Monday, February 16, 2026

Object detection with Ultralytics YOLO26 on Raspberry Pi

In celebration of #MakerMonday this week, we’re taking a look at how well YOLO‘s AI models deploy and run on Raspberry Pi. This is an exceptionally in-depth tutorial, so we have just shared part of the installation with you here; the rest of the tutorial can be found in the latest issue of Raspberry Pi Official Magazine, pages 70–77.

YOLO (You Only Look Once) is a powerful object detection model created by Ultralytics that enables you to identify content in images and videos from the command line and Python. From here, you can perform classification and respond to images or videos with your code.

When paired with a Raspberry Pi Camera Module, YOLO forms a powerful means of identifying objects that your Raspberry Pi board can react to; you can use it with sensors and actuators connected to the Raspberry Pi to perform real-time identification and reaction. You can also use it to analyse images and video files.

Using YOLO to download an image and perform inference on it

YOLO26 has just been released, and the YOLO26n model is what we are using here. It’s custom-built for speed, accuracy, and versatility. You can use YOLO out of the box or train your own datasets on it.

In this tutorial, we’re going to look at installing the Ultralytics framework with images and video files, both online and in our computer system. We’ll also look at setting up Docker so that you can install the environment and the programs needed.

The YOLO26n model performing image inference alongside the image of a bus, downloaded from the Ultralytics website

You don’t need a Raspberry Pi Camera Module for this, but a reasonably powerful Raspberry Pi will help — we are using a Raspberry Pi 5 for this tutorial. In following tutorials, we will look at integrating a Raspberry Pi Camera Module.

Install Docker

Set up your Raspberry Pi 5 with Raspberry Pi OS (see Raspberry Pi Documentation for help with these steps). We start by installing Docker Engine in Raspberry Pi OS.

Add Docker to apt

To install Docker Engine, you should be running the latest version of Raspberry Pi OS based on Debian Trixie (it will also work on Bookworm and Bullseye, however). These instructions follow the Docker documentation guide for Debian. Docker provides separate Raspberry Pi installation instructions, but these are geared towards the old 32-bit version of Raspberry Pi OS, so stick with the Debian install.

Our Python code in Thonny alongside the YOLO26 Docker instance performing image recognition

First, make sure you have uninstalled any old Docker packages. Open a terminal window and enter:

$ sudo apt remove $(dpkg --get-selections
docker.io docker-compose docker-doc podman‑docker containerd runc | cut -f1)

Unless you have Docker already installed, apt will report that these packages are not found.

Next we’ll add Docker’s official GNU Privacy Guard (GPG) key to the keyrings folder. First, we update apt, then install ca-certificates and curl:

$ sudo apt update
$ sudo apt install ca-certificates curl

These should already be installed. We make sure our keyrings directory has the correct permissions: 0755. This enables the file owner (our admin account) to read, write, and execute; just read and execute permissions are set for groups and others. We do this with a funky install command that is normally used for copying files, but in this instance it’s being used to adjust permissions:

$ sudo install -m 0755 -d /etc/apt/keyrings

Now we use curl to download Docker’s GPG key and place it into our keyrings directory with the file name docker.asc:

$ sudo curl -fsSL https://download.docker.com/linux/debian/gpg -o /etc/apt/keyrings/docker.asc

We need to ensure that all users can read the docker.asc file. We do this with the standard chmod command with a+r options:

$ sudo chmod a+r /etc/apt/keyrings/docker.asc

Next comes a funky multi-line piece of code that creates a file called docker.sources in our /etc/apt/ directory and contains the details of the Docker repository. Enter the first line and you will see a > in the terminal. Enter each line carefully and press RETURN after each one. Each line is added to the docker.sources text file until you enter EOF (at which point you return to the command line):

$ sudo tee /etc/apt/sources.list.d/docker.sources <<EOF
Types: deb
URIs: https://download.docker.com/linux/debian
Suites: $(. /etc/os-release && echo "$VERSION_CODENAME")
Components: stable
Signed-By: /etc/apt/keyrings/docker.asc
EOF

Check that the docker.sources file has been created correctly:

$ cat /etc/apt/sources.list.d/docker.sources

The output is expected to list the following, where Suites is the VERSION_CODENAME of your operating system (trixie):

Types: deb
URIs: https://download.docker.com/linux/debian
Suites: trixie
Components: stable
Signed-By: /etc/apt/keyrings/docker.asc 

If there’s a problem, use vim or nano to edit your file:

$ sudo nano /etc/apt/sources.list.d/docker.sources

Check the update

Now update the system and check access to Docker downloads:

$ sudo apt update

The output should include a line like this:

Get:5 https://download.docker.com/linux/debian trixie InRelease [32.5 kB]

Install Docker

Now that the Docker repository is in apt, it’s time to install the various elements. Enter this line in the terminal:

$ sudo apt install docker-ce docker-ce-cli containerd.io docker-buildx-plugin dockercompose-plugin

Docker should run automatically after installation. To verify that Docker is running, use:

$ sudo systemctl status docker

Press q to exit systemctl and return to the command line. Some systems may have this behaviour disabled and will require a manual start:

$ sudo systemctl start docker

Finally, verify that the installation is successful by running the hello-world image:

$ sudo docker run hello-world

If this is the first run, it will pull the library/hello-world container from the Docker Hub. You will see a message containing:

Hello from Docker!

This message shows that your installation appears to be working correctly.

Check out the latest issue of Raspberry Pi Official Magazine to learn how to finish setting up Docker and start using YOLO26n.

Issue 162 of Raspberry Pi Official Magazine is out now!

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post Object detection with Ultralytics YOLO26 on Raspberry Pi appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/jrH6OtA

Labels: ,

Friday, February 13, 2026

Certifying third-party antennas for use with Raspberry Pi Compute Modules

When designing and producing Raspberry Pi devices, we consider as many potential use cases as possible — particularly when it comes to criteria like wireless (WLAN and Bluetooth) performance and antenna usage. While our single-board computers (such as Raspberry Pi 5) include only an on-board PCB antenna, our Raspberry Pi Compute Module range offers two pre-approved options: an on-board PCB antenna and the external whip antenna from the official Raspberry Pi Antenna Kit.

However, we recognise that some industrial and commercial customers may need to employ third-party antennas for their applications. Example scenarios include:

  • Embedding a Compute Module within a metal enclosure, where the PCB antenna would perform poorly due to the Faraday cage effect
  • Extending the communication distance of a device, which requires increased antenna gain
  • Integrating an antenna with a different form factor, such as a flexible PCB antenna

In such cases, the Compute Module and new antenna may be required to undergo additional testing and certification before the product can be sold. While procedures vary depending on the market and the device’s features, Raspberry Pi is well placed to support our customers in meeting these additional requirements — either by updating our existing certifications or by obtaining new certifications on their behalf.

Compliance requirements

For new antennas, compliance requirements depend on whether the antenna gain is less than, equal to, or higher than the approved gain value. Alternative antenna options are therefore split into two categories:

  • Antenna gain is equal to or less than the approved antenna gain
  • Antenna gain is higher than the approved antenna gain
Antenna gain plot for the external whip antenna included in the Raspberry Pi Antenna Kit (Source: Antenna Patterns)

To help our commercial and industrial customers meet regulatory requirements in either gain scenario, we’ve put together a white paper outlining the certifications and testing procedures required in a number of our our key markets.

Different markets, different regulations

For example, in the UK and EU, integrators can adopt an antenna with a gain less than or equal to the gain of the antenna used for the original certification without needing to carry out any further spectrum usage testing. For antennas with higher gain, this course of action depends on how high the gain of the new antenna is, as this determines whether some or all of the spectrum usage tests need to be repeated. Integrators are, however, encouraged to carry out spurious emissions tests and other electromagnetic compatibility tests on all alternative antennas, regardless of their gain.

Antenna gain plot for our on-board PCB Niche antennas (Source: Antenna Patterns)

In Japan, all antennas must be approved by the country’s Ministry of Internal Affairs and Communications (MIC), and all antenna options must be listed, but no additional testing is required. Similarly, in South Korea and Taiwan, all antennas must comply with each country’s regulations — but further testing is required for antennas with higher gain. In Vietnam and Mexico, no modifications to the device’s existing certifications are required; however, manufacturers must ensure that the radiated output power of the antenna does not exceed the regulatory limits.

For a full list of requirements in several of Raspberry Pi’s key markets, refer to the handy table in our white paper.

Using pre-approved Raspberry Pi antennas

To avoid potential compliance issues or additional costs altogether, manufacturers, integrators, and end users can employ Raspberry Pi’s existing antenna architecture, which is already fully compliant in all of our key markets.

Newer Raspberry Pi single-board computers and microcontrollers include an integrated PCB Niche antenna, providing on-board Wi-Fi and Bluetooth connectivity as standard. Raspberry Pi Compute Module 4 and 5 also feature one of these PCB Niche antennas, along with a built-in U.FL connector for attaching an external antenna.

The U.FL connector on Compute Module 4 and 5 can be fitted with the omnidirectional external whip antenna included in our pre-approved Raspberry Pi Antenna Kit, or with another compatible third-party antenna.

Next steps: How Raspberry Pi can help

Should you need further assistance with integrating an alternative antenna —  either during the product design process or after launch — our in-house Global Market Access (GMA) team is fully equipped to handle any additional tests, documentation submissions, or approvals on your behalf. Contact gma@raspberrypi.com with your product requirements, including the proposed antenna options and a list of your target markets (including any not listed above).

The GMA team will review your antenna specifications and advise whether compliance with the relevant market regulations is possible. Once confirmed, the team will update the existing approvals or obtain new ones to include the new antenna, carrying out any additional testing as required.

Disclaimer:

The information provided here and in our white paper is intended to be used as initial guidance only. Customers should always refer to the official regulations and publications issued by the relevant authorities.

The post Certifying third-party antennas for use with Raspberry Pi Compute Modules appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/oi1LhCr

Labels: ,

Thursday, February 12, 2026

Accessibility improvements for screen readers on raspberrypi.com

We’re committed to making raspberrypi.com work well for everyone. Recently, we’ve been endeavouring to make the site more accessible to people using screen readers, such as JAWS and NVDA on Windows or VoiceOver on macOS and iOS.

Previously, screen reader users had no way to quickly identify the main parts of pages or skip banners and navigation. To address this, we’ve now given each region proper labels and ‘landmarks’ (standard markup recognised by screen readers), allowing you to jump between different parts of the page.

We’ve also corrected our headings so that they follow a logical order and improved our links to reduce duplication; each link is now labelled clearly so that it makes sense on its own, no matter where it appears. Hints and error messages are now associated with their relevant form fields, making it easier to complete forms while using a screen reader.

Simplifying our CAPTCHA protections

The high volume of automated traffic we receive means we often need to distinguish between human users and bots. While we previously relied on hCaptcha to do this, we’ve now implemented Cloudflare Turnstile instead. Rather than presenting users with frustrating visual challenges, Turnstile verifies them behind the scenes using automatic checks.

While hCaptcha does offer an accessibility mode, it requires users to sign up separately and complete extra setup. Turnstile, however, works with screen readers without any extra steps. This change has been applied across raspberrypi.com, including on our forums, the Raspberry Pi ID page, and the Raspberry Pi Connect page. Our board member Chris Mairs found it to be a helpful improvement; he discusses his experience and encourages others to make the switch in a recent post on his blog, The Open Eyed Man.

We now test new and updated pages with a screen reader as part of our development process, checking landmarks, headings, links, and forms. We found Adam Liptrot’s guide to VoiceOver and Deque’s axe accessibility tools particularly helpful here.

Get in touch

If you use a screen reader and run into any issues on our site, or have ideas for further improvements, please get in touch. And if you’d like to improve your own website’s accessibility for screen readers and want to know more details, we’ll try to answer your questions in the comments below.

The post Accessibility improvements for screen readers on raspberrypi.com appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/tT3Ve0N

Labels: ,

Monday, February 9, 2026

Pioreactor: An automated Raspberry Pi bioreactor

Welcome to another glorious #MakerMonday, on which we celebrate your Raspberry Pi builds. Today, let’s take a look at ‘Pioreactor’ — an amazing project designed to automate long bioreactor experiments, featured in the latest issue of Raspberry Pi Official Magazine.

Whilst at the Open Hardware Summit 2025 in Edinburgh, UK, we met Gerrit, who spoke about growing food with electricity while representing AMYBO, an online community dedicated to developing sustainable protein food sources. 

The talk is excellent, and it’s available on YouTube. In it, you can see and hear a lot of information about the Pioreactor: a tiny automated bioreactor that allows complex science to take place on your desk. It’s powered by a Raspberry Pi, and is the go-to tool for many professional, amateur, and hobbyist biologists and chemists; it’s also used by a fascinating research community called AMYBO.  

What is a bioreactor?

A bioreactor is a vessel that provides an optimised environment for growing cells, microorganisms, and microbial cultures. In its simplest form, it could just be a jar, but the term ‘bioreactor’ commonly describes more complex setups in which the environment can be controlled and automated. Bioreactors are typically used in the development of pharmaceuticals, as well as in the food sciences, medical sciences, and many other chemistry- and biology-adjacent sectors. 

Figure 1: The small 20ml glass vial allows for small samples and cultures to be grown

The Pioreactor is small: the working volume of our version is just 20ml (see Figure 1). You definitely aren’t going to grow enough algae for your fuel cell, or to create a decent food supply. For experiments and research, however, it offers a wide range of environmental controls straight out of the box. It’s capable of automating experiments over long periods of time and can also log data about the experiments you schedule it to perform. You can see the bill of materials (BOM) on the AMYBO documentation website.

How to build the Pioreactor

Building the Pioreactor is pretty straightforward — all you need is a Pioreactor kit. Gerrit is the co-founder of LabCrafter, a company that supplies open-sourced science equipment, including the OpenFlexure microscope kits we wrote about in issue 158. It also stocks the current models of the Pioreactor, as well as numerous add-on accessories and expansions. The kit arrived from LabCrafter really well packed, in nice, sustainable, recyclable packaging, ready to be built. 

You can build a Pioreactor using any choice of Raspberry Pi model, whether that’s a Model A, a Model B, or any of the Zero-series boards. We went for a Raspberry Pi 5 with 4GB RAM (Figure 2), as we knew this would provide great performance. You begin by simply attaching a base to your Raspberry Pi, followed by some standoffs, before finally fitting the Pioreactor HAT onto Raspberry Pi 5’s GPIO header. The instructions are online and they are excellent. Do, however, double-check which version of the Pioreactor you have, as the assembly approach has changed slightly for the recent v1.5 hardware design update. 

We then move on to assembling the ‘wetware’ section: the main chamber of the Pioreactor that holds the glass vial that contains your experiment. Fit the supplied O-rings to the base of the vial chamber and the chamber wall, then insert the small heater element into the chamber. The clearances for various parts of the mechanism are quite accurate (Figure 3), so you need to double-check that you are assembling it using the correct bolts; handily, the packaging has a labelled, to-scale image of all the bolts to check them against. 

Figure 2: You can build a Pioreactor with various models; our build uses a 4GB Raspberry Pi 5

There are numerous holes in the side of the chamber wall, allowing for the addition of an optical system later on. The included optical system consists of an infrared (IR) LED and two photodiodes in the same 5mm LED form factor. These are fitted later in the build, allowing you to automatically measure the optical density of your experiment. This is an obvious potential metric for growth. Imagine starting with a reasonably clear liquid in which an organism, such as yeast, is growing; over the course of the experiment, the optical density of the liquid would be expected to increase as the IR LED becomes more and more obscured by the cloudiness of the mixture.

Also on the chamber wall is a small rectangular aperture with some small threaded inserts in the corner. This is a ‘viewing window’ (Figure 4), and there is a supplied blanking plate for this area. The viewing window is also designed to receive additional hardware. One option is to add an Adafruit AS7341 sensor, which is a pretty well-featured spectrometer. You can purchase this separately, and there is software for this device that enables you to directly retrieve readings from it. 

Figure 3: The underside of the vial chamber has flush screws, allowing the fan to freely rotate close to the surface

An upper faceplate sits between the Pioreactor HAT and the vial chamber. The faceplate has a mount for a fan unit, and the vial chamber assembly fits on top of the fan’s mounting bolts. The fan, you will notice, has been retrofitted with a pair of strong magnets (Figure 5). This is because the fan isn’t really used as a fan — the magnets actually create a stirring mechanism for inside the vial. Supplied with the kit is a tiny, plastic-covered metal stir bar that sits inside the vial; when the fan is instructed to turn, the magnets cause the stir bar to spin, allowing you to schedule periodic agitations of your experiment.

Assembly continues by attaching the fan and the vial chamber to the upper faceplate, then mounting the faceplate and attachments onto the HAT and the Raspberry Pi. Rugged connections are made for the fan/stirrer cable, and the heater element’s ribbon cable is also fitted at this point.

Figure 4: There’s a window in the chamber wall for viewing your experiment, or for attaching add-ons like the Adafruit AS7341 spectrometry board

Finally, add the IR LED and the two supplied photodiodes and cover them with the neatly designed protective covers (Figure 6). The kit includes some spare covers, so for now, we can cover the additional chamber holes with them (these holes are there so you can add further LEDs, depending on your experiment’s needs). Many bioreactor experiments require some form of light source, so it’s common to mount 5mm LEDs of the target wavelength into these holes. 

Installing the software

Once you have all of the hardware assembled, it’s time to grab a microSD card and install the software that runs your Pioreactor. This is neatly achieved using the custom Raspberry Pi OS image supplied by the Pioreactor team. With the latest version of the official Raspberry Pi Imager application, installation is easy.

Figure 5: The included fan unit isn’t actually used as a fan; it’s modified to run the magnetic stirring system

After booting Raspberry Pi Imager, you need to click the ‘App Options’ button on the main page. 

If you’re using the AppImage version of the Raspberry Pi Imager tool on Ubuntu, you’ll need to run Imager with root permissions. To do this, navigate to the directory where the AppImage is and then launch Imager with:

 $ sudo ./imager_2.0.0_amd64.AppImage

On the App Options page, you can edit the ‘content repository’ tab and add the custom URL for the Pioreactor OS image. If you then reboot Imager, you should be able to select your Raspberry Pi device and see the Pioreactor OS available to install to your microSD card. 

Figure 6: Adding the IR LED and photodiodes

The Pioreactor instructions walk you through this process really well. In essence, Imager will prompt you to set localisation settings, then enter a specific username and password from the Pioreactor instructions, along with your own Wi-Fi network credentials. Don’t worry, though: you can change these down the line. 

Booting your Pioreactor

Once the software is installed, you can boot your Pioreactor by connecting a power supply. We made sure to use an official Raspberry Pi power supply, and after a few minutes we saw a blue LED blinking on the Pioreactor HAT. Then, on a laptop connected to the same Wi-Fi network, we opened a web browser and navigated to http://pioreactor.local. A pop-up window asked us to confirm which Pioreactor version we have; after selecting this, a wonderful dashboard for our Pioreactor appeared (Figure 7).

Figure 7: The web interface is neat and easy to explore

There’s lots you can check, even without beginning to run an experiment on your Pioreactor. As a simple test, you can select the ‘Profiles’ tab from the left-hand side of the screen and then choose the ‘Demo Stirring Example’ from the drop-down list of available profiles. This little community-contributed example will turn on the stirring system at a particular number of revolutions per minute (RPM), increase the speed of stirring after 90 seconds, then stop stirring after three minutes. If you remove the lid from your vial chamber before running this, you can watch the stir bar in action.

Similarly, if you click the ‘Pioreactors’ tab from the list on the left-hand side, you can select your Pioreactor (you can run multiple from a single interface) and assign it as ‘leader’. Then, if you click the ‘Manage’ button, you will see a list of ‘Activities’ that you can run or stop running inside your experiment, impacting things like stirring, optical density, temperature control, and more. 

Experimenting

A good first experiment is described on the AMYBO website. As written, it’s used to calibrate two Pioreactors to each other, but you could also run the experiment as a test for a single Pioreactor. Essentially, you are going to grow some yeast using a yeast extract peptone dextrose (YPD) broth, which is a common growth medium used in all manner of microbiological cultivations. The experiment basically grows yeast in the YPD broth, stirring and warming the mixture while taking periodic optical density measurements to track its growth (Figure 8). 

Figure 8: An example set of results from a calibration test between two Pioreactors growing yeast

The Pioreactor is a capable device in its standalone form, but there are lots of add-ons and modifications available or in development within the community. For example, Figure 9 shows an expanded Pioreactor system that can push CO2 through the liquid in the chamber, which can be used to remove other volatile compounds from a sample. This process is known as ‘sparging’. The CO2 sparging system is well engineered, but can be made using simple items like the CO2 bottle from a Sodastream device, which are widely available and readily refillable. This modified Pioreactor also has peristaltic pumps with surgical tubing, enabling accurate dosing of additional material into a given experiment. Some of the AMYBO experiments need hydrogen and oxygen to be present in the vial chamber, and this can be achieved through in-chamber electrolysis, all of which is being explored and developed. It’s superb to see the community building and developing these complex tools for everyone. 

Figure 9: A Pioreactor setup with lots of additional features, including CO2 sparging, in-chamber electrolysis, and more

The new issue of Raspberry Pi Official Magazine is out now!

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post Pioreactor: An automated Raspberry Pi bioreactor appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/orOTWKD

Labels: ,

Thursday, February 5, 2026

Beige is back: Remembering the BBC Micro with Raspberry Pi 500+

The BBC Microcomputer System, or BBC Micro, taught a generation how to use personal computers. Raspberry Pi exists partly because of that legacy. Our CEO and co-founder Eben Upton’s own journey began with a Beeb, and when he recently floated the idea of making a Raspberry Pi 500+ look like a BBC Micro, it felt less like a gimmick and more like a polite nod to four decades of British computing.

The BBC Micro was released in 1981. Manufactured by Acorn Computers, it had an 8-bit CPU running at 2MHz, and came in two main variants: the 16KB Model A, initially priced at £299, and the more popular 32KB Model B, priced at £399. According to the Bank of England’s inflation calculator, Model B would set you back something in the region of £1600 today. So, it was expensive to say the least. Despite this, it went on to sell over 1.5 million units, and was found in almost every UK school at the time. The BBC Micro’s entire memory could comfortably fit inside a modern emoji, but at the time it felt revolutionary, offering up a whole new world to the masses.

Back to BASICs

Within minutes of starting the makeover, I discovered that beige spray paint is unsurprisingly not very popular anymore — especially this exact shade, which reminds me of nicotine-stained pub wallpaper. A couple of purchases later, I found one that just about did the job. After a quick disassembly of a Raspberry Pi 500+ (which is designed to be taken apart so you can upgrade the SSD), a coat of primer, and a top coat of RAL 1001 Beige enamel spray paint, we had the base of our imitation Micro.

But that old-school beige was not the classic computer’s only distinguishing feature; the BBC Micro also had a very distinctive set of keycaps. For those above a certain age, the keyboard is instantly recognisable — mostly for its bright red function keys, which seem to cry out “we do something powerful”. In practice, they were programmable macros for BBC BASIC commands (RUN, LIST, etc.), and their vibrant colour made them feel special, almost like hardware buttons rather than just keys.

Because Raspberry Pi 500+ was built with customisation in mind, recreating this look was easy; the keycaps could easily be swapped out using the removal tool included with every purchase. Signature Plastics LLC offer a variety of unique, high-quality keycaps, and they certainly delivered on our request for this project. Within minutes, the transformation was complete. My hat respectfully doffed to an iconic British computer that introduced millions of people to computing.

Microcomputer, major impact

Raspberry Pi’s all-in-one PCs have always been inspired by the home computers of the 1980s, and much like the classics, they help put high-performance, programmable computers into the hands of people all over the world.

Raspberry Pi 500+ is our most premium product yet, giving you a quad-core 2.4GHz processor, 16GB of memory, 256GB of solid-state storage, modern graphics and networking, and a complete Linux desktop, all built into a beautiful mechanical keyboard. In 1981, this would have represented more raw processing power than every BBC Micro in a typical school combined. In simple terms, it delivers computing on an entirely different scale: around a million times more processing power, well over half a million times more memory, and several million times more storage. Not bad for the price of a routine car service — before they “find something”, anyway…

The post Beige is back: Remembering the BBC Micro with Raspberry Pi 500+ appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/a26ZBAY

Labels: ,

Tuesday, February 3, 2026

RP2350 Hacking Challenge 2: Less randomisation, more correlation

At the end of July 2025 — so almost 6 months ago — we launched the second RP2350 Hacking Challenge, searching for practical side-channel attacks on the power-hardened AES implementation underpinning RP2350‘s secure boot. So far, we don’t have a winner, so we decided to evolve the challenge by removing one of the core defense-in-depth features: the randomisation of memory accesses.

Our AES implementation was designed to withstand side-channel attacks by using multi-way secret sharing (where sensitive values are split into random components that must be XORed together) and by randomly permuting the order of operations and data. We hope that even just the multi-way shares are enough to protect us against side-channel attacks; hence, we have decided to update our challenge:

If you manage to demonstrate a successful attack on our AES implementation without the randomisation, you win!

For this, we have created a new version of the challenge in the Hacking Challenge 2 GitHub repository. You will notice the new aes_no_random.S, which disables all RNG-based randomisation.

We’ve also added a Unicorn-based emulation example to help you develop attacks virtually!

I didn’t understand any of this?!

The secure boot protection of firmware on RP2350 relies on AES — the Advanced Encryption Standard — to decrypt the firmware from external flash into the on-chip SRAM. AES in itself is considered very secure; however, a lot of software and hardware implementations are susceptible to so-called side-channel attacks. By recording and analysing hundreds of thousands (or even millions) of power traces on the chip, attackers might be able to recover the encryption key.

To protect against this, we worked with some very smart folks to build an AES implementation that is hardened against these kinds of attacks. Now we are putting it to the test by offering a bounty to the first person who successfully manages to attack our AES via side channels!

I’m almost there…

Getting close but don’t have a successful attack yet? Write to us! We care more about protecting our implementation than about having a full end-to-end attack. If you’ve identified a leak, we want to talk to you!

What we know so far

During our initial work on the AES implementation, we found some abstract correlation that lets us differentiate between an all-zeros key and an all-ones key. However, we were unable to build a model that significantly impacts the key space.

A bit more time on the clock

To give you a little more time to keep hacking, we’re extending the deadline to 30 April 2026. The prize remains unchanged at $20,000.

Head to the Hacking Challenge 2 repo to view the updated challenge software.

The post RP2350 Hacking Challenge 2: Less randomisation, more correlation appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/vO2Jhzx

Labels: ,

Monday, February 2, 2026

More memory-driven price rises

Two months ago, we announced increases to the prices of some Raspberry Pi 4 and 5 products. These were driven by an unprecedented rise in the cost of LPDDR4 memory, thanks to competition for memory fab capacity from the AI infrastructure roll-out.

Price rises have accelerated as we enter 2026, and the cost of some parts has more than doubled over the last quarter. As a result, we now need to make further increases to our own pricing, affecting all Raspberry Pi 4 and 5, and Compute Module 4 and 5, products that have 2GB or more of memory.

Memory density Price increase
1GB
2GB $10
4GB $15
8GB $30
16GB $60

Raspberry Pi 500 and 500+ are affected, but not Raspberry Pi 400, which remains our lowest-cost all-in-one PC at $60. We have also been able to protect the pricing of 1GB products, including the $35 1GB Raspberry Pi 4 variant, and the $45 1GB Raspberry Pi 5 variant that we launched in December.

We don’t anticipate any changes to the price of Raspberry Pi Zero, Raspberry Pi 3, and other older products, as we currently hold several years’ inventory of the LPDDR2 memory that they use.

Looking ahead

2026 looks likely to be another challenging year for memory pricing, but we are working hard to limit the impact. We’ve said it before, but we’ll say it again: the current situation is ultimately a temporary one, and we look forward to unwinding these price increases once it abates.

The post More memory-driven price rises appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/0XZNTIy

Labels: ,

Thursday, January 29, 2026

SmartCoop: Controlling chickens with Java

The new issue of Raspberry Pi Official Magazine is here, and with it, this smart chicken coop project. With SmartCoop, a Raspberry Pi monitors feed and water levels, and schedules the opening and closing of the main door based on preconfigured times and weather data.

Owning a small flock of chickens means regularly opening and closing the coop’s main door, collecting the eggs, and making sure there is enough food and water. Given that most of this needs to be done daily, you’ll need to arrange for someone to perform these tasks if you want to get away for more than a day or two.

Enter SmartCoop. One of the key design goals behind this project was to ensure the system was robust enough that its creator, Dave Duncanson, could be away for up to a week without anyone physically attending to it, while still preventing the local foxes from getting to the chickens.

The main gate is opened and closed automatically;
sensors measure things like water and food levels

Dave started working on SmartCoop over ten years ago, and the current version contains the fourth generation of his custom-made PCB. With this new design, he could bump the system to use a Raspberry Pi Zero 2 W.

The full system contains an array of automated doors, light sensors, manual push buttons, water tank measurement tools, feeders, and so on. On the software side, an MQTT broker distributes the data, while a Java application based on Pi4J uses live weather data from an API, along with measurements from the sensors, to open and close the gates, keep track of feeding, and perform other tasks. 

Raspberry Pi Zero 2 W is mounted on a custom PCB with ports connected to multiple sensors

The project evolved not only because the technology changed, but also because it was being influenced by nature. Dave was struggling with a fox that loved to hunt the chickens and had learnt when the gates would open automatically. Because of this, the system was adapted to use the expected dawn and dusk times, only opening and closing the gate based on light sensor measurements.

Another problem was caused by Dave’s teenager. As anyone with kids will confirm, teenagers tend to forget a lot of important things, like closing the gate of the coop. To combat this, the SmartCoop monitors the gate and the status of the food and water supply, alerting a configurable number of people when something is wrong.

The chicken coop door is controlled by a daylight sensor

In the future, a UHF RFID reader, combined with an RFID ring for each of the chickens, could be added to the system to monitor whether they are all inside at night. By installing another one of these readers in each of the laying boxes, it would even be possible to keep track of the most (or least) productive chickens.

Raspberry Pi + ESP32

Around 80% of the core functionality is handled by a Raspberry Pi Zero 2 W running a Java application, which uses the Pi4J library to control the GPIO pins and interact with I2C devices. It also stores data in an H2 database and provides GPS and NTP functionality, event scheduling, and a template-based web interface.

The remaining work is handled by the ESP32. Its initial role was simply to power Raspberry Pi on and off at preset, configurable times to conserve battery via RTC interrupts; its functionality has since been extended, and it now also checks the door positions and motor encoders. Because many existing Arduino examples also work on the ESP32, these were used to understand how some components are controlled before the code was ported to Java and Pi4J. 

Chicken town

Dave is the first to admit that this solution is most likely over-engineered and therefore not cost-effective — but it’s the perfect way to fully automate his chicken coop. He has no plans to turn it into a commercial product either; instead, he shares both the software and hardware on Bitbucket.

The new issue of Raspberry Pi Official Magazine is out now!

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post SmartCoop: Controlling chickens with Java appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/gCo7L8T

Labels: ,

Wednesday, January 28, 2026

Raspberry Pi Smart Display Module: coming soon

For those attending Integrated Systems Europe (ISE) 2026 in Barcelona, a visit to the Sharp booth might reveal something new, exciting, and not yet released…

We’ve been working with Sharp Display Solutions Europe to develop the Raspberry Pi Smart Display Module: an adapter board for Raspberry Pi Compute Module 5 that is designed to deliver high-quality, low-power display experiences for professional signage applications.

The Raspberry Pi Smart Display Module enables users in the audio-visual and digital signage markets to integrate the power, flexibility, and energy efficiency of Compute Module 5 into compatible display screens, with no external media player, cabling, or power source required. The module also provides HDMI output to support a second independent video stream, along with an M.2 expansion slot for optional AI acceleration.

Conforming to the Intel® SDM specification, the Raspberry Pi Smart Display Module slots directly into displays that support Intel’s standard, drawing power from the display itself. With the computer embedded inside the screen, installations are clean, reliable, and easy to maintain, making the Smart Display Module ideal for applications such as flight information systems, retail and corporate signage, and industrial displays.

We designed the Raspberry Pi Smart Display Module to be as straightforward to assemble as possible — customers can install it themselves without any specialist tools.

Enabling edge AI for digital signage

As organisations increasingly explore AI-powered digital signage, the Raspberry Pi Smart Display Module offers an efficient and practical solution. Able to integrate easily with compatible AI accelerators, the module enables edge AI processing to take place directly inside the screen it is paired with. This allows users to run analytics and AI-driven applications locally, privately, and in real time, without reliance on cloud-based services.

Raspberry Pi technology is already used by thousands of businesses and powers hundreds of thousands of screens worldwide; the introduction of the Raspberry Pi Smart Display Module further expands that ecosystem. By embedding AI capability directly into their display solutions, businesses can innovate rapidly and adapt to changing requirements with an energy-efficient, easy-to-integrate modular solution.

See it for yourself

ISE 2026 is taking place from 3–6 February 2026 at Fira de Barcelona, Gran Via. Visitors to the Sharp booth will be able to see the Raspberry Pi Smart Display Module in action ahead of its launch later this year.

The post Raspberry Pi Smart Display Module: coming soon appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/dJjkNPv

Labels: ,

Tuesday, January 27, 2026

Streamline dataset creation for the Raspberry Pi AI Camera

Starting an AI project often begins with building a quality dataset, which can be a complex and time-consuming task. This dataset contains the data you want to use to train, test, and verify that your AI model works. This tutorial introduces a practical approach to help simplify the process.

With the Sony IMX500 sensor on the Raspberry Pi AI Camera, you can use your own datasets to improve your AI models. Whether you’re an experienced maker or just beginning to explore the world of edge AI, this guide will help you organise, refine, and export datasets with ease. Let’s look at how this tool can support you in building smarter AI models, faster.

The challenge of dataset creation

Dataset preparation is an important yet sometimes challenging aspect of vision AI projects. Capturing images, organising them, cropping out irrelevant details, and ensuring they’re formatted correctly is a lot of work. This process can be a roadblock that slows down progress or discourages you from starting in the first place. But with the right setup and tools, you can simplify these tasks and focus on your AI development.

Figure 1: The GUI Tool web interface

Setting up and getting started

For this tutorial, we will use a tool that provides some convenient features for dataset creation: GUI Tool. This makes it easier to capture images that are very close to the deployment environment and highly suitable for training, since the data comes directly from the IMX500 image sensor.

GUI Tool runs on a Raspberry Pi with an AI Camera attached, and you access it via a web browser using another computer on the same network.

To run the tool, you’ll need Node.js and uv software:

$ sudo apt install nodejs npm
$ curl -LsSf https://astral.sh/uv/install.sh | 
sh

Check that everything installed correctly with:

$ node --version
$ npm --version
$ uv --version

Now clone the repository from GitHub:

$ git clone https://github.com/SonySemiconductorSolutions/aitrios-rpi-sample-app-gui-tool

Navigate into the new folder and install the software in the root of the folder:

$ make setup

To start the GUI Tool, run:

$ uv run main.py

You’ll need the IP address:

$ hostname -I

Or hostname:

$ hostname

…of your Raspberry Pi to access it on the network.

Access the GUI Tool

Now move to the second computer on your local network and open a browser. Navigate to:

http://<your-raspberrypi-IP-address>:3001

…to access the tool’s interface.

You can also access the GUI Tool directly from your Raspberry Pi and AI Camera via:

http://127.0.0.1:3001

You will see the GUI Tool web interface as shown in Figure 1.

Creating a dataset using the IMX500 sensor

Once the setup is complete, you can use the GUI Tool to create and organise your dataset. Choose the ‘Images’ tab in the sidebar and click ‘Add’ to create a new dataset. Give the dataset a name in the pop-up window; for example, ‘car-dataset’ (Figure 2). Click ‘Add’ to create the dataset.

Now we need to add images by uploading them from your computer. For this tutorial, we have used the Vehicles-OpenImages Dataset from Roboflow (Figure 3).

Click ‘Upload’ and choose an image from your Raspberry Pi OS file system. The image will appear in the car dataset (as in Figure 4).

Figure 2: Create a new dataset

Capture images with the camera

It is also possible to use the GUI Tool to automate image capture directly from a camera attached to your Raspberry Pi. If you have a Raspberry Pi AI Camera connected, you can also gather input tensor data alongside the raw image.

Choose the ‘Camera preview’ tab to view the image from your camera.

Select collection: Click ‘Select Collection’ and choose a dataset to add the images to.

Input: Click the ‘Timer’ switch to automate image capture at set intervals. For example, to capture a frame every 10 seconds for 50 images, set the capture rate to 0.1 and the number of photos to 50. Activate the image capture and let the tool handle the rest.

Input tensor: The Raspberry Pi AI Camera works differently to traditional image processing systems. The IMX500 sensor includes an internal ISP that preprocesses the sensor data and supplies the input tensor directly to its on-board AI accelerator chip. So, for optimal performance, it’s highly recommended that you train models using the exact input tensor data produced by the IMX500 sensor, rather than relying on raw images or preprocessed images only. This ensures that the model learns from data that precisely matches the runtime conditions, which leads to better model performance.

Fortunately, we can very quickly get this input tensor data by enabling the ‘Input Tensor’ flag during the image capturing process.

Start capture: Click the camera icon to start the image capture process.

Figure 3: The Vehicles-OpenImages Dataset from Roboflow is a good test bed of images for training a vehicle detection model

Manage images

Head to the ‘Images’ tab to upload, delete, or capture images directly into your dataset to keep it organised.

Once your dataset is ready, click ‘Images’, then click the cog icon next to your dataset. Select ‘Download’ to save the images as a ZIP file on your computer.

Practical example: Recognising cars

Imagine you’re developing an AI model for car recognition with the IMX500 sensor. Here’s what the process might look like:

  1. Create a ‘car-dataset’ dataset
  2. Capture images of cars using the IMX500 sensor
  3. Automate the capture process to ensure consistency
  4. If needed, crop images to focus on relevant areas, such as individual cars
  5. Organise and manage these images within the tool
  6. Export the dataset and use it to annotate and train your AI model
Figure 4: The vehicle dataset added to GUI Tool

Training your AI model

Once your dataset is ready, the next step is annotation, followed by training with TensorFlow or PyTorch. Alternatively, for a streamlined and user-friendly experience, you can use a dedicated tool to simplify these steps. One tool that can assist you is Brain Builder for AITRIOS (Figure 5) from the Studio Series of AI tools and services for AITRIOS.

Annotating

Annotating your dataset is a critical step in training an AI model because it teaches the AI exactly what you want it to learn. If the annotations contain mistakes, the model will learn those mistakes as well, which can reduce its accuracy.

There are many tools available for annotation, such as Roboflow or CocoAnnotator, that help you label your datasets according to the type of model you plan to train.

When choosing an annotation tool, make sure to check which export formats it supports. Your dataset must be exported in a format compatible with the AI model you want to train.

Figure 5: Sony AITRIOS Brain Builder software can simplify the process of training AI models

Training

Once your dataset is annotated and exported, you are ready to start training. We suggest you follow your chosen framework’s guides on how to create a training script and what hardware you might need.

Brain Builder for AITRIOS

This tool is designed to simplify the annotation and training process, which might be helpful for users with varying levels of AI expertise. With Brain Builder for AITRIOS, you can annotate and train your AI models in a few steps, all inside the same tool. This means your annotated dataset can be sent straight into training, already in the right format.

Brain Builder for AITRIOS currently supports three types of models: Classification, Object Detection, and Anomaly Hi-Fi. You can train and evaluate your model and, when you are happy with the accuracy, export it for IMX500 without any hassle.

Deploying your AI model

Once your model is trained, you can package it and then deploy it on the IMX500:

  1. Package your model on your Raspberry Pi
  2. Build an application to visualise the results, such as counting cars

Creating datasets isn’t just a technical task — it’s a gateway to collaboration, learning, and real-world innovation. The possibilities are wide-ranging: educators can introduce students to AI and machine learning; makers can build smarter IoT devices, such as home security systems or gesture recognition tools; and researchers can accelerate their work on projects including wildlife conservation, medical imaging, and more.

This tutorial featured in Raspberry Pi Official Magazine #161

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post Streamline dataset creation for the Raspberry Pi AI Camera appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/U5uY1Ol

Labels: ,