Monday, February 9, 2026

Pioreactor: An automated Raspberry Pi bioreactor

Welcome to another glorious #MakerMonday, on which we celebrate your Raspberry Pi builds. Today, let’s take a look at ‘Pioreactor’ — an amazing project designed to automate long bioreactor experiments, featured in the latest issue of Raspberry Pi Official Magazine.

Whilst at the Open Hardware Summit 2025 in Edinburgh, UK, we met Gerrit, who spoke about growing food with electricity while representing AMYBO, an online community dedicated to developing sustainable protein food sources. 

The talk is excellent, and it’s available on YouTube. In it, you can see and hear a lot of information about the Pioreactor: a tiny automated bioreactor that allows complex science to take place on your desk. It’s powered by a Raspberry Pi, and is the go-to tool for many professional, amateur, and hobbyist biologists and chemists; it’s also used by a fascinating research community called AMYBO.  

What is a bioreactor?

A bioreactor is a vessel that provides an optimised environment for growing cells, microorganisms, and microbial cultures. In its simplest form, it could just be a jar, but the term ‘bioreactor’ commonly describes more complex setups in which the environment can be controlled and automated. Bioreactors are typically used in the development of pharmaceuticals, as well as in the food sciences, medical sciences, and many other chemistry- and biology-adjacent sectors. 

Figure 1: The small 20ml glass vial allows for small samples and cultures to be grown

The Pioreactor is small: the working volume of our version is just 20ml (see Figure 1). You definitely aren’t going to grow enough algae for your fuel cell, or to create a decent food supply. For experiments and research, however, it offers a wide range of environmental controls straight out of the box. It’s capable of automating experiments over long periods of time and can also log data about the experiments you schedule it to perform. You can see the bill of materials (BOM) on the AMYBO documentation website.

How to build the Pioreactor

Building the Pioreactor is pretty straightforward — all you need is a Pioreactor kit. Gerrit is the co-founder of LabCrafter, a company that supplies open-sourced science equipment, including the OpenFlexure microscope kits we wrote about in issue 158. It also stocks the current models of the Pioreactor, as well as numerous add-on accessories and expansions. The kit arrived from LabCrafter really well packed, in nice, sustainable, recyclable packaging, ready to be built. 

You can build a Pioreactor using any choice of Raspberry Pi model, whether that’s a Model A, a Model B, or any of the Zero-series boards. We went for a Raspberry Pi 5 with 4GB RAM (Figure 2), as we knew this would provide great performance. You begin by simply attaching a base to your Raspberry Pi, followed by some standoffs, before finally fitting the Pioreactor HAT onto Raspberry Pi 5’s GPIO header. The instructions are online and they are excellent. Do, however, double-check which version of the Pioreactor you have, as the assembly approach has changed slightly for the recent v1.5 hardware design update. 

We then move on to assembling the ‘wetware’ section: the main chamber of the Pioreactor that holds the glass vial that contains your experiment. Fit the supplied O-rings to the base of the vial chamber and the chamber wall, then insert the small heater element into the chamber. The clearances for various parts of the mechanism are quite accurate (Figure 3), so you need to double-check that you are assembling it using the correct bolts; handily, the packaging has a labelled, to-scale image of all the bolts to check them against. 

Figure 2: You can build a Pioreactor with various models; our build uses a 4GB Raspberry Pi 5

There are numerous holes in the side of the chamber wall, allowing for the addition of an optical system later on. The included optical system consists of an infrared (IR) LED and two photodiodes in the same 5mm LED form factor. These are fitted later in the build, allowing you to automatically measure the optical density of your experiment. This is an obvious potential metric for growth. Imagine starting with a reasonably clear liquid in which an organism, such as yeast, is growing; over the course of the experiment, the optical density of the liquid would be expected to increase as the IR LED becomes more and more obscured by the cloudiness of the mixture.

Also on the chamber wall is a small rectangular aperture with some small threaded inserts in the corner. This is a ‘viewing window’ (Figure 4), and there is a supplied blanking plate for this area. The viewing window is also designed to receive additional hardware. One option is to add an Adafruit AS7341 sensor, which is a pretty well-featured spectrometer. You can purchase this separately, and there is software for this device that enables you to directly retrieve readings from it. 

Figure 3: The underside of the vial chamber has flush screws, allowing the fan to freely rotate close to the surface

An upper faceplate sits between the Pioreactor HAT and the vial chamber. The faceplate has a mount for a fan unit, and the vial chamber assembly fits on top of the fan’s mounting bolts. The fan, you will notice, has been retrofitted with a pair of strong magnets (Figure 5). This is because the fan isn’t really used as a fan — the magnets actually create a stirring mechanism for inside the vial. Supplied with the kit is a tiny, plastic-covered metal stir bar that sits inside the vial; when the fan is instructed to turn, the magnets cause the stir bar to spin, allowing you to schedule periodic agitations of your experiment.

Assembly continues by attaching the fan and the vial chamber to the upper faceplate, then mounting the faceplate and attachments onto the HAT and the Raspberry Pi. Rugged connections are made for the fan/stirrer cable, and the heater element’s ribbon cable is also fitted at this point.

Figure 4: There’s a window in the chamber wall for viewing your experiment, or for attaching add-ons like the Adafruit AS7341 spectrometry board

Finally, add the IR LED and the two supplied photodiodes and cover them with the neatly designed protective covers (Figure 6). The kit includes some spare covers, so for now, we can cover the additional chamber holes with them (these holes are there so you can add further LEDs, depending on your experiment’s needs). Many bioreactor experiments require some form of light source, so it’s common to mount 5mm LEDs of the target wavelength into these holes. 

Installing the software

Once you have all of the hardware assembled, it’s time to grab a microSD card and install the software that runs your Pioreactor. This is neatly achieved using the custom Raspberry Pi OS image supplied by the Pioreactor team. With the latest version of the official Raspberry Pi Imager application, installation is easy.

Figure 5: The included fan unit isn’t actually used as a fan; it’s modified to run the magnetic stirring system

After booting Raspberry Pi Imager, you need to click the ‘App Options’ button on the main page. 

If you’re using the AppImage version of the Raspberry Pi Imager tool on Ubuntu, you’ll need to run Imager with root permissions. To do this, navigate to the directory where the AppImage is and then launch Imager with:

 $ sudo ./imager_2.0.0_amd64.AppImage

On the App Options page, you can edit the ‘content repository’ tab and add the custom URL for the Pioreactor OS image. If you then reboot Imager, you should be able to select your Raspberry Pi device and see the Pioreactor OS available to install to your microSD card. 

Figure 6: Adding the IR LED and photodiodes

The Pioreactor instructions walk you through this process really well. In essence, Imager will prompt you to set localisation settings, then enter a specific username and password from the Pioreactor instructions, along with your own Wi-Fi network credentials. Don’t worry, though: you can change these down the line. 

Booting your Pioreactor

Once the software is installed, you can boot your Pioreactor by connecting a power supply. We made sure to use an official Raspberry Pi power supply, and after a few minutes we saw a blue LED blinking on the Pioreactor HAT. Then, on a laptop connected to the same Wi-Fi network, we opened a web browser and navigated to http://pioreactor.local. A pop-up window asked us to confirm which Pioreactor version we have; after selecting this, a wonderful dashboard for our Pioreactor appeared (Figure 7).

Figure 7: The web interface is neat and easy to explore

There’s lots you can check, even without beginning to run an experiment on your Pioreactor. As a simple test, you can select the ‘Profiles’ tab from the left-hand side of the screen and then choose the ‘Demo Stirring Example’ from the drop-down list of available profiles. This little community-contributed example will turn on the stirring system at a particular number of revolutions per minute (RPM), increase the speed of stirring after 90 seconds, then stop stirring after three minutes. If you remove the lid from your vial chamber before running this, you can watch the stir bar in action.

Similarly, if you click the ‘Pioreactors’ tab from the list on the left-hand side, you can select your Pioreactor (you can run multiple from a single interface) and assign it as ‘leader’. Then, if you click the ‘Manage’ button, you will see a list of ‘Activities’ that you can run or stop running inside your experiment, impacting things like stirring, optical density, temperature control, and more. 

Experimenting

A good first experiment is described on the AMYBO website. As written, it’s used to calibrate two Pioreactors to each other, but you could also run the experiment as a test for a single Pioreactor. Essentially, you are going to grow some yeast using a yeast extract peptone dextrose (YPD) broth, which is a common growth medium used in all manner of microbiological cultivations. The experiment basically grows yeast in the YPD broth, stirring and warming the mixture while taking periodic optical density measurements to track its growth (Figure 8). 

Figure 8: An example set of results from a calibration test between two Pioreactors growing yeast

The Pioreactor is a capable device in its standalone form, but there are lots of add-ons and modifications available or in development within the community. For example, Figure 9 shows an expanded Pioreactor system that can push CO2 through the liquid in the chamber, which can be used to remove other volatile compounds from a sample. This process is known as ‘sparging’. The CO2 sparging system is well engineered, but can be made using simple items like the CO2 bottle from a Sodastream device, which are widely available and readily refillable. This modified Pioreactor also has peristaltic pumps with surgical tubing, enabling accurate dosing of additional material into a given experiment. Some of the AMYBO experiments need hydrogen and oxygen to be present in the vial chamber, and this can be achieved through in-chamber electrolysis, all of which is being explored and developed. It’s superb to see the community building and developing these complex tools for everyone. 

Figure 9: A Pioreactor setup with lots of additional features, including CO2 sparging, in-chamber electrolysis, and more

The new issue of Raspberry Pi Official Magazine is out now!

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post Pioreactor: An automated Raspberry Pi bioreactor appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/orOTWKD

Labels: ,

Thursday, February 5, 2026

Beige is back: Remembering the BBC Micro with Raspberry Pi 500+

The BBC Microcomputer System, or BBC Micro, taught a generation how to use personal computers. Raspberry Pi exists partly because of that legacy. Our CEO and co-founder Eben Upton’s own journey began with a Beeb, and when he recently floated the idea of making a Raspberry Pi 500+ look like a BBC Micro, it felt less like a gimmick and more like a polite nod to four decades of British computing.

The BBC Micro was released in 1981. Manufactured by Acorn Computers, it had an 8-bit CPU running at 2MHz, and came in two main variants: the 16KB Model A, initially priced at £299, and the more popular 32KB Model B, priced at £399. According to the Bank of England’s inflation calculator, Model B would set you back something in the region of £1600 today. So, it was expensive to say the least. Despite this, it went on to sell over 1.5 million units, and was found in almost every UK school at the time. The BBC Micro’s entire memory could comfortably fit inside a modern emoji, but at the time it felt revolutionary, offering up a whole new world to the masses.

Back to BASICs

Within minutes of starting the makeover, I discovered that beige spray paint is unsurprisingly not very popular anymore — especially this exact shade, which reminds me of nicotine-stained pub wallpaper. A couple of purchases later, I found one that just about did the job. After a quick disassembly of a Raspberry Pi 500+ (which is designed to be taken apart so you can upgrade the SSD), a coat of primer, and a top coat of RAL 1001 Beige enamel spray paint, we had the base of our imitation Micro.

But that old-school beige was not the classic computer’s only distinguishing feature; the BBC Micro also had a very distinctive set of keycaps. For those above a certain age, the keyboard is instantly recognisable — mostly for its bright red function keys, which seem to cry out “we do something powerful”. In practice, they were programmable macros for BBC BASIC commands (RUN, LIST, etc.), and their vibrant colour made them feel special, almost like hardware buttons rather than just keys.

Because Raspberry Pi 500+ was built with customisation in mind, recreating this look was easy; the keycaps could easily be swapped out using the removal tool included with every purchase. Signature Plastics LLC offer a variety of unique, high-quality keycaps, and they certainly delivered on our request for this project. Within minutes, the transformation was complete. My hat respectfully doffed to an iconic British computer that introduced millions of people to computing.

Microcomputer, major impact

Raspberry Pi’s all-in-one PCs have always been inspired by the home computers of the 1980s, and much like the classics, they help put high-performance, programmable computers into the hands of people all over the world.

Raspberry Pi 500+ is our most premium product yet, giving you a quad-core 2.4GHz processor, 16GB of memory, 256GB of solid-state storage, modern graphics and networking, and a complete Linux desktop, all built into a beautiful mechanical keyboard. In 1981, this would have represented more raw processing power than every BBC Micro in a typical school combined. In simple terms, it delivers computing on an entirely different scale: around a million times more processing power, well over half a million times more memory, and several million times more storage. Not bad for the price of a routine car service — before they “find something”, anyway…

The post Beige is back: Remembering the BBC Micro with Raspberry Pi 500+ appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/a26ZBAY

Labels: ,

Tuesday, February 3, 2026

RP2350 Hacking Challenge 2: Less randomisation, more correlation

At the end of July 2025 — so almost 6 months ago — we launched the second RP2350 Hacking Challenge, searching for practical side-channel attacks on the power-hardened AES implementation underpinning RP2350‘s secure boot. So far, we don’t have a winner, so we decided to evolve the challenge by removing one of the core defense-in-depth features: the randomisation of memory accesses.

Our AES implementation was designed to withstand side-channel attacks by using multi-way secret sharing (where sensitive values are split into random components that must be XORed together) and by randomly permuting the order of operations and data. We hope that even just the multi-way shares are enough to protect us against side-channel attacks; hence, we have decided to update our challenge:

If you manage to demonstrate a successful attack on our AES implementation without the randomisation, you win!

For this, we have created a new version of the challenge in the Hacking Challenge 2 GitHub repository. You will notice the new aes_no_random.S, which disables all RNG-based randomisation.

We’ve also added a Unicorn-based emulation example to help you develop attacks virtually!

I didn’t understand any of this?!

The secure boot protection of firmware on RP2350 relies on AES — the Advanced Encryption Standard — to decrypt the firmware from external flash into the on-chip SRAM. AES in itself is considered very secure; however, a lot of software and hardware implementations are susceptible to so-called side-channel attacks. By recording and analysing hundreds of thousands (or even millions) of power traces on the chip, attackers might be able to recover the encryption key.

To protect against this, we worked with some very smart folks to build an AES implementation that is hardened against these kinds of attacks. Now we are putting it to the test by offering a bounty to the first person who successfully manages to attack our AES via side channels!

I’m almost there…

Getting close but don’t have a successful attack yet? Write to us! We care more about protecting our implementation than about having a full end-to-end attack. If you’ve identified a leak, we want to talk to you!

What we know so far

During our initial work on the AES implementation, we found some abstract correlation that lets us differentiate between an all-zeros key and an all-ones key. However, we were unable to build a model that significantly impacts the key space.

A bit more time on the clock

To give you a little more time to keep hacking, we’re extending the deadline to 30 April 2026. The prize remains unchanged at $20,000.

Head to the Hacking Challenge 2 repo to view the updated challenge software.

The post RP2350 Hacking Challenge 2: Less randomisation, more correlation appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/vO2Jhzx

Labels: ,

Monday, February 2, 2026

More memory-driven price rises

Two months ago, we announced increases to the prices of some Raspberry Pi 4 and 5 products. These were driven by an unprecedented rise in the cost of LPDDR4 memory, thanks to competition for memory fab capacity from the AI infrastructure roll-out.

Price rises have accelerated as we enter 2026, and the cost of some parts has more than doubled over the last quarter. As a result, we now need to make further increases to our own pricing, affecting all Raspberry Pi 4 and 5, and Compute Module 4 and 5, products that have 2GB or more of memory.

Memory density Price increase
1GB
2GB $10
4GB $15
8GB $30
16GB $60

Raspberry Pi 500 and 500+ are affected, but not Raspberry Pi 400, which remains our lowest-cost all-in-one PC at $60. We have also been able to protect the pricing of 1GB products, including the $35 1GB Raspberry Pi 4 variant, and the $45 1GB Raspberry Pi 5 variant that we launched in December.

We don’t anticipate any changes to the price of Raspberry Pi Zero, Raspberry Pi 3, and other older products, as we currently hold several years’ inventory of the LPDDR2 memory that they use.

Looking ahead

2026 looks likely to be another challenging year for memory pricing, but we are working hard to limit the impact. We’ve said it before, but we’ll say it again: the current situation is ultimately a temporary one, and we look forward to unwinding these price increases once it abates.

The post More memory-driven price rises appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/0XZNTIy

Labels: ,

Thursday, January 29, 2026

SmartCoop: Controlling chickens with Java

The new issue of Raspberry Pi Official Magazine is here, and with it, this smart chicken coop project. With SmartCoop, a Raspberry Pi monitors feed and water levels, and schedules the opening and closing of the main door based on preconfigured times and weather data.

Owning a small flock of chickens means regularly opening and closing the coop’s main door, collecting the eggs, and making sure there is enough food and water. Given that most of this needs to be done daily, you’ll need to arrange for someone to perform these tasks if you want to get away for more than a day or two.

Enter SmartCoop. One of the key design goals behind this project was to ensure the system was robust enough that its creator, Dave Duncanson, could be away for up to a week without anyone physically attending to it, while still preventing the local foxes from getting to the chickens.

The main gate is opened and closed automatically;
sensors measure things like water and food levels

Dave started working on SmartCoop over ten years ago, and the current version contains the fourth generation of his custom-made PCB. With this new design, he could bump the system to use a Raspberry Pi Zero 2 W.

The full system contains an array of automated doors, light sensors, manual push buttons, water tank measurement tools, feeders, and so on. On the software side, an MQTT broker distributes the data, while a Java application based on Pi4J uses live weather data from an API, along with measurements from the sensors, to open and close the gates, keep track of feeding, and perform other tasks. 

Raspberry Pi Zero 2 W is mounted on a custom PCB with ports connected to multiple sensors

The project evolved not only because the technology changed, but also because it was being influenced by nature. Dave was struggling with a fox that loved to hunt the chickens and had learnt when the gates would open automatically. Because of this, the system was adapted to use the expected dawn and dusk times, only opening and closing the gate based on light sensor measurements.

Another problem was caused by Dave’s teenager. As anyone with kids will confirm, teenagers tend to forget a lot of important things, like closing the gate of the coop. To combat this, the SmartCoop monitors the gate and the status of the food and water supply, alerting a configurable number of people when something is wrong.

The chicken coop door is controlled by a daylight sensor

In the future, a UHF RFID reader, combined with an RFID ring for each of the chickens, could be added to the system to monitor whether they are all inside at night. By installing another one of these readers in each of the laying boxes, it would even be possible to keep track of the most (or least) productive chickens.

Raspberry Pi + ESP32

Around 80% of the core functionality is handled by a Raspberry Pi Zero 2 W running a Java application, which uses the Pi4J library to control the GPIO pins and interact with I2C devices. It also stores data in an H2 database and provides GPS and NTP functionality, event scheduling, and a template-based web interface.

The remaining work is handled by the ESP32. Its initial role was simply to power Raspberry Pi on and off at preset, configurable times to conserve battery via RTC interrupts; its functionality has since been extended, and it now also checks the door positions and motor encoders. Because many existing Arduino examples also work on the ESP32, these were used to understand how some components are controlled before the code was ported to Java and Pi4J. 

Chicken town

Dave is the first to admit that this solution is most likely over-engineered and therefore not cost-effective — but it’s the perfect way to fully automate his chicken coop. He has no plans to turn it into a commercial product either; instead, he shares both the software and hardware on Bitbucket.

The new issue of Raspberry Pi Official Magazine is out now!

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post SmartCoop: Controlling chickens with Java appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/gCo7L8T

Labels: ,

Wednesday, January 28, 2026

Raspberry Pi Smart Display Module: coming soon

For those attending Integrated Systems Europe (ISE) 2026 in Barcelona, a visit to the Sharp booth might reveal something new, exciting, and not yet released…

We’ve been working with Sharp Display Solutions Europe to develop the Raspberry Pi Smart Display Module: an adapter board for Raspberry Pi Compute Module 5 that is designed to deliver high-quality, low-power display experiences for professional signage applications.

The Raspberry Pi Smart Display Module enables users in the audio-visual and digital signage markets to integrate the power, flexibility, and energy efficiency of Compute Module 5 into compatible display screens, with no external media player, cabling, or power source required. The module also provides HDMI output to support a second independent video stream, along with an M.2 expansion slot for optional AI acceleration.

Conforming to the Intel® SDM specification, the Raspberry Pi Smart Display Module slots directly into displays that support Intel’s standard, drawing power from the display itself. With the computer embedded inside the screen, installations are clean, reliable, and easy to maintain, making the Smart Display Module ideal for applications such as flight information systems, retail and corporate signage, and industrial displays.

We designed the Raspberry Pi Smart Display Module to be as straightforward to assemble as possible — customers can install it themselves without any specialist tools.

Enabling edge AI for digital signage

As organisations increasingly explore AI-powered digital signage, the Raspberry Pi Smart Display Module offers an efficient and practical solution. Able to integrate easily with compatible AI accelerators, the module enables edge AI processing to take place directly inside the screen it is paired with. This allows users to run analytics and AI-driven applications locally, privately, and in real time, without reliance on cloud-based services.

Raspberry Pi technology is already used by thousands of businesses and powers hundreds of thousands of screens worldwide; the introduction of the Raspberry Pi Smart Display Module further expands that ecosystem. By embedding AI capability directly into their display solutions, businesses can innovate rapidly and adapt to changing requirements with an energy-efficient, easy-to-integrate modular solution.

See it for yourself

ISE 2026 is taking place from 3–6 February 2026 at Fira de Barcelona, Gran Via. Visitors to the Sharp booth will be able to see the Raspberry Pi Smart Display Module in action ahead of its launch later this year.

The post Raspberry Pi Smart Display Module: coming soon appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/dJjkNPv

Labels: ,

Tuesday, January 27, 2026

Streamline dataset creation for the Raspberry Pi AI Camera

Starting an AI project often begins with building a quality dataset, which can be a complex and time-consuming task. This dataset contains the data you want to use to train, test, and verify that your AI model works. This tutorial introduces a practical approach to help simplify the process.

With the Sony IMX500 sensor on the Raspberry Pi AI Camera, you can use your own datasets to improve your AI models. Whether you’re an experienced maker or just beginning to explore the world of edge AI, this guide will help you organise, refine, and export datasets with ease. Let’s look at how this tool can support you in building smarter AI models, faster.

The challenge of dataset creation

Dataset preparation is an important yet sometimes challenging aspect of vision AI projects. Capturing images, organising them, cropping out irrelevant details, and ensuring they’re formatted correctly is a lot of work. This process can be a roadblock that slows down progress or discourages you from starting in the first place. But with the right setup and tools, you can simplify these tasks and focus on your AI development.

Figure 1: The GUI Tool web interface

Setting up and getting started

For this tutorial, we will use a tool that provides some convenient features for dataset creation: GUI Tool. This makes it easier to capture images that are very close to the deployment environment and highly suitable for training, since the data comes directly from the IMX500 image sensor.

GUI Tool runs on a Raspberry Pi with an AI Camera attached, and you access it via a web browser using another computer on the same network.

To run the tool, you’ll need Node.js and uv software:

$ sudo apt install nodejs npm
$ curl -LsSf https://astral.sh/uv/install.sh | 
sh

Check that everything installed correctly with:

$ node --version
$ npm --version
$ uv --version

Now clone the repository from GitHub:

$ git clone https://github.com/SonySemiconductorSolutions/aitrios-rpi-sample-app-gui-tool

Navigate into the new folder and install the software in the root of the folder:

$ make setup

To start the GUI Tool, run:

$ uv run main.py

You’ll need the IP address:

$ hostname -I

Or hostname:

$ hostname

…of your Raspberry Pi to access it on the network.

Access the GUI Tool

Now move to the second computer on your local network and open a browser. Navigate to:

http://<your-raspberrypi-IP-address>:3001

…to access the tool’s interface.

You can also access the GUI Tool directly from your Raspberry Pi and AI Camera via:

http://127.0.0.1:3001

You will see the GUI Tool web interface as shown in Figure 1.

Creating a dataset using the IMX500 sensor

Once the setup is complete, you can use the GUI Tool to create and organise your dataset. Choose the ‘Images’ tab in the sidebar and click ‘Add’ to create a new dataset. Give the dataset a name in the pop-up window; for example, ‘car-dataset’ (Figure 2). Click ‘Add’ to create the dataset.

Now we need to add images by uploading them from your computer. For this tutorial, we have used the Vehicles-OpenImages Dataset from Roboflow (Figure 3).

Click ‘Upload’ and choose an image from your Raspberry Pi OS file system. The image will appear in the car dataset (as in Figure 4).

Figure 2: Create a new dataset

Capture images with the camera

It is also possible to use the GUI Tool to automate image capture directly from a camera attached to your Raspberry Pi. If you have a Raspberry Pi AI Camera connected, you can also gather input tensor data alongside the raw image.

Choose the ‘Camera preview’ tab to view the image from your camera.

Select collection: Click ‘Select Collection’ and choose a dataset to add the images to.

Input: Click the ‘Timer’ switch to automate image capture at set intervals. For example, to capture a frame every 10 seconds for 50 images, set the capture rate to 0.1 and the number of photos to 50. Activate the image capture and let the tool handle the rest.

Input tensor: The Raspberry Pi AI Camera works differently to traditional image processing systems. The IMX500 sensor includes an internal ISP that preprocesses the sensor data and supplies the input tensor directly to its on-board AI accelerator chip. So, for optimal performance, it’s highly recommended that you train models using the exact input tensor data produced by the IMX500 sensor, rather than relying on raw images or preprocessed images only. This ensures that the model learns from data that precisely matches the runtime conditions, which leads to better model performance.

Fortunately, we can very quickly get this input tensor data by enabling the ‘Input Tensor’ flag during the image capturing process.

Start capture: Click the camera icon to start the image capture process.

Figure 3: The Vehicles-OpenImages Dataset from Roboflow is a good test bed of images for training a vehicle detection model

Manage images

Head to the ‘Images’ tab to upload, delete, or capture images directly into your dataset to keep it organised.

Once your dataset is ready, click ‘Images’, then click the cog icon next to your dataset. Select ‘Download’ to save the images as a ZIP file on your computer.

Practical example: Recognising cars

Imagine you’re developing an AI model for car recognition with the IMX500 sensor. Here’s what the process might look like:

  1. Create a ‘car-dataset’ dataset
  2. Capture images of cars using the IMX500 sensor
  3. Automate the capture process to ensure consistency
  4. If needed, crop images to focus on relevant areas, such as individual cars
  5. Organise and manage these images within the tool
  6. Export the dataset and use it to annotate and train your AI model
Figure 4: The vehicle dataset added to GUI Tool

Training your AI model

Once your dataset is ready, the next step is annotation, followed by training with TensorFlow or PyTorch. Alternatively, for a streamlined and user-friendly experience, you can use a dedicated tool to simplify these steps. One tool that can assist you is Brain Builder for AITRIOS (Figure 5) from the Studio Series of AI tools and services for AITRIOS.

Annotating

Annotating your dataset is a critical step in training an AI model because it teaches the AI exactly what you want it to learn. If the annotations contain mistakes, the model will learn those mistakes as well, which can reduce its accuracy.

There are many tools available for annotation, such as Roboflow or CocoAnnotator, that help you label your datasets according to the type of model you plan to train.

When choosing an annotation tool, make sure to check which export formats it supports. Your dataset must be exported in a format compatible with the AI model you want to train.

Figure 5: Sony AITRIOS Brain Builder software can simplify the process of training AI models

Training

Once your dataset is annotated and exported, you are ready to start training. We suggest you follow your chosen framework’s guides on how to create a training script and what hardware you might need.

Brain Builder for AITRIOS

This tool is designed to simplify the annotation and training process, which might be helpful for users with varying levels of AI expertise. With Brain Builder for AITRIOS, you can annotate and train your AI models in a few steps, all inside the same tool. This means your annotated dataset can be sent straight into training, already in the right format.

Brain Builder for AITRIOS currently supports three types of models: Classification, Object Detection, and Anomaly Hi-Fi. You can train and evaluate your model and, when you are happy with the accuracy, export it for IMX500 without any hassle.

Deploying your AI model

Once your model is trained, you can package it and then deploy it on the IMX500:

  1. Package your model on your Raspberry Pi
  2. Build an application to visualise the results, such as counting cars

Creating datasets isn’t just a technical task — it’s a gateway to collaboration, learning, and real-world innovation. The possibilities are wide-ranging: educators can introduce students to AI and machine learning; makers can build smarter IoT devices, such as home security systems or gesture recognition tools; and researchers can accelerate their work on projects including wildlife conservation, medical imaging, and more.

This tutorial featured in Raspberry Pi Official Magazine #161

You can grab this issue from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available from our online store, which ships around the world. And you can get a digital version via our app on Android or iOS.

You can also subscribe to the print version of our magazine. Not only do we deliver worldwide, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico 2 W!

The post Streamline dataset creation for the Raspberry Pi AI Camera appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/U5uY1Ol

Labels: ,

Thursday, January 22, 2026

Raspberry Pi Flash Drive available now from $30: a high-quality essential accessory

A USB flash drive is one of those small essentials you reach for from time to time to back up data or transfer files between your Raspberry Pi and substitute computers. For basics like these, it’s tempting to reach for the cheapest thing on Amazon or whatever you find in your local supermarket, but you can easily end up with a device that has sluggish read and write speeds, fragile casing, or – worst of all – far less storage capacity than it claims. Better to go with something you can rely on: introducing the Raspberry Pi Flash Drive, a compact high-capacity USB 3.0 USB‑A device with fast data transfer and an all‑aluminium enclosure. It’s available now at $30 for 128GB, or $55 for 256GB.

We’ve brought our usual exacting standards and attention to detail to our new accessory. It can sustain a write speed of 75MB/s (128GB variant) or 150MB/s (256GB variant), and our thorough testing has made sure it can handle the demands of real life when it comes to sudden disconnection and power failure. Its ergonomic all-aluminium enclosure is easy to grasp and almost impossible to break, although you’ll manage it if, like jdb of this parish, you go at it with a blowtorch. It has an attachment hole so you can keep it on a keyring or similar. The Raspberry Pi logo is etched with classy understatement onto its upper surface.

Fast and robust

Like many high-density NAND flash storage devices, the Raspberry Pi Flash Drive employs a small reservation of pseudo-SLC cache to improve performance under burst-y write workloads. In the background, any writes that were allocated in pSLC are streamed out to the higher-density, but slower, QLC flash. There are significant advantages to doing this: for short periods, the sequential write speed can be almost as fast as USB 3.0 will go.

This cache does, however, make benchmarking challenging. For this reason, the USB 3.0 performance figures we quote are sustained figures, where writes are measured when the cache is forced to do write‑through due to the volume of writes already committed, and reads are measured with the cache empty.

It goes without saying that whatever internal storage arrangement is used, it must be robust against surprise removal or power failure. We verified that our new flash drive meets this requirement over tens of thousands of random power cycles while running intermittently intensive I/O workloads.

Bonus features

In addition to being fast, we made sure that these drives support SSD-style SMART health reporting to help you to manage the device lifespan, as well as supporting TRIM operations. They will also autonomously enter low-power USB 3.0 states when idle.

More handy essentials from Raspberry Pi

Our new flash drive joins a growing range of rigorously specified and robustly tested Raspberry Pi accessories designed to make your day-to-day computing life as friction-free as possible. Raspberry Pi SD Cards and Raspberry Pi SSDs offer you a choice of storage solutions; the four-way Raspberry Pi USB 3 Hub provides an excellent alternative to unsatisfactory price/quality compromises elsewhere; and the Raspberry Pi Bumper is exactly what you need to protect the base and edges of your Raspberry Pi 5, without obstructing access to anything else.

The new Raspberry Pi Flash Drive gives you compact, portable storage with reliable performance for both 128GB and 256GB capacity options. Grab one from a Raspberry Pi Approved Reseller today.

The post Raspberry Pi Flash Drive available now from $30: a high-quality essential accessory appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/j0ywdLl

Labels: ,

Wednesday, January 21, 2026

USB gadget mode in Raspberry Pi OS: SSH over USB

If you’ve ever tried using a Raspberry Pi — or any single-board computer — while travelling, you probably know how frustrating it can be. Hotel rooms with no spare Ethernet ports, conference Wi-Fi behind captive portals, networks that block local discovery tools, or simply not knowing what IP address your headless board received can all turn a simple task into a hassle.

Last year, I came across a concept that sounded like the ideal solution: Ethernet over USB. The idea is beautifully simple — plug the Raspberry Pi into a laptop and it appears as a USB network adapter, just like when you enable USB tethering on a smartphone. That would mean no Wi-Fi setup, no IP scanning, no captive portal headaches — just plug in, SSH, and start working. Bonus: The host computer could even share its internet connection over that same cable.

At least, that’s the theory.

Raspberry Pi Zero 2 W connected to a Laptop

In reality, getting this to work has traditionally involved a mix of outdated scripts, manual configuration steps, and platform-specific instructions that only reliably supported one host OS at a time — Windows, macOS, or Linux, but rarely all three. Many great community efforts exist, but they often require you to clone repositories, edit system files, or manually switch the Raspberry Pi between Internet Connection Sharing (ICS) and normal local networking — and ICS is typically treated as an optional afterthought, rather than part of a unified workflow.

I wanted to streamline that experience — not to replace community solutions, but to offer a clean, all-in-one option that “just works”, regardless of whether the user is a first-time Raspberry Pi owner or someone deploying a fleet of headless boards.

So I started a project with a clear goal in mind: to make a single Debian package that enables USB gadget networking straight out of the box on all supported Raspberry Pi boards, and across all major host operating systems.

Introducing rpi-usb-gadget

Starting with Raspberry Pi OS Trixie images dated 20.10.2025 and later, a new package called rpi-usb-gadget is included by default. It can be enabled with a single toggle in Raspberry Pi Imager, making USB networking setup drastically simpler.

Once enabled:

  • Your Raspberry Pi will present itself as a USB Ethernet device when connected to a PC
  • You can SSH directly using the hostname you set in Raspberry Pi Imager — no Wi-Fi or Ethernet setup required
  • If your PC has an active internet connection and ICS is enabled, the Raspberry Pi will automatically receive internet access through the same USB cable
  • A lightweight background service runs on the Raspberry Pi to detect host connectivity and automatically switch between standalone mode and ICS-backed networking
  • In practice, it behaves very similarly to USB tethering on a smartphone — but for Raspberry Pi

The package is supported on all major host systems: Windows, macOS, and Linux.

Important hardware note

To use USB gadget mode, the Raspberry Pi must be connected to a USB port that supports OTG (device mode):

Raspberry Pi model USB port to use
Raspberry Pi Zero, Zero W, Zero 2 W The micro USB port closest to HDMI — not ‘PWR IN’
Raspberry Pi 4, 5, 500, 500+ The USB-C port directly on the board
Compute Module 5 The USB-C port on the Raspberry Pi CM5 IO Board
Compute Module 4 Requires additional manual setup and is not auto-configured

Warning:
Once gadget mode is enabled, the selected port will function exclusively as USB networking + power input; it will no longer operate as a regular USB host port. This means that keyboards, storage devices, or other peripherals cannot be connected to that port while gadget mode is active.

Supported boards:

  • Raspberry Pi Zero (W) and Zero 2 W
  • Raspberry Pi 3 Model A+
  • Raspberry Pi 4 Model B
  • Raspberry Pi 5, 500, and 500+
  • Compute Module 5
  • Compute Module 4 (technically supported, but additional manual setup is required)

For optimal stability — especially on Raspberry Pi 4, 5, and 500/500+ — connect the Raspberry Pi directly to a USB port on your PC. Some laptop USB ports cannot provide sufficient power, which may cause reboots or USB link drops.

Recommended accessory: The Raspberry Pi USB 3 Hub allows you to power the device externally while still passing only data over the USB connection; this is ideal for laptops with weak USB power delivery.

Enabling gadget mode the easy way: Raspberry Pi Imager 2.0

  1. Generate a capabilities-enhanced manifest
  2. Double-click the generated os_list_local.rpi-imager-manifest file
  3. Select a Raspberry Pi OS Trixie image (20.10.2025 or newer)
  4. In the ‘Customisation’ menu, set a hostname
    • This is the name you’ll use to SSH into the Raspberry Pi
    • If ICS is disabled on the host, the fallback IP will be 10.12.194.1
  5. Go to ‘Interfaces & Features’ and toggle ‘Enable USB Gadget Mode’
  6. Write the image, insert the card into your Raspberry Pi, and connect it to your PC using the correct USB/OTG port (not just the power input)
  7. Power on the Raspberry Pi (the first boot may take longer than usual and might reboot once — this is expected)
  8. Once booted, your Raspberry Pi should appear as a new Ethernet adapter on your host machine
    • You can now SSH using the hostname you set

Windows driver requirement

Windows does not include a generic driver for USB Ethernet gadget devices. To avoid relying on impersonated vendor IDs or unofficial drivers, a dedicated installer is provided: download and run rpi-usb-gadget-driver-setup.exe from the project’s releases page on GitHub.

This only needs to be done once per Windows machine.

Internet Connection Sharing (ICS)

If you want your Raspberry Pi to access the internet through the USB connection, enable Internet Connection Sharing (ICS) on your host computer.

On Windows

  1. Plug in the Raspberry Pi and confirm it shows up as a new Ethernet adapter
    → It will appear under a name like ‘Ethernet 7 — Raspberry Pi USB Remote NDIS Network Device’
  2. Open ‘Network Connections’:
    Control Panel → Network and Internet → Network and Sharing Center → Change adapter settings
    → Or press Win + R and enter ncpa.cpl
  3. Identify your primary internet-connected adapter (e.g. Wi-Fi) and open ‘Properties’
  4. Go to the ‘Sharing’ tab, enable ‘Allow other network users to connect…’, and, in the dropdown, choose the Raspberry Pi USB Ethernet adapter
  5. Confirm your selection; within around 60 seconds, the Raspberry Pi should obtain an IP address

Note on Windows ICS behavior:
If ICS is enabled for the Raspberry Pi’s adapter while the Raspberry Pi is not connected, Windows may bind its DHCP service to another interface (such as a Hyper-V adapter). In that case, the Raspberry Pi interface may show as shared but will not receive a DHCP lease. To fix this, fully disable ICS on all adapters you shared the network from, plug in the Raspberry Pi, and then re-enable ICS.

On macOS

  1. Connect the Raspberry Pi and wait for it to appear as a new USB Ethernet device
  2. Open System Settings → General → Sharing → Internet Sharing
  3. Before enabling, click the info icon to configure:
    → Your internet source (e.g. Wi-Fi)
    → The Raspberry Pi USB Gadget interface as the target to share to
  4. Save and toggle ‘Internet Sharing’ on

On Linux

Enable routing and NAT from your primary internet connection to the Raspberry Pi USB Gadget network interface using your distribution’s NetworkManager or equivalent. Instructions vary depending on the desktop environment or init system.

Enabling USB gadget mode without Imager

This only applies to Raspberry Pi OS Trixie–based images.

Fresh images (before first boot):

Cloud-init can enable USB gadget mode automatically on first boot:

  1. Mount the boot partition
  2. Edit user-data and append:
rpi:
  enable_usb_gadget: true
enable_ssh: true  # Optional but recommended when using gadget mode

It’s recommended that you also define a user and an SSH key in the same file, as the setup wizard cannot be used over SSH, and connecting USB peripherals is not possible on Zero boards while gadget mode is active.

Existing installation:

1. Verify that you are running Raspberry Pi OS Trixie:

cat /etc/os-release

→ Confirm that VERSION_CODENAME=trixie.

2. Install and enable gadget mode:

sudo apt update
sudo apt install rpi-usb-gadget
sudo rpi-usb-gadget on
sudo reboot

After reboot, the USB port will switch into gadget mode. Any active SSH sessions will temporarily drop and then reconnect once the USB Ethernet interface becomes available. Depending on the host system, give it up to one minute for DHCP/ICS negotiation to settle.

Technical details

The rpi-usb-gadget package configures the g_ether USB gadget kernel module, which exposes a virtual Ethernet interface according to the host OS’ capabilities:

  • Windows hosts → RNDIS mode
  • macOS/Linux hosts → CDC-ECM mode

The correct mode is automatically selected at runtime based on USB descriptor negotiation — no manual selection is required.

Why no USB serial console?

CDC-ACM (serial over USB) is not included, as Windows cannot bind both RNDIS/ECM and ACM to a single composite USB device using one .inf file without vendor-specific drivers.

A lightweight background service runs on the Raspberry Pi and continuously monitors:

  • USB link state
  • DHCP/ICS availability on the host
  • Routing and DNS status

If ICS is detected on the host, gateway and DNS configurations are automatically applied to provide seamless internet access over USB.

Troubleshooting

If the connection doesn’t work immediately, work through the following checks:

Hardware and cabling

  • Ensure you are using the correct USB/OTG port:
    → On Zero models, this must be the port labeled USB, not PWR IN
  • Connect directly to a USB port on the host PC — avoid hubs or docks that may block OTG negotiation or limit power
  • Only connect one Raspberry Pi at a time in gadget mode to avoid host-side interface conflicts
  • If the Raspberry Pi reboots repeatedly or disconnects, it may not be receiving enough power
    → Use a powered USB hub or the Raspberry Pi USB 3 Hub to supply external power while keeping data routed through the host

ICS and DHCP behavior

  • After enabling ICS, wait up to one minute for DHCP to issue an IP, and for hostname resolution to begin working
  • On Windows, if the gadget adapter shows as ‘Shared’ but no IP is assigned:
    → Disable ICS completely on all interfaces, plug in the Raspberry Pi, then re-enable ICS and reassign the correct adapter
  • Windows may list multiple ‘Ethernet X’ adapters from previous attempts
    → Consider removing or disabling unused adapters to prevent routing conflicts

Hostname and mDNS

  • If ssh pi@hostname.local does not resolve on Windows, install Bonjour/mDNS support or use the assigned IP address directly (shown in arp -a)

Source code and driver downloads

This feature is now included by default in Raspberry Pi OS Trixie images, but all code and tooling remain open for inspection and contribution. You can find the full source, the documentation, and the Windows driver installer (rpi-usb-gadget-driver-setup.exe) here.

The repository also contains an issue tracker, as well as reports from real-world setups — particularly involving ICS on Windows and macOS — which are extremely valuable. Different host systems behave slightly differently when assigning DHCP, routing, or firewall rules, so community feedback helps make the experience more reliable for everyone.

Wrapping up

USB gadget mode brings a Raspberry Pi much closer to being a plug-and-go development device — no Wi-Fi setup, no IP scanning, no HDMI, and no keyboard required. Just connect a single USB cable, SSH in, and start building.

This feature has been designed to be simple enough for beginners to use, yet robust and scriptable enough for power users and large-scale deployments. Future updates will continue to refine host detection, ICS handling, and diagnostics. If you have ideas or edge cases to share, the GitHub issue tracker and the Raspberry Pi Forums are open.

The post USB gadget mode in Raspberry Pi OS: SSH over USB appeared first on Raspberry Pi.



from News - Raspberry Pi https://ift.tt/Wg0sAax

Labels: ,