Sign in

In 2021, Raspberry Pi enters the world of microcontrollers — a world that’s nothing new, but quite different from minicomputers.

How is it different? With respect to the camera applications we are discussing in this blog, there won’t be dedicated camera slots or official Pi camera modules for the Pico.

Most microcontroller boards are not designed with cameras in mind. For them, the image signal processing is not a simple task.

This is where Arducam comes in. Arducam is a long-time player of microcontroller cameras since the Arduino era.

Back in the old days, we managed to add 2MP OV2640 cameras to the Arduino, even multiple ones

This blog is a guide to setting up and using a…

TensorFlow Lite for Microcontrollers is designed to run machine learning models on microcontrollers and other devices with only a few kilobytes of memory.

It supports microcontroller platforms like Arduino Nano 33 BLE Sense, ESP32, STM32F746 Discovery kit, and so on. Since the release of the $4 Raspberry Pi Pico, which has gained increasing popularity among makers, Arducam has been trying to bring what’s possible on other microcontroller platforms to the Pico.

This article is a tutorial on using the machine learning framework Tensorflow Lite Micro on the Pico for Person Detection.

Getting started

See Getting Started with the Raspberry Pi Pico

The Raspberry Pi is a powerful single-board computer to help you do proof-of-concept of your ideas.

To make the Raspberry Pi smart enough to realize what we want, we’ll need it to sense the world.

Which sense has the biggest impact on our everyday lives? I believe it’s vision. Camera builds the vision for the Raspberry Pi, and with multiple cameras, you get even more features to explore.

Here are 6 exciting applications enabled by Arducam multi-camera adapter for the Raspberry Pi.

1. A Low-Cost Multi-Camera System With Multi-Spectral Illumination\

ArduCAM offers camera multiplexer boards that facilitate the operation of up to 4 cameras with a single…

In nature, schools of fish exhibit complex, synchronized behaviors without following a leader. Robotics is far behind.

The natural world abounds with self-organizing collectives, where large numbers of relatively simple agents use local interactions to produce impressive global behaviors.

Well-known examples include social insect colonies, bird flocks, and fish schools. Mathematicians and engineers have strived to understand the mapping from local interactions onto global behaviors and vice versa in a quest to understand natural collective intelligence.

Implicit coordination for 3D underwater collective behaviors in a fish-inspired robot swarm from Harvard

Now, Harvard researchers have developed fish-inspired robots that can synchronize their movements like a school of fish. The underwater robots, dubbed Blueswarm, have no external control.

Each underwater robot, called a Bluebot, is equipped with two cameras and three LED…

This article explains everything you need to know about the Raspberry Pi camera autofocus.

It applies to all Raspberry Pi Models (Pi 3, Pi 4, Pi Zero, Compute Module, and all others), and all Raspberry Pi camera modules (V1.3, V2.1, HQ).

So let’s dive right in and clarify this issue once and for all.

Prerequisites for autofocus

Here is the definition of autofocus from Wikipedia:

An autofocus (or AF) optical system uses a sensor, a control system, and a motor to focus on an automatically or manually selected point or area.

Easily, we get all the answers we need. …


Camera solutions expert for embedded systems - like Raspberry Pi, NVIDIA Jetson, Arduino — from lens to sensor to driver to PCB and industrial design.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store