2026 ideas needed

Can anyone help out here with some project ideas? I’m struggling to find the time.

Greybus has a lot of unimplemented protocols: GitHub · Where software is built
Maybe camera and audio protocols? Would be nice IoT project.

1 Like

Interfacing 7 inch DSI LCD H display with BeagleY-AI Board using mipi dsi cable

If some of us new the direction of where beagleboard is going it would help.

1 Like

I’d like to propose a GSoC project to implement IT66121 HDMI bridge driver and TI McASP audio driver for U-Boot on BeaglePlay.
Current gap: BeaglePlay boots with blank HDMI screens and no audio feedback.
Deliverables:

  1. IT66121 driver for HDMI display + audio
  2. McASP driver for I2S audio output
  3. Boot splash framework with custom logos
  4. Audio feedback system (beeps, boot melodies, error codes)
1 Like

I would like to see some stuff on using golang with the BBB

I would like to propose a GSoC project to continue working on BeagleMind, and this time focus totally on fine-tuning a model based on docs, forum and discord channels qa pairs (using SFM) and quantize it to ensure local and offline access to the assistant on user machine.

1 Like

PocketBeagle2 PWM output controlled from Zephyr.

Beaglebone Black Trixie 5.10 how to use EHRPWM1 & EHRPWM2 with PRU0 with messages using RPMsg. Had this working years ago and trying to get updated.

Perhaps relaed to @kevinacahalan:
I think an interesting project would be to use PWM + DMA to drive Neopixels/smart LEDs instead of having to dedicate a full PRU to it as we currently do. One may even be able to do it with GPIO + DMA instead of using the PWM peripheral as long as you can find a suitable timer. For instance, with a 3.2MHz timer, you could write 1 1 1 0 (937 ns high pulse) for a “logic high” or 1 0 0 0 (312 ns high pulse) for a “logic low”.

The scope could be widened slightly to also provide PWM overlays for all available peripherals on PB2.

I would also like to revive this idea from last year that didn’t get to completion.

And added goal for this year’s gsoc would be to make a driver/device tree that allow to use all of the following at the same time (as found on the Bela Gem Multi cape for the PB2):

  • 1x TLV320AIC3106 2-channel codec
  • 2x TLV320ADC3140 4-channel ADC
  • 1x ES9080Q 8-channel DAC

For full 10-in, 10-out multichannel audio capabiliites.

Work so far, made last year, can be found here: GitHub - jaydon2020/ALSA-Linux-Driver-for-BELA

LMR-H²: FPGA-Accelerated Language-Driven Robotics for BeagleBoard Ecosystem.

Problem Statement: Current language-enabled robots depend on cloud APIs or expensive edge GPUs, limiting accessibility for education, research, and offline deployment.

Solution: This project can be started by developing an FPGA LLM Inference Engine to demonstrate that lightweight LLMs can run efficiently on BeagleV-Fire. We can then create reusable IP cores and APIs that benefit the entire BeagleBoard community.

Tech Stack:

  • Libero SoC for FPGA design
  • Quantisation: ONNX Runtime/llama.cpp INT4 format
  • Inference engine: Custom RTL + HLS hybrid
  • API: Python bindings via ctypes/pybind11

An internal layer which makes BeagleBoard HAT Compatible.

I noticed that while some newer Beagle boards expose a 40-pin header, there isn’t a full Raspberry Pi HAT compatibility layer (especially around EEPROM detection and automatic overlay loading).

I’m wondering whether it would make sense to work on something like:

  • Implementing Raspberry Pi HAT EEPROM parsing support

  • Auto-loading appropriate device tree overlays

  • Creating a compatibility framework for Pi HATs on Beagle boards

  • Possibly a CLI validation tool to detect pin conflicts

The hardware adapter part would just be optional proof-of-concept but the main focus would be Linux/kernel + device tree tooling.

Multimodal Troubleshooting Assistant (Update over BeagleMind Project)

building a multimodal AI assistant for BeagleBoard platforms that combines vision and language capabilities to assist users with hardware troubleshooting. The system will analyze images of hardware setups alongside official documentation to identify wiring mistakes, incorrect pin connections, and common debugging issues.

The goal is to create an intelligent assistant that makes hardware development more accessible, reduces debugging time, and enhances the overall developer experience within the BeagleBoard ecosystem.

Sounds interesting! Can you please share your insights in the BeagleMind thread? Wirings, diagrams, and basically data inside of images aren’t easy to process. I would like to hear your insights on how you plan to implement the idea.

Sure, I will share this there as well.

I’ll provide my insights, but I also suggest that mentees bring their own implementation ideas to the table.