GSoC Project Idea: Logic Analyser with the PRU-ICSS on the BeagleBone Black

Hello Jason and fellow community members,

This is Kumar Abhishek, a Second Year Student at the Indian Institute of Technology, Kharagpur. I have been pursuing electronic circuit-building (with microcontrollers) and computer programming (in C, C++, C# and Java) as a hobby since I was 12 years old. I have had experience working with 8051, AVR (bare metal) and STM32 microcontrollers with RTOS, and a lot of peripherals and LCDs before. I have recently been exploring various aspects of the BeagleBone Black, and have been fascinated by its capabilities.

One of the ideas that came to me for a project was to use the PRU on the BeagleBone Black to build a full-fledged logic analyser with support for decoding various protocols. This would exploit the PRU R31 GPI Mode direct connection to capture PRU0 to PRU15 inputs. The maximum possible sample rate would be 100 MHz, and the aim of the project would be to get as close as possible to this rate and also provide, if possible, a web-based front-end to the user, accessible via RNDIS or Ethernet, from almost any PC, smartphone or tablet.

When I did an initial search, I found out that there have been projects that exploit the PRU functionality and data capture, like this one or http://linux.thaj.net63.net/2014/01/logic-analyzer-with-your-beaglebone.htm. However these were only basic proof-of-concept projects. These applications, if welded together into a proper framework will be able to turn the BeagleBone Black into an open and inexpensive instrumentation toolbox indispensable for any budding hacker.

The project would be split into two parts:

  • The core / data capturing back-end: This would be responsible for capturing GPIO data using the PRU, combined with user mode / kernel mode code for the maximum throughput. Setting up triggers, etc would be taken up by this layer. Also support for protocol decoding plug-ins will be added here.

  • The front-end:

  • The preferred front-end will be a web-based UI similar to a standard logic analyser interface, displaying captured waveforms and decoded data.

  • There are also two alternative ideas for the front end:

  • A standalone device with a display and GUI. However the PRU pins are shared with the HDMI Framer, the onboard eMMC, the external microSD Card cage and UART0 and a configuration of all the 16 PRU pins will render at least one of these peripherals inaccessible

  • Transporting the data over USB for use by a pre-existing PC client, like the SUMP logic analyzer tool [simplest but least preferred, as it would not be able to fully utilize the faster Cortex-A8 processor for processing and user interface, which it is certainly well capable of]

I would like to hear from you on the above proposal. I am excited to see the possibilities with the BeagleBone and I look forward to working with the BeagleBoard community in the near future.

Best Regards
Kumar Abhishek
Undergraduate Student
Department of Electronics & Electrical Communication Engineering
Indian Institute of Technology, Kharagpur

Hello Jason and fellow community members,

This is Kumar Abhishek, a Second Year Student at the Indian Institute of Technology, Kharagpur. I have been pursuing electronic circuit-building (with microcontrollers) and computer programming (in C, C++, C# and Java) as a hobby since I was 12 years old. I have had experience working with 8051, AVR (bare metal) and STM32 microcontrollers with RTOS, and a lot of peripherals and LCDs before. I have recently been exploring various aspects of the BeagleBone Black, and have been fascinated by its capabilities.

One of the ideas that came to me for a project was to use the PRU on the BeagleBone Black to build a full-fledged logic analyser with support for decoding various protocols. This would exploit the PRU R31 GPI Mode direct connection to capture PRU0 to PRU15 inputs. The maximum possible sample rate would be 100 MHz, and the aim of the project would be to get as close as possible to this rate and also provide, if possible, a web-based front-end to the user, accessible via RNDIS or Ethernet, from almost any PC, smartphone or tablet.

Love it. OSCPRIME for Android is interesting if you were to add support for the ADCs and more of an oscilloscope mode. I agree a web-based front-end would be great, but I suggest starting with the back-end work and an existing front-end first.

When I did an initial search, I found out that there have been projects that exploit the PRU functionality and data capture, like this one or http://linux.thaj.net63.net/2014/01/logic-analyzer-with-your-beaglebone.htm. However these were only basic proof-of-concept projects. These applications, if welded together into a proper framework will be able to turn the BeagleBone Black into an open and inexpensive instrumentation toolbox indispensable for any budding hacker.

Sweet!

The project would be split into two parts:

  • The core / data capturing back-end: This would be responsible for capturing GPIO data using the PRU, combined with user mode / kernel mode code for the maximum throughput. Setting up triggers, etc would be taken up by this layer. Also support for protocol decoding plug-ins will be added here.

Can you break this task down a bit? The shared memory between Linux userspace and PRU seems to solve most of the communications issues. Have you already done a “hello world” PRU program? Do you have a BeagleBone Black already?

Have you looked at the parallel capture feature of the PRU? Can you breakdown some of the features of your proposed PRU firmware and divide up the major components between the PRU firmware, A8 back-end library/daemon and A8/remote front-end?

  • The front-end:

  • The preferred front-end will be a web-based UI similar to a standard logic analyser interface, displaying captured waveforms and decoded data.

  • There are also two alternative ideas for the front end:

  • A standalone device with a display and GUI. However the PRU pins are shared with the HDMI Framer, the onboard eMMC, the external microSD Card cage and UART0 and a configuration of all the 16 PRU pins will render at least one of these peripherals inaccessible

  • Transporting the data over USB for use by a pre-existing PC client, like the SUMP logic analyzer tool [simplest but least preferred, as it would not be able to fully utilize the faster Cortex-A8 processor for processing and user interface, which it is certainly well capable of]

I believe starting with the SUMP client on a remote machine is the best place to start. Moving forward, I think working on an HTML5 canvas client would be good.

For limited channels, running SUMP on the BeagleBone would be good. Also, you can run SUMP via X11, so performing an ‘ssh -X’ would free those HDMI pins back up again.

I would like to hear from you on the above proposal. I am excited to see the possibilities with the BeagleBone and I look forward to working with the BeagleBoard community in the near future.

I like this idea, but I think you need to try to attract some mentors from existing projects (like SUMP) that have active developers and hit the IRC channels to try to excite some other possible mentors.

Hello Jason,

Love it. OSCPRIME for Android is interesting if you were to add support for the ADCs and more of an oscilloscope mode. I agree a web-based front-end would be great, but I suggest starting with the back-end work and an existing front-end first.

:slight_smile: . I saw their project page and I liked their concept. I had indeed thought of adding an ADC and then oscilloscope too but then decided to just start with a logic analyser first for simplicity. As I had written in my previous mail, my idea is to “turn the BeagleBone Black into an open and inexpensive instrumentation toolbox”. I wanted a FPGA-free solution so as to keep the hardware down to the very basic - just the BBB and focus more on the software part. And if we could indeed be able sample at 100 MHz and utilize the full potential available, that would be, I suppose, more than enough for most basic purposes.

Can you break this task down a bit? The shared memory between Linux userspace and PRU seems to solve most of the communications issues. Have you already done a “hello world” PRU program? Do you have a BeagleBone Black already?

I just acquired a BeagleBone Black two weeks ago. I should be able to get a “Hello, PRU World” app running this weekend.

Have you looked at the parallel capture feature of the PRU? Can you breakdown some of the features of your proposed PRU firmware and divide up the major components between the PRU firmware, A8 back-end library/daemon and A8/remote front-end?

The reference manual suggests that the parallel-capture feature is intended to capture the PRU pins 0 to 15 on a clock edge (configurable positive/negative) which is provided externally through pin PRU16, the 17th PRU R31 input pin. Direct connection mode would be utilized for reading the PRU GPIO pins.

The PRU firmware in its most basic form would be responsible for the actual data capture and moving it into userspace memory. I am yet to read more about how it actually takes place, the various ways in which it can be done, and if the EDMA can be used for the purpose.

The A8 back-end daemon/library would ideally take care of setting up the triggers, and ensuring a real-time flow of data from the PRU into the main memory. For an example to be able to attached to a SUMP client, the A8 back-end library would be responsible for converting the byte-stream captured into USB packets. Actually the SUMP client just uses a standard serial port for receiving its data, therefore it should be comparatively simple. For a web-based client, the back-end will use Console.IO could be used to stream packets, with some form of RLE on each bitstream. I would however like to design a combination of a kernel-mode and a user-mode daemon, and I am looking forward for comments and support in this architecture design from my to-be mentors.

I do not know if it would be better to put protocol analysis as an additional layer after the back-end daemon or completely on the front-end client. I have begun studying the SUMP client code, it would help me get more insight on how it has been achieved and could be achieved with the BBB.

I believe starting with the SUMP client on a remote machine is the best place to start. Moving forward, I think working on an HTML5 canvas client would be good.

I agree, this would be the preferred way,

For limited channels, running SUMP on the BeagleBone would be good. Also, you can run SUMP via X11, so performing an ‘ssh -X’ would free those HDMI pins back up again.

As of now, I do not have a display or cape I could connect my BBB to, I guess I will have to stick with cross-compilation and SSH for now.

I like this idea, but I think you need to try to attract some mentors from existing projects (like SUMP) that have active developers and hit the IRC channels to try to excite some other possible mentors.

I checked, the last release of the SUMP client was in 2007, way long back. It has now given way to the Open Logic Sniffer, the last release was in Aug '13. I will contact the project maintainer for his opinion.

My IRC nick is Abhishek_ .

Looking forward to your reply

Best Regards
Kumar Abhishek
Undergraduate Student
Department of Electronics & Electrical Communication Engineering
Indian Institute of Technology, Kharagpur

Hello Jason,

Love it. OSCPRIME for Android is interesting if you were to add support for the ADCs and more of an oscilloscope mode. I agree a web-based front-end would be great, but I suggest starting with the back-end work and an existing front-end first.

:slight_smile: . I saw their project page and I liked their concept. I had indeed thought of adding an ADC and then oscilloscope too but then decided to just start with a logic analyser first for simplicity. As I had written in my previous mail, my idea is to “turn the BeagleBone Black into an open and inexpensive instrumentation toolbox”. I wanted a FPGA-free solution so as to keep the hardware down to the very basic - just the BBB and focus more on the software part. And if we could indeed be able sample at 100 MHz and utilize the full potential available, that would be, I suppose, more than enough for most basic purposes.

Can you break this task down a bit? The shared memory between Linux userspace and PRU seems to solve most of the communications issues. Have you already done a “hello world” PRU program? Do you have a BeagleBone Black already?

I just acquired a BeagleBone Black two weeks ago. I should be able to get a “Hello, PRU World” app running this weekend.

Have you looked at the parallel capture feature of the PRU? Can you breakdown some of the features of your proposed PRU firmware and divide up the major components between the PRU firmware, A8 back-end library/daemon and A8/remote front-end?

The reference manual suggests that the parallel-capture feature is intended to capture the PRU pins 0 to 15 on a clock edge (configurable positive/negative) which is provided externally through pin PRU16, the 17th PRU R31 input pin. Direct connection mode would be utilized for reading the PRU GPIO pins.

The PRU firmware in its most basic form would be responsible for the actual data capture and moving it into userspace memory. I am yet to read more about how it actually takes place, the various ways in which it can be done, and if the EDMA can be used for the purpose.

The A8 back-end daemon/library would ideally take care of setting up the triggers, and ensuring a real-time flow of data from the PRU into the main memory. For an example to be able to attached to a SUMP client, the A8 back-end library would be responsible for converting the byte-stream captured into USB packets. Actually the SUMP client just uses a standard serial port for receiving its data, therefore it should be comparatively simple. For a web-based client, the back-end will use Console.IO could be used to stream packets, with some form of RLE on each bitstream. I would however like to design a combination of a kernel-mode and a user-mode daemon, and I am looking forward for comments and support in this architecture design from my to-be mentors.

I do not know if it would be better to put protocol analysis as an additional layer after the back-end daemon or completely on the front-end client. I have begun studying the SUMP client code, it would help me get more insight on how it has been achieved and could be achieved with the BBB.

I believe starting with the SUMP client on a remote machine is the best place to start. Moving forward, I think working on an HTML5 canvas client would be good.

I agree, this would be the preferred way,

For limited channels, running SUMP on the BeagleBone would be good. Also, you can run SUMP via X11, so performing an ‘ssh -X’ would free those HDMI pins back up again.

As of now, I do not have a display or cape I could connect my BBB to, I guess I will have to stick with cross-compilation and SSH for now.

What does running headless have to do with cross-compilation? You can easily ssh into the board and use gcc/pasm on the board. ‘pasm’ now comes included on the Debian images (Latest Software Images - BeagleBoard).

I like this idea, but I think you need to try to attract some mentors from existing projects (like SUMP) that have active developers and hit the IRC channels to try to excite some other possible mentors.

I checked, the last release of the SUMP client was in 2007, way long back. It has now given way to the Open Logic Sniffer, the last release was in Aug '13. I will contact the project maintainer for his opinion.

Great. Hopefully they have some interest.

My IRC nick is Abhishek_ .

See you on IRC!

> Hello Jason and fellow community members,
>
> This is Kumar Abhishek, a Second Year Student at the Indian
> Institute of Technology, Kharagpur. I have been pursuing
> electronic circuit-building (with microcontrollers) and
> computer programming (in C, C++, C# and Java) as a hobby
> since I was 12 years old. I have had experience working with
> 8051, AVR (bare metal) and STM32 microcontrollers with RTOS,
> and a lot of peripherals and LCDs before. I have recently
> been exploring various aspects of the BeagleBone Black, and
> have been fascinated by its capabilities.
>
> One of the ideas that came to me for a project was to use
> the PRU on the BeagleBone Black to build a full-fledged
> logic analyser with support for decoding various protocols.
> This would exploit the PRU R31 GPI Mode direct connection to
> capture PRU0 to PRU15 inputs. The maximum possible sample
> rate would be 100 MHz, and the aim of the project would be
> to get as close as possible to this rate and also provide,
> if possible, a *web-based front-end *to the user, accessible
> via RNDIS or Ethernet, from almost any PC, smartphone or
> tablet.

Love it. OSCPRIME for Android is interesting if you were to
add support for the ADCs and more of an oscilloscope mode. I
agree a web-based front-end would be great, but I suggest
starting with the back-end work and an existing front-end
first.

Another angle on the basic logic analyzer is a bootstraping
instrument for people learning hardware on the Beagle board.
Right now, people can visualize GPIO with an LED. But to take it
further, they eneed to hook up chips and have blind faith that it
will just work. What if instead of build a generic logic analyzer
as a project, we integrate it into a "learning" module. Something
like -

- 4 inputs to minimize number of pins we are burning. 4 pins will
let us look at a SPI signal. (CS, MOSI, MISO, CLK).
- Ability to trigger of a pattern. This might knock the sample
rate down a little bit.
- Interface is via the web. Prehaps something built along side
nodejs.
A user then can get some wires (a la the bread boards of days
long pass) and wire the logic analyzer pins to signals of
interest else where on the Beagle, say the I2C or SPI lines. Run
whatever software they are debugging and watch their commands go
exit the SPI or I2C lines. Since this is the same board, this
side steps some of the EE issues - voltages, grounding.

This doesn't preclude the same firmware implementing a stand alone
logic analyzer. It just provides some basic learning tools for
things to do after you've blinked the LED.

Just another angle...

> > Hello Jason and fellow community members,
> >
> > This is Kumar Abhishek, a Second Year Student at the Indian
> > Institute of Technology, Kharagpur. I have been pursuing
> > electronic circuit-building (with microcontrollers) and
> > computer programming (in C, C++, C# and Java) as a hobby
> > since I was 12 years old. I have had experience working with
> > 8051, AVR (bare metal) and STM32 microcontrollers with RTOS,
> > and a lot of peripherals and LCDs before. I have recently
> > been exploring various aspects of the BeagleBone Black, and
> > have been fascinated by its capabilities.
> >
> > One of the ideas that came to me for a project was to use
> > the PRU on the BeagleBone Black to build a full-fledged
> > logic analyser with support for decoding various protocols.
> > This would exploit the PRU R31 GPI Mode direct connection to
> > capture PRU0 to PRU15 inputs. The maximum possible sample
> > rate would be 100 MHz, and the aim of the project would be
> > to get as close as possible to this rate and also provide,
> > if possible, a *web-based front-end *to the user, accessible
> > via RNDIS or Ethernet, from almost any PC, smartphone or
> > tablet.
>
> Love it. OSCPRIME for Android is interesting if you were to
> add support for the ADCs and more of an oscilloscope mode. I
> agree a web-based front-end would be great, but I suggest
> starting with the back-end work and an existing front-end
> first.

Another angle on the basic logic analyzer is a bootstraping
instrument for people learning hardware on the Beagle board.
Right now, people can visualize GPIO with an LED. But to take it
further, they eneed to hook up chips and have blind faith that it
will just work. What if instead of build a generic logic analyzer
as a project, we integrate it into a "learning" module. Something
like -

- 4 inputs to minimize number of pins we are burning. 4 pins will
let us look at a SPI signal. (CS, MOSI, MISO, CLK).
- Ability to trigger of a pattern. This might knock the sample
rate down a little bit.
- Interface is via the web. Prehaps something built along side
nodejs.
A user then can get some wires (a la the bread boards of days
long pass) and wire the logic analyzer pins to signals of
interest else where on the Beagle, say the I2C or SPI lines. Run
whatever software they are debugging and watch their commands go
exit the SPI or I2C lines. Since this is the same board, this
side steps some of the EE issues - voltages, grounding.

This doesn't preclude the same firmware implementing a stand alone
logic analyzer. It just provides some basic learning tools for
things to do after you've blinked the LED.

Just another angle...

Focusing on this sort of learning tool seems *extremely* helpful. It makes
it where several examples for exploring interface controls can be explored
very quickly.

It makes me start to imagine adding a pattern generator or other logic to
start using to test your peripheral, but I don't want to get too far ahead
of ourselves.

Anyway, adding more signals later should be relatively easy. Getting some
solid examples that allow people to visualize what they are doing with SPI,
I2C, UART, PWM, etc seems incredible to me and great for building
test-benches/visualization-tools for some of these high-level abstractions
like libsoc, BoneScript, PyBBIO, etc.

Thanks for the input Hunyue. Do you have any suggestions on questions the
student should answer in any proposal made?

> > > Hello Jason and fellow community members,
> > >
> > > This is Kumar Abhishek, a Second Year Student at the
> > > Indian Institute of Technology, Kharagpur. I have been
> > > pursuing electronic circuit-building (with
> > > microcontrollers) and computer programming (in C, C++,
> > > C# and Java) as a hobby since I was 12 years old. I have
> > > had experience working with 8051, AVR (bare metal) and
> > > STM32 microcontrollers with RTOS, and a lot of
> > > peripherals and LCDs before. I have recently been
> > > exploring various aspects of the BeagleBone Black, and
> > > have been fascinated by its capabilities.
> > >
> > > One of the ideas that came to me for a project was to
> > > use the PRU on the BeagleBone Black to build a
> > > full-fledged logic analyser with support for decoding
> > > various protocols. This would exploit the PRU R31 GPI
> > > Mode direct connection to capture PRU0 to PRU15 inputs.
> > > The maximum possible sample rate would be 100 MHz, and
> > > the aim of the project would be to get as close as
> > > possible to this rate and also provide, if possible, a
> > > *web-based front-end *to the user, accessible via RNDIS
> > > or Ethernet, from almost any PC, smartphone or tablet.
> >
> > Love it. OSCPRIME for Android is interesting if you were
> > to add support for the ADCs and more of an oscilloscope
> > mode. I agree a web-based front-end would be great, but I
> > suggest starting with the back-end work and an existing
> > front-end first.
>
> Another angle on the basic logic analyzer is a bootstraping
> instrument for people learning hardware on the Beagle board.
> Right now, people can visualize GPIO with an LED. But to
> take it further, they eneed to hook up chips and have blind
> faith that it will just work. What if instead of build a
> generic logic analyzer as a project, we integrate it into a
> "learning" module. Something like -
>
> - 4 inputs to minimize number of pins we are burning. 4 pins
> will let us look at a SPI signal. (CS, MOSI, MISO, CLK). -
> Ability to trigger of a pattern. This might knock the sample
> rate down a little bit.
> - Interface is via the web. Prehaps something built along
> side nodejs.
> A user then can get some wires (a la the bread boards of
> days long pass) and wire the logic analyzer pins to signals
> of interest else where on the Beagle, say the I2C or SPI
> lines. Run whatever software they are debugging and watch
> their commands go exit the SPI or I2C lines. Since this is
> the same board, this side steps some of the EE issues -
> voltages, grounding.
>
> This doesn't preclude the same firmware implementing a stand
> alone logic analyzer. It just provides some basic learning
> tools for things to do after you've blinked the LED.
>
> Just another angle...

Focusing on this sort of learning tool seems extremely
helpful. It makes it where several examples for exploring
interface controls can be explored very quickly.

It makes me start to imagine adding a pattern generator or
other logic to start using to test your peripheral, but I
don't want to get too far ahead of ourselves.

That can be the other PRU. It would be a companion project for
another student to do that. Only request/preference is to align
the UI so there is a measure of sanity. Having said that, a
signal generator has a hardware risk element to it - we are
driving pins.

Anyway, adding more signals later should be relatively easy.
Getting some solid examples that allow people to visualize
what they are doing with SPI, I2C, UART, PWM, etc seems
incredible to me and great for building
test-benches/visualization-tools for some of these high-level
abstractions like libsoc, BoneScript, PyBBIO, etc.

Thanks for the input Hunyue. Do you have any suggestions on
questions the student should answer in any proposal made?

Nothing specific. Students are newer/closer to the learning
process then either of us. IMO, I would like to hear inputs,
comments from the students. An important element is demonstrating
some understanding of what they are programming as the summer
isn't long enough to start from absolute zero and learn what a
logic analyzer is.

Some kind of sanity test code to go along with it is highly
desireable. Exactly what and how well does it test is a useful
thing to mention in the proposal.

Dear All,

The idea has been formed into a proposal and posted onto to the Melange portal.

Attached herein is my statically linked “Hello World” file, verified execution on the actual hardware [BeagleBone Black].

Looking forward to your comments.

Best Regards
Kumar Abhishek
Undergraduate Student
Department of Electronics & Electrical Communication Engineering
Indian Institute of Technology, Kharagpur

helloworld.bin (454 KB)

Hi,

Here is the PDF copy of the proposal.

https://drive.google.com/file/d/0B7U2bJEjkNeZQWc0Njg2T0NuTXM/edit?usp=sharing

Request your comments.

Regards

Hello Jason,

Request you to provide feedback & inputs on answer 3 / 3 of:

If your project is successfully completed, what will its impact be on the BeagleBoard.org community? Consider who will use it and how it will save them effort. Give 3 answers, each 1-3 paragraphs in length. The first one should be yours. The other two should be answers received from feedback of members of the BeagleBoard.org community, at least one of whom should be a BeagleBoard.org GSoC mentor. Provide email contact information for non-GSoC mentors.

My Answer (1/3):

The goal of the project (as a whole, beyond the GSoC timeline) is to become a tool bundled in the default system image on every BeagleBone Black. Implementation of this tool will add significant value to the BeagleBone Black, which now apart from being used in projects will be able be used as a debugging tool for intermediate users and a learning tool for beginners. Userswill be able to use it for debugging their circuits right away after installing our software on it. Successful completion of this project will also attract more users to the BeagleBone Black.

Beginners may use our project to visualise actual hardware communication between the BBB and the peripherals they are using by connecting their SPI, I2C or other ports to the PRU inputs of the BeagleBone Black.

Best Regards
~Abhishek

Hello Jason,

Request you to provide feedback & inputs on answer 3 / 3 of:

If your project is successfully completed, what will its impact be on the BeagleBoard.org community? Consider who will use it and how it will save them effort. Give 3 answers, each 1-3 paragraphs in length. The first one should be yours. The other two should be answers received from feedback of members of the BeagleBoard.org community, at least one of whom should be a BeagleBoard.org GSoC mentor. Provide email contact information for non-GSoC mentors.

My Answer (1/3):

The goal of the project (as a whole, beyond the GSoC timeline) is to become a tool bundled in the default system image on every BeagleBone Black. Implementation of this tool will add significant value to the BeagleBone Black, which now apart from being used in projects will be able be used as a debugging tool for intermediate users and a learning tool for beginners. Userswill be able to use it for debugging their circuits right away after installing our software on it. Successful completion of this project will also attract more users to the BeagleBone Black.

Beginners may use our project to visualise actual hardware communication between the BBB and the peripherals they are using by connecting their SPI, I2C or other ports to the PRU inputs of the BeagleBone Black.

Can you give a few more answers in-line to the queries elsewhere on this thread? In-line answers help me follow the overall conversation without having to re-read everything over again.

As discussed yesterday, Here is a link to the Balsamiq mockup of the web interface I just built.

Request feedback.

https://theembeddedkitchen.mybalsamiq.com/mockups/1663073.png?key=c0dc89889f2abc5817f880fb8d180913d2da0033

Best Regards

As discussed yesterday, Here is a link to the Balsamiq mockup of the web interface I just built.

Request feedback.

https://theembeddedkitchen.mybalsamiq.com/mockups/1663073.png?key=c0dc89889f2abc5817f880fb8d180913d2da0033

Looks great to me. Not sure why I’m the only one replying! I’m going to paste it here so some don’t need to follow the link.

I’m thinking the signal names don’t match the pin names. Also, do you know what will be providing the clock?

This seems super-cool to me and helps me to visualize what will be done.

If completed, such a system would enable BeagleBone developers to learn all about the signals they are generating, without needing to pay for any extra equipment. Visualization is an absolutely critical tool for building an intuitive understanding and this project has a tremendous potential for enabling both new and experienced users to visualize signals in an easier way than ever.

Now, convince us you can implement the PRU side and libsigrok side of things!

As discussed yesterday, Here is a link to the Balsamiq mockup of the web interface I just built.

Request feedback.

https://theembeddedkitchen.mybalsamiq.com/mockups/1663073.png?key=c0dc89889f2abc5817f880fb8d180913d2da0033

Looks great to me. Not sure why I’m the only one replying! I’m going to paste it here so some don’t need to follow the link.

I’m thinking the signal names don’t match the pin names. Also, do you know what will be providing the clock?

The “configure names” option allows the user to alias the LA input pins (Pin 0 to Pin 7, (or Pin 0 to Pin 15 if it can be achieved in the GSoC timeframe)) and assign own names to it. So the user just puts taps on the hardware to the inputs, changes the names to map them into the signals they need. For example, the user taps SCL and SDA pins to Pin 4 and Pin 5 and then configures their names as SCL and SDA, so they appear as SCL and SDA in the waveform.

My example was meant to visualise a transmission sending 2 bits at a time. The clock, in this example is provided by the user’s hardware. I have presented 4 signals tapped using the PRU inputs.

The PRU, of course, is being powered by its 200 MHz internal clock.

This seems super-cool to me and helps me to visualize what will be done.

If completed, such a system would enable BeagleBone developers to learn all about the signals they are generating, without needing to pay for any extra equipment. Visualization is an absolutely critical tool for building an intuitive understanding and this project has a tremendous potential for enabling both new and experienced users to visualize signals in an easier way than ever.

Now, convince us you can implement the PRU side and libsigrok side of things!

Let my proposal speak for me. With the support of the mentors from the BeagleBoard.org community, together we can make it happen.

Best Regards.

As discussed yesterday, Here is a link to the Balsamiq mockup of the web interface I just built.

Request feedback.

https://theembeddedkitchen.mybalsamiq.com/mockups/1663073.png?key=c0dc89889f2abc5817f880fb8d180913d2da0033

Looks great to me. Not sure why I’m the only one replying! I’m going to paste it here so some don’t need to follow the link.

I’m thinking the signal names don’t match the pin names. Also, do you know what will be providing the clock?

The “configure names” option allows the user to alias the LA input pins (Pin 0 to Pin 7, (or Pin 0 to Pin 15 if it can be achieved in the GSoC timeframe)) and assign own names to it. So the user just puts taps on the hardware to the inputs, changes the names to map them into the signals they need. For example, the user taps SCL and SDA pins to Pin 4 and Pin 5 and then configures their names as SCL and SDA, so they appear as SCL and SDA in the waveform.

Just UI nitpicking here, but it would be nice to have the name in the pin list as well, e.g. “Pin 4 - SCL”. Also it would be good to have a unique marker to show the trigger point. But yeah, this looks great!