I am a grad student and I am particularly interested in the idea relating to the Android-based remote display. My research currently involves developing a distributed gossip framework for the BeagleBone Black with the Xbee radio. This framework is eventually intended to be applied to perform ambient noise tomography for seismic imaging. I already have a working prototype of the gossip framework on simulation and on the BeagleBone and I am working to enhance the same. I want to further develop a way to control, automate and visualize the gossip taking place with the help of an external device like an Android Tablet and therefore my interest in this particular idea. I genuinely believe that this idea has the potential to enhance the data aggregation experience on the field with seismic sensors or on the testbed for evaluating seismic algorithms. I have some ideas I would like to develop on and examine their feasibility and seek your help. Kindly advise.
The project was already taken up last GSoC, and is more or less complete.
That’s right. Take a look at these repositories. If you are interested, you are welcome to contribute.
I was caught up with my academic and lab work after last year gsoc but I’m planning to do some code changes and making it easier to setup in the coming summer (outside of gsoc scope)
Is it all integrated into mainline and/or the Debian release images? Is there any hope of making more phones/tablets support it?
It’s not integrated into the mainline. There are still some stability issues. I plan to work on them during the coming summer.
As for supporting more devices, the issue is that the USB descriptors for each device has to be added manually. Currently it supports about 4 devices that I have access to. Once the code is improved, I will try to collect the device descriptors for the popular Android devices and also figure a better way to add support to newer devices.