Question: What is the best way to get real time data from the Raspberry to influence a p5.js drawing?
Context: I am running a coding + art project as a school teacher (With students aged 11 - 15). We have a couple of raspberry pis (Rpi) with sensehats (sensors for humidity, accelerometer, magnetometer, gyroscope, etc attached to the GPIOs).
Students have done some
python, and some of the great coding train
p5.js tutorials. My objective would be for them to create a box which can make their art interactive. For example:
You blow on the raspberry pi, and the drawing becomes more red.
Now, the challenge is making that PiHat <-> Browser connection. Here are some ideas I explored:
- I/O processing library (it means learning yet another language, Java, and a lot of work figuring out the pins. Probably too much for the little time they have)
- Websockets (Sensors -> Python -> Websockets -> Server <- Browser <- JS <- p5.js) … I like the sound of that. The students would only need to touch the ends (python + p5.js if I can build the bridge) but I’m struggling to make most of the libraries work. Any recommendations?
Spacebrew looks amazing … but the last commit was 2 years ago … and there doesn’t seem to be much python support.
- Is there something obvious and easier that I am missing?
Hello @alphydan - I am not an expert on p5.js but I do a fair bit of work with the Pi. From my perspective, you enumerate all the major options very thoughtfully, and I wouldn’t know of any other way (besides, perhaps, doing it all in Python), that you had not already thought of.
The biggest reason for the Sensors -> Python -> Websockets route would be that the supporting code that talks to all these sensors is already readily built (in Python) by the Raspberry Pi folks. Most of the sensors would actually be rather easy to implement using e.g. Processing and the Hardware I/O library, or in something built on top of node.js. But particularly the accelerometer and magnetometer have some rather sophisticated software driving it, which would be hard to replicate elsewhere. And unfortunately the accelerometer seems to be the most fun sensor to play around with
So my suggestion would be this route. Perhaps you can make it so that the Python code continuously transmits all the sensor readings over the websocket to p5.js - this way the students could focus on interaction, and not having to dig through tho programming languages?
If you wanted to use Java Processing on the Pi to interact with the Sensehat: I’d be happy to jump in and implement example sketches that interact with all of its major sensors. But I very much share your thoughts on not exposing students to too many lanugages (and again, it probably wouldn’t be able to do the fun sensor-fusion that Python does).
Thank you for the guidance. I managed to build a prototype for it.
I hope the students can show off their interactive art on Tuesday!
@alphydan I hope students will want to know what “Yak Shaving” means
… @gohai, that part is meant for the instructor. I provide the infrastructure and they focus on the creativity of sketch.js.