Groovebox

Violet Cowell, Frank Santen, Patrick Tirch

Musical Robotics B25

The goal of Groovebox was to create a small box which you could hand-drum on, and it would run your hand-drumming sequence through a machine learning model and play a response back to you. To accomplish this, piezoelectric sensors detect pressure when the user taps the top of the box. The connected raspberry pi processes the input and drives a surface transducer to play audio, as well as controlling various feedback systems such as LEDs and vibration motors. The three different corners of the box map to different parts of a drumset, allowing the user to better express themself. It also would handle changing audio live during performance, and sustained sliding presses across the top to modulate volume. We would fabricate a pi hat to streamline the electronics, and train our own model to accept three drum inputs and output a complex response rhythm.

That was the goal. Now, let’s talk about what we actually made.

As you can see in the image, the groovebox has escaped the confines of its box. This is because we had troubles with printing a proper pi hat, so all of the electronics are very slapdash in their wiring and take up more space than the box can allow, but it functions. Most of the feedback systems such as the vibration motors and LEDs were cut for time, though. Additionally, we never got around to making the program start when the pi is turned on, so it needs to be plugged into a monitor, keyboard, and mouse to start the requisite programs.

The onset detection works, for the most part. Some of the more advanced features such as volume sliding were left out due to time constraints, but it reliably detects user inputs and assembles them into a MIDI file ready to be passed to the model. It also provides audio for the surface transducer to play, but the timing is iffy because python is slow.

Regarding the model, it seems to work! It takes in simple midi input and responds with complex midi output. However, we never properly connected it to the input handling program, so we can’t say for sure whether it would sound good when properly tested.

Regardless, most of the important pieces of the puzzle were created. The biggest issue was that we didn’t leave ourselves enough time to connect it all together, so unfortunately there isn’t a nice impressive video to show the device working. In fact, the most recent iteration including onset detection and surface transducer and tempo changes wasn’t recorded while it was working. The very next day, the main power wire frayed off and now nothing turns on, which is proof that the PCB would have been useful.

Materials:

Final paper in NIME format:

Video of early onset detection functioning:

Video of LEDs functioning in isolation:

Video of the vibration motor functioning in isolation:

Link to the github with all the code and the model (warning, it’s quite messy): https://github.com/WheatleyTheCore/Groovebox

Final 3D design of the box:

Plan for pi hat PCB to manufacture:

Image

Bill of Materials:

Relevant links for BOM: Power Supply, Usb c breakout board, Amplifier, motor drivers, Haptic Motors, ADC Board, Surface Transducers, Raspberry Pi 3 B+, SD Card

Leave a Reply

Your email address will not be published. Required fields are marked *