Keep on Keepin’ on

My redefined thesis: Sound balls is an interactive and collaborative soundscape controlled by a set of smart balls.

Since my last update, a lot of my work has gone toward building the physical ball. I coated one in resin, sanded it and painted it, and am fairly pleased with the look and texture. The resin I am using provides a hard and paintable outer layer for the ball and does not block the signal at all. I had originally planned on sanding the ball smooth, but as I went, I realized I liked the wavy grain that resulted, so I left it a little rough. The only thing that needs fixing is the hideous seem that runs along the ball. I will either be covering it with tape and painting it over or possibly filling it with foam.

Each ball had a power switch and a JST port that allows it to be charged. One of the advantages of using the JST connector is that it prevent power from going to ground and vice versa, which I have found through experience, is a great way to fry a charger. The setup of the switch also prevents the arduino from drawing power while the ball is charging.

Originally I thought I would be sealing the balls permanently, but I realized that that was not realistic. If a single wire fell out or anything else happened, I needed a way to get into the ball. I played aorund with a few methods of holding the ball together that could be removed if needed, and settled on screws. I tested with some hard resin and found that the screw could be pushed though it without it cracking and provide a strong hold. So in addition to the resin on the outside, each ball has two little holes that uses the resin to reinforce it.

The custom made arduino board is working really well, and is actually quite a bit smaller than even the ardweenies. Here’s a Fritzing diagram of the final setup of a gyro ball. The accelerometer balls are pretty much identical with an accelerometer being swapped out for the gyro.

The new boards came together just in time, as one of my ardweenies started spitting out very questionable data. When the sensor was still, it would fluctuate between a wide enough range to cause problems.

One good thing came out of this, though, in that I developed an auto calibration for the gyroscope balls. Previously, I had to press a key when the balls were still to calibrate it, but now the system waits for the same reading to come up a few times in a row, and it assumes that the ball is at rest and sets that as the rest value. As the battery dies, the readings change slightly, so this turned out to be a really good way of keeping my readings accurate.

I also fixed the accelerometer balls to auto calibrate their max and min readings so that I can get the full range I want out of their readings, even if they are coming in a little bellow that range.

In terms of the software running on these things, I am still working on getting good collision detection. I started using the distance between the two readings for the gyros instead of calculating a rough speed based on their distance from the midpoint. This seems to work a little better.

More importantly, I set up an XML read/write system so that I can record an action and play it back. This will let me test the recognition of collisions and other gestures without me needing to actually move the balls around. This will make it easier for me as well am let me work with more consistent test data.

I have a fairly consistent method of checking if the gyro balls have been spun, so this should be a good way to switch modes.

I also created a new sound for the accelerometer ball that does make sound as well as making sure that the gyro balls make very different sounds from each other.

The Singing bowls are now their own mode, with each ball creating a specific tone that crows in volume and width as the ball moves more. In my user testing this has proved to be a very popular mode. It’s very meditative and somewhat surreal.


And of course, there was more testing.

My new favorite picture of the project

As I mentioned, Singing Bowl mode went over very well. Several people mentioned how much they would like to see some visuals with it. Time permitting, I may implement a very simple color based visualization of what is happening with the balls. This will be a low priority over getting the balls working, though.

For the other mode though, the big take away is casualty. There is still not a clear link between the user action and what is happening with the balls. Testers figure out a little bit over time but not enough. I don’t mind a degree of initial confusion. Part of this project is to encourage exploration, so having the role of each ball being completely obvious would not be as fun, and there were some great moments in testing where after the initial cacophony as people moved all of the balls at once, they stopped and tried to figure out what each ball did. This was the sort of interaction I was looking for.

However, after these experiments people had a better idea of what did what, but not a real grasp of it. There are a few issues with the balls:
-There is a slight delay between what happens and the sound that is created
-Collisions are not always detected, and occasionally non collisions register as collisions
-The ball will sometimes stop transmitting for a few seconds.

I am fairly certain that only one of these issues is not enough to break causality: when the balls are working most of the way (even if one of  the aspects was causing issues), people respond really well to it and were able to feel like their actions were affecting what the balls were doing, but when a few links broke, people often felt like they were just moving them around randomly.

So that’s my big push: getting the gesture recognition better. I would really like to use the gesture recognition to frame parts of the game, such as switching modes.

And here’s my presentation.