AniBall goes Bloosy

It was a number of years ago when I first wrote a very simple Java applet which animated a few solid blue circles in a small box in your web browser. I called it AniBall, and I still have the source code somewhere.

That exercise was important to me for a boatload of reasons.

For one, it was my introduction to object-oriented programming. I remember being very confused by it at the time because it seemed like everything was just a series of new statements and parentheses. In hindsight, this was probably also the first glimpse I had into the importance of the API over the language. In my previous procedural programming everything was about simple statements used to express logic that I created and understood. In new-fangled Java applets everything depended on opaque APIs built around objects and I didn’t get to see where the fundamentals resided anymore. My logic took a back seat to assembling someone else’s components in tricky variations. It took years before I got back to a point where I could respect myself enough to write procedural code.

AniBall was also important because a guy I respected a lot gave me the most under-stated compliment you might imagine. He said it was “soothing” or something like that.

The applet didn’t do much. It just painted a solid blue circle on the canvas and then animated it by letting it bounce off the edges – kind of like Pong works. There was some mechanism for introducing new balls to the animation too, I think you would just mouse-click somewhere in the canvas. And after a few clicks there were a few of these soothing blue balls bouncing around the web browser.

Fast-forward to today, when I am in the midst of learning how to program on the Android operating system. The AniBall concept, simple as it may seem, has been on my mind’s back-burner for more than a decade. But I see now that there is some other game or something out there with the name “AniBall” – so I decided to name my new application as “Bloosy” (sort of a take-off on “bluesy”).

For now, I’m just figuring out how to get everything built and deployed onto my Android phone, and to get it published in the Google Play store. But I’ve got a whole bunch of neato ideas for things to add to my mobile app. To a great extent, it will be a playground for me to learn Android OS concepts and techniques. You can check-out the code on GitHub, and if you’re polite I might even add you as a contributor to the project.

The bigger picture is that I’ve always been fascinated by visualizations. And I’ve been thinking a lot about how interesting it would be to use a mobile device as a tool to visualize data based on information signals gleaned directly from your immediate local environment.

For example, your phone has a latitude and longitude. That can be visualized as some sort of animation – maybe as a number or as a modification to one of the blue balls based on those numbers, or as a speed vector or a shading of color. Your phone has access to other local inputs too – e.g., number of unread emails, time of day, something about a news headline, free memory, whatever. All of these might be items of information that normally go unnoticed but which could be interesting to see as a visualization of some sort. A way of more deeply immersing yourself in your environment by SEEING something that is normally not seen.

I know that’s abstract, and my elevator pitch is not quite yet… pitchy. But as I work on the project it will become more coherent. And I am betting you are going to enjoy some aspect of the finished app!