Wednesday, July 11, 2012

we maed a v1d w1th f!shez 1n it!!1

        So i've been following this whole OUYA thing somewhat closely, and as much as I feel like I don't know enough about their business plan to kickstart the project, I feel like i'm actually a pretty fair representation of the target audience.  I think about how i use my Xbox nowadays and yeah, it's pretty much just to play XBLA games.  That makes the OUYA at 99USD, pretty much an impulse buy for me, so i'm supposing if there's enough content, it would make sense as a gaming solution.  Here's the problem, i don't play XBLA games that often.  I either play a PC game because i'm on my PC working and i need a break after a few hours or I play games on a mobile device because i'm out and i'm bored waiting on something for a bit.  In that vein, i thought it might be cool to have a mobile android box with an HDMI out to run processing sketches on, if only i didn't have to plug it in.  Really that's probably not a huge issue though.  If the hardware ends up being as open as they say, might be moddable to this end.  But then i suppose the question remains why i still just wouldn't use an android tablet...I know, i know, don't kick if you don't believe, but hey, it's internet, i'm allowed to have opinions.


...Mobile p5 engine..?

        ...But let's be honest, you're not here to listen to me rain on internet's parade, so let's get onto the meat of it.  As the title sorta tries to allude to, we built a sort of installation!  Or at least, we got something to an early workable stage.  Sometime last week, Chris had the idea that we should motion track Annie's betta and use the resulting data to drive a flocking simulation.  To keep the footprint small, we decided to use a webcam based solution.  I spent a few hours working on a basic frame differencing solution using some code from processing.org to tweak the overall performance and output and came up with this:


        I did put a version up on my github, it's not quite as usable as I want it to be (a bit more dependencies than i'd like), but in a revision or two it'll be where i'd like it to be.  Meantime, it's definitely usable enough to do your own simple tracking based sketches, so have at!  If you're interested in putting your own motion tracker together, check these out:

Frame Differencing by Golan Levin
Frame Differencing with GSVideo

        We got all the code merged and tweaked to an initial state last night, and here's the result so far, fish courtesy of our buddy Ermal's Fishcam:


Live Fish Flocking from Chris Rojas on Vimeo.

        Next up, live tracking Annie's fish per the original spec.  Annie had some interesting ideas about how we might be able to enhance with some projection or some kind of external display to liven the overall display up.  Version 2.0 incoming!  Created with processing, toxiclibs, and GSVideo.

2 comments:

  1. I think a two part cam setup would be super awesome. One to capture the RGB with flocking on a screen behind the bowl/tank and one to do the tracking in IR so that the display won't interfere with the tracking.

    ReplyDelete
  2. That'd be an interesting experiment, good addition for v1.5 me thinks...

    ReplyDelete