Friday, March 23, 2012

NOW I remember what being happy feels like

        It's 10:30pm on a Friday and I'm sitting in my apartment mere blocks away from the downtown San Jose nightlife working.  And not working on anything really "cool" or "sexy", at least not to most people, in fact it's about as dry as you can get.  I'm working on a comparative analysis doc for different software components of the continuous integration system we're putting together at work, and I'm having a blast.  Realizing this made me stop and think, not about the specifics of the task itself, but the implications of the situation.

        My friend and fellow tech artist Rob Galanakis made a blog post a while back regarding a change in jobs/life he had gone through recently, and I guess it's my turn now.  It's so funny, you know, I'm the guy that's always saying "Ah, it's just a job, work's just a passing phase, etc", but you know, looking back I spent so much time propping myself up that I don't think I realized I wasn't following my own admonitions.  Even when I was in L.A., I feel like, in retrospect, that I was the guy who was married to someone who he wasn't really that into, but publicly proclaimed how incredibly hot his wife is, in a really feeble attempt to make himself feel better about it.  Clumsy analogy, but you get what I'm saying, that is, i feel like i've been deluding myself (poorly, I might add) in an attempt to miss the bigger issue.

rdirty6sibs5
"That's delude yourself, dummy..."

        Maybe it's too early to tell, i've only been with Perceptual for a month, still in that honeymoon phase, but you know, I've never felt this sold after just a month.  That could just be because I'm so happy to be doing something different, but I think it's more than that.  I've told everyone this and it's true, this is seriously the hardest job I've ever had.  I'm having to pretty much draw on every bit of software development experience i've gathered in the last 12 or so years just to keep up, but you know, I wouldn't have it any other way.  Not being challenged at work is worst than death, and I mean GENUINE challenges, not the challenges of having to manage your managers.  I have ONE manager now who I report to, and by report to, I mean i tell him what i'm working on for the month and if it aligns with the groups charter, he gives me his blessing and doesn't bother me again for a month.  And when people ask my opinion and tell me to go do something, they mean, "we trust you to leverage your experience, so we're just going to get out of the way and let you execute".  Having been paid that lip service for the last 2+ years, this is a refreshing change.

        Ultimately, I know we're sitting in the pressure cooker.  In a year, we may not be around, but you know, I'm going to try my damndest to make sure that doesn't happen.  And you know, i feel like at this point, I actually can do that.  I feel like my contribution matters, I feel like my contributions are actually valued, I feel like I really can make a difference.  I haven't felt that at a job in...about 4 years.

        And you know, the most telling part overall is that at a high-stakes, high-pressure team inside of a large company like Intel, I've found that working environment I've been looking for.  The one the games industry claims to espouse, but really doesn't have a clue about how to create...that's interesting.

       Ah well...not gloating, not whining, just thinking.  It's good to be happy at work again.

smiley-face-wallpaper-016
Not quite like this, but close enough considering...

Tuesday, March 20, 2012

prototyper's toolbox

        Nothing special here, just a bunch of tools I've been using recently to do rapid prototyping.  It's such an alien concept to me to just throw a bunch of hardware and software into a blender and see what comes out, but it's pretty freakin fun.

banner
Unity
Probably needs no explanation, really good platform for prototyping as it's open enough to do things that aren't necessarily games.

oF_0
openFrameworks
Everything you need to create rich interactive applications under one roof. This is the model for what easy-to-use SDKs should be.

processing_cover
Processing
It's like an IDE for doing cool graphical stuffs. For you Pythonistas, check out pyprocessing as well.

vvvv
vvvv
Just found out about this the other day, node based madness.

ArenaLogo
OpenNI
Definitive framework for natural interaction (gestures, etc). A bit of a learning curve, but good to know.

        So what are some of your favorite rapid prototyping tools?  And if you say UDK, i swear i WILL hunt you down and punch you in the heart...

Saturday, March 17, 2012

git some!

        So i spent a decent chunk of the day mucking around with Unity projects and version control...well ok truth be told, it's actually been a few days, first with SVN and now with git.  Now, obviously you're not going to get all the DVCS benefits with content, but being able to branch scripts could be pretty cool.  We're not doing the sort of focused development that's really going to require that I don't think, but who knows?  I have some folks that I want to get up to speed a bit on Unity Scripting, so having safe sandboxes for them to script in is definitely a plus.

hydra
For every branch you merge, two more...

        Github's been unreachable for a few days and I needed to get up and functional pretty quick, so I went with bitbucket.  God knows I pull enough code from bitbucket, might as well join the revolution.  Also free private repos, which is good because I may or may not be posting somewhat sensitive content.  So, there are a few things we need to do first, I'm going to assume you have Unity installed, if not, think about doing that at some point ;)  Other than Unity, you'll want to get git on your machine too.  For Windows users, you'll want to grab two software packages:
  • Git for Windows (msysgit) - You'll want to grab the most recent Git-1.7.x-Preview file, at the time of this writing it's 1.7.9
  • TortoiseGit (optional) - If you've ever used Tortoise, you'll know what this is, otherwise, it's a very easy to use git windows shell extension.  Most of your day-to-day will happen here, unless you just love command lines...and there's nothing wrong with that.  I'll be covering how to do everything through TortoiseGit where possible here, but again, you can also do it all with the command line/GitBash.

git-logo Git_tortoisegit_logo

        Make sure you install msysgit first, as TortoiseGit obviously requires it.  I would uncheck the Windows Explorer integration option, but otherwise it's pretty straightforward.  TortoiseGit is equally straightforward, make sure you install OpenSSH vs the Putty option, as we'll want OpenSSH for bitbucket.  Speaking of, grab yourself a bitbucket account too, the free 5 user account is more than sufficient for what we're doing, unless you have a bigger team that's going to need access to the repo.  In that case, i might still only leave the number of users on the bitbucket repos fairly small and set something up to push to a different repo for a larger group (hmmm...another blog post??).  Be sure to refer to the bitbucket Docs on setting up git, as well.  You'll also need to make sure you're SSH friendly, which is a bit of a process.  Thankfully, it's also documented on bitbucket, and they do a much better job of explaining than I'm going to.  You can skip step 5, altho it is a cool little trick.  So the whole process then, looks like this:
  • Create a bitbucket account
  • Install Git for Windows - bitbucket Documentation
  • Install TortoiseGit (optional)
  • Setup SSH - bitbucket Documentation
  • Create a a git repo on bitbucket
  • Create a folder for your Unity project
  • Create a new project in said folder (don't import any Unity assets)
  • Enable external version control in Unity
  • Delete the Library folder in your project
  • Create a git repo in your Unity project folder
  • Commit the changes
  • Add your bitbucket URL to your local repo settings
  • Push the local repo to bitbucket<
        If you're familiar with TortoiseGit, this is a pretty straightforward process, but I'm guessing for alot of the readers, that's not the case, so in the next post, I'll step through the rest of the process (with pictures).  There are alot of steps, but they're not long involved steps, so the whole process actually goes pretty quickly.  I just got lazy because i've spent the last two days taking screen caps and I'm realizing that i need to rethink how i do my images for this tutorial, so Stay tuned...

recycled

Wednesday, March 14, 2012

from my cold, dead, hands...

        Well, ok maybe not quite that dark, but at least I look somewhat civilized when I'm using a Kinect, unlike well...


        Gah, ok ok, I promised myself i'd stop talking smack about former large software companies i may or may not have worked for.  I dunno it's weird, if you're like me, you always feel like once you leave a company in bad blood, you're forever professionally competing with them as long as you're in the same industry.  Not to put too fine a point on anything, or anything, hehe...

        Aaaanyway, so i feel like i'm at a good enough spot with KinectCam to drop some knowledge on some folks if you're interested in playing with this stuff yourself.  I gotta say I might have written Kinect off a bit early, I'm having a good time playing around with it and its ilk.  Granted, none of the games I've ever played on Kinect are doing the sort of stuff I've been doing, I sorta wonder what kind of crazy game you could come up with using the depth stream...Ok, sure, i'm just reaching there, but probably not, I bet someone (Jeff Minter) could come up with some insane depth stream game.  Actually I'm thinking of some sort of pattern recog app that i can feed my CD collection into or maybe just stuff the audio array, you know anything to avoid this sort of thing (Hmm, this would be funnier if you didn't have to squint for the text):

MLFail

        Alright, well, mindless self indulgence aside, let's get to this MS Kinect SDK wrapper.  Odd how this will probably be the only such post I make that's going to apply to this particular wrapper, therein crawling another bug up my nose, the fact that I wish MS would just release an official Unity wrapper/plugin/whatever.  Truth be told though, it probably wouldn't be terribly hard to compile down my own DLL, it's probably more of a time issue to make sure I expose and marshall everything properly, not to mention proper SDK.  But SDK specifics aside, all the maths herein should be pretty easy to re-apply since it's just a few simple space conversion tricks.

        So recap from my previous post, make sure you install the following softwarez:
        I've alluded to the camera control thing for a while, so let's just dive right in and get that taken care of, then I'll do a separate post about the Kinect Wrapper guts.  Again, that probably won't be terribly interesting, mainly it's just for me.

        Alright, so to get stuff working in this environment is pretty simple.  Import the KinectWrapperPackage and you'll have all the scripts and objects you need to get started.  I recommend just starting with a blank scene and build up for there, altho you could pop open the KinectExample scene and make blobs dance around wildly.

        Once you have a new scene (taxing task that that is), all that's required to make it Kinect-aware is to drag a Kinect_Prefab from the Project into the scene, as so:

kinectPrefabSetup

        This scene doesn't actually do anything yet, obviously, but you can use the Kinect_Prefab to setup your camera.  If you play the scene, you may notice the camera moving around a bit and focusing in odd places (hey buddy, my eyes are UP HERE), which can be fixed by setting some values in the KinectSensor attached to the Kinect_Prefab:

kinectSensorSetup

       I've been using the included Skeletal Viewer in the SDK to make sure i'm getting good coverage, once you find some numbers that work, you can always go into the KinectSensor code and set your own defaults if you want.

       Alrighty then, time to get the camera moving.  So, I went the hack route and used Unity's built-in MouseLook script as a jumping off point, as it only required a few minor changes to the camera transform code.  The part that requires a bit of work is getting Kinect skeletal numbers into Unity friendly numbers, and when i say "bit", i mean it, it's actually not that hard.  So the first thing I did was for my own sanity, and that's setup a little GUI indicator that tells me if I'm tracking or not.  Polling the camera for the depth image in this wrapper is a bit annoying because you have to restart Unity every time you stop running.  Might be a fun project to figure that out and re-implement depth and color display as a GUI texture.  Something to keep in mind for my next project.  Anyway, so verifying tracking is pretty simple, i put this all in a script called FromBonePosition:


        Pretty straightforward, we're just pulling some functionality from the included SkeletonWrapper class, which...well, wraps Skeleton functionality.  Truth be told, I feel like this class is a bit of an extra, but whatever.  Anyway, breaking down the code:
  • Really all pollSkeleton() does is call the native NuiSkeletonGetNextFrame(), i'll leave what that probably does as an exercise to the reader.
  • trackedPlayers[] is interesting.  The details of how we get there are a bit unimportant at this stage, the important note is that we're grabbing an enum that tells us what we're tracking.  KinectWrapper initializes the values in trackedPlayers[] to -1 which indicates that we haven't even acquired a skeleton.  The possible enums are NUI_SKELETON_NOT_TRACKED, NUI_SKELETON_POSITION_ONLY, and NUI_SKELETON_TRACKED.  So if either of the two players is acquired, we can say we've tracked.
  • So once we've tracked, we spit out a string to the UI that tells us what's up.  
        Easy-peasy, yeah?  Well fear not, most of the Kinect polling is that straightforward, it's just flipping the numbers that takes some code.  So let's continue by polling the Kinect for a bone position, which we'll use to drive our camera.  Yeah, yeah, i know, classic Kinect "Hello, World!", recreate a mouse driver, but it's an easy place to start.

        Joints are pretty easy to poll, as the SkeletonWrapper class provides a few different arrays that contain joint information and the SDK provides us a convenient enum for specifying which joint to query for data.  So let's add some code to our class:


        Simple, efficient (like the body itself), we add a variable to hold the right hand's position, then we pull a value from the bonePos[] array on the SkeletonWrapper.  You'll notice bonePos[] is a 2D array, the first id is which skeleton to poll (we might be tracking two players), the second id is pretty obviously the joint we want to get the position for.  You can see the full enum definition in the KinectInterop class if you should ever want to poll a different joint, or if you want to update a bunch of joints, or a whole skeleton, or make tea, or something like that.  We output that number to the GUI too, a) for verification purposes and b) so we can set some other numbers later.

       By now we pretty much know all the kinect stuff we need to know to get our camera moving, so let's get our camera look on.  Here's our member list (with comments!):


        And now...Maths!  The basic theory behind what we're doing is pretty simple:
  • Find the distance (as a percent) between the minimum and maximum x (oldX) that we get from the Kinect
  • Repeat for y
  • Cast that into (-1,1) so it mimics Unity's mouse input values
  • Rotate the camera
  • Profit, or at least impress your friends (or your mom)
        This is pretty simple, we can use some of the Unity built-ins to do all the scary maths for us, i'll just post the relevant code here:

       
        So yeah, we're basically parroting MouseLook's rotation setting and replacing Input.GetAxis() with the data we poll from the Kinect.  This SHOULD work, as it's almost verbatim the code I was using minus things like aim assist and project specific passing.  Play around with the OldX and OldY values, the Kinect SDK says that values for X can lie between (-2.2, 2.2) and Y can be (-1.6,1.6), but that's going to vary a bit depending on where you're hanging out.  For example, the values we ended up using were OldX = (-0.3,0.7) and OldY = (1.3,2.4).  Drop a texture onto the gui and set the rectangle based on hand position, that's a pretty simple way to get some good debug.  It's not too tricky to cast your value into screen space, try it out!
        I think that's all I got for now, going to try and port this to OpenNI/PrimeSense just for fun this weekend too.

Sunday, March 11, 2012

to the buttery depth(array)s

UPDATE(20120313):
{
You also need this plugin as well.  If you notice Unity throwing DLLNotFoundExceptions and grousing about MSRKINECTNUI.DLL, you probably don't have it installed.

KinectPlugin.zip
}
        Man, I wish I could talk about all the awesome stuff I saw at GDC, but sadly, most of my GDC was prepping for my talks, running around organizing events, some modicum of time lost commuting from San Jose to San Francisco, and trying not to suffer a total breakdown overall.  Nah, I make it sound worse than it really was, i think the most taxing part of the conference had...not much to do with the actual conference, tho it was def tangential.  As much as I'd like to whine about it here, that's another post for another blog for a much more inebriated state of mind.  I should have my GDC slides up in the very near future, so stay tuned.  I may do some highlights here or something like that.

        What I can show you is these awesome pictures from some of the Tech Art related events.  The trend of Tech Art presence of GDC escalating continued with 2012 in spades, as the photos from Boot Camp and Decompression can attest to:








WP_000293
Boot Camp about to get underway, another great crowd!

WP_000301
Decompressing after the show Friday...

WP_000298
...started with about 15 people back in 2010(?)...

WP_000302
...logged about 50 in 2012.  Not bad!

        But now it's back to the real world and demo crunch, although I have to admit this hasn't been a horrible demo crunch at all, and in fact I can probably blame my own poor scheduling.  Admittedly, I can also blame the fact that I didn't think the problem through and didn't write my own Kinect wrapper back when I started.  Yeah yeah, there are a few out there, but I'm really not sold on any of them.  I used the zigFu one for a bit, but I think i'm philosophically opposed to using that one.  It crashes Unity, it's a known bug, and it seems the answer is "It'll be fixed when our commercial version comes out".  I appreciate everyone needing to make a buck but...no.

        So we decided to take a look at some of the MS SDK based solutions, despite the Microsoft stuff seeming to be a mixed bag just because it feels like there's been so much splintering between SDK releases.  I did find a version from the good folks at Carnegie Mellon's Entertainment Technology Center (ETC), an organization at which I've had the pleasure of speaking, as well as the opportunity to work with many of alums.  Having failed miserably to get a hand tracking based solution working, we decided to focus a bit on skeletal tracking instead.  Now, I feel more and more like this was a failed design decision more than a fail of technology, but at this point, I'll try anything once. Just need to get something usable by Tuesday.

        Quick aside, I really feel at this point like it's my duty to write a good wrapper for Kinect to Unity.  I feel like all the ones that have been made available to date have been written by people who have more experience using SDKs and less experience developing SDKs.  This could just be the fact that there hasn't been alot of feedback, but anyway, I digress...

        So head over to the MS Kinect - MS SDK page at ETC's Unity3d web and grab a few files.  You'll need at minumum:

  • Microsoft Kinect SDK - Grab the one on the page, as this is a beta release.  From what I've heard, the projects don't work with the official release.  Also, i've only loosely verified this, but the drivers in this package only work with XBox 360 Kinects.
  • Kinect Wrapper Example Project - Basic stuff to get you going without too much other overhead
  • DirectX June 2010 SDK - Some of the samples in the Kinect SDK require this.  If you get a redistributable error on installing (s1023 or something like that), uninstall all VS2010 redistributables later than 30139 and try again.  Should be good to go.
  • DirectX End-User Runtime - I believe some of the Unity samples require this

        Install everything in no particular order, plug your Kinect in, and you'll be good to go.  Windows Update might toss you some noise about looking for updated drivers (because like everything MS, it thinks you're stupid), but that probably isn't too big a deal.  Next installment we'll talk about some of the flow in and out of the Unity Kinect wrapper...Fun fun!  I'm using this mainly as a way to step through some key bits of the Nui namespace in prep for more kinect/depth camera work coming up.  I'm working on the final kinect camera driver the next two days, so I'll finally be able to toss that up as part of the wrapper exploration.  It's a fairly simple jumping off point so...anyway.  Sleep comes.