Posts tagged with “experimental”, “fx”, and “tracking”

September 28, 10

Boiling the soft egg aka Webcam Stories

If you came here after seeing my FOTB talk – that's great! Thank you!

Make sure you've seen the Black or White vs Machine Vision mash-up first.

Then head to the HiSlope GitHub repo to download/fork the HiSlope toolkit and have fun.

Yes, HiSlope is opensource!

I am going to add more examples, couple of cool and previously unreleased Pixel Bender filters and more tutorials/documentation soon, so make sure you follow me on Twitter @blog2t.

Please send comments and feedback, I'd really like to hear from you! Any Q&As - catch me at the conference.

HiSlope

Update: I've added the video from my pitch.

12:16 AM | | 2 Comments | Tags: , , , , , ,
August 02, 09

Machines are looking for Michael Jackson

Today passes my 5th year of living and working in the UK... feeling in sort of a nostalgic/festive mood, did some cool VJing last nite and decided to spend semi-hangovery afternoon doing some (softcore) flashcoding. I got some good feedback and suggestions (thanks you know who) to my recent Terminator Salvation "machine vision" experiment and decided to explore that area a bit further.

This time I've managed to add "the real face tracking" ported from OpenCV by Masakazu "Mash" Ohtsuka (with some great optimizations by Quasimondo) to my video processing framework (codename HiSlope) which should be hopefully released within a couple of weeks (still need to do some important/major refactoring). Follow me on twitter.com/blog2t for updates.

So I was looking for some perfect video to use for testing... couldn't think of anything really. Then suddenly the spirit of Michael Jackson (RIP) came to me and whispered into my ear: "Black or white?" – and it was all clear then :)

Enough words, click the image to sing along.

Terminator Machine Vision plus Michael Jackson's Black or White mashup by Og2t

I found this video particularly challenging – with loads of head banging and different races (skin tones, facial hair etc.) – which actually makes a perfect source material for testing.

And again, I am really surprised by the final result – it's still not the fastest (I am getting 20 FPS in browser and about 35 FPS in standalone projector) but the main task is achieved.

The filters' settings were optimized for the video, so if you switch to the webcam mode you probably won't get your eyes detected (there's still a bit of work for me to do on it – that's why I am not releasing the sources yet), but do try to play with sliders, especially with HSBC (no, not the bank, it's Hue, Saturation, Brightness, Contrast) – enable it by clicking the checkbox on the left) and Eye Finder – enable debug and adjust blur and fuziness.

So, where's Michael? He's wandering somewhere in that black puma's outfit, fighting with racism. Watch out!

05:05 PM | | 21 Comments | Tags: , , , , ,
July 19, 09

Realtime Terminator Salvation "Machine Vision" fx

Have you seen Terminator Salvation yet? There's a bunch of cool visual effects developed by Imaginary Forces, it shows the world as seen by machines. There's a lot of object tracking going on there, I was thinking whether I could recreate the whole thing just in pure AS3. And, well, here's the result (which I am actually very proud of) ;-)

Terminator Salvation Machine Vision in AS3 by Og2t

Click image to activate, wait for the video to buffer (1.6MB) press EDIT button to play with the filters (in full screen mode). Enable your webcam (if you have one) and play about with sliders and checkboxes – try if your face can be tracked too – but then watch for evil Terminators – they'll come and get you! ;-) Btw. you can turn histograms for every filter - thanks to Quasimondo for the code.

This is a part of the whole video filter framework I am developing just now, the inspiration came from Joa Ebert's Image Processing library (as far as I know, he's cooking a complete rewrite). The full source code (including Pixel Bender kernels and examples) will be soon released on Google Code and will feature face/eye tracking/gestures and few other things (surprise!) A lot of people are very sceptic about the whole eye tracking idea, they don't believe it's precise enough to make any use of it – I will prove that it is, and it works! (just watch closely how it tracks my eyeballs on the video!)

My approach is to make everything as much simple as I can. If something cannot be achieved using this rule, I either abandon the idea completely or look for a simpler solution.

The face tracking is actually relatively simple, I will briefly describe each step:

  1. Brightness/Contrast (HBSC filter) - initial adjustment of the input (will be replaced with auto levels)
  2. Motion Capture - works the same way as the "movement watchdog" that's implemented in brains of almost all animals (including humans) in order to survive – it finds the rectangle area of the all the differences between two frames. This step could be much more complicated (i.e. I might use face detection or Eugene's motion tracker once he decides to release the source) but simple motion capturing is good enough for Machine Vision experiment here.
  3. Shape Depth Detector – finds centres of colour local maximums, play with the levels slider carefully to get more details – it works by posterising the image then does a very fast blob detection on every result colour – thanks to Kynd and Kampei Baba for sharing this technique.
  4. Color Grading – identical to Photoshop's Gradient Map – uses paletteMap to remap the colors.
  5. Machine Vision – the final and the most complicated filter – utilises Delaunay triangulation and Voronoï diagram by Nicoptere – it's fast enough to process it realtime (thanks for sharing!). Then it plots the points and lines and applies my spotlight effect class (another blog post on that subject coming soon) to achieve the final look. Btw. I've found another very cool experiment using Delaunay for face triangulation by Neuro Productions.

Other thanks goes to Mr. Doob for his stats widget, Bit-101 for the Minimal Comps and SubBlue for lots of inspiring technical discussions we've had during lunch breaks at tictoc.

Feel free to leave any comments questions and suggestions, I am really interested what you think. You can also follow my blog updates on Twitter or RSS. It's getting very late now, so I better go.

UPDATE: I am giving up, it's just too hard to track human's head, I gonna do next experiments with chickens:

UPDATE 2:
If you liked this experiment, make sure you see the new version.

11:53 PM | | 5 Comments | Tags: , , , , , ,
June 17, 09

Dried eye syndrome

Few days ago I saw this eye blinking detector written in JavaScript using HTML5 and canvas (Firefox 3.5 needed) and I set myself a challenge of writing similar one in AS3 from scratch during my lunch break today.

Actually, it turned out to be much simpler than I had initially thought!

Click image to activate, hold your head still and blink your eyes. Hit space to toggle motion areas visibility.

In case it's not working, move your head closer/further away from the camera.

The SWF has 2.5 kilobytes, no heavy calculations are needed to detect eyes blinking. Here's how it works:

  1. Detect all motion areas (hit space to see them)
  2. Apply blur filter to get rid of the noise
  3. Apply threshold to get 1 bit image
  4. Use blob detection algorithm to find blobs
  5. Reject all blobs that are either too big or too small
  6. Draw bounding boxes around blobs that meet the size criteria

Currently, the code is a mess (or I would rather call it experimental state) so no source codes yet.

But I am planning to improve this a lot, i.e. make it possible to track the head movement and position and maybe even eyes. There is also an AIR app stopping your eyes drying coming out soon, meanwhile make sure you'll read a few tips on that very subject.

10:15 PM | | 2 Comments | Tags: , , ,