Posts tagged with “tracking”
September 28, 10
Boiling the soft egg aka Webcam Stories
If you came here after seeing my FOTB talk – that's great! Thank you! Make sure you've seen the [Black or White vs Machine Vision mash-up](http://play.blog2t.net/files/black-or-white/) first. Then head to the HiSlope GitHub repo to [download/fork the HiSlope toolkit](http://github.com/og2t/HiSlope) and have fun. **Yes, HiSlope is opensource!** I am going to add more examples, couple of cool and previously unreleased Pixel Bender filters and more tutorials/documentation soon, so make sure you [follow me on Twitter @blog2t](http://twitter.com/blog2t). Please send comments and feedback, I'd really like to hear from you! Any Q&As - catch me at the conference.
August 04, 10
Webcam stories
Pressure. It's all coming along. This Saturday, the Art Tech Seminars at Assembly Summer in Helsinki, Finland, 7th of August at 2.00PM. [WebCam stories](http://www.assembly.org/summer10/seminars/sessions#webcam). If you want to experience the potential of Flash and a typical webcam to achieve face recognition, eye tracking and motion detection, using real-time image processing, come and see. [I will share](http://www.assembly.org/summer10/seminars/speakers) how we can use our often dusty and under-used webcams to make our eyes less tired after hours of gazing into pixels. Another good news is that HiSlope toolkit will be finally released there. Although the Flash Player is recently being eschewed in favour of emerging HTML5+JS technologies (most notably by Apple), I will prove it still offers [decent processing power](http://apiblog.youtube.com/2010/06/flash-and-html5-tag.html) and support for external devices such as webcam and microphone (which HTML5 is not yet capable of) and can significantly contribute to improving real human-computer interfaces. Thanks to Justyna for a cheeky title :) PS I also got the chance to [speak at the Elevator Pitch](http://www.flashonthebeach.com/sessions/index.php?pageid=2999) this Autumn, which is part of Flash on the Beach conference in Brighton, UK (26-29 September 2010) – my very own 3 minutes! 10:51 PM | posts | 2 Comments | Tags: demoscene, as3, tracking, talk, presentationAugust 02, 09
Machines are looking for Michael Jackson
Today passes [my 5th year of living and working in the UK](http://www.flickr.com/photos/og2t/sets/72157621798561063/detail/)... feeling in sort of a nostalgic/festive mood, did some cool VJing last nite and decided to spend semi-hangovery afternoon doing some (softcore) flashcoding. I got some good feedback and suggestions (thanks you know who) to my recent [Terminator Salvation "machine vision" experiment](http://play.blog2t.net/terminator-salvation-realtime-machine-vision-as3/) and decided to explore that area a bit further. This time I've managed to add "the real face tracking" ported from OpenCV by Masakazu "Mash" Ohtsuka (with some great optimizations by Quasimondo) to my video processing framework (codename **HiSlope**) which should be hopefully released within a couple of weeks (still need to do some important/major refactoring). Follow me on [twitter.com/blog2t](http://twitter.com/blog2t) for updates. So I was looking for some perfect video to use for testing... couldn't think of anything really. Then suddenly the spirit of Michael Jackson (RIP) came to me and whispered into my ear: __"Black or white?"__ – and it was all clear then :) Enough words, **click the image to sing along**.
July 19, 09
Realtime Terminator Salvation "Machine Vision" fx
Have you seen Terminator Salvation yet? There's a bunch of cool visual effects developed by Imaginary Forces, it shows the [world as seen by machines](http://www.imaginaryforces.com/featured/5/539). There's a lot of object tracking going on there, I was thinking whether I could recreate the whole thing just in pure AS3. And, well, here's the result (which I am actually very proud of) ;-) **Click image to activate, wait for the video to buffer (1.6MB) press EDIT button to play with the filters (in full screen mode).** Enable your webcam (if you have one) and play about with sliders and checkboxes – try if your face can be tracked too – but then watch for evil Terminators – they'll come and get you! ;-) Btw. you can turn histograms for every filter - thanks to [Quasimondo](http://www.quasimondo.com/) for the code. This is a part of the whole video filter framework I am developing just now, the inspiration came from [Joa Ebert's Image Processing](http://blog.joa-ebert.com/imageprocessing-library/) library (as far as I know, he's cooking a complete rewrite). **The full source code (including Pixel Bender kernels and examples) will be soon released on Google Code and will feature face/eye tracking/gestures and few other things (surprise!)** A lot of people are very sceptic about the whole eye tracking idea, they don't believe it's precise enough to make any use of it – I will prove that it is, and it works! (just watch closely how it tracks my eyeballs on the video!) My approach is to make everything as much simple as I can. If something cannot be achieved using this rule, I either abandon the idea completely or look for a simpler solution. The face tracking is actually relatively simple, I will briefly describe each step: 1. **Brightness/Contrast (HBSC filter)** - initial adjustment of the input (will be replaced with auto levels) 2. **Motion Capture** - works the same way as the "movement watchdog" that's implemented in brains of almost all animals (including humans) in order to survive – it finds the rectangle area of the all the differences between two frames. This step could be much more complicated (i.e. I might use face detection or [Eugene's motion tracker](http://blog.inspirit.ru/?p=305) once he decides to release the source) but simple motion capturing is good enough for Machine Vision experiment here. 3. **Shape Depth Detector** – finds centres of colour local maximums, **play with the levels slider carefully to get more details** – it works by posterising the image then does a [very fast blob detection](http://play.blog2t.net/fast-blob-detection/) on every result colour – thanks to [Kynd](http://www.kynd.info/dev/2009/05/flash-finding-objects-from-a-photo.html/) and [Kampei Baba](http://faces.bascule.co.jp/motiondetection/) for sharing this technique. 4. **Color Grading** – identical to Photoshop's Gradient Map – usespaletteMap
to remap the colors.
5. **Machine Vision** – the final and the most complicated filter – utilises [Delaunay triangulation and Voronoï diagram by Nicoptere](http://en.nicoptere.net/?p=10) – it's fast enough to process it realtime (thanks for sharing!). Then it plots the points and lines and applies my spotlight effect class (another blog post on that subject coming soon) to achieve the final look. Btw. I've found another [very cool experiment using Delaunay for face triangulation by Neuro Productions](http://www.neuroproductions.be/experiments/fle_delaunay_triangulation/).
Other thanks goes to Mr. Doob for his stats widget, Bit-101 for the Minimal Comps and [SubBlue](http://www.subblue.com/blog) for lots of inspiring technical discussions we've had during lunch breaks at [tictoc](http://www.tictocfamily.com).
Feel free to leave any comments questions and suggestions, I am really interested what you think. You can also follow my [blog updates](http://twitter.com/blog2t) on Twitter or RSS. It's getting very late now, so I better go.
**UPDATE: I am giving up, it's just too hard to track human's head, I gonna do next experiments with chickens:**
**UPDATE 2:**
If you liked this experiment, make sure you see the [new version](http://play.blog2t.net/realtime-as3-face-and-eye-detection-with-michael-jackson/).
11:53 PM
|
posts
|
5
Comments
|
Tags: tracking, experimental, fx, as3, pixelbender, twitter, hislope
June 17, 09