How do robots see the world? How do they gather meaning from our streets, cities, media and from us?

This is an experiment in found machine-vision footage, exploring the aesthetics of the robot eye.

Hat tip to:

Uses material from the following sources:

Line Queueing Analysis –
Tracking in a Parking Lot –
Vehicle classification –
Video Analytics Identifies Tailgating –
High density crowd tracking –
Mono-Camera based Road Marking and Lane Detection – Vacek, Dillmann 2007 –
Human Tailgating –
Crowd Analysis and Tracking –
Exit Lane Analysis –
Tunnel Intrusion Detection –
Traffic Counting and Congestion –
Car Counting –
IriSyS IRC People Counting Cameras –
Eye-Tracking of Outdoor Advertising –
iOnRoad Demo –
Eye Tracking by SMI: Bee Swarm TV Commercial –
Eyetrack and Heatmap using Computer Vision based Human Visual Attention model vs Real Eyetracking study – catalogue Carrefour –
Real Time Face Tracking with pose estimation on tv clips – OPENCV –
Eye Tracking, Gaze Tracking –
Road / Traffic Sign Recognition –
Traffic signs detection and recognition –
Real Time Pedestrians Tracking with MOTION DETECTOR – OPENCV –
Face tracking stereo system video sample 1/3 –
Traffic Counting and Congestion –
India Driving – Computer Vision Challenge –
Vision based Navigation and Localization –
ENCARA2 (Face detection), 2008 –
Face detection v6 – TV clips –
Multiple car tracking with blob tracking & MHT –
CellTracker: program for automated cell tracking on biological images –
Optical flow demo –
Choppy output 2 –
Face Tracking with OpenCV –

“Cold Summer Landscape” by Blear Moon (

Cast: Timo

Tags: robot readable world, ai, computer vision, analysis, facial recognition, motion tracking, technology, complexity, systems, newaesthetic, machine vision, face tracking, cameras, algorithms, robots, bots, seeing, sight and looking

feb 06, 2012 12:02 by lhli.