Using Time-of-Flight Cameras for 3D Sensing
There was so much buzz this past year on various and unusual reality-based applications. These included: augmented reality, virtual reality, immersive reality, haptic and domotics reality, digital-out-of-home (DOOH), etc. This whole reality-based phenomena can be traced back when we had simple game-consoles. These started with wired joysticks then transitioned to wired position-type sensors (ie.Wii) - where the controller senses position by its orientation and acceleration in space. We also have been exposed to a number of different sensing technologies: laser based, infra-red based, combination of infra-red and laser, laser and RGB, optical sensors, etc. Some of these have found their way in both consumer and commercial use. Take gaming as example. Microsoft’s Kinect is quite popular today in the gaming space, reported to be in the billions of dollars. Non-gaming markets, like hospitality, entertainment, retail, airports, interactive kiosks etc. all have been reported also showing double digits growth...all way to 2015! Let's look at Time-of-flight (TOF) based sensors.
Time-of-Flight (TOF) is one of the simplest and oldest technologies around. Many instruments today use this sensing technique in spectrometry devices, ultrasonic flow meters, optical flow meters, even gesture control devices. The concept of TOF is simple. A TOF-based camera emits an infra-red light (like a TV controller), hitting a target. The light then bounces back to the sensor which then calculates the distance that it took to travel back and forth.
The individual pixelated data produced by these infra-red lights bounce off the target, and are then extrapolated (via middleware) to produce the target’s equivalent avatar. See above.