Microsoft Kinect product is used as novel input interface for 'XboX 360' game console. Kinect is a perfect example for embedded vision system. Eight million Kinect sold, just within two months of the launch. It shows the power of embedded vision system.
Embedded Vision can be defined as a microcontroller based system that incorporate vision sensor (ex. camera) and able to understand the environment through the sensor. Digital camera is a microcontroller based system that contain vision sensor . Outcome of digital camera will be pictures. But camera is incapable of interpret the pictures it took. So digital camera is NOT a embedded vision system. System on Chip (SoC), Graphical Processing Unit(GPU) , Digital Signal Processor(DSP) and Field Programmable Gate Array (FPGA) can be used in place of Microcontrollers. General purpose personal computer strictly No-No. Smart phones, tablet computers and surveillance system can be upgraded to embedded vision system.
Applications:
- To find a child that is struggling in swimming pool.
- To find intruder(s)
- To detect whether lane change has occurred or not and if occurred then warn driver of automobile.
Embedded vision system will carry out following three functions
1. Image acquisition and optimization
- Noise reduction, image stabilization and colour space correction
- Outcome of optimization stage, need not be aesthetically pleasing pictures but they should be easily processable by further stages.
2. Building objects through pixels
- First level operations used are Image filtering, Haar filters, Edge detection, Histogram, Optical flow, Erosion and dilation, Thresholding.
- Second level operations used are Connected component labelling, contour tracing, clustering and hough transform
3. Object analysis and interpretation
- object movement tracking, classification, obstacle detection
- Kalman filters as predictive filters, hidden markov models correlation, finite state models, neural networks All the above operations are computation intensive. Extensive DSP algorithms are also used.
Embedded Vision Alliance is a organisation that look into every aspect of embedded vision. It's website address is http://www.embeddedvision.com/ . In the website go to 'Industry analysis' in that go to 'Market Analysis'. This section seems to be very informative. 'News' section gives a reasonable amount of information. 'Technical Articles' section needs registration. Most of the website content, directs us where information available than providing them. There are no advertisement section. Website has professional look. It is worth visiting the site.
The following link gives how much importance IEEE gives embedded vision technology.
The following link gives how much importance IEEE gives embedded vision technology.
A note on DSP
linear filtering is an convolution operation. After the advent of Fast Fourier Transform (FFT) it became desirable to transform the signal into frequency domain and multiply with desired frequency response (In time domain it is called impulse response) and transform back the resultant signal in time domain. If it is a image signal will be transformed to spatial-frequency domain and after multiplication, resultant image will be converted back to spatial domain. FFT was proposed by Coolie and Tukey in 1965. It was too mathematical. In 1967 Tom Stockham and Charlie Rader gave flow graph representation for FFT. I thing it is called 'Butterfly diagram' nowadays.
Courtesy:
- Eye Robot: embedded vision the next big thing in DSP, by Brian Dipert and Amit Shoham, IEEE Solid State Circuits Magazine, Vol. 4, No. 2, Spring 2012. [doi : 10.1109/MSSC.2012.2193077]
- A note on DSP from the above magazine issue page number 36
- Special thanks to Mr. B. Srinath