Interpreting Sensory Data

Eye
We developed biometrics software to identify a person based on a scan of their retina

A wide variety of tasks center around interpreting data from a sensor or set of sensors. Sometimes this data must be interpreted in real time, as in the cases of monitoring ECG data for signs of heart problems, speech recognition in a dialogue system, or computer vision for a robot on a factory floor. Other applications allow the data to be interpreted off-line, such as finding tumors in PET scan data, or looking through satellite images for signs of enemy troops.

Data from any sensor must be interpreted before the sensor is of any use, and there are times when this interpretation is trivial. However, cameras, microphones, sonar, radar, EEG, MRI, and many other kinds of sensors all provide enough data that interpreting them automatically is rarely trivial. Fortunately, researchers in artificial intelligence have been developing means of interpreting these and many other kinds of sensors, and this work has payed off. For a broad variety of problems, the techniques now exist to reliably interpret data from such sensors. These techniques allow computers to interpret the data without human intervention, more reliably than humans, much faster than humans, and much cheaper.

If you have a problem that involves interpreting sensory data, please contact us about it.