Researchers at MIT have taught artificial intelligence to identify people through solid objects using radio signals that refract off of surfaces and around corners. This new technology could be used for military, police, healthcare, disaster relief and augmented reality gaming.
They started by compiling thousands of images of people walking. talking, sitting, opening doors using a wireless device and camera. A process not unlike machine learning where programmers go through thousands if not millions of examples be they photos or otherwise in order to triangulate a general archetype of learning for the machine to adapt to.
Except this particular achievement was made by virtue of neuro-morphic computing. A concept similar to machine learning however it is more concerned with actively simulating the human brain in it’s more interconnected, non-binary means of processing information.
Using collected reservoirs of data the system is better able to understand the association between certain radio signals and the human body which it portrays as stick figures – a long shot away from a full depiction of the human form but nonetheless a startling achievement.
The neuro-morphic device is even able to grasp postural subtleties and identify a person by virtue of their gait- a particularly useful feature for intelligence agencies both known and unknown.
Since the technology can be linked up wirelessly some have proposed that it could be used in advanced warfare techniques. A soldier or special operative may wear augmented reality glasses that allow them to peer through walls and get the upper hand. Alternatively a fireman may do the same, so the technology is certainly malleable and diverse as far as how it could potentially be used.
The next step for researchers is improving the device to more accurately portrayal of the full spectrum of human anatomy so that they could potentially even identify nervous, injured or violent people by virtue of subtle indicators like shaking hands, or particular gestures.