NodeNs have developed a ‘smart’ centimetre-precise millimetre-wave real-time locating sensor, that incorporates novel AI capabilities for active monitoring. The sensor includes ‘edge-AI’ technologies on the embedded chip, with data transmitted via Wi-Fi to the hospital command centre / IT system. The plug-and-play sensor is plugged into an outlet and unlike current systems requires no rewiring or ward closures. It is novel in its ability to track users passively without use of wearables. This can be used for: room utilisation, patient flow optimisation, fall detection and heart / breathing rates.
Millimetre-waves (mmWaves) are the foundation of our sensors, as well as many other future technologies, such as next-gen Wi-Fi (named WiGig), WirelessHD, 5G mobile, and self-driving cars.
But what are mmWaves? These are electromagnetic waves with frequencies approximately between 30-300 GHz; i.e. they are really, really high frequencies. In comparison, current mobile phones (4G LTE) and Wi-Fi operate at a maximum of around 5 GHz. So mmWaves allow much, much faster wireless speeds, better accuracy, and smaller devices.
And in comparison to optical or infrared technologies (IR, LIDAR, cameras), mmWaves can scan through smoke, steam, and other obstructions, allowing operations in a far greater range of environments. They also allow unprecedented privacy, as there is no camera recording anyone’s images.
One of the underpinning technologies behind our sensors is a technique called Multi-Input Multi-Output (MIMO) radar. MIMO makes use of antenna arrays and the proliferation of extremely powerful processing capabilities to vastly improve the performance of wireless devices.
As a technology, MIMO is a key component of recent data transfer protocols, vastly improving speeds of the latest Wi-Fi and mobile technologies. It has also very recently been used to allow a massive improvement in radar capabilities, enabling ultra-high resolution location tracking at previously unheard of costs.
How will the ultra-high resolution radar data help track people and assets? We are developing novel algorithms to give our sensors ‘smart’ capabilities, allowing them the ability to give context to the scenes they capture, and allowing them to make decisions without human-input.