A primary component of such a system is non-intrusive implantable sensors that can gather important body statistics. As part of a cyber-physical system initiative, we are collaborating with nano-electrical engineers at University of Arkansas, Fayetteville to build such an infrastructure. In particular, we have resources under the flagship of the High Density Electronics Center to design and fabricate small non-intrusive bio-nano sensors for healthcare diagnostics.
We are addressing several issues that will make such a cyber-physical system practical. To detect anomalies in real time, we need to transfer and process large volumes of health monitoring data. We are building surrogate devices that download data from nano-sensors and stream it to our back-end server for processing. Further, we are designing efficient machine learning algorithms that can process large volumes of data to infer disparities.
Another important design consideration specific to the healthcare application domain is security. Since the data is highly confidential and the surrogate and sensor devices are computationally weak platforms, we are working on light weight security protocols to transfer data from the sensor to the cloud.
Quadriplegia and paraplegia are common disabilities that result from injuries to the spinal cord, neuromuscular disorders such as cerebral palsy, multiple sclerosis, and strokes. Patients suffering from quadriplegia have varied levels of impaired motor movements, and in many cases are unable to speak, breathe autonomously, and experience loss of sensation. Performing quotidian tasks like controlling home appliances is challenging for quadriplegics. Gesture recognition to perform tasks is a plausible remedy, but available solutions use obtrusive sensors and often assume considerable limb movement. These solutions are not applicable to paralysis patients. To address this problem, we present the design, implementation, and evaluation of a multi-sensor gesture recognition system that uses nonintrusive wearable sensors and requires minimal limb movement. We have designed an EOG-based headband built out of textile electrodes and a wearable glove that uses flex sensors and an accelerometer to infer eye and hand gestures. The gestures are used to control appliances remotely in a home setting. We present simple and robust gesture recognition and sensor fusion algorithms to accurately detect gestures on computationally weak end-user devices such as smartphones and micro-controllers. We have prototyped and thoroughly evaluated the gesture-based home automation system, and show that it has good accuracy, latency, and energy consumption characteristics.