Wearables: Tracking, Computing and Making Sense of Context Awareness
Wearable devices are moving quickly from concept to reality based on evolving technologies such as sensor fusion, communications and vision. Wearables have evolved to become perceptually aware agents that learn through users’ experiences, activities and environment through observation via diverse physical sensors. Bluetooth connectivity and low-power wireless communication are driving demand for wireless wearable devices. In fact, Juniper Research has predicted that global retail revenue from smart wearable devices will reach $53.2 billion by 2019.*
The design and development of wearables is not only making use of existing technology, but also prompting the development and evolution of others. For example, wearables require ever-smaller components, lower power consumption and an ever-increasing number of embedded features. MEMS sensors, as a result, are becoming even more highly integrated with multiple sensors such as an accelerometer, gyroscope and magnetometer in a single package.
It’s not just individual sensors that make up wearable technology. In addition, algorithms, camera technologies and body area networks (BANs), a collection of miniature sensor and actuator nodes, are also integral to the development of wearable devices. Battery capacities need to expand for “always on” use, and system designs must focus on overall power consumption while being mindful of security, privacy, precision and data storage. Sensor fusion solutions are evolving to feature greater embedded power management with minimal overhead, enabling better performance in wearables.
Dynamic wearable models must recognize when an individual has put on the device, and perhaps more importantly, that individual’s location. Technologies involved include GPS, satellite, mountable sensor boards and vision algorithms. Measurements span the physical environment, time of day, shapes, colors, texture, and temperature, as well as the person’s mental state and the presence of liquids and light.
Applications — From Digestible Devices to Disaster Management
To date, health and fitness applications dominate the wearables industry, but entertainment applications are gaining momentum and perceived importance. Applications will expand to include:
- Wrist-based apps such as the Android Wear, the Moto 360 smart watch and the Samsung Gear Fit primarily provide fitness tracking now, but are expected to soon incorporate photo tracking, notifications and journaling capabilities.
- In the automotive arena, key fobs can now authenticate users and start an engine. Soon, wearable devices are expected to be able to provide hazard data as well as information on the position of a car, door lock status and fuel levels.
- Smart fabrics and high-tech clothing with embedded solar panels will be capable of charging smartphones and other wearable devices.
- Wireless body area networks (WBANs) will be used for disaster management, employee safety in roadside and construction work environments, mobile health monitoring and ambient assisted living apps that detect negative changes in resident behavior.
- Motion tracking using accelerometers, gyroscopes and other motion sensors will increasingly find its way into activity monitors, fitness devices, pedometers, golf and tennis swing analysis tools and sports kinetics applications.
- Additionally, recently published Apple patent filings describe a system of interconnected sensors — including wearable devices — that work with an iPhone 5 to monitor activity levels, dynamically set or cancel alarms and manage push notification settings, among other automated tasks.
- Future applications of wearables are also predicted to include ingestible devices, context aware computing, military mission reporting, and greater use in disability and impairment applications to track status and improvement.
As wearable devices grow in popularity, research continues into how to best obtain, calculate and assess the sensor measurements involved. Users require wearable devices that are minimally obtrusive, yet reliable, and that best record movement or physiological signals. Efforts continue to develop a system, or systems, that gather data from multiple wearable sensors and interpret that data in a meaningful way. And engineers continue to attempt to design and implement algorithms that can extract relevant information from the data recorded by these devices.
Context awareness, or the detecting of the internal or external state of the user, enables a wearable computer to be aware of surroundings and modify its behavior based on the data. Context awareness interprets both physical and biochemical signals. Context awareness is one of the most important aspects of wearable technology. As the technology develops and more applications are discovered, it becomes critical, for example, that users are sensed near a lake and not in it, or that winter coats are not recommended for wearable users who are in Phoenix on a summer day.
The technology enabling context awareness in wearable devices includes wireless, ambient intelligence, user interfaces, powerful search engine capabilities, power management, software, mobile computing and myriad perceiving and data-collecting sensors. Added to this list are such human factor enablers as emotional state, biophysiological condition, goals and social interaction that, when combined with the technological factors, provide the potential for a meaningful and individualized experience.
The key to creating context-aware applications rests on effective context sensing and modeling systems powerful enough to perform accurately, yet simple enough to be implemented and operate under tight resource constraints.
Current and Future Challenges
The challenges are immense for wearable designers. Managing power consumption and communications have been the greatest challenges to date. Embedding the sensors requires every component to be smaller and thinner, necessitating the use of chip-scale packaging that replaces QFN (quad-flat no-leads) and LGA (land grid array) in several applications.
Today's fitness trackers still miscalculate speed and distance. Accuracy matters, whether monitoring sleep or the number of steps taken. MEMS sensors have variable performance from the point of manufacture and throughout the sensor’s lifetime. Each sensor reacts differently to changes in environmental factors such as temperature, voltage, interference and sensor aging, making calibration essential. Engineers also must keep in mind the integration of multi-sensory data, effective context recognition, ensuring “always on” long-term use, structuring the network as to the number of sensors used, timing, security, and selecting features in the body sensor network itself. All of that makes it easy to see the barriers and the challenges of product creation.
The demands put on design engineers are dramatically changing as well. Improving the quality of sensor data integrated into the device has typically not been their focus until now. Now, designers must consider the whole system. It’s one thing to optimize the power and security of a single device and quite another to accomplish this feat across many devices in a variable network. All the while, the number of devices and device versions of the devices might change rapidly — a recipe for design issues. With wearable devices, critical decisions rely on the validity of data, so the importance of addressing these challenges is extremely high.