Sensor Fusion: The future of intelligent devices
Combining the data from multiple sensors can tell us a great deal more about the application environment than each sensor could on its own.
Sensor fusion is an intriguing idea: if data from more than one sensor can be combined in the right way, the combined data can be more accurate, more reliable or simply provide a better understanding of the context in which the data was gathered. It is perfectly possible to combine data from two or more (or in fact, lots of) sensors to produce extremely rich, context aware data that eliminates the limitations in range or accuracy of the individual sensors. The whole effectively exceeds the sum of the parts, using only standard sensor technologies as all the analysis is done in the software. For today’s remote, small sensor nodes in the internet of things, this can easily be done using the ample computational power of the cloud.
At the simplest end of the scale, a classic example of sensor fusion is combining the data from a moisture sensor with a temperature sensor to calculate relative humidity; the amount of water vapour present in the air expressed as a percentage of the amount needed for saturation at the same temperature. Relative humidity is an important parameter in heating, ventilation and air conditioning (HVAC) systems, as well as metrology equipment. Relative humidity is also important in the painting and coating industries, as processes can be very sensitive to the environment’s dew point.
For example, ALPS has a humidity and temperature sensor module mounted on a small PCB, the HSHCAL series (right, top), which enables a lot of design freedom as it can be mounted in the optimal location within the system. Meanwhile, TE Connectivity’s HTU21D humidity and temperature sensor module (right, bottom) comes in a reflow solderable DFN package for automated assembly, which measures just 3 x 3 x 0.9mm for compact applications. Both these relative humidity sensor examples operate across the full range of 0 to 100% relative humidity, and provide a digital (I2C) output of the data for direct interface with a microcontroller.
Another classic application of sensor fusion is determining the orientation of a system in three-dimensional space. A gyroscope can be used to measure the angular velocities of the system in all three dimensions, then the result from each axis can be mathematically integrated to get a position, but even with today’s technology this data is not always very accurate. Gyroscopes are prone to bias error, which produces a non-zero reading even when the sensor is stationary, and this error varies with temperature and the sensor’s age. An accelerometer could be used to detect linear motion in three dimensions, but these sensors are susceptible to vibration. A three-dimensional magnetometer could be used, which detects the Earth’s magnetic field to give an idea of orientation, but these sensors are not always accurate in the face of interference from devices nearby. The best approach is to combine the data from all three sensors, and ideally a temperature sensor too, to eliminate the inaccuracies of any individual sensor. Complex filtering algorithms (such as the widely used Kalman filtering technique) are used to combine the data. 9-DoF (nine degrees of freedom) sensor systems, which include 3D accelerometers, gyros and magnetometers plus a microcontroller with a filtering algorithm, are available combined into one package that outputs easy to deal with position data.
Different combinations of accelerometers, gyros and magnetometers are also available to suit various applications. For example, Murata has a combined single axis gyro and 3D accelerometer solution, the SCC2000 series (right), which comes in x or z axis gyro configurations, allowing six degrees of freedom to be implemented on a single application board. This component is intended for sensor fusion in harsh environment applications, offering high reliability and performance in the most demanding systems. This might include industrial machine control, where robotic arms are used to complete manufacturing tasks such as welding, painting and material handling. These robots can move quickly in any direction so they need to have accurate knowledge of the position of their “hand” (actually, end of arm tooling, or EOAT) in order to complete their tasks. These robots often operate in areas that are deemed too dangerous for humans to work in, so it’s important that the sensors used to gather the data for fusion can withstand extreme temperatures, shock and vibration.
Another harsh environment that uses sensor fusion extensively is the world of automotive. In this case, the SCC2000 series may be used for applications such as electronic stability control (ESC) which detects skidding using a number of different sensors. Data is input from sensors including a steering wheel angle sensor, which determines the direction the driver intends to go, a gyroscope to measure yaw rate and possibly roll rate, an accelerometer to measure linear acceleration and a wheel speed sensor which detects changes in speed due to loss of traction. This data is fed into a control algorithm; if the result indicates a skid, an intervention such as braking individual wheels may be implemented for safety.
|Healthcare and medical electronics is another area where sensor fusion from accelerometers and gyroscopes is enabling exciting new systems. Advances in the miniaturisation and power consumption of MEMS sensor devices and microcontrollers are enabling wearable sensor systems that can be used in a variety of medical environments. For example, body-worn systems which monitor the movement of limbs can be helpful for physiotherapy, to ensure exercises are being done correctly. Wearable activity trackers, already popular in the consumer wellness market, may in the future have their data fused with data from wearable heart rate monitors, temperature sensors, etc. as part of telehealth services or remote monitoring of patient conditions. Uploading and analysing this data in the cloud means it can be accessed and reviewed by doctors at any time. Intelligent sensor fusion of vital signs data gathered by body-worn sensors can even make it possible for electronic systems to diagnose common diseases without seeing doctors at all.|
Healthcare and medical electronics is another area where sensor fusion from accelerometers and gyroscopes is enabling exciting new systems. Advances in the miniaturisation and power consumption of MEMS sensor devices and microcontrollers are enabling wearable sensor systems that can be used in a variety of medical environments. For example, body-worn systems which monitor the movement of limbs can be helpful for physiotherapy, to ensure exercises are being done correctly. Wearable activity trackers, already popular in the consumer wellness market, may in the future have their data fused with data from wearable heart rate monitors, temperature sensors, etc. as part of telehealth services or remote monitoring of patient conditions. Uploading and analysing this data in the cloud means it can be accessed and reviewed by doctors at any time. Intelligent sensor fusion of vital signs data gathered by body-worn sensors can even make it possible for electronic systems to diagnose common diseases without seeing doctors at all.
Behind these new sensor fusion applications are many, many innovations in both hardware and software. On the hardware side, MEMS sensors can be integrated together in any number of different combinations, into tiny, power-efficient packages. MEMS have also vastly reduced in price in recent years due to miniaturisation and new automatic calibration techniques, while their limitations in terms of accuracy have been offset with advanced sensor fusion techniques. These innovations are set to bring sensor fusion to more varied applications than ever before.
Using other technology advances such as digital signal processing, huge amounts of data can now be fused very quickly to allow a system response to be provided in real time, while wireless internet access provides sensor systems with access to huge computing power in the cloud. The eventual aim is to emulate with electronics the ultimate in sensor fusion hardware – the human body – which uses the brain as a processor to fuse data from the nervous system, visual systems and other sensory inputs to allow people to perform incredibly complex tasks.
As Technical Manager, Martin is responsible for marketing strategy across IP&E, power and battery products into key market segments. Martin has over 15 years' experience in electronics having begun his career at Nortel Networks, and since occupied roles at RS Components, Avnet and Altera.
I recognise that face: Machine vision with Omron's HVC
Giving machines the power of vision gives them the ability to interact with people in a much more na...
Enabling smart tracking for the supply chain and Industry 4.0 with RFID
Do you know where your product has been and where it is going right now? This is the challenge that ...
Taking flow measurement to the next level
Avnet Abacus helps engineers to develop smart flow meters for monitoring water and gas consumption. ...