What is Sensor Data and Information Fusion?
Sensor Data and Information Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor data or non-sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output or other information source in accuracy, reliability, or cost. Appropriate collection, registration and alignment, stochastic filtering, logical analysis, space-time integration, exploitation of redundancies, quantitative evaluation, and appropriate display are part of sensor data fusion as well as the integration of related context information.
Why is it omnipresent?
Sensor data fusion is an omnipresent phenomenon that existed prior to its technological realization or the scientific reflection on it. In fact, all living creatures, including human beings, by nature or intuitively perform sensor data fusion. Each in their own way, they combine or fuse sensations provided by different and mutually complementary sense organs with knowledge learned from previous experiences and communications from other creatures. As a result, they produce a mental picture of their individual environment, the basis of behaving appropriately in their struggle to avoid harm or successfully reach a particular goal in a given situation. Technical sensor data fusion systems try to automate parts of these "natural" fusion capabilities resulting in "cognitive tools", which enhance the perceptive faculties of human beings in the same way conventional tools enhance their physical strength. In this type of interactive assistance system, the strengths of automated data processing (dealing with mass data, fast calculation, large memory, precision, reliability, robustness etc.) are put into service for the human beings involved. Automated sensor data fusion actually enables them to bring their characteristically human strengths into play, such as qualitatively correct over-all judgment, expert knowledge and experience, intuition and creativity, i.e. their natural intelligence that cannot be substituted by automated systems in the foreseeable future.
How is it defined?
Llinas (2001). "Information fusion is an Information Process dealing with the association, correlation, and combination of data and information from single and multiple sensors or sources to achieve refined estimates of parameters, characteristics, events, and behaviors for observed entities in an observed field of view. It is sometimes implemented as a Fully Automatic process or as a Human-Aiding process for Analysis and/or Decision Support." [1]
[1] D. L Hall and J. Llinas (edt). Handbook of Multisensor Data Fusion. CRC Press: USA. 2001.
A comprehensive list of definitions is given on this page.
What are its technological prerequisites?
The modern development of sensor data fusion systems was made possible by substantial progress in the following areas over the recent decades:
1. Advanced and robust sensor systems, technical equivalents of sense organs with high sensitivity or coverage are made available that may open dimensions of perception usually unaccessible to most living creatures.
2. Communication links with su cient bandwidths, small latencies, stable connectivity, and robustness against interference are the backbones of spatially distributed networks of homogeneous or heterogeneous sensors.
3. Mature navigation systems are prerequisites of (semi-)autonomously operating sensor platforms and common frames of reference for the sensor data based on precise space-time registration including mutual alignment.
4. Information technology provides not only su cient processing power for dealing with large data streams, but also e cient data base technology and fast algorithmic realizations of data exploitation methods.
5. Technical interoperability , the ability of two or more sub-systems or components to exchange and to information, is inevitable to build distributed systems of systems for sensor exploration and data exploitation.
6. Advanced and ergonomically e cient human-machine interaction (HMI tools are an integral part of man-machine-systems presenting the results of sensor data fusion systems to the users in an appropriate way.
Where does it come from?
The technical term "Sensor Data and Information Fusion" was created in George Orwell's very year 1984 in the US defence domain, but the applications and scientific topics in this area have much deeper roots. Today, Sensor Data Fusion is evolving at a rapid space and present in countless everyday systems and civilian products.
Who are the pioneers?
Since sensor data fusion can be considered as a branch of automation with respect to imperfect sensor data, a historical reflection on its roots could identify numerous predecessors in automation engineering, cybernetics, and Bayesian statistics, who developed fundamental notions and concepts relevant to sensor data fusion. Among many other pioneers, Carl Friedrich Gauß, Thomas Bayes and the Bayesian statisticians, as well as Rudolf E. Kalman have created the methodological and mathematical prerequisites of sensor data fusion that made the modern development possible.
Carl Friedrich Gauß
Many achievements in science and technology that have altered today's world can be traced back to the great mathematician, astronomer, geodesist, and physicist Carl Friedrich Gauß (1777-1855). This general tendency seems also to be true in the case of sensor data fusion. After finishing his opus magnum on number theory, Gauß re-oriented his scientific interests to astronomy. His motive was the discovery of the planetoid Ceres by the Theatine monk Giuseppe Piazzi on January 1, 1801, whose position was lost shortly after the first astronomical orbit measurements. Gauß succeeded in estimating the orbit parameters of Ceres from a few noisy measurements by using a recursively written least-squares error compensation algorithm, a methodology, which can be interpreted as a limiting case of Kalman filtering, one of the most important backbone algorithms of modern target tracking and sensor data fusion. Based on his results, Heinrich Olbers was able to rediscover Ceres on January 1, 1802. The discovery of three other planetoids followed (Pallas 1802, Juno 1804, Vesta 1807). Although until then, Gauß was wellknown to mathematical experts only, this success made his name popular, leading to his appointment at Göttingen University in 1807 as a Professor of Astronomy and Director of the Observatory. Gauß' personal involvement in this new scientific branch of reasoning with imprecise observation data is indicated by the fact that he called his first borne child Joseph, after Father Guiseppe Piazzi. Three others of his children were named after the discoverers of Pallas, Juno, and Vesta.
Bayesian Statistics
In sensor data fusion, the notion of Bayesian probability is of fundamental importance. It interprets the concept of probability as a measure of a state of knowledge and not as a relative frequency as in classical statistics. According to this interpretation, the probability of a hypothesis given the sensor data is proportional to the product of the likelihood function multiplied by the prior probability. The likelihood function represents the incomplete and imperfect information provided by the sensor data themselves as well as context information on the sensor performance and the sensing environment, while the prior specifies the belief in the hypothesis before the sensor data were available.
Bayesian Statisticians
The term 'Bayesian' refers to Thomas Bayes (1702-1761), a British mathematician and Presbyterian minister, who proved a special case of this proposition, which is now called Bayes' theorem (published posthumously by his friend Richard Price in 1763). The roots of 'subjective probability' can even be traced back to the great Jewish philosopher Moses Maimonides (1135/38-1204) and medieval rabbinic literature. In the sequel, the foundations of Bayesian statistics were laid by many eminent statisticians.
Pierre Simon Laplace
It was Pierre-Simon Laplace (1749-1827), however, who introduced a more general version of Bayes' theorem, apparently unaware of Bayes' work, and used it to approach problems in celestial mechanics, medical statistics, reliability, and jurisprudence.
Abraham Wald
Of particular importance is Abraham Wald (1902-1950), an Austro-Hungarian mathematician, who immigrated to the USA in 1938, where he created Sequential Analysis, a branch of applied statistical decision making, which is of enormous importance for sensor data fusion, especially in track management and consistency testing. In his influential final work on Statistical Decision Functions, he recognized the fundamental role of Bayesian methods and called his optimal decision methods 'Bayes strategies'.
Rudolf E. Kalman and his Predecessors
The beginning of modern sensor data fusion is inextricably bound up with the name of Rudolf E. Kalman (*1930), a Hungarian-American system theorist, though he had many predecessors. The Kalman filter is a particularly influential example of a processing algorithm for inferring a time variable object state from uncertain data assuming an uncertain object evolution, which can elegantly be derived from Bayesian statistics. Among Kalman's predecessors, Thorvald Nicolai Thiele (1838-1910), a Danish astronomer, actuary and mathematician, derived a geometric construction of a fully developed Kalman filter in 1889. Also Ruslan L. Stratonovich (1930-1997), a Russian physicist, engineer, probabilist, and Peter Swerling (1929-2000), one of the most influential RADAR theoreticians in the second half of the 20th century [16, Appendix], developed Kalman-type filtering algorithms earlier using di erent approaches. Stanley F. Schmidt (*1926) is generally credited with developing the first application of a Kalman filter to the problem of trajectory estimation for the NASA Apollo Space flight Program in 1960, leading to its incorporation in the Apollo navigation computer.
Contemporary Researchers
Independently of each other, Günther van Keuk (1940-2003) and Singer first applied Kalman filtering techniques to single air target tracking problems in multiple radar data processing. The foundations of multiple hypothesis tracking methods for dealing with data of uncertain origin related to multiple objects were laid by Robert W. Sittler, who first posed the problem, while Donald B. Reid published a method for solving it. Van Keuk, Sam S. Blackman, and Yaakov Bar-Shalom were among the first, who transformed Reid's method into practical algorithms. In the vast research literature published since then, however, it is impossible to identity all important scientists and engineers. The following discussion of significant contributions is therefore by no means complete. In particular due to their monographs on target tracking and sensor data fusion issues, Yaakov Bar-Shalom, Sam S. Blackman, and Alfonso Farina are highly influential researchers and have inspired many developments. Henk A.P. Blom introduced stochastic hybrid processes into data fusion, which under the name of Interacting Multiple Models still define the state-of-the-art in target dynamics modeling. He in particular applied Bayesian data fusion to large air traffic control systems under severe reliability constraints. Countless realization aspects in fusion systems design are covered by Oliver Drummond's contributions. Already in his PhD thesis, he addressed many important issues in multiple object tracking at a very early time. Larry Stone is a pioneer in Bayesian sonar tracking and data fusion in complex propagation environments. Neil Gordon was among the first, who applied sequential random Monte-Carlo-techniques to non-linear tracking problems, known under the name of Particle Filtering and inspired a rapid development in this area. Numerous contributions to problems at the borderline between advanced signal processing, distributed detection theory, and target tracking were made by Peter K. Willett. Xiao-Rong Li provided important solutions to radar data fusion. The integration of modern mathematical non-linear filtering to practical radar implementation is the merit of Fred Daum. Hugh Francis Durrant-Whyte is generally credited with creating decentralized data fusion algorithms as well as with simultaneous localization and navigation. The stormy development of efficient multitarget tracking based on random set theory with Probabilistic Hypothesis Density Filtering (PHD) as an efficient realization has been developed by Ronald Mahler. Finally, Roy Streit first introduced Expectation Maximization techniques to solve efficiently the various data as sociation problems in target tracking and sensor data fusion and exploited the use of Poisson-point precesses in this area.
Pioneering Work at Fraunhofer FKIE
The Fraunhofer Research Institute for Communications, Information Systems, and Ergonomics (FKIE) and its predecessor FFM (FGAN Research Institute for Radio Technology and Mathematics) is dealing with sensor data fusion since 1965. Under the responsibility of Günther van Keuk, these activities were related to distributed target tracking and data fusion in multiple radar networks for the German Agency of Air Traffic Security (DFS). In 1969, van Keuk became responsible for multiple target tracking by using multifunctional phased-array radar. These activities led in 1975 to the foundation of the first organizational unit in Germany exclusively devoted to research on sensor data fusion (department Sensordatenverarbeitung und Steuerungsverfahren (SuS) at FGAN-FFM, now department Sensor Data and Information Fusion (SDF) at Fraunhofer FKIE. Over many years, active sensor management, tracking, and data fusion for the phased-array radar system ELRA was an important focal point of the activities in SuS. Van Keuk was among the first, who proposed and realized sequential track initiation scheme based on an optimal criterion related to state estimates. In this context he developed a performance prediction model for phased-array array radar, which has been called Van-Keuk-Equation in the tracking literature. He contributed to ELRA (Elektronisches Radar), which was the first European experimental system, which included all aspects of a phased-array radar, beginning with the antenna, over digital signal processing, adaptive beam-control tracking and track management, as well as prioritized graceful degradation, in which the task sequences were derived from the current track states according to a system threat model. Pioneering work was also done in multiple emitter tracking within networks of electromagnetic and acoustic sensors under the effect of hostile measures in challenging Cold-War reconnaissance scenarios. Today, the research activities of the department SDF at FKIE covers many aspects of sensor data fusion related to localization and navigation, wide-area surveillance, and threat recognition for defence and security applications.
What are the modern roots?
Sensor data fusion systems have been developed primarily for applications, where a particular need for support systems of this type exists, for example in time-critical situations or in situations with a high decision risk, where human deficiencies must be complemented by automatically or interactively working data fusion techniques. Examples are fusion tools for compensating decreasing attention in routine and mass situations, for focusing attention on anomalous or rare events, or complementing limited memory, reaction, and combination capabilities of human beings. In addition to the advantages of reducing the human workload in routine or mass tasks by exploiting large data streams quickly, precisely, and comprehensively, fusion of mutually complementary information sources typically produces qualitatively new and important knowledge that otherwise would remain unrevealed.
The demands for developing such support systems are particularly pressing in defence and security applications, such as surveillance, reconnaissance, threat evaluation, and even weapon control. The earliest examples of large sensor data fusion projects were designed for air defence against missiles and low-flying bombers and influenced the development of civilian air traffic control systems. The development of modern sensor data fusion technology and the underlying branch of applied science was stimulated by the advent of sufficiently powerful and compact computers and high frequency devices, programmable digital signal processors, and last but not least by the Stratecic Defence Initiative (SDI) announced by US President Ronald Reagan on March 23, 1983.
After a certain level of maturity has been reached, the Joint Directors of Laboratories (JDL), an advisory board to the US Department of Defense, coined the technical term Sensor Data and Information Fusion in George Orwell's very year 1984 and undertook the first attempt of a scientific systematization of the new technology and the research areas related to it. To the present day, the scientific fusion community speaks of the JDL Model of Information Fusion and its subsequent generalizations and adaptations. The JDL model provides a structured and integrated view on the complete functional chain from distributed sensors, data bases, and human reports to the users and their options to act including various feed-back loops at different levels. It seems to be valid even in the upcoming large fields of civilian applications of sensor data fusion and computer security. Obviously, the fundamental concepts of sensor data fusion have been developed long before their full technical feasibility and robust realizability in practical applications.
Figure: Overview of the JDL-Model of Sensor Data and Information Fusion, which provides a structured and integrated view on the complete functional chain from distributed sensors, data bases, and human reports to the users and their options to act including various feed-back loops at dierent levels.
Where lies the future?
Due to the increasing availability of inexpensive, but powerful sensor, communication,and information technology, its technical prerequisites, sensor data fusion, or more general, information fusion, increasingly emancipates from its roots in defense related applications.
A commonplace example of this trend is the advent of navigation systems, which have developed a mass market by fusing military global navigation satellite system data with digital road maps in combination with an appealing graphical interface.
We can therefore expect that information fusion will become a key technology driver for developing numerous innovative products penetrating everyone's daily life and changing it profoundly. In this context, many new research questions are expected to emerge that will foster the further evolution of information fusion as an also economically eminent branch of applied informatics.
What are everyday life applications?
Even now, intelligent fltering, analysis, evaluation, and graphical presentation of multiple sensor information enable numerous products that make everyday life safer or more secure. For example, in intelligent car-driver assistance systems, image and video data from cameras and miniaturized automotive radar sensors are automatically fused in order to perceive road obstacles and pedestrians or to exclude ghost objects.
Airport Check Points
At airport security checks, assistance systems can be used, which directly take advantage of military surveillance technology. By fusing signatures of stand-off chemical sensors and miniaturized gamma-spectrometers, for example, with person trajectories, carry-on items contaminated with hazardous materials or explosives can be detected. This may be a contribution to avert threats or avoid terrorist attacks.
Omnipresent Sensors
Other areas where information fusion based assistance systems will increasingly be important are medical and health care, process control, logistics, industrial production, precision agriculture, and traffic monitoring. A particularly stormy evolution can currently be observed for assistance systems, where physical activities and the health status of elderly or handicapped human beings can be monitored, allowing them to live in their usual everyday environment much longer than now. In the vast fields of fire, disaster, and pollution control, quick exploitation and fusion of complex data streams can be essential for safety analysis and designing corresponding concepts as well as for developing sophisticated emergency information and management systems.
Industrial Potential
Since sensor data fusion has actually evolved into a mature technology in major fields and provides a coherent and powerful inventory of methodologies and algorithms already proven in ambitious applications, the further realization of its inherent application potential is much alleviated by the very fact that R&D for new products can be done on a sound technology base that does not need to be created in a time-consuming and expensive way. For this reason, the expected development cycles for innovative products are short, while the development risks involved are calculable. Due to its traditional strengths in high-tech industries, such as system technology or software engineering, sensor or RFID technology, highly industrialized and research-intensive countries like Germany can use their potential especially in those branches where they are traditionally well-positioned for example in automotive technology, automation and aerospace industries, in security, safety and medical technology, and last but not least, in information system technology in general.
What are the large scale trends?
Human Assistance Systems
Typically human fusion processes, however, characterized by associative reasoning, negotiating of reasonable compromises, or extrapolating incomplete information creatively and in an intuitive way, seem to be still un t for automation. Nevertheless, technical data fusion systems can offer assistance functionalities also here, by which specifically human competencies of judgment are freed from routine or mass tasks, quite in the sense of a cognitive tool as discussed earlier. Moreover, highly promising research areas are and will increasingly be those that aim at modeling and formalizing this speci c human expert knowledge and expertise of situation assessment and incorporate it into the process of automated multiple sensor data.
Context Data Integration
Furthermore, a large-scale technology tend to be highlighted is given by the large potential of quantitative non-sensor information available in comprehensive databases, such as Geographical Information Systems (GIS), which is still waiting to be integrated into multiple sensor data fusion systems. This is especially true in the vast area of ground, air, sea, and underwater robotics, but has also strong implications in guaranteeing high levels of air transportation security, even in the case of high tra c densities, and in advanced logistics support systems, such as container monitoring and tracking, topics with direct implications for global economy.
Network-centric Operations
A predominant trend in defence applications is given by the demand of supporting Network-centric Operations , which will still be in effect for the next decade. Sensor data and information fusion technology is one of the major forces shaping this process of transformation from more standard operational doctrines. Especially for out-of-area operations and operations in an urban terrain, as well as for dealing with asymmetric opponents, distributed high-performance reconnaissance is inevitable. In particular, wide-area ground, sea, and underwater surveillance, belong to this eld, specially by making use of unmanned reconnaissance robots (unmanned ground, aerial, or underwater vehicles). Moreover, intelligent security systems for harbors, critical infrastructure, or camp protection are likely to raise many research intensive data fusion problem.
Fusion-driven Communications
The communications sub-systems within a large sensor network are typically characterized by many internal degrees of freedom, which can be controlled and adapted. This opens the vast area of fusion-driven communications, where communications and the distributed data fusion system architectures are closely tied and optimized with respect to the particular surveillance goals to be reached. In the focus are multi-component system consisting of sensors, data bases, and communication infrastructures that collectively behave as a single dynamically adaptive system. Important aspects are network scalability given a limited communication bandwidth, adaptive and optimal spectrum sharing protocols, sensor data against network objectives, and innetwork information. In addition, the growing use and ubiquitous nature of sensor networks pose issues when networks deployed for multiple applications need to be combined or need to exchange information at the network level.
Pervasive Passive Surveillance
A particularly exciting topic of recent research is advanced distributed signal and data fusion for passive radar systems, where radio, TV, or mobile phone base stations are used as sources for illuminating targets of interest. Even in remote regions of the world, each transmitter of electromagnetic radiation becomes a potential radar transmitter station, which enables air surveillance by passively receiving re ections of non-cooperatively emitted signals of opportunity. In this way, the reconnaissance process remains covert and is not revealed by actively transmitting radiation. Analogous considerations are valid for sub-sea surveillance.
"Add-on" Research
Since a stormy evolution of civilian information fusion applications is to be expected in the near future, defence-related R&D on information fusion technology will increasingly show the character of add-on research, which adapts existing civilian problem solutions to specifically military requirements. This trend is analogous to the evolution in advanced communication systems, a technology that also had its roots in the military domain, before the civilian market opportunities became the predominant force driving its technological and scientific progress.