Closely spaced targets can result in merged measurements, which complicate data association. Such merged measurements violate any assumption that each measurement relates to a single target. As a result, it is not possible to use the auction algorithm in its simplest form (or other two-dimensional assignment algorithms) to solve the two-dimensional target-to-measurement assignment problem. We propose an approach that uses the auction algorithm together with Lagrangian relaxation to incorporate the additional constraints resulting from the presence of merged measurements. We conclude with some simulated results displaying the concepts introduced, and discuss the application of this research within a particle filter context.
The Unscented Kalman Filter oflers sign$- cant improvements in the estimation of non-linear discrete- time models in comparison to the Extended Kalman Fil- ter 1121. In this paper we use a technique introduced by Casella and Robert (2), known as Rao-Blackwellisation, to calculate the tractable integrations that are found in the Unscented Kalman Filter: We show that this leads to a re- duction in the quasi-Monte Carlo variance, and a decrease in the computational complexity by considering a common tracking problem.
Over-the-horizon radar (OTHR) uses the refraction of high frequency radiation through the ionosphere in order to detect targets beyond the line-of-sight horizon. The complexities of the ionosphere can produce multipath propagation, which may result in multiple resolved detections for a single target. When there are multipath detections, an OTHR tracker will produce several spatially separated tracks for each target. Information conveying the state of the ionosphere is required in order to determine the true location of the target and is available in the form of a set of possible propagation paths, and a transformation from measured coordinates into ground coordinates for each path. Since there is no a-priori information as to how many targets are in the surveillance region, or which propagation path gave rise to which track, there is a joint target and propagation path association ambiguity which must be resolved using the available track and ionospheric information. The multipath track association problem has traditionally been solved using a multiple hypothesis technique, but a shortcoming of this method is that the number of possible association hypotheses increases exponentially with both the number of tracks and the number of possible propagation paths. This paper proposes an algorithm based on a combinatorial optimisation method to solve the multipath track association problem. The association is formulated as a two-dimensional assignment problem with additional constraints. The problem is then solved using Lagrangian relaxation, which is a technique familiar in the tracking literature for the multidimensional assignment problem arising in data association. It is argued that due to a fundamental property of relaxations convergence cannot be guaranteed for this problem. However, results show that a multipath track-to-track association algorithm based on Lagrangian relaxation, when compared with an exact algorithm, provides a large reduction in computational effort, without significantly degrading association accuracy.
The protection of infrastructure and facilities within the UK is of prime importance in the current environment where terrorist threats are present. Surveillance of large areas within such facilities is a complex, manpower intensive and demanding task. To reduce the demands on manpower, new systems will need to be developed that use a mixed sensor suite associated with access to databases containing historical data and known threats. This requires fusion of mixed type data from disparate sources. The methods used for the fusion process, and the location of the fusion process, will be dependent on the data, sensor or database. The communication requirements will also be of paramount importance within the monitoring network. As computers increase in performance and reduce in cost and power consumption, there is a growing trend for more processing to be carried out locally. This raises issues of compatibility, timeliness, global awareness of the situation and distributed versus centralised control of the system. This paper presents a generic solution to the wide-area surveillance problem through the application and combination of covariance inflation (a distributed fusion mathematical framework that circumvents problems with data incest) with agent-based technologies (allowing the dynamic formation of sensor coalitions) to track, and potentially risk assess, targets within the region of interest. Simulations will be provided into the distributed detection and tracking of an intruding vehicle at a commercial airport. (6 pages)
Abstract In May 2020 the UK introduced a Test, Trace, Isolate programme in response to the COVID-19 pandemic. The programme was first rolled out on the Isle of Wight and included Version 1 of the NHS contact tracing app. We used COVID-19 daily case data to infer incidence of new infections and estimate the reproduction number R for each of 150 Upper Tier Local Authorities in England, and at the National level, before and after the launch of the programme on the Isle of Wight. We used Bayesian and Maximum-Likelihood methods to estimate R, and compared the Isle of Wight to other areas using a synthetic control method. We observed significant decreases in incidence and R on the Isle of Wight immediately after the launch. These results are robust across each of our approaches. Our results show that the sub-epidemic on the Isle of Wight was controlled significantly more effectively than the sub-epidemics of most other Upper Tier Local Authorities, changing from having the third highest reproduction number R (of 150) before the intervention to the tenth lowest afterwards. The data is not yet available to establish a causal link. However, the findings highlight the need for further research to determine the causes of this reduction, as these might translate into local and national non-pharmaceutical intervention strategies in the period before a treatment or vaccination becomes available.
Bayesian inference is a vital tool for consistent manipulation of the uncertainty that is present in many military scenarios. However, in some highly complex environments, it is hard to write down an analy tic form for the likelihood function that underlies Bayesian inference. Approximate Bayesian computation (ABC) algorithms address this difficulty by enabling one to proceed without analytically specifying or evaluating the likelihood distribution. This is achieved through the use of computer simulation models that stochastically simulate measurements for a given set of parameter values. This paper gives an overview of standard ABC methods such as rejection and Markov chain Monte Carlo (MCMC) sampling. It then goes on to discuss ABC versions of sequential Monte Carlo (SMC) samplers, which are the recently-developed next-generation particle filters for Bayesian sampling. SMC Samplers have properties that make them highly suitable for complex estimation and inference problems in the presence of uncertainty. This includes an ability to effic iently explore multi-modal distributions. One application of SMC -ABC algorithms is source term estimation for chemical, biological, radiological and nuclear (CBRN) defence when the number of releases is unknown. A proof -of-principle study has been conducted using a bar -sensor model and a Gaussian dispersion model for agent behaviour, including the effect of wind. The outcome is that the algorithms are able to estimate model parameters to a reasonable degree of accuracy, but there is some ambiguity between the location and time of releases due to wind effects.
The use of multiple scans of data to improve ones ability to improve target tracking performance is widespread in the tracking literature. In this paper, we introduce a novel application of a recent innovation in the SMC literature that uses multiple scans of data to improve the stochastic approximation (and so the data association ability) of a multiple target Sequential Monte Carlo based tracking system. Such an improvement is achieved by resimulating sampled variates over a fixed-lag time window by artificially extending the space of the target distribution. In doing so, the stochastic approximation is improved and so the data association ambiguity is more readily resolved.
In data fusion systems, one often encounters measurements of past target locations and then wishes to deduce where the targets are currently located. Recent research on the processing of such out-of-sequence data has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships between the algorithms so that any approximations made are explicit.