CAT-I GBAS Availability Improvement Through Ionosphere Field Monitor (IFM)
2011
Mitigating ionosphere anomalies has been the largest challenge to GBAS system development and certification. Extreme spatial gradients of ionosphere delays can cause large differential correction errors and could lead to loss of integrity if not detected or mitigated in some manner. This is difficult in today’s single-frequency (L1-only) GBAS because detection of severe ionosphere gradients cannot be guaranteed before users are threatened. In the existing GBAS systems that support CAT I precision approach, the ionosphere threat is mitigated by inflating the broadcast integrity parameters assuming that the worst-case ionosphere condition exists all the time. In this approach the integrity parameters are inflated so that resulting protection levels, which are calculated by airborne systems using those integrity parameters, bound the worst-case position error. Therefore this approach ensures user integrity; however it causes a significant loss of availability. In order to achieve acceptable availability, while making use of above mentioned fairly conservative approach, our CAT-I GBAS prototype implemented Ionosphere Field Monitor (IFM). IFM is designed to detect anomalous ionosphere delay gradients that are assumed to be proportional to the distance from GBAS reference stations. With the IFM, we can reduce the worst-case differential correction error; and thus reduce the degree of inflation of the integrity parameters by which we can gain improvement of availability. This paper shows how and how much the IFM helps to increase availability while guaranteeing user integrity.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
1
Citations
NaN
KQI