Methodology to Incorporate the Effect of Plant Operating State During Surveillance Testing in Determining Optimal Surveillance Test Interval

2020 
Periodic surveillance tests are performed on standby systems to maintain operational preparedness of the safety-critical systems in nuclear plants. Test intervals are mostly decided qualitatively based on operational experience and expert advice in line with manufacturer’s recommendations. Probabilistic techniques allow to take risk-informed decision on surveillance test interval (STI) that aims at minimising mean system unavailability. System unavailability during testing is an important parameter influencing optimum STI. The contribution of testing time on system mean unavailability depends on demand occurrence frequency during testing. Many safety systems are tested during plant shutdown or low-power operation which significantly reduces demand occurrence frequency during testing. Traditional approaches towards STI optimisation do not consider this effect and consider either no demand during testing or take demand occurrence frequency similar to that during power operation. This paper discusses a methodology to incorporate the effect of reduced demand occurrence frequency during testing, in the determination of optimum STI using a factor that gives the ratio of demand occurrence frequency during system testing to that during standby condition. It is an approach balanced between the two extreme traditional approaches. A case study is performed on the Emergency Core Cooling System of Dhruva Reactor to study the effect of this parameter on optimum STI.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []