Inverse modelling requirements for a nuclear materials safeguards tool

Miller, Euan Colin (2001) Inverse modelling requirements for a nuclear materials safeguards tool. PhD thesis, University of Glasgow.

Full text available as:
[thumbnail of digitised version of the original print thesis] PDF (digitised version of the original print thesis)
Download (13MB)
Printed Thesis Information: https://eleanor.lib.gla.ac.uk/record=b2019034

Abstract

The work presented in this thesis has been carried out in the support of the specification of a solution monitoring system to assist United Nations' inspectors performing nuclear materials, primarily pertaining to the chemical separation areas of nuclear reprocessing facilities. The system is designed to provide assurances over hours and days, other methods are more appropriate for the provision of assurances over weeks. The impetus for this system derives from the fact that conventional material accountancy methods are unable to satisfy the protracted loss detection goal specified by the International Atomic Energy Agency when applied to large commercial reprocessing plants. Based on the concept of model-based reasoning, the system estimates the distribution of plutonium throughout the plant via simulation, and then attempts to justify any discrepancies between the estimated distribution and the observed distribution. Because the simulation's structure is fixed the process of justification involves hypothesising additional forcing functions and parameter changes, which result in the simulation predicting that observed. The simulation inputs are largely in the form of flow rates and concentrations, which are obtained via indirect measurement. Plant operators discourage invasive measurement systems on the grounds of the expense of maintenance and plant containment. For this reason the direct measurement of material flow rates is not possible. However, the volume and density of liquor in process tanks is measured, so it is possible to obtain the flow rates indirectly by analysing the measurements, a process known as inverse modelling. Concentration measurements are obtained from the laboratory analysis of samples. Inverse modelling is not just confined to flow rate estimation, because one of the aims of the system is also one of inverse modelling: to hypothesis a set of forcing functions and boundary conditions which, when input into the simulation, predicts the observed distribution. Thus inverse modelling is required at two levels, locally for flow rate estimation and more globally for distribution estimation over the entire plant. Inverse modelling is problematic because inverse solutions have a propensity to be non-unique and unstable. Furthermore, since the solutions are obtained by analysing the measurements, they are adversely affected by the presence of noise and/or biases. This thesis describes some of the tools that have been developed as part of this system. A number are based on common statistical process control algorithms such as the Shewhart Control Chart and the V-mask, others involve more novel algorithms such as simulated annealing. Different tools are used over different time-scales: the short-term and the medium-term Over the short-term, disagreements between the simulation and observations are analysed to generate forcing function hypotheses by using banks of observers to generate a list of the possible causes. The most likely hypothesis is chosen on the basis of user specified subjective possibilities. These probabilities reflect the view that some events are more likely to be acceptable to the operator than others are. The problem over the medium-term is more difficult. The inverse modelling process is imperfect so the model diverges from the real plant over time with the net effect that quantities of material are predicted to be in the wrong place. This imperfection can stem from both the simulation and the plant data. The possible causes are biases that may exist on the plant and inaccuracies in the estimation of flow rates that affect the simulation. A method is proposed for identifying and estimating the gross multiplicative biases. If no bias is found an event is created describing the redistribution necessary to achieve parity. A method is proposed to correct flow rates with the net effect that is a redistribution that would minimise the divergence. If a large redistribution is necessary to achieve parity, then an incident may have occurred on the plant. The emphasis in the design of the algorithms is on the development of a practical system, one that could easily be adapted for use on a real plant. A number of different activities were needed to convert the conceptual design into a practical additional safeguards system A considerable amount of work has been spent designing and testing on real data virtually identical algorithms. This activity is not central to the work described in this thesis, and has been relegated to Appendix 1. However, it is evidence of the credibility of the algorithms on their ability to work in a real situation, and cannot be stressed too much. (Abstract shortened by ProQuest.).

Item Type: Thesis (PhD)
Qualification Level: Doctoral
Additional Information: Published article: (1999) Tank measurement data compression for solution monitoring in Journal of Nuclear Materials Management (1999) 25: 120-127 removed.
Keywords: Nuclear engineering, nuclear nonproliferation, nuclear industry, nuclear facilities.
Subjects: T Technology > T Technology (General)
T Technology > TK Electrical engineering. Electronics Nuclear engineering
Colleges/Schools: College of Science and Engineering > School of Engineering
Supervisor's Name: Howell, Dr. J.
Date of Award: 2001
Depositing User: Enlighten Team
Unique ID: glathesis:2001-73732
Copyright: Copyright of this thesis is held by the author.
Date Deposited: 14 Jun 2019 08:56
Last Modified: 13 Jul 2022 13:41
Thesis DOI: 10.5525/gla.thesis.73732
URI: https://theses.gla.ac.uk/id/eprint/73732

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year