A platform for research: civil engineering, architecture and urbanism
Optimal Inspection and Maintenance Policies for Infrastructure Networks
State‐of‐the‐art infrastructure management systems use Markov decision processes (MDPs) as a methodology for maintenance and rehabilitation (M&R) decision making. The underlying assumption in this methodology is that inspections are performed at predetermined and fixed time intervals and that they reveal the true condition state of the facility, with no measurement error. As a result, after an inspection, the decision maker can apply the activity prescribed by the optimal policy for that condition state of the facility.
In previous research, the second author has applied a methodology for M&R activity selection that accounts for the presence of both forecasting and measurement uncertainty. This methodology is the latent Markov decision process (LMDP), an extension of the traditional MDP that relaxes the assumptions of error‐free annual facility inspections.
In this article we extend this methodology to include network‐level constraints. This can be achieved by extending the LMDP model to the network‐level problem through the use of randomized policies. We present both finite‐horizon (transient) and infinite‐horizon (steady‐state) formulations of the network‐level LMDP. A case study application demonstrates the expected savings in life‐cycle costs that result from increasing the measurement accuracy used in facility inspections and from optimal scheduling of inspections.
Optimal Inspection and Maintenance Policies for Infrastructure Networks
State‐of‐the‐art infrastructure management systems use Markov decision processes (MDPs) as a methodology for maintenance and rehabilitation (M&R) decision making. The underlying assumption in this methodology is that inspections are performed at predetermined and fixed time intervals and that they reveal the true condition state of the facility, with no measurement error. As a result, after an inspection, the decision maker can apply the activity prescribed by the optimal policy for that condition state of the facility.
In previous research, the second author has applied a methodology for M&R activity selection that accounts for the presence of both forecasting and measurement uncertainty. This methodology is the latent Markov decision process (LMDP), an extension of the traditional MDP that relaxes the assumptions of error‐free annual facility inspections.
In this article we extend this methodology to include network‐level constraints. This can be achieved by extending the LMDP model to the network‐level problem through the use of randomized policies. We present both finite‐horizon (transient) and infinite‐horizon (steady‐state) formulations of the network‐level LMDP. A case study application demonstrates the expected savings in life‐cycle costs that result from increasing the measurement accuracy used in facility inspections and from optimal scheduling of inspections.
Optimal Inspection and Maintenance Policies for Infrastructure Networks
Smilowitz, Karen (author) / Madanat, Samer (author)
2000-01-01
9 pages
Article (Journal)
Electronic Resource
English
Optimal Inspection and Maintenance Policies for Infrastructure Networks
Online Contents | 2000
|British Library Online Contents | 1999
|Optimizing Inspection Policies for Buried Municipal Pipe Infrastructure
British Library Online Contents | 2012
|Optimizing Inspection Policies for Buried Municipal Pipe Infrastructure
Online Contents | 2012
|