Road Network Performance Monitoring & Management Guideline
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Quality Record Sheet Road Network Performance Monitoring & Management Guideline Authorisation Project Champion Peter Higgs, IPWEA Prepared By Henning, Holland and Tapper Reviewed By Waugh Date October 2019
CONTENTS 1 This Guideline 4 1.1 Objectives of this guideline 4 1.2 Structure of the Guideline 5 Part I Context to Performance Monitoring and Management 6 2 Performance Management a Critical Part of the Asset Management Processes 7 2.1 Asset Management processes 7 2.2 The business case for performance management 8 3 Finding the Optimal Performance Level to Maintain – The Long Term Objective 12 3.1 Realistic expectations of performance reporting planning applications 12 3.2 What is a good performance for the road network under different LoS expectations 13 3.3 Good performance for individual road sections 14 Part II Performance Monitoring and Management Process 15 4 Data 17 4.1 Data needs and processing 17 4.2 Recommended data collection frequencies 19 5 Frameworks 21 5.1 Key performance frameworks for managing road asset 21 5.2 Additional references to consider 23 5.3 Available measures 24 5.4 Measures, composite indices, and future performance areas 30 5.4.1 Performance measures 30 5.4.2 Composite indices 31 5.4.3 Future performance areas / measures 33 6 Story 35 6.1 Story-telling 35 6.2 Understanding the full statistical distribution 35 6.3 Understanding network condition distribution and distribution changes over time 36 6.4 Tips for effectively communicating performance measures 39 7 Business Case Support 41 7.1 Linking the performance to investment and maintenance strategy 41 7.2 Peer Comparison and Benchmarking 42 7.2.1 Peer Group Comparisons 43 7.2.2 Benchmarking using data envelopment analyses 43 8 Application 45 8.1 Underlying principles of making optimal decisions on an operational level 45 8.2 Using trends during field investigations / RAPT reviews 47 9 References 49 10 Glossary of Terms 50 Appendix A Case Study Examples for Level of Service Reporting Frameworks 54 Appendix B Case Study Example Performance Outputs from Main Road Western Australia 58
TABLE OF TABLES Table 1: Performance Monitoring Application ............................................................................................................................ 12 Table 2: Data Quality Report on Each Data Items (Source REG) ..................................................................................... 17 Table 3: Tips for Data Processing ...................................................................................................................................................... 18 Table 4: Data collection requirements for performance monitoring (Adapted from Henning et al., 2015) 20 Table 5: Confidence level rating framework for most common road condition data (Henning et al., 2015) 20 Table 6: Relevant Performance Frameworks for RCAs (Road Controlling Authority) .......................................... 22 Table 7: Additional References ........................................................................................................................................................... 23 Table 8: Available Measures ................................................................................................................................................................. 24 Table 9: The advantages and limitations of composite indices ......................................................................................... 32 Table 10: Potential additional composite indices ...................................................................................................................... 33 Table 11: Future Performance Monitoring Area/measures .................................................................................................... 33 Table 12: Condition Distribution ......................................................................................................................................................... 37 Table 13: Changes in Distribution over time – example 1 ....................................................................................................... 37 Table 14: Changes in Distribution over time – example 2 ..................................................................................................... 38 Table 15: Using Performance Monitoring and Reporting in an Investment Strategy ............................................. 41 Table 16: Terms Used in this guideline ............................................................................................................................................ 50 TABLE OF FIGURES Figure 1: Information Dashboard ...................................................................................................................................................... 4 Figure 2: Resulting Benefits from Performance Management ........................................................................................ 5 Figure 3: The Asset Management Road Map (Henning, 2015) ....................................................................................... 7 Figure 4: The Investment Approach (Based on Karlaftis and Kepaptsoglou, 2012) ........................................ 8 Figure 5: Application of Performance Monitoring in Respective Asset Management Levels ...................... 9 Figure 6: Performance monitoring applied to different datasets ................................................................................. 10 Figure 7: Outcomes from relative comparison analyses .................................................................................................... 11 Figure 8: Road network maintenance planning is a balancing act between investment, risk and LoS .. 13 Figure 9: The danger of making maintenance decisions on the basis of condition alone ............................. 14 Figure 10: Linking Grading Frequency to Customer Complaints (Robertson, 2018) ......................................... 19 Figure 11: Related Performance Frameworks to Road Asset Management Levels .............................................. 21 Figure 12: Measures that describe the surface condition .............................................................................................. 30 Figure 13: Measures that describe the pavement make-up and condition ........................................................... 30 Figure 14: A Typical Distribution of SCI for an Urban Authority ................................................................................ 31 Figure 15: Suggested condition ranges for SCI .................................................................................................................. 32 Figure 16: IRAP Star Rating System for Roads (source https://www.irap.org/) ................................................. 33 Figure 17: “The Flaw of Averages” (Savage, 2012) ............................................................................................................ 35 Figure 18: Considering specific statistical percentiles within the full distribution .............................................. 36 Figure 19: Explaining the Box-and-Whisker Graphs ......................................................................................................... 36 Figure 20: Building Blocks for a Business Case (Source IDS Training Material) ................................................. 41 Figure 21: An example of Connecting Performance and Costs (Source Main Roads Western Australia) . 42 Figure 22: Data Envelopment Analysis .................................................................................................................................. 43 Figure 23: Data Envelopment Analyses Results ................................................................................................................ 44 Figure 24: Efficiency Classification of all Local Councils in NZ (Shivaramu, 2018) ........................................... 44 Figure 25: The value of investment into preservations treatments (Source: REG, 2010) ............................... 45 Figure 26: The business process for taking account of modelling outcomes in field decisions ................. 46 Figure 27: Different measures carrying different weightings pending road class (Source IDS) ................. 47 Figure 28: Viewing Condition Trends During Field Inspections (Source Juno Viewer) .................................. 48
Figure 1: Information Dashboard 1. This Guideline Imagine having to fly an aircraft with no instruments. There is some information we would wish we had at our fingertips! • How high am I? • Am I going the right way? • Will I get lost in a cloud? • Will I have enough fuel? This information will keep us safe, on track, and able to reach our destination successfully. There is more information we can use if things get a bit rough. Likewise, managing infrastructure requires complete and accurate information on the full extent of the network, its performance over time and the costs required to maintain and operate it. This guideline was developed by the New Zealand Road Infrastructure Management Support Group (RIMS). It forms part of the Body of Knowledge that provides sector guidelines to assist road asset managers with the management planning of their assets. 1.1 Objectives of this guideline Problems / Key Questions During the need development for this guideline, some common issues raised by a number of road controlling authorities in New Zealand included: • How can performance management be used more effectively in the management and maintenance decision making of road assets? • How do different performance frameworks relate to each other? • How the statistical performance outputs relate to the health of the network, and how performance measures relate to different work classes / treatments? • What are the linkages between performance outcomes to the appropriate maintenance programme and investment profile are not clearly demonstrated? • Are agencies fully aware of the data implications on performance monitoring and need guidance to the type, frequency, and quality and processing of condition data? 4
Aim The aim of this guideline is to assist authorities in better decision making for investment in maintenance programmes through performance monitoring, reporting and management. This aim will be achieved by addressing the following objectives: 1. Improve the understanding of how the network performs through specific measures (what they measure, how it can be used) and its performance from a statistical point of view 2. Clarify the performance management implications as viewed from different performance monitoring frameworks used in New Zealand, including One Network Road Classification (ONRC), Department of Internal Affairs (DIA) and Council specific frameworks 3. Improve the understanding of linking performance outcomes to the appropriate maintenance programme and investment profile 4. Define the data implications on performance monitoring and need guidance to the type, frequency, quality and processing of condition data Benefits A robust performance monitoring and reporting process will result in benefits for the agencies as illustrated in in Figure 2 below. Figure 2: Resulting Benefits from Performance Management INCREASED EFFICIENCIES OF MAINTENANCE FUNDS REDUCED RISKS ASSOCIATED WITH INAPPROPRIATE MAINTENANCE INVESTMENTS MORE VALUABLE, CONSISTENT AND TRANSPARENT COMMUNICATIONS OF CONDITION OUTCOMES BETTER FOCUSED MAINTENANCE INVESTMENT 1.2 Structure of the Guideline The Context for Performance Monitoring and Management Part I • Asset managment process • Why we monitor performance • What performance monitoring could achieve Performance Monitoring and Management Process • Data Needs Part II • Performance Frameworks • Telling the Story • Business Case Support • Application 5
2. Performance Management a Critical Part of the Asset Management Processes 2.1 Asset Management processes Too often asset managers believe they will achieve successful network outcomes by executing only one of the asset management process well. In fact, by having a balanced approach across the entire spectrum of asset management activities, a platform of robust linkages is created between the objectives (including investment strategies) of an organisation and ultimately the services provided to the community through the asset. Figure 3 illustrates the main activities involved in road asset management. Although performance management is specifically indicated within the lifecycle management grouping, it links to, and could also be categorised in most of the other processes including: • The outcomes from the Asset Knowledge (e.g. condition data) is given meaning through performance reporting and management • Performance outcomes is a crucial part of the over-all decision processes, in particular, related to Level of Service (LoS) management • Performance monitoring and reporting is a vital process to monitor the asset delivery cycle by reporting the outcome of both operations and maintenance. It is also a significant component within the procurement processes • Lastly, performance monitoring and management is ultimately the communication vehicle, reporting on progress with strategies and legislative requirements. Figure 3: The Asset Management Road Map (Henning, 2015) Customer Legislation Investors Institutional Arrangements (Governance, Strategy, Asset Management Plans) Enabling People (Skills training and guidelines) Decision Making Lifecycle Management Demand Management Pavement Lifecycle cost management tools analysis Disposal Acquire Depreciation and Lifecycle Risk Management Delivery valuation Maintain Operate Performance Level of Service Management Procurement Asset Knowledge (Road Hierarchy, Database, Inventory and Condition Data) 7
2.2 The business case for performance management Why? We report on the outcome of road investment in order to make better decisions towards providing the travelling public with the service they require at an affordable, sustainable, and optimal cost. Figure 4 illustrates the investment approach where we apply inputs to achieve outputs and outcomes. The systems used to convert the inputs into outputs require monitoring to ensure the investment is appropriate and the results will contribute to the outcomes we seek from the system. Figure 4: The Investment Approach (Based on Karlaftis and Kepaptsoglou, 2012) USER CONDITIONED OUTCOMES HUMAN ACTIVITY PHYSICAL SUBSYSTEM SUBSYSTEM SUBSYSTEM VALUES INPUTS ROAD ADMINISTRATOR OUTPUTS CONCOMITANT OUTPUTS 8
What? Performance monitoring and reporting have to assist decision processes at a strategic / investment level while informing programming and planning processes on tactical and operational levels. Figure 5 illustrates the three levels of management and planning in asset management and the typical questions performance management are able to cover. Figure 5: Application of Performance Monitoring in Respective Asset Management Levels • How does the actual performance compare to target levels? • Is the investment targeting the right outcomes? • Are investment levels sustainable? • Are the risks appropriately managed? • Ensure sustainable investment levels • Timing and type of renewal and maintenance • Linking technical inputs to performance outcomes It is vital to monitor the achievement level of the outputs and outcomes sought. This tests if the investment and methods we use are fit for purpose, and builds onto our evidence base for future decision making. 9
How? • “Snapshot” performance reporting – is usually relative to performance targets. This technique is often used in Customer LoS reporting; • Trend Monitoring – tracking the change in the performance of the network over time; “has it got better or worse”, and, • Benchmarking – a process of comparing to targets being set on the basis of other peer organisations “I want to be as good as that team”. In order to answer the different asset management questions, performance reporting is undertaken using different data sets as illustrated in Figure 6. The figure also shows the different purposes for the respective monitoring processes. Figure 6: Performance monitoring applied to different datasets 10
In some cases we have known target performance levels – in other cases, we have to rely on benchmarking ourselves to others. Keep in mind that “good” performance is relative to the type of comparison used. Figure 7 shows the potential outcome of relative comparison analyses. It shows the value of undertaking internal performance comparisons, say comparing current performance to set target performance levels, combined with comparing with external authorities. Combining internal performance comparisons to external comparisons daylight the appropriateness of defined target performance levels. For example, my authority may be performing very well to internally defined targets, but when compared to other authorities, we may find that my authority is over or underperforming. Figure 7: Outcomes from relative comparison analyses Note: Over-performance to internal targets and external comparison could also be classified as being a bad outcome. 11
3. Finding the Optimal Performance Level to Maintain – The Long Term Objective of Asset Management 3.1 Realistic expectations of performance reporting planning applications Section 2.1 describes how performance monitoring and management has relevance in most of the processes and steps involved with the entire asset management process. Because performance reporting is so widely used, it often results in unrealistic expectations of what reporting on performance is able to tell us. Table 1 lists the range of application areas for performance monitoring and subsequent sections go into the application in more detail. Applications are further discussed in Section 8. Table 1: Performance Monitoring Application Additional Processes / What performance considerations required Performance Questions reporting is able to tell us to answer performance questions What is the overall network What standard is being Suggested target / subnetwork performance achieved? performance levels are also a levels or targets we should General indications or combination of: aim for? suggested trends in future • Available funding: short, performance medium and long-term; What was the past • What are the performance performance of the network expectations from the What is the current community; performance or performance • What are the risks we distribution of the network need to manage How does my authority’s • The expected / forecasted performance compare performance for each road to others link into the future What is the optimal Through performance An optimised maintenance maintenance regime on reporting and benchmarking programme for different a network? techniques, we are able to investment levels requires report on efficiencies forecasted performance modelling, lifecycle analyses and optimisation techniques What is the best treatment Performance management This is not a static answer on a road section at any point can only be used for fixed because it requires in time? decision algorithms that do the knowledge of past not optimise a programme as performance, available they often resort to a “worse funding (on a network first” approach level) and taking account of road function and user expectations 12
3.2 What is a good performance for the road network under different LoS expectations “Good performance” can only be defined in the context the situation the network serves Figure 8 shows LoS, investment and risk being an interlinked system for planning and managing road maintenance and renewals. While, each one of these elements could be considered in isolation using performance reporting, the interactions between these elements are more complex and for that forecasted outcomes, Lifecycle cost and optimisation are needed to determine the optimum outcomes. Often the status quo becomes the default level of service, as that is what customers are used to receiving. However, if there is a significant service level gap, the customers have already made it clear that the existing level of service is not satisfactory. In this situation, the default level of service is the anticipated improved service. This approach is embraced by the ONRC philosophy where an asset serving a function in one location should perform similar wherever that asset / network is. When looking at network performance it is important to consider – is this a network issue or a local issue? Extrapolating localised issues across the network is not a helpful intervention strategy. While roading authorities have traditionally chosen the Level of Service that will be provided, along with the level of risk that is acceptable, there has then been some change in securing the corresponding investment. Figure 8: Road network maintenance planning is a balancing act between investment, risk and LoS LoS Investment Risk 13
3.3 Good performance for individual road sections As much as we are unable to use performance monitoring and reporting for determining the network maintenance and renewal programme, the same applies to making decisions on individual road sections. Some of the reasons why it is dangerous to only use historic and performance data for decision are: • Although engineers have the skills for making an appropriate engineering decisions on maintenance or renewal needs for individual road sections, it is difficult to make a decision on an individual road section while thinking of its relationship to available funding and all the other road sections on the network. It is thus not surprising that maintenance and renewal programme solely based on field processes and condition reporting often resort back to a “worse first strategy”; • Likewise, engineers are often unable to bring life-cycle costing into consideration when making maintenance decisions for individual road sections. The main decision is not only related to “what is the most appropriate choice of treatment now”, but often it is rather important to know how the current decision will impact on future maintenance cost. Figure 9 provides an example of using a fixed performance level for making maintenance decisions. Assume for this example that both road sections carry equal traffic and also have the same functional classification. Just because the two road sections reach the same condition threshold at the same time, does not mean that a common treatment or treatment timing would have a good outcome for both road sections. It perhaps would have made more sense to apply a preservation treatment much earlier to Road Section A in order to arrest the fast deterioration rate. Figure 9: The danger of making maintenance decisions on the basis of condition alone A more holistic approach for field decision that incorporates input from an array of considerations or “decision lenses”, is explained in Section 8.2. 14
Part II Performance Monitoring and Management Process
Part II Sections 4.1 Data needs and processing Data 4.2 Recommended data collection frequencies 5.1 Key performance frameworks for managing the road asset 5.2 Additional references to consider Frameworks 5.3 Available measures 5.4 Measures, composite indices, and future performance areas 6.1 Story-telling 6.2 Understanding the full statistical distribution 6.3 Understanding network condition distribution & distribution Story changes over time 6.4 Tips for effectively communicating performance measures 7.1 Linking the performance to investment and maintenance strategy Business 7.2 Peer Comparison and Benchmarking Case Support 8.1 Underlying principles of making optimal decisions on operational level Application 8.2 Using trends during field investigations / RAPT reviews 16
4. Data Business Data Frameworks Story Case Application Support 4.1 Data needs and processing “Public organisations that manage their information well will treat data as a strategic asset. This means that they recognise its value and that they have a deliberate strategy for how they manage and govern information.” (Controller & Auditor General, 2018). There is an obvious and direct relationship between the quality of performance reporting and the quality of the data. For that reason, it is good practice to report on the data quality as part of any performance reporting. A simple example of this is the data quality tables provided with asset valuation reports. The Road Efficiency Group (REG) data quality tools are perfect for more thorough reporting on these issues. Table 2 shows an example of data quality reporting from the REG tool. It is good practice to report on the data quality of each of the data items used in a performance monitoring and reporting process. Also, note that reporting on data quality needs to include both inventory data items and the performance measures used. Table 2: Data Quality Report on Each Data Items (Source REG) Ref Probable Metrics Primary Secondary Initial 2016/17 National Result & Category Category Dimension Dimension(s) Comments Sub- AM-TL2 Treatment length sectioning Timeliness Accuracy maintained Treatment Length Completeness Percentage of treatment length records added or updated during last three financial years. Excludes Network pavement type “U” and disabled The result is a lot higher than expected treatment lengths and very similar to AM-Ca2. This needs to be explored further AM-Su1ba Achieved chipseal resurfacing Complete- Timeliness renewal programme as-builted ness Consistency Percentage of achieved chipseal Asset Inventory resurfacing renewals reported in TIO and as-builted in RAMM (in m2) for Surfacing reported financial year Significantly more data loaded to RAMM than reported as achieved in TIO. Script needs to be updated to included surfaces with appropriate works origin / category only AM-Su1bb Achieved asphaltic concrete Complete- Timeliness resurfacing renewal programme as- ness Consistency Asset Inventory builted Percentage of achieved asphaltic concrete resurfacing renewals Surfacing reported in TIO and as-builted in Same as AM-Su1ba. Larger variation RAMM (in lane.km) for reported largely due to the smaller quantities financial year AM-Su4 Surface records have valid attribute Accuracy data Percentage of treatment length records Asset Inventory with top surface records with a valid chip size (AM>=7, CS
The quality of performance reporting is also a function of appropriate data processing techniques. Table 3 provides some useful tips for data processing. Table 3: Tips for Data Processing Processing Aspect Why is this Important? Recommended Practice Data Aggregation / To be statistically robust in Use consistent 100 or 200m lengths. The ONRC reporting intervals processing requires working performance reporting tool uses 100m lengths with consistently base data sectioning lengths. Given the length and changing nature of treatment lengths, using treatment lengths is not appropriate for statistical reporting Time-Based Analysis – Readers should be aware that Use annual snapshot data in order to allow for Network Changes the RAMM data structure is not time-based changes – e.g. when surface types temporal. Therefore for specific change. Note that the annual snapshot will be a analyses such as “how long do time slice of the data at a given point of inventory my surfaces last” care should and condition data. As the different data items are be taken that the appropriate collected at different frequencies and at a different surface types are matched time of the year, the timing of such a snapshot is to the performance data important. e.g. the condition data has to relate to an and dates updated inventory. Time-Based Analyses In the past, some authorities Undertake full network analysis or when a sampled / trend – Network used sampled surveys. For (e.g. road class) sampling is used, do separate Sampling example, they may have performance trend analyses on consistently measured covered only 1/3 of the samples. For trends to be valid, it has to be based on network for annual roughness continuously monitored parts of the network. E.g. if a surveys. Inconsistent condition network is split into 3 parts (say a, b and c). With “a” monitoring samples cannot being surveyed every year and b and c in alternating be used for trend analyses years, valid trend analysis can only be undertaken and have limited value for on “a”, “b” and “c” in sepearte analysis sets, not as a “snapshot” or “latest” data combined analysis set. Frequency of Industry minimum data The ultimate purpose of trend analyses should Measurements collection requirements be considered for deciding on the data collection often limit trend analyses. A frequency. For example, with the NLTP cycle being minimum of three data points in three-year blocks, collecting the data only once is required for any meaningful during the three years, will have limited value (weak trend analyses evidence) for trend analysis. Check data quality / Completed condition data Validation of trend analyses should prove that validation not only collection surveys do not observed trends are actual and not subject to bias completeness necessarily mean the data within the measurements itself. Simple checks should quality is good include checking for outliers and correlations between maintenance quantities and condition outcomes. Contextual Data Trend analyses needs to be Always include routine maintenance cost trends fully supported by contextual alongside condition trends. Significant shifts in data that may have impacted condition performance also need to be supported the trends / explained by presenting potential causing factors, such as changed or excessive rainfall patterns and / or usages such as significant loading increases due to timber logging or dairy expansions. Meaningful reporting Condition trends by themselves Correlation between different trends increases the are seldom meaningful. It is understanding and value of reporting. Figure 10 good to connect reporting to is an example that suggests an initial increase in the end-customer customer complaints when the blading frequency has changed, yet over time the new LoS is accepted and complaints reduced. 18
Figure 10 is an example that suggests an initial increase in customer complaints when the blading frequency has changed, yet over time the new LoS is accepted and complaints reduced. Figure 10: Linking Grading Frequency to Customer Complaints (Robertson, 2018) Note: The data for 2018 does not include a full financial year, complaints were not recorded prior to 2011. Complaints data were only incorporated since 2011 4.2 Recommended data collection frequencies Condition data collection returns significant benefits for the investment, yet Authorities monitor the cost of data collection because of the perception that data collection does not necessarily result directly in “fixing roads”. The NZTA research report (Henning et al., 2015) on performance monitoring produced Table 4 and Table 5, which included suggested minimum data collection practices for the purpose of performance reporting and management. 19
Table 4: Data collection requirements for performance monitoring (Adapted from Henning et al., 2015) Functional Road Data confidence level Survey frequency Data items Classification (see Table 5) National 4 Annual Full HSD Regional 4 Annual Full HSD Arterial 4 Annual Full HSD Primary collector 4 2 years Full HSD Secondary collector 3 2 years Full HSD Access 3 2 years R&R and FWD Access Low Volume 3 2 years Visual Legend: Full HSD = roughness, rutting, texture, skid, FWD (50 to 100% cover) R&R and FWD= roughness, rutting and Falling Weight Deflectometer (33%% sample) Table 5: Confidence level rating framework for most common road condition data (Henning et al., 2015) Confidence rating Confidence factor Very low (1) Low (2) Medium (3) High (4) Equipment sophistication Visual Automated – non Automated laser Automated laser laser Calibration standard No calibration Internal calibration Contractual Calibrated calibration process according to NZTA state highway standards Quality assurance (QA) No evidence of QA Internal QA Contractual QA, eg Calibrated loop method according to NZTA state highway standards Post survey confirmation None Compare overall Consider individual Benchmark with network trends sections and LTPP sites exception reporting Note: the overall rating is calculated as the average score from each confidence factor 20
5. Frameworks Business Data Frameworks Story Case Application Support 5.1 Key performance frameworks for managing road asset Councils have to consider performance reporting for different organisations; the questions are how do they relate to Councils’ network performance. This section puts these frameworks into perspective. Performance may be measured for internal business / Council needs or to meet the reporting requirements of other agencies, such as the Department of Internal Affairs and NZTA. The performance of the asset also needs to be understand to inform decision-making on how the asset can be best managed. Figure 11 shows the different performance frameworks currently in use within the sector and where it fits in relation to the respective asset management processes. Of these frameworks, the ONRC and Council specific performance measures cover the entire asset management level spectrum. Appendix A demonstrates how Auckland Transport have integrated their own and ONRC reporting frameworks. The focus of the DIA and Wellbeing Framework are at a national level and thus not necessarily integrated at the local level. Figure 11: Related Performance Frameworks to Road Asset Management Levels TREASURY LIVING STANDARDS FRAMEWORK COUNCIL OUTCOME COUNCIL OUTCOME AND OBJECTIVES AND OBJECTIVES COUNCIL LEVEL OF NON-FINANCIAL SERVICE REPORTING PERFORMANCE AND MONITORING MEASURE (DIA) ONE NETWORK ROAD CLASSIFICATION PERFORMANCE MEASURES 21
The following examples of Performance Management Frameworks illustrate the range of approaches in use, and the differences between measuring inputs, outputs and outcomes. Table 6: Relevant Performance Frameworks for RCAs (Road Controlling Authority) Relevance to Framework Description Managing Roads Treasury Living Being relatively new the LSF is setting the pace at the Although not widely used Standards national level for government entities to report the within transport performance Framework (LSF), impact of policies and investment into four impact areas monitoring frameworks, (Treasury, 2015) (capitals), economic, natural, social and human there will be a drive towards more consistent reporting Assess the impact of policy across key in these areas in future living standards dimensions activity management plans. Opportunities should be maximised to relate road infrastructure reporting to LSF capitals HIGHER LIVING STANDARDS • Economic capital • Natural capital • Social capital • Human capital Department of The rules for non-financial reporting measures came The Department of Internal Internal Affairs into force on 30 July 2014 under the Local Government Affairs performance Non-financial Act 2002. Through these rules, Councils are obligated to measures are included in Performance report on certain performance measures as part of their the Councils’ Long-term Measures asset management process. Plan and asset management planning process. Although Performance measures to report on include: at a strategic level, these 1. How safe are the local roads? measures are actively 2. What is the overall condition of sealed roads in the managed and monitored local road network? through delivering tactical performance measures 3. Is the sealed roads network being maintained that feed into the adequately? strategic outcomes 4. Are the footpaths that form part of the local road network being maintained adequately? 5. Does the local authority responsible for the service provide a timely response if there is a problem? One Road Network Developed as part of Road Efficiency Group’s (REG) The ONRC reporting Performance ONRC initiative a set of expected performance measures framework is foundational in Frameworks (ONRC) were developed for each respective ONRC class. These the in the strategic planning measures are broadly categorised according to the and reporting of road following customer outcome measures: investment outcomes. • Amenity Maintenance planning • Accessibility processes resulting in the • Efficiency quantity, type and timing • Resilience of maintenance ultimately • Travel Time has to be checked against • Safety achieving the state The Auckland Transport case study (Appendix A) customer outcomes illustrates how the ONRC framework is integrated with the Council specific LoS performance framework 22
Relevance to Framework Description Managing Roads Council Specific Councils will have specific performance monitoring Specific tactical performance Level of Service frameworks that incorporate the ONRC framework but measure targets are specified Monitoring and add some specific Councils specific objectives. Most to ensure: Reporting Councils will add asset preservation type measures to • ONRC customer ensure sustainable maintenance programmes outcomes are met • The road network is maintained at an optimal and sustainable manner 5.2 Additional references to consider Table 7: Additional References Reference Description and Value to NZ Authorities OECD Performance Measurement It provides an excellent summary of performance frameworks used in the in The Road Sector: A Cross- USA, Canada, NZ & Australia and Japan. It also provides many examples of Country Review of Experience performance measures to use for: (Karlaftis and Kepaptsoglou 2012) • Pavement and structure preservation • Operational Efficiency • Capacity Expansion • Safety and the Environment • Sustainability AP-T176 / 11 - Network Some good examples of developed composite indices for the overall Performance Indicators – Next performance of pavements, surfaces, safety and efficiency Generation (Chin, et al., 2011) Guide to Asset Management The guideline provides an over-all road asset management approach. It is (Hassan, et al., 2018) a good explanation of how performance monitoring and reporting is used within strategic and tactical asset management ONRC – Best Practise Guides There are a number of ONRC guidelines that provides significant guidance (Obtainable from https://www. on performance monitoring and applying performance measures in planning. nzta.govt.nz/roads-and-rail/ Some of the most relevant are: road-efficiency-group/resources/) • Auckland Transport Auckland Transport ONRC Gap Analysis: a project plan • Lag measures vs real-time measures • Reporting on mandatory non-financial performance measures – a Waikato guideline • Incorporating ONRC into asset management planning • Bridge management framework AP-T84 / 07 Application of the The analytic hierarchy process is a decision technique that uses Analytic Hierarchy Process in performance measures in decision making. For example, it considers the Road Asset Management: User different impact areas of roughness (e.g. preservation, LoS and safety) for Manual (Su and Hassan, 2007) ranking interventions 23
5.3 Available measures The following table combines the suites of measures typically used by roading asset managers. It should be noted that the ONRC output measures are operational / tactical, while the outcome measures are more strategic. The outcome measures consider the performance of the transport system overall, while the output measures are more asset specific. Table 8: Available Measures Snapshot (S), Level of Influence Works that Review Category Measure Description Trends (T) or Business Mtce Impacts on this Comments Benchmark (B) Case Planning measure Framework: ONRC Safety Customer Number of fatal and The total number of fatal S, T, B Major Minor Strategic – safety Direct measure of Outcome serious injuries and serious injuries / yr programme safety outcomes investment. Black spot improvements Collective risk (fatal Intensity measure – that S, T, B Major Minor Strategic – safety Direct measure of and serious injury) highlights dangerous routes programme safety outcomes rate / km or parts of the network investment. Black for specific parts of spot improvements the network Personal risk The total number of fatal S, T, B Major Minor Strategic – safety Direct measure of (fatal and serious and serious injuries by programme safety outcomes injury rate by traffic volume / yr investment. Black for specific parts of traffic volume) spot improvements the network Safety Technical Permanent hazards The number of permanent S, T, B Minor Intermediate Marking of hazards Specific output Output hazards that are not focus marked in accordance with national standards (e.g. building / narrow bridge within the road corridor) Temporary hazards The number of sites S, T, B Minor Minor Contractual Specific output inspected and the number compliance focus, indicative of of audits compliant with network COPTTM Sight Distances The number of locations S, T, B Minor Intermediate Minor safety works Specific output where sight distance or focus, indicative of signs are obstructed network 24
Snapshot (S), Level of Influence Works that Review Category Measure Description Trends (T) or Business Mtce Impacts on this Comments Benchmark (B) Case Planning measure Loss of control on The number of fatal and S, T, B Major Minor Skid treatments and / Specific output wet roads serious injuries attributable or curve re-alignment focus, indicative of to the loss of driver control network (including on wet roads) Loss of driver The number of fatal and S, T, B Major Minor Signs, lighting, Skid Specific output control at night serious injuries which occur treatments and / or focus, indicative of in crashes at night curve re-alignment network Intersections The number of fatal S, T, B Minor Minor Skid treatments Specific output and serious injuries at focus, indicative of intersections network Hazardous faults The number of hazardous S, T, B Minor Intermediate Routine maintenance, Specific output faults which require evasive Minor safety works focus, indicative of action by road users (e.g. network large pothole) Safety Technical Cycle path faults The number of cycle path S, T, B Minor Intermediate Routine maintenance, Direct measure of Output hazards requiring evasive Minor safety works safety outcomes action by cyclists for specific users Vulnerable users The number of fatal and S, T, B Major Minor Strategic – safety Direct measure of serious injuries involving programme safety outcomes vulnerable users investment. Black for specific users spot improvements Roadside The number of locations S, T, B Minor Minor Inspection Routine Specific output obstructions where there are maintenance focus, indicative of unauthorised items placed network within the road reserve Resilience Number of journeys The number of unplanned T Major Major Bridge capacity Impact of local Customer impacted by road closures and the Drainage issues on network Outcome unplanned events number of vehicles affected Improvements performance by closures Number of The number of unplanned T Major Major Bridge capacity Impact of local instances where road closures and the Drainage issues on network road access is lost number of vehicles affected Improvements performance by closures where there Creation of emergency was no viable detour route redundancy 25
Snapshot (S), Level of Influence Works that Review Category Measure Description Trends (T) or Business Mtce Impacts on this Comments Benchmark (B) Case Planning measure Amenity – Smooth Travel % of travel on sealed roads S, T, B Minor Minor Localised repairs or Direct measure of Customer Exposure (STE) which are smoother than a full rehabilitation and outcomes. Strong Measures defined threshold smoothing treatments link to customer Peak roughness The 85th and 95th S, T, B Minor Minor Localised repairs or Direct measure of percentile roughness of full rehabilitation and outcomes. Strong your roads smoothing treatments link to customer Average Roughness Average Roughness (IRI) S, T, B Minor Moderate Rehabilitation Direct measure of measured by laser or (smoothing outcomes. Impacts Bump integrator treatments such as directly on drivers’ granular or asphalt perception and overlays) comfort (Not always an indication of road deterioration. It is heavily impacted by topography and urban / rural differences) Aesthetic faults The number of aesthetic S, T Minor Major The full spectrum of Specific output faults that detract from the maintenance options focus, indicative of customer experience network Accessibility The proportion The proportion of each S, B Major Minor Rehabilitation Impact of local - Customer of network not road classification that is issues on network Outcome available to: not accessible to Class performance a. Class 1 heavy 1 Heavy Vehicles and (Subjective and vehicles 50MAX vehicles a function of risk tolerance) b. 50MAX vehicles Accessibility- Accessibility The number of instances T Minor Minor Traffic Services Specific output Technical Output where the road is not Marking focus, indicative of marked in accordance with network national standards 26
Snapshot (S), Level of Influence Works that Review Category Measure Description Trends (T) or Impacts on this Business Mtce Comments Benchmark (B) measure Case Planning Travel Time Output at indicator The hourly traffic volume T, B Major Minor Contract to Impact of local Reliability information sites during the peak morning encourage night issues on network Customer hour and peak afternoon / works performance Outcome evening hour Strong customer focus Cost Efficiency Pavement The total quantity and cost S, T, B Major Major Rehabilitation Lag indicator of rehabilitation of pavement rehabilitation network health and preservation Chipseal resurfacing The total quantity and S, T, B Major Major Resurface Lag indicator of cost of sealed road chip network health and seal resurfacing preservation Asphalt resurfacing The total quantity and S, T, B Major Major AC Resurface Lag indicator of cost of asphaltic sealed network health and road resurfacing preservation Unsealed road Total quantity and cost S, T, B Major Major Maintenance, grading Lag indicator of metalling of metalling that has and re-gravelling network health and been undertaken over preservation the previous year as renewal work Overall network The overall cost per km and S, T, B Major Minor All Lag indicator of cost, and cost by vkt of routine maintenance network health and work category activities, and cost by work preservation category on each road 27
Snapshot (S), Level of Influence Works that Review Category Measure Description Trends (T) or Impacts on this Business Mtce Comments Benchmark (B) Case Planning measure Framework: DIA Road Safety Crash rate trend The change from the T Minor Intermediate Safety improvements Direct measure of previous financial year in safety outcomes the number of fatalities deaths and serious injury crashes injuries Road Condition Smooth Travel % of travel on sealed roads S, T, BM Minor Minor Localised repairs or Direct measure of Exposure (STE) which are smoother than a full rehabilitation and outcomes defined threshold smoothing treatments Impacts directly on drivers’ perception and comfort (Not always an indication of road deterioration. It is heavily impacted by topography and urban / rural differences) Maintenance Maintenance of The percentage of the S, T Minor Minor Resurfacing An indication planned vs actual a sealed local sealed local road network of effective road network that is resurfaced (as programme compared to the target delivery area set in the Asset Management Plan) Footpath The proportion of The percentage of S, T Major Major Footpath Outcome measure, Conditions footpaths above footpaths within a territorial maintenance and noting result is LoS expectation authority district that fall renewals relative to authority within the level of service or defined LoS service the standard for the condition of footpaths 28
Level of Influence Snapshot (S), Works that Review Category Measure Description Trends (T) or Impacts on this Business Mtce Comments Benchmark (B) measure Case Planning Framework: Additional useful measures Asset Preservation 75th Percentile 75th rutting value as S, T, BM Major Major Rehabilitation A strong indicator Rutting measured by High-speed of pavement health data collection (Requires HSD survey) Safety Network portion The network portion of S, T, BM Minor Major Localised repairs Direct impact above Peak Rutting rutting exceeding 15 mm measure for wet- that may lead to water road crashes ponding and unsafe driving conditions Asset Preservation Surface Condition The composite index S, T, BM Major Major Resurfacing An overall Index (SCI) for surfacing defects as measure of surface calculated by NZTA condition (Does not indicate specific surface issues) Customer A portion of the Relative measures as per S, T Major Minor All maintenance Customer LoS Life-cycle network in very LoS definition for authority treatments perception focused Management poor condition (e.g. (Subjective 95th Percentile) and relative to specific council) 29
5.4 Measures, composite indices, and future performance areas Performance monitoring and measurements are rapidly developing as our understanding of the questions and the problems we encounter increase, technology to measure improves, and our ability to report and convey performance reporting improves. This section provides some general guidance on the future development of measures and the way in which we report them. 5.4.1 Performance measures Asset performance can be measured using different methods and technology. A key discussion for the network manager is to ascertain what measures are useful and what will inform decision-making. Figure 12: Measures that describe the surface condition CRACKING • LONGITUDINAL • LATERAL • ALLIGATOR RAVELLING POTHOLES POTHOLE PATCHES FLUSHING Figure 13: Measures that describe the pavement make-up and condition Note: SNP is the modified structural number, curvature is the shape of the deflection bowl as measured by the falling weight reflectometer Adapted from https://kids.britannica.com/students/assembly/view/19289 30
5.4.2 Composite indices Performance measures are often referred to as the quantified values of individual distress mechanisms, indices are often used to combine these measures into a single number. Composite indices are very useful to describe an overall outcome that aggregates a number of different individual indicators or performance measures. An example of the most widely used in New Zealand is the Surface Condition Index (SCI) that combines a number of surfacing defects to provide an over-all surface health indicator. The factors included in the SCI are: • cracking (from alligator in RAMM • ravelling (from scabbing in RAMM) • potholes • pothole patches • flushing • surface age in years • expected surface (design) life in years Figure 14 provides a typical urban network surface condition distribution as expressed by SCI. Figure 15 gives the condition suggested ranges for the SCI. Figure 14: A Typical Distribution of SCI for an Urban Authority The figure shows that an expected worst surface condition is observed for the lower volume road spectrum when considering Primary collector and below. A higher SCI is observed for Arterial and Regional road suggesting that these surfaces (predominantly asphalt surfaces) display a high degree of defects (mostly cracking). Nationwide performance reporting suggested that asphalt surfaces having significantly higher cracking compared to chip seals. 31
Figure 15: Suggested condition ranges for SCI Threshold SCI Condition Lower (>=) Upper (
Table 10: Potential additional composite indices Example of some potential composite indices Composite Index Description Safety Index A safety index that combines road features into an overall assessment of safety / crash risks. Some examples include KiwiRAP and iRAP (Figure 16) Structural Index An index that describes specific strength features of a road. Some examples are: • World Bank Modified Structural number concept (SNP) • Structural indices for Rutting, Flexure and Shear (Henning et al., 2010) Figure 16: IRAP Star Rating System for Roads (source hhtps://www.irap.org/) Road Survey Data Investment Road coding Processing Star Ratings Preparation Plan Road Design Target: optimise star ratings, reduce deaths and serious injuries Design Refined 5.4.3 Future performance areas / measures The development of performance measures and composite indices is an on-going process due to ongoing changes and development like data collection and analysis techniques developed. Furthermore, there are still some known issues associated with some current indices. Table 11 lists some aspects that may require further work on some measures. Table 11: Future Performance Monitoring Area/ measures Monitoring Area / Measure Need / Description Development Status Roughness International Roughness Index (IRI) has We understand the issues, yet some known limitations as highlighted in more research is needed to NZ Transport Agency Research Report 430 develop something better. (Brown et al., 2010). At this point in time, Roughness is required as part it is perhaps sufficient if road controlling of DIA reporting suite authorities are aware of the limitations 33
Monitoring Area / Measure Need / Description Development Status Pavement preservation / We use surrogate measures such as rutting Most Road controlling sustainability measures and strength as indicators of whether a authorities are using network renewal rate is adequate. More roughness for this, which work is needed in this area is the worse measure to use. This is perhaps an area for priority development. Roughness from construction or geological reasons, which does not mean the road needs maintenance. Other measures such as rutting are more useful to indicate pavement deterioration. Pavement Failure Risk Most measures we currently use, report on The initial concept developed performance under prevailing conditions. - a PhD was completed on Vulnerability to changing traffic loading developing such a model and/or increasing moisture levels is (Schlotjes et al., 2014). currently causing difficulties in panning. A Auckland Transport is trialling failure risk model/index would be useful for this concept this purpose Unsealed Roads Most of the unsealed roads reporting Refer to NZ Transport Agency aspects are covered in ONRC. Yet some research report 652 (Henning additional measures would be helpful for et al., 2018) more effective investment planning into these roads. These include more traceable More pragmatic ways of customer complaint information, reporting measuring dust are being on dust emissions and effective monitoring researched of aggregate source performance and use Bridges There are some strong performance Refer to RIMS Data Collection frameworks available but are not readily Framework for Bridges used in NZ. It simply needs more focus 34
You can also read