Historically, loss reserving systems and processes have gotten by with relatively little, if any, strong internal controls. Far too often a company’s loss reserving process relies upon manual procedures such as click-through data reconciliation, drag-and-drop routines for modifying spreadsheets, and awkward journaling routings. These out-of-date approaches typically lack basic security controls, leaving important reserve and loss cost estimates vulnerable to human errors and substantial inefficiencies.
In addition, many carriers continue to survive on the bare minimum process control necessary to meet their SOX or Model Audit Rule certifications and, of course, to get clean audit opinions. Given that reserves can be the largest (and most uncertain) item on an insurance company’s balance sheet, just scraping by can hardly be considered best practice. Today, technology and process automation can offer a governance framework that ensures greater confidence in management’s decision-making with much less effort.
Effective governance provides many benefits in addition to regulatory compliance. Reliable controls make the reserving process more efficient and less susceptible to disruption from staff turnover or rotation programs. Controls can provide more consistent analysis throughout periods and across analysts, ensuring a better understanding of the overall reserving process on the part of senior management as well as more reliable and understandable output.
Naturally, introducing controls will result in fewer surprises in the reserve analysis, which helps build confidence in the numbers at all levels. A reserving process that builds in early warning systems can help identify trouble spots to be communicated to management well before typical reviews. The added transparency also makes communication of results to the board or audit committee a little less stressful for the lead actuary or the chief financial officer (CFO). Finally, a well-governed reserving process will result in a more effective and efficient external audit, resulting in a less costly and time-consuming audit window. Each of these benefits, quantifiable or not, makes controls throughout the reserving process well worth the investment.
These four benefits (increased efficiency, more involved management, fewer reserve surprises, and streamlined audits) are all intertwined, and come from robust controls implemented in four facets of the reserving process:
- Data quality
- Reserving process
The modern reserving department needs senior management to be strongly committed to and involved in the reserving process and in the determination of the final reported numbers. Management generally will be responsible for implementation of most of the controls described herein, but also stands to gain the most from it. The reserving process itself needs to be well established, documented, and formally approved.
A modern actuarial reserving tool allows managers to quickly identify trouble spots and areas of concern within their data, via automated diagnostic summaries, actual-versus-expected reports, or analytic tools connected directly to the data underlying their analyses. Knowing where adverse development is coming from early in the process allows key personnel to focus on these areas, while allowing the simpler coverages developing as expected to be handled by junior analysts or the automated process itself.
A clear workflow must be established within the reserving project. From retrieving the data to booking management’s selected estimates, all steps in this process benefit from management controls. These steps can vary in complexity, but at a minimum they must limit access to various components of the reserving process to specific employees, and track user activity throughout the system, identifying and timestamping all material changes.
A well governed reserving process is a consistent reserving process. Codifying the roles of all team members, from accessing the data to the senior management signoff, results in everyone knowing precisely what their role is, when they are expected to perform each of their tasks, and what gets sent on to the next step in the process. This helps avoid surprises in the reserving timeline. It also helps allocate sometimes-scarce actuarial resources, and eases challenges from staffing shortages and rotation programs by shortening onboarding windows with consistent training and workflows.
Data quality controls
The single most important element within an actuarial analysis of reserves is the underlying data. Whether the input data comes directly from the claim records or is intercepted and aggregated prior to the actuaries’ access, controls are essential to assure the quality of the underlying data. For years, actuaries have relied on pivot tables for triangle generation and to summarize analyses; modern technology now provides more robust and automated extract, transform, and load (ETL) capabilities, which significantly mitigate the process and human risks inherent in using spreadsheets for such large data manipulation.
Reserving departments must ensure that the data used in the actuarial analysis is reconcilable to source data, which can be handled automatically with modern reserving tools. In addition, the tools used must be capable of querying data at the level sufficient for a range of actuarial reviews, from the standard triangles analyzed every quarter to ad hoc triangles that filter out specific subsets of claims. Best-in-class reserving departments take advantage of automated data input feeds to limit human intervention between the raw claim data and the reserving tools, substantially reducing numerous potential risks.
Adjustments to the data for the purposes of analysis will always be required, from simple segmentation or pivoting of the data set to more detailed manipulations such as claim limiting, excluding specific claims, or aggregating based on specific variables. A proper reserving tool will be capable of performing these adjustments, as well as providing logging and messaging regarding when manual steps have been inserted into the process.
Modern actuarial tools should automatically provide all these controls without getting in the way of the timely access to the data that reserving actuaries need to meet their deadlines.
Reserving analysis controls
Within the analysis itself, certain controls will help make reserve reviews more efficient and better protected from human error. First, building on the data controls from above, data feeds into a reserving project should be locked, and analysts should not be able to modify the underlying data, intentionally or otherwise.
Second, a reserving process that effectively manages assumptions and selections makes rolling projects forward much more straightforward, and potentially automated. Detailed information on selection logic from prior analyses should help inform current selections, without forcing the analyst to continually look up the previous analysis.
Modern reserving software provides an analysis that is consistent and standardized, one that is well-controlled but which still offers sufficient flexibility for users to create and apply new methods if desired. It’s well documented that spreadsheet-based models have significant potential for human error (almost regardless of how well they are checked). Specialized reserving software virtually eliminates this potential for spreadsheet risk.
Whenever changes are made, they must be documented and version-controlled. The reserving tool should automatically log each change with a timestamp and the user who made the change. These logs can help managers at multiple levels identify material changes to the analyses without having to rely on inconsistent journaling. In addition, a report that logs all professionals who reviewed and signed off on the reserves should be automatically generated and archived with the analysis files.
All reserving projects should be backed up frequently, if not continuously, to avoid the loss of critical information. This essential process should be a part of any automated solution, not reliant upon the memory of humans who also have numerous other time pressures. More robust systems will include the capability to roll back the entire reserve analysis to a specific point in time, as either a recovery mechanism or perhaps to account for a material restatement of input data.
The increased scrutiny placed on reserving managers effectively requires that the decisions made today are supportable far into the future. Management needs to be able to recreate the thought process that went into today’s reserve estimate decisions (from raw data to final report) perhaps two or more years down the road. Having a reserving tool that automatically archives projects when completed, together with detailed version control notes, makes this arduous-sounding task simply a collateral benefit of working in a competent reserving solution.
The bottom line
Unlike spreadsheet-based reserving, a modern governance system can consolidate all the individual controls involved in data protection, data reconciliation, and data management into a unified platform that manages these controls, in large part without human intervention.
For small companies where work processes may not be as messy as at their larger counterparts, strong governance may not seem like an immediate concern. However, it can rapidly escalate in priority if the company goes public, expands through acquisition, or even experiences reserving difficulties. Having robust controls in place can set the stage for easier transitions.
No matter what the size or structure of a carrier, technology can streamline the governance process, transforming what often seems like a tedious obligation with little intrinsic reward into an integrated component of the loss reserving process. Governance is no longer a matter of mere compliance with regulatory requirements. Executed efficiently, it can increase confidence in management’s decision-making, capital allocation, and financial stability.
This article is part of a longer series on rethinking the loss reserving process. Read the rest of the series here.
Governance at a glance
Many of the core requirements of the National Association of Insurance Commissioners (NAIC) Model Audit Rule can be automated or greatly streamlined under a modern governance system, reducing the possibility of error and improving efficiency in the loss reserving process. Below are some of the most salient requirements and the ways in which a modern, well-controlled governance process can manage them.
|Requirement||Re-engineered governance platform|
|Secure user access||Data is stored in a secure location so only individuals with permission can save or delete data, and user login allows for tracking of activity.|
|Role-based user management||Access to data, exhibits, and analyses is based on role-based permissions that allow individuals to interact with the data and project based on their specific responsibilities.|
|Data management||Data management utilities include features such as automated data input feeds, data assumption management, and well-controlled model output for financial reporting purposes.|
|Data reconciliation||Reconciliation is automated by utilities that provide easy comparison of data and totals.|
|Data protection||Data controls may include locked input data arrays and read-only files that lock the projects at different points in the loss reserving process to prevent inadvertent errors.|
|Change and version control/journaling||Key journaling capabilities include systems that automatically log the date and time of each change and the individual who made the change, that prevent users from disabling tracking features, and that make the log easily searchable.|
|Backups||Applications are automatically backed up in sufficient frequency, avoiding loss of critical information.|
|Archiving||Final approved versions are archived in protected files.|
This article was originally published on April 25, 2022.