The loss reserving process, reimagined
Gaining access to the data actuaries need can be a huge hurdle in the loss reserving process.
Revolutionize your loss reserving processes with Milliman Arius.
Estimating loss reserves is a quintessential function in determining an insurer’s core financial stability. Yet manual processes and outmoded technology typically hamper the process at nearly every juncture. Stalled by delays in accessing data, tripped up by cumbersome spreadsheets, and burdened with time-consuming reviews, current approaches often deliver “good enough” estimates at a time when “good enough” may soon be insufficient, if not risky.
Ever narrowing deadlines and increasing reporting requirements from agencies around the world have placed mounting pressure on scarce actuarial resources and cut into the time that actuaries can spend studying and interpreting the data. This situation has led to an endless cycle of catch up and temporary fixes that have become engrained in the reserving process, and often preventing organic process improvement from taking place.
Moreover, these patchworked fixes may increase operational risk to the business resulting from inconsistent or inaccurate analyses. At some point very soon this make-do approach will become untenable, and modernization of the long outdated analysis process will need to take place.
Understanding loss and claim trends earlier
So as we look to that future solution, what should such a reengineered loss reserving process deliver?
For starters, a more efficient and reliable reserving solution provides quicker indications as to how losses and reserves are trending. The system should help identify adverse development early in the analysis process, even allowing users to automatically estimate the current reserves using previous assumptions to see how prior picks are performing. This would allow the team to get senior management up to speed as early in the analysis process as possible.
At the analyst level, increased efficiency will come from removing obstacles to data access, automating repetitive tasks that bog down the process, increasing confidence in the reliability of analysis, rejuvenating a static reporting system, streamlining project management, strengthening governance, and employing a platform that meets the growing needs of the organization. Obviously, getting there will require some structural changes.
A power tool for senior actuaries
If purpose-built reserving tools add significant efficiency to the work of the reserve analysts, they are power tools for senior actuaries responsible for project review. Specialized collections of exhibits and reports can show reviewers exactly the information they’re looking for, providing direct access to the diagnostics and information they need to quickly understand what's going on in a specific analysis. Templates assure that reviewers know exactly where to look for whatever they want to see, consistently from analysis to analysis, allowing them to spend their time understanding the business rather than time hunting for specific calculations.
A central data repository
Under the current process, reserving actuaries often start from a time-deficit position because they rely on IT or Claims functions to provide fundamental elements like triangulated data. This challenge typically continues through much of the reserving process as modifications to data are required, and requests and re-requests for data wait in queues along with other requests for IT’s help from across the company.
These persistent delays point to the need for a central repository from which actuaries can directly access data at the granular level that meets their needs. This shift gives actuaries ownership of the data, reduces time delays, and enables actuaries to focus their expertise on the areas that can provide the most value.
Automating low-value tasks
Today, far too many tasks remain manual, requiring actuaries to spend time adding diagonals to templates, relinking spreadsheets, adjusting methods, and then checking and rechecking the work. Instead, these tasks should be part of an automated process that systematically updates data and templates across all projects, from the creation of the data set and the generation of loss triangles, through initial selections, and even to reporting.
Eliminating as much of the perfunctory repetition as possible from the process frees up time for actuaries to drill down into loss triangles, examine potential outliers, and conduct ad hoc analyses, better aligning the reserving process with the demands of senior management and regulators.
Much of the manual intervention that goes into quality control needs to be replaced with a technology framework that instills reliability in reserve analysis by guaranteeing the integrity of all methods, exhibits, and diagnostics, and the accuracy of all calculations. This is no longer a place for a spreadsheet-based process.
The need for dynamic interactive reporting
The static nature of reporting that characterizes so much of communicating loss estimates and other metrics needs to move to a more dynamic interactive approach; that is, reporting where graphics and dashboards tell a loss reserving story that engages the company’s business leaders. At the same time, it should have the capacity to involve other internal stakeholders in a way that encourages them to look beyond top-level results and investigate the factors that drive those results.
If reengineered with an eye toward better governance, role-based permission protocols can allow access to only the parts of the analysis in which an individual is directly involved; also, systematic logging and journaling of activities can automatically address today’s sophisticated regulatory requirements. Such a solution should work in the background, seamlessly replacing the cumbersome manual logs and limited controls that too often dominate governance in many of today’s reserving teams.
And the constraints of precious IT resources that are forced to balance multiple priorities can and should be replaced by a platform with scalability and elasticity that more cost-effectively, and in many ways automatically, responds to the fluctuating demands of a reserving process.
Implementing the enhancements described above appears to be a tall order, though perhaps not as tall as it looks. It will, however, require that actuaries seriously reconsider everything in their current workflow, starting with the right cloud platform on which to build a new vision of the reserving process. We will consider the individual components of the analysis process in more detail as part of a series of upcoming blogs.
In many ways, reserving departments have come to a crossroads. They can continue to labor under a patchworked process that is short on analysis and whose estimates just slip in under deadlines. Or they can adopt an approach that can reduce working hours, drive efficiency and reliability, and deliver insightful results. The choice will set the course for the department’s ability to understand risk, manage growing regulatory requirements, and fulfill internal stakeholders’ needs. It is likely to determine whether the reserving team is viewed as a forward-looking participant in assessing and helping to ensure an insurer’s financial stability or as a reactive cost center.
This article is part of a longer series on reimagining the loss reserving process. Read the next article here, or view the rest of the series here.