By Claire Wilkinson, Vice-President - Global Issues
Catastrophe modeling is a risk management tool that uses computer technology to help insurers and reinsurers as well as business and government agencies better assess the potential losses caused by natural and man-made catastrophes.
The modeling process evolved in the late 1980s as companies became increasingly aware of their exposure to catastrophic risks. After Hurricane Andrew in 1992 and the Northridge earthquake in 1994, the use of catastrophe models took off as companies sought to more accurately analyze, write and price for natural catastrophe risk.(1) The advancement of catastrophe modeling has also been made possible due to quantum leaps in computing power.
In the wake of the record hurricane seasons of 2004 and 2005 and amid predictions of increased storm activity over the next 15 to 20 years, catastrophe models are coming under increasing scrutiny by insurance regulators. Some consumer advocates have also suggested that greater transparency is needed to understand what goes into the models.(2)
While many companies make use of catastrophe models today, as with any financial or meteorological model, there are no guarantees. The hurricane seasons of 2004 and 2005 and the record dollar value and number of claims paid to policyholders by insurers underscore the point that catastrophe models are not an absolute predictor of risk.
Rather, catastrophe modeling is one of many tools in the risk management toolbox available to insurers and reinsurers as they look to predict future losses and better manage and prepare for disasters in the years to come.
Natural catastrophe models combine historical disaster information with current demographic, building (age, type and usage), scientific and financial data to determine the potential cost of catastrophes for a specified geographic area. The models use these vast databases of information to simulate the physical characteristics of thousands of potential catastrophes and project their effects on both residential and commercial property.
Natural catastrophe models have been developed for a wide range of catastrophic risks and geographic territories worldwide, including specific industry types. All major natural hazards are modeled, including hurricanes, earthquakes, winter storms, tornadoes, hailstorms and floods. A number of catastrophe modeling firms have also developed man-made models that help quantify the potential financial impact from emerging risks such as terrorism.
Today, the three main proprietary catastrophe modeling firms are: AIR Worldwide, Risk Management Solutions (RMS) and EQECAT (also known as EQE). Insurers, reinsurers, rating agencies, risk managers and major insurance brokers license models from these firms. Some also develop their own models. In 2007, Karen Clark, widely regarded as a pioneer of catastrophe modeling, also launched a new firm, Karen Clark & Co. to help companies better use catastrophe models.
It is important to recognize that many insurers and reinsurers engage multiple catastrophe models when assessing their exposures. The use of multiple models by companies may provide different outcomes, enabling them to better understand their books of business and apply more targeted underwriting decisions.
The process of developing sophisticated catastrophe models is complex and draws on expertise from a broad range of technical and financial disciplines. The models utilize the skills of many experts, including meteorologists, seismologists, geologists, engineers, mathematicians, actuaries, decision scientists and statisticians.
Over the years, catastrophe models have been constantly updated and fine-tuned to incorporate the latest technologies, data, and research findings. For example, some of the latest models are increasingly incorporating scientific data, such as developing weather systems.
Catastrophe models help a wide range of industry participants, including insurers, reinsurers and risk managers to determine the most accurate risk management and pricing strategies. In this way, catastrophe models help to ensure that individual companies are resilient enough to withstand a major disaster affecting their insured properties.
Insurers use catastrophe modeling as a tool for both underwriting and pricing.(3) Models are used to assess the risk in a portfolio of exposures. This helps guide an insurer’s underwriting strategy, and can help the insurer decide how much reinsurance to purchase. In addition, some insurance regulators allow insurers to use catastrophe modeling in their rate filings to help determine how to price the insurance product.
Reinsurers and reinsurance brokers also use catastrophe modeling to help them price and structure reinsurance contracts, while participants in the capital markets, such as catastrophe bond investors and investment banks, use these models in the pricing and structuring of catastrophe bonds.
Model outputs are an increasingly important component of the catastrophe analysis conducted by rating agencies as they assess the financial strength of insurance companies.
Catastrophe models identify and quantify the likelihood of occurrence of specific natural disasters in a region and estimate the extent of incurred losses. There are four key components of catastrophe models: hazard, inventory, vulnerability and loss.(4)
The hazard component of catastrophe models characterizes the risk of hazard, whether that is an earthquake, hurricane, tornado, or flood. For example, for a hurricane the risk would be characterized by projected path and wind speed, as well as landfall locations and track angles at landfall.
The inventory portion of the model characterizes the portfolio of properties at risk as accurately as possible. Geographic coordinates such as latitude and longitude are assigned to a property based on its street address, ZIP code, or another location descriptor. Then the model calculates the number of individual structures in the insurer’s portfolio (location of the insured properties) that are at risk from hurricanes of different projected paths and wind speeds. Other factors that characterize an individual property are the construction and occupancy types, building height and age.
These initial two steps analyze the physical impact of the hurricane, earthquake or other hazard on the property at risk. Based on this measure of vulnerability, the dollar value of loss to the property inventory of an individual insurer is then evaluated. In addition to direct physical losses, catastrophe models include indirect losses such as the impact of business interruption and allocated loss adjustment expenses (ALAE).
Models also incorporate policy and financial data from insurers and reinsurers, such as coverage value, deductibles or attachment points (the point at which reinsurance begins to apply), and limits, to create a profile of the probability of loss from different event scenarios exceeding certain levels. Companies can evaluate the correlation of expected losses from a single event or combination of events affecting more than one territory.
With these model outputs, underwriters are better equipped to analyze and evaluate the overall level of capital required to pay potential claims and to allocate capital between the individual risks within that portfolio.
It is important to recognize that actuarial standards exist for the appropriate application of catastrophe models. Actuaries are guided in their use of catastrophe models by Actuarial Standard of Practice 38 Using Models Outside the Actuary’s Area of Expertise which details the review required by actuaries before they use such tools in their work product.(5)
Following the unprecedented frequency and severity of storms during the 2004 and 2005 hurricane seasons, catastrophe models have faced some criticism within the industry for underestimating the losses. Additionally, many insurance regulators remain reluctant to permit their widespread use, often alleging that they are “black boxes”.
However, it is important to recognize that there is no one-size fits all approach when it comes to catastrophe modeling. There are a number of different methodological approaches to catastrophe modeling, each using somewhat different assumptions, data inputs and computational algorithms.
Models are used not just by the insurance community, but also by weather forecasters and government agencies. Consequently, the output of competing models—while generally consistent—will vary and estimated losses will contain some deviation from actual observed losses. Additional uncertainty also arises from the simple random deviation observed in all complex meteorological or geological processes.
As with any financial model, there are no guarantees. Individual catastrophe models are only as good as the quality and integrity of data they comprise at a certain point in time and the way their outputs are then interpreted and used by individual companies and underwriters.
Just as insurers and reinsurers are constantly reassessing their risks, so catastrophe modeling companies, along with other stakeholders in the business, are continually reviewing their models to reflect changing loss experience and any increase in projected losses.
In the wake of Hurricane Katrina, certain catastrophe modelers have developed new hurricane models that incorporate near-term projections of loss. These near-term models reflect a five-year outlook, providing probable maximum loss estimates using projections of expected hurricane activity in the upcoming near-term period.
The introduction of these new hurricane models has prompted much discussion between insurance regulators and the industry. A criticism of the new models is that they have facilitated insurance rate increases.(6) However, catastrophe modelers maintain that the new models simply give insurers another option as they look to manage their future catastrophe exposures.
Insurers cannot arbitrarily raise rates based on catastrophe model outputs. By law, the rates charged by insurers may not be excessive, inadequate or unfairly discriminatory. When included as part of a rating plan by insurers, catastrophe models are subject to approval by state insurance regulators.
Different states take different approaches when it comes to the use of catastrophe models by insurers in the rate-making process.
For example, in 1995, the Florida legislature created the Florida Commission on Hurricane Loss Projection Methodology to “consider any actuarial methods, principles, standards, models, or output ranges that have the potential for improving the accuracy of or reliability of the hurricane loss projections used in residential property insurance rate filings.”(7)
The Commission’s review process consists of modelers submitting documentation on their models based on the requirements of the Commission’s annual Report of Activities, which includes compliance with various standards, the submission of disclosures regarding aspects of how the model operates, the submission of forms for data and results reporting, and on-site audit requirements involving a review by a group of experts. Based on this review, the Commission will decide to accept, accept subject to modifications, or reject a particular model, model specifications or output ranges.
While the Commission’s findings are not binding on the Florida Department of Insurance, they are admissible and relevant when a rate filing is being considered by the Department or in any arbitration, administrative, or judicial proceeding.
In Louisiana, insurers are permitted to use catastrophe computer modeling in formulating rates.(8) There are no specific laws regarding use or approval of the models. However, for insurers that use the models the Department of Insurance (DOI) does require the modeling company to complete a form/questionnaire and file the model with the DOI. Insurers are also required to file a form/questionnaire identifying the model and its direct effect in the calculation of the insurer’s rates.
To help regulators evaluate the use of the models in the rate-making process the Catastrophe Insurance Working Group of the National Association of Insurance Commissioners (NAIC) published the Catastrophe Computer Modeling Handbook in January 2001.(9)
This purpose of the handbook is to explore catastrophe computer models and to discuss issues that have arisen or can be expected to arise from their use. The handbook is a tool that regulators can use in evaluating the appropriateness of the use of catastrophe models in establishing rates. The guidance offered in the handbook is advisory only and is not intended to prescribe mandatory regulatory procedures.
Background information on catastrophe models from the perspective of insurers, modelers, consumers and regulators is provided in the handbook. It also includes a general overview of catastrophe models, a discussion of model input and output and a section on evaluating the models.
The handbook also suggests areas and concepts regulators should consider and explore to become informed about catastrophe models.
The challenge of managing catastrophic risk -- both natural and man-made disasters -- is one that confronts insurers, reinsurers and risk managers on a daily basis. Ratings agencies consider catastrophic loss to be the number one threat to the financial strength and credit quality of the property/casualty industry.
The widely reported forecasts of more frequent and severe hurricanes in the Atlantic Ocean, Caribbean Sea, and Gulf of Mexico for the next decade or more is also increasing the importance that insurers, reinsurers, corporations, and government entities are placing on more sophisticated risk management tools such as catastrophe models. At the same time, certain demographic shifts such as increasing coastal populations and rising coastal property values are having a profound impact on insurers’ ability to manage and absorb risk.
According to the National Oceanic and Atmospheric Administration (NOAA), in 2003, 53 percent of the nation’s population – 153 million people – lived in coastal counties (including those that abut the Great Lakes).(10) Between 1980 and 2003 the population of coastal counties grew by 33 million people, or 28 percent. Another study puts the value of insured coastal property in hurricane prone states – states bordering on the Atlantic Ocean and Gulf of Mexico – at $6.86 trillion in 2004.(11)
Amid this changing risk landscape, catastrophe models combine scientific and engineering principles, technical and financial expertise, to help a wide range of industry participants and other agencies to better understand and handle their catastrophic exposures.
While they are just one tool in the risk management toolbox, in the course of the last 20 years catastrophe models have become an integral part of the catastrophe risk management process in both the private and public sectors.
(1) New York Times Magazine, In Nature’s Casino, by Michael Lewis, August 26, 2007; http://www.nytimes.com/2007/08/26/magazine/26neworleans-t.html
(2) The Orlando Sentinel, State panel to review insurers’ hurricane-risk models, by Anika Myers Palm, January 28, 2008.
(3)American Insurance Association, Testimony for the National Association of Insurance Commissioners (NAIC) 9/28/2007 Public Hearing on Catastrophe Modeling.
(4) Managing Large-Scale Risks in a New Era of Catastrophes, A Wharton Risk Center Extreme Events Project, in conjunction with Georgia State University and the Insurance Information Institute, Report on Phase 1 of the Study, February 2007.
(5) Catastrophe Exposures and Insurance Industry Catastrophe Management Practices, American Academy of Actuaries Catastrophe Management Working Group, June 10, 2001.
(6) NAIC Sets Hearings To Examine Cat Models, Steve Tuckey, National Underwriter Online News Service, June 5, 2007.
(7) Florida Statute s. 627.0628., http://www.flsenate.gov/Statutes/
(8) The Louisiana Property and Casualty Insurance Commission, A Summary of Legislative Considerations Annual Report 2006-2007, Louisiana Department of Insurance.
(9) NAIC, Catastrophe Computer Modeling Handbook, January 2001, http://www.naic.org
(10) National Oceanic and Atmospheric Administration (NOAA), Population Trends Along the Coastal United States: 1980-2003.
(10) AIR Worldwide, The Coastline at Risk, September 2005.