A-Z OF… CATASTROPHE IMPACT
Derek Thrumble takes an alphabetised look at nat cat analysis for corporate insurance buyers…
In the past, access to modelling software that evaluates exposures to major natural catastrophes or terrorism attacks was primarily limited to large re/insurers. Their sizeable portfolios of risk provide premium volumes that are sufficient to support the associated startup costs, ongoing annual licence fees and data subscriptions. In the past, access to modelling software that evaluates exposures to major natural catastrophes or terrorism attacks was primarily limited to large re/insurers. Their sizeable portfolios of risk provide premium volumes that are sufficient to support the associated startup costs, ongoing annual licence fees and data subscriptions.
More recently, large corporate insurance buyers have increased their focus on exposure to floods, windstorms, earthquakes, and even wildfires. It has become the norm for brokers and consultants to deploy risk modelling tools that allow them to present their clients’ risk to underwriters in a professional way. These tools:
- ‘Cleanse’ schedules of values to bring data submissions into a consistent format, clearly showing values at risk, and risk/property types;
- Geocode data to identify clearly the precise risk locations by latitude and longitude;
- Display schedules graphically to illustrate any ‘hotspots’ or accumulations of values;
- Show the proximity to areas of high hazard;
- Overlay historical natural catastrophe events or risk-zoning factors (such as flood zones).
Specialist software tools are now available that produce this level of analysis cost-effectively and in ‘real time’ as part of the insurance renewal process. Many sources of historical event data can be obtained to support the analysis. Some are free of charge; others can be obtained at a relatively low cost. Using this data for larger clients applied to a geocoded schedule of values combined with an assumption of the potential damage makes possible the development of a bespoke natural catastrophe risk rating.
This combines an assumption of the likely frequency of future natural catastrophe losses with modelled severity impact. The likely recovery from insurance can be assessed. Only when this has been completed is a broker in position to truly negotiate with underwriters over premium rates and coverage conditions such as deductibles, sub-limits or annual aggregate limits, and co-insurance clauses. We have carried out more than 50 such analyses during the past year for corporate buyers, using the EigenPrism software tool.
Analysis with such models also helps brokers to understand the underwriting parameters being applied, for example when a specific risk ‘doesn’t model well’. This may simply be a factor of the locations of the insured assets, or the type of construction or year, or the default ‘damage function’ that is being applied through the underwriter’s catastrophe modelling software.
Such analysis additionally enhances risk managers’ ability to understand the threat from events in real time. For some clients, we have set up a series of alerts that will indicate potential damage and allow them to put emergency plans into place, as well as advise senior management of response actions.
It is important for insurers and brokers to continue to streamline the way in which data is transferred, to avoid duplication in the underwriting process. Agreeing standard data templates improves accuracy, reduces the time required to quote and lowers costs. All these factors are attracting much closer attention and scrutiny across the London insurance market.
While ‘big data’ has not yet had a significant impact on corporate insurance buyers, we believe advances in modelling and the availability of data will drive some significant changes during the next few years. Buyers who embrace these new tools will be able to enhance and optimise their insurance and risk transfer programmes, leading to valuable risk insights and tangible cost savings.
KNOWING YOUR A-Z
- A analysis
- B big data
- C corporate buyers
- D deductibles
- E evaluate
- F floods
- G geo-code
- H hotspots
- I impact
- J maJor
- K bespoKe
- L latitude
- M modelling
- N norm
- O obtained
- P portfolios
- Q quakes
- R real time
- S software
- T tools
- U underwriters
- V volumes
- W wildfires
- X eXposures
- Y year
- Z zoning
Derek Thrumble is managing partner – risk consulting at Alesco
Marc Michaels explains what businesses in finance and insurance sectors can do to ensure compliance while continuing to use data effectively.
Blockchain is a hot but complex topic. Gary Nuttall breaks it down with the help of the alphabet.
Dan Trueman, of Novae, takes an alphabetised look at reputational risk