Drop down Class 3 bonds: a bandaid for TWIA

A lot of ink has been spilled on this blog about fixing TWIA in the long run.  Having attended the hearing this past week in Austin and looking at my calendar, which shows 41 days until hurricane season, I am becoming less hopeful that a good long-run fix is in the works.  Moreover, two of the leading bills (S.B. 18 and H.B. 2352) do nothing to address the desperate situation for 2013.  I thus offer up the following as a minimalist bandaid for TWIA.  It will not by any means solve TWIA’s problems.  If, however, a solid solution can not be found, what I offer here may at least provide some assistance and, in my naive view, should be politically feasible. The Executive Summary is that the legislature needs to repeal the provisions prohibiting the Class 3 bonds from dropping down and instead permit them to drop down in the event the Class 2 Alternative bonds fail to sell, offering insurers a premium tax credit to the extent the drop down Class 3 bonds increase their subsidization of tropical cyclone losses along the Texas Gulf Coast.


I start with some history to explain the current problem.

In 2011, the legislature recognized that the system of post-event bonds it had established in 2009 as the means of recapitalizing TWIA following a significant storm was extremely vulnerable to a cascade of failures. Lenders could refuse to purchase the Class 1 bonds on whose sale higher levels of bonds legally depended and thus leave TWIA with only the money it had in its Catastrophe Reserve Trust Fund to pay the claims of its policyholders. And lenders might very well refuse because repayment of the Class 1 bonds depended on TWIA policyholders remaining with TWIA even after it raised its premiums (perhaps 25%) to pay off the bonds. So, the legislature developed this complex scheme now codified in section 2210.6136 of the Insurance Code.

Unfortunately, the fix, which appears to have been developed deep into the legislative session, suffers a risk of the same infirmity as the legislative provisions it attempted to supplement. Class 1 bonds remained theoretically available but a contingency plan was developed: the Class 2 Alternative Bond (my name). This Class 2 Alternative Bond could be sold in the event that the entire $1 billion authorized in Class 1 bonds failed to sell in whole or in part. But, as with the Class 1 bonds, the Class 2 Alternative Bonds contained in the fix depend for their repayment in significant part in extracting large sums of money from a TWIA pool of insureds (a) after a significant hurricane has struck and (b) who can and may leave the pool if insurance premiums get too high. And while coastal residents and insurers share partial responsibility for the repayment and thus reduce the size of the TWIA premium increase, it is unclear if that contribution will be enough to persuade lenders that TWIA policyholders will remain in the pool and pay enough to amortize the bonds. Moreover, the legislation provided that Class 3 bonds, which provide an additional $500 million of borrowing capacity to pay for windstorm damages, can not be sold — repeat, can not be sold — unless every dime of borrowing capacity under the Class 2 Alternative Bonds is exhausted.

The current situation

The result of all this is a potential catastrophe. If, as many observers, including the Texas Insurance Commissioner expect, the Class 1 bonds fail in whole or in part because the market won’t accept them, the Class 2 Alternative Bonds may fail too. Why? Because their repayment source is infected — not as badly, but still infected — with the same problem as the Class 1 bonds. And if the Class 2 bonds fail even a little bit, the Class 3 bonds fail. And if the Class 3 bonds fail, there may well not even be any reinsurance protection. This is so because, if TWIA is not careful and does not purchase reinsurance — at a higher price — that drops down in the event the Class 2 and/or 3 bonds don’t sell, the reinsurer isn’t obligated to pay a dime. The $100 million of policyholder money dumped into reinsurance will have been 100% wasted. (I sure hope TWIA’s lawyers and reinsurance brokers understand this last point.). And so, TWIA will have only the $180 million or so in its Ike-depleted, failure-to-properly-assess-depleted Catastrophe Reserve Trust Fund to pay claims. As my friend David Crump has pointed out, it may not even take a named tropical storm to generate damages of that magnitude to the $72 billion TWIA pool.

We thus end up with a short run problem in addition to a long run problem with TWIA. The long run problem is that the system of post-event bonds on top of a thin Catastrophe Reserve Trust Fund is extremely unstable and potentially depends on massive subsidization by people other than policyholders to prop it all up. That is a hard problem to fix. Perhaps, as been suggested here, an assigned risk plan would be a better alternative. Perhaps, as others believe, the funding structure can be made more stable with yet greater subsidization. Those are hard and politically contentious issues. I am not certain they will be ironed out this legislative session before hurricane season begins in 41 days. And, sorry to say to, but it is a bit irksome to have to bail TWIA out yet again when doing so also rescues from humiliation the legislators who have shortsightedly engineered a system that beautifully served the short run interests of their constituents by underfunding their insurer but that has predictably betrayed those same constituents long run interests. Still, one can not help feeling a bit sorry for those on the coast who may have been fooled, perhaps eagerly so, by these false heroes.

The bandaid

What to do? Triple the minimum amount available for this summer. How?

1. Permit the Class 3 bonds to drop down. Repeal section 2210.6136(c), which currently prohibits the issuance of Class 3 bonds until all the Class 2 or Class 2 Alternative Bonds are sold. Instead, permit the Texas Insurance Commissioner to authorize sale of Class 3 bonds notwithstanding the failure of all Class 2 or Class 2 Alternative Bonds to sell if, in the opinion of the Commissioner, the failure to do so would reduce the amount available to pay claims of TWIA policyholders.

2. To the extent that Class 3 bonds drop down, make the assessments that are required to repay them simply a no-interest loan from insurers to the state rather than an outright payment. This can and has been done by making providing a premium tax credit for the assessments.  I dislike this philosophically because it is less transparent than simply taxing Texans and potentially reduces the amount available for government programs, but it is one way to raise money. To do this will require repeal of section 2210.6135(c) of the Insurance Code and perhaps some other statutory tinkering. The idea, however, is that to the extent an extra obligation has been imposed on the insurers of the state, it is one they should bear only as a vehicle for fronting money rather than in any ultimate sense. I believe sensible insurers should be willing to go along with this alteration. Moreover, as the state bears actual responsibility for up to $500 million, the costs of having the rest of the state subsidize TWIA will be more apparent to the electorate. It will thus be a great — albeit costly — learning opportunity.

Will this solve the TWIA problem for 2013. Absolutely not. This is a bandaid on a gaping wound. $680 million ($180 million in CRTF plus $500 million in dropped down Class 3) is not nearly enough to protect TWIA policyholders from even a minor tropical cyclone. Even $1.68 billion ($180 million in CRTF plus $500 million in Class 2 Alternative plus $500 million in dropped down Class 3 plus maybe $500 million in incredibly costly reinsurance) is not enough. At its current $72 biliion girth, TWIA at a minimum needs a $5 billion stack. But if you don’t have the time, will or ability to do major surgery, a bandaid is better than watching the patient bleed dry in front of you.  So, if the long run problem can not be solved before the start of hurricane season, or if the long run fix starts only in 2014, this extra money this bandaid creates for 2013 should be sorely appreciated when the wind and water starts roiling in the Gulf.

Could the guaranty fund help TWIA policyholders?

We’ll know more in the coming days, but, if I am right in believing that the Texas Windstorm Insurance Association is about to be placed into receivership, there is a substantial chance that the claims of at least some TWIA policyholders and other TWIA creditors will not be paid in full.  The Texas Department of Insurance admits as much in paragraph 10 of its recently issued FAQ. Now, ordinarily when a Texas property insurer goes insolvent and is placed in receivership, the Texas Property and Casualty Guaranty Association comes in and pays at least part of the unpaid portion of legitimate policyholder claims. So, the question is, could TPCIGA help out TWIA policyholders? And thereon rests a complex statutory thriller.

We start with the statute creating and regulating TPCIGA, the exciting Chapter 462 of the Texas Insurance Code. Let’s take a look at what claims are protected by TPCIGA.  That’s found in section 462.201.  Here it is.

Sec. 462.201.  COVERED CLAIMS IN GENERAL. A claim is a covered claim if:

(1)  the claim is an unpaid claim;

(2)  the claim is made under an insurance policy to which this chapter applies that is:

(A)  issued by an insurer authorized to engage in business in this state; or

(B)  assumed by an insurer authorized to engage in business in this state that issues an assumption certificate to the insured;

(3)  the claim arises out of the policy and is within the coverage and applicable limits of the policy;

(4)  the insurer that issued the policy or assumed the policy under an assumption certificate issued to the insured is an impaired insurer; and

(5)  the claim:

(A)  is made by a liability claimant or insured who is a resident of this state at the time of the insured event; or

(B)  is a first-party claim for damage to property that is permanently located in this state.

Most of this statute should be easily satisfied by TWIA policyholders.  It will relate to property permanently located in Texas. It will be unpaid (or they wouldn’t be complaining). It has to actually be covered by and within the limits of the policy.  It’s not as if you get a better insurance policy from TPCIGA than you got from your impaired insurer. But there are two tricky bits that I’ve highlighted in orange: (1) it has to relate to an insurance policy to which this chapter (462) applies and (2) the insurer that issued the policy has to be an impaired insurer.  Let’s turn to each of those in turn.

Leaf to section 462.004 of the statute.  It reads, in excerpted form,

Sec. 462.004.  GENERAL DEFINITIONS. In this chapter:

(5)  “Impaired insurer” means a member insurer that is:

(A)  placed in:

(i)  temporary or permanent receivership or liquidation under a court order, including a court order of another state, based on a finding of insolvency; or

(ii)  conservatorship after the commissioner determines that the insurer is insolvent; and

(B)  designated by the commissioner as an impaired insurer.

(6)  “Member insurer” means an insurer, including a stock insurance company, a mutual insurance company, a Lloyd’s plan, a reciprocal or interinsurance exchange, and a county mutual insurance company, that:

(A)  writes any kind of insurance to which this chapter applies under Sections 462.007 and 462.008, including reciprocal or interinsurance exchange contracts; and

(B)  holds a certificate of authority to engage in the business of insurance in this state.


So, in order to be an impaired insurer you have to be a “member insurer” (and be in receivership).  But what’s a member insurer?  If you were a stock insurance company, a mutual insurance company, a Lloyd’s plan or some other things, the matter would be easy. You’d be in.  But TWIA isn’t one of those things.  It’s just TWIA. But that’s not the only way to qualify.  See the word “includes.”  That generally means that the items listed are not the exclusive way to qualify.  If TWIA meets the conditions in parts (A) and (B) it should qualify as a “member insurer,” which, you will recall, is what we need before TWIA policyholders can seek protection from TPCIGA.

Now we are deep into the plot. Does TWIA write “any kind of insurance to which this chapter [462] applies under Sections 462.007 and 462.008?

We’re going to skip section 462.008.  Trust me, it has absolutely no relevance.  So, let’s focus instead on section 462.007.  It reads:


(a) Except as provided by Subsection (b), this chapter applies to each kind of direct insurance.

(b)  Except as provided by Subchapter F, this chapter does not apply to:

(1)  life, annuity, health, or disability insurance;

(2)  mortgage guaranty, financial guaranty, or other kinds of insurance offering protection against investment risks;

[stuff that clearly does not apply omitted]

(8)  a transaction or combination of transactions between a person, including an affiliate of the person, and an insurer, including an affiliate of the insurer, that involves the transfer of investment or credit risk unaccompanied by the transfer of insurance risk, including transactions, except for workers’ compensation insurance, involving captive insurers, policies in which deductible or self-insured retention is substantially equal in amount to the limit of the liability under the policy, and transactions in which the insured retains a substantial portion of the risk; or

(9)  insurance provided by or guaranteed by government.

Assume for the moment that TWIA policies are “direct insurance.” If so, then Chapter 462 applies unless there’s an exception in subsection (b) of 462.007.  It’s quite clear that exceptions (1)-(8) do not apply.  TWIA is not selling ocean marine insurance. But there is this exception 9.  It says that the chapter does not apply to “insurance provided by or guaranteed by government.” Is TWIA provided by or guaranteed by government? Protestations of some notwithstanding, it is abundantly clear that TWIA policies are not (except conceivably for TPCIGA itself!) guaranteed by government. But might TWIA policies be insurance provided by government?!

And we have now reached the crucial moment in our mystery thriller.  Is TWIA insurance government provided insurance?  If it is, TWIA is not a member insurer and, as such, is not an impaired insurer, and, as such, is not the sort of insurer with respect to which TPCIGA offers policyholders any protection.

I would not laugh at someone who suggested that TWIA was not a government insurer.  It is, to be sure, a government-chartered insurer. Unlike Allstate, State Farm and the rest of the gang, TWIA does business not by satisfying general incorporation and licensure statutes but as a result of a special act of the legislature. But is not the federal or state government itself acting as an insurer. Moreover, a court that recently confronted the issue as to whether TWIA was entitled to sovereign immunity appears to have left the issue open.

Unfortunately, this argument, though not frivolous, runs into several obstacles.  First, I believe TWIA has treated itself and TPCIGA has treated it as if TWIA were a government insurer.  That’s because being a member insurer creates many duties. Chief among those duties is to pay assessments when other insurers go insolvent and TPCIGA has to pay claims. (Check out section 462.151). Non-members don’t have to pay assessments. TPCIGA issued assessments to members in 2001, 2002, 2003 and 2006. (see here). My belief — and if I am wrong it is strong evidence that I am wrong — is that TWIA did not pay any of these assessments and was not asked to do so.  Here, for example is the TWIA financial statement for 2006/2007. I don’t see anywhere that it shows a TPCIGA assessment.

Second, I’m not aware of instances where Texas itself acts as an insurer other than on its own property. So if insurer just meant instances where the state itself is an insurer the exception in the statute would have no application.  There are canons of statutory interpretation that say you try not to construct statutory provisions so that they have no purpose. Instead, I suspect, the term “insurance provided by government” means insurance provided by government created entities such as TWIA, the Texas Fair Plan and entities such as TPCIGA itself.

At the end of the day, then, if you asked me, I would say TPCIGA is unlikely to come to the rescue of TWIA policyholders.  I would not say, however, that this is an open and shut case.  I do suspect, however, that the exclusive protection of TWIA policyholders is instead  the funding mechanism set up in Chapter 2210 of the insurance code and whatever amendments may come thereto.  Unfortunately, that’s not looking very good right now and, absent legislative rescue, is going to look abysmal in the event our Texas coast is socked with a significant storm this rapidly approaching hurricane season.

A love letter to Texas Legislature Online

Dear Texas Legislature Online,

Even though you are only a website, I love you so much. You are cute, simple, easy on the eyes and you have so much to say when I ask you for information. And you’re a cheap date; I don’t have to pay, except for some teeny portion of my tax dollars.  I can search you all over for bills or for statutes and you don’t mind a bit.  Plus, when I find what I want, you are so giving.  I can get a printout in PDF, Word or even plain text that I can edit, display or mash up to my heart’s content. I can’t imagine doing what I do without you. You’re the best thing for Texas democracy ever. I’ll confess that I’m not totally faithful. I look at many legal information websites around the country. And, to be honest, there are many that are good. But you’re my hometown sweetheart. I am looking forward to being with you for years to come.

Seth J. Chandler (catrisk)

Senator Carona calls for insurers to be more constructive on windstorm legislation

Far more important, frankly, than my testimony yesterday before the Texas Senate Business & Commerce Committee, was the colloquy between influential members of that committee and representatives of the insurance industry, notably Beamon Floyd, director of the Texas Coalition for Affordable Insurance Solutions (big Texas insurers such as Allstate, State Farm, Famers, USAA), and Jay Thompson of the Association of Fire and Casualty Companies of Texas.  You can watch it all here from 1:49 to 2:00 and 2:22 to 2:25 on the video of the hearing.  John Carona (R-Dallas) and chair of the committee castigates the insurance industry for acting in bad faith, dragging its heels and apparently stonewalling on the issue of TWIA reform.  While such criticism might be expected from members along the coast or from those predisposed to criticize whatever the insurance industry does, this critique

State Senator John Carona

State Senator John Carona

came directly from Senator Carona,  a man who described himself as a friend of the insurance industry and, indirectly, from Governor Rick Perry, likewise seldom confused as an insurance basher.

The problem, basically, is that the insurance industry is resisting a bill that would likely compel it to shoulder more expense for risk along the Texas coast than it does now, even if it can pass many of those expenses on, but it hasn’t been bold enough thus far to come forward at this stage of the legislative process with support for specific solutions to the short and long term problems facing TWIA and its insureds. Nor has the industry publicly (or otherwise, to my knowledge) to date presented facts showing the extent of the burden that would be created by the assigned risk plan embodied in SB 18. This silence places legislators such as Senator Carona in a difficult position. They do not wish to create crushing burdens on the insurance industry that will make insurance in Texas yet more expensive or difficult to obtain, particularly in their districts, but they are also not willing to create a situation in which a significant storm forces an insurer for which they bear responsibility to undergo a difficult forced recapitalization or, worse, leaves it unable to pay claims promptly and fully. My sense is that Senator Carona and perhaps others felt much the way I do when confronted with a student, even one who has done well in the past, who is long on generalized rhetoric but doesn’t show that they have actually done the needed homework.

Here’s what I bet Senator Carona and others would like to see. With respect to all of these numbers, it would be best if they came from certified actuaries using contemporary storm models and it would be helpful if the figures were provided in both absolute dollars and as a percentage of industry premium revenue.  Some of these numbers may well be difficult to develop, but if figures could be brought forth even on an order of magnitude basis, it might separate out real threats to the Texas insurance industry from reflexive rhetoric.

Numbers Relevant to SB 18

(a) Evidence as to the expected costs of the 2210.0561 potential for assessment; this figure might be either a measure of expected losses or an explanation of why this assessment responsibility needs to be reinsured along with the costs thereof.

(b) Evidence as to the costs of the 2210.0561 assessment to help TWIA buy up to $2 billion in reinsurance. My wild guess is that we are looking at $150 million per year in the immediate future but ramping down substantially as the take out in the assigned risk plan decreases the expected amount reinsurers would pay

(c) Evidence as to what it will cost to set up and maintain a clearinghouse that will migrate coastal residents, and perhaps others, either into a private take-out policy or into the assigned risk pool.  Perhaps I am naive, but I believe the clearinghouse could be operated for less than $10 million per year.

(d) Evidence as to what the shortfall between “market rates” and transition premiums will cost insurers AFTER premium tax credits and recoupment are taken into account.

(e) At least an order of magnitude guess as to what it will cost, net of premiums, to write policies on the riskiest policies as to which SB 18 caps the premium at 25% higher than market. Such an estimate will require at least three figures: (1) an estimate of how many policies there will be in this category; (2) an estimate of actual expected losses among the purchasers; and, importantly, (3) an estimate of the incremental costs of capital that insurers need to stockpile in order to bear this correlated risk.

(f) An estimate of the cost of servicing TWIA policyholders even for windstorm claims pursuant to section 2210.5725 of the bill.

I also suspect Senator Carona and others in the legislature would like to see at least a bargaining position from the insurance industry on how much of these costs should be transferred either to TWIA policyholders or more directly to statewide insureds.

Numbers Relevant to An Alternative Plan

For any alternative plan submitted by the insurance industry, we ought to see numbers on the following:

(a) what are the rates that will be paid for risks currently covered by TWIA policies

(b) how will it address the 2013 hurricane season — the Carona bill is weak here

(c) how does it get the stack of protection up to an amount sufficient to cover at least a 1 in 100 years storm, preferably a 1 in 500 years storm

(d) who bears the financial burden of such a stack

So, I know this is a lot of work and there isn’t much time in which to do it.  But my sense is that one outcome of yesterday’s hearing is going to be a greater sense of urgency on many sides from those who will try to scuttle the assigned risk alternative.

P.S. For those who would rather (or also) like to see my testimony, you can find it at 1:36 to 1:44 of the hearing.

Exciting stuff coming

I know it’s been two weeks since I’ve posted, but I’ve been cooking up a good model to help understand the likely effects of changing deductible and coinsurance requirements on catastrophe insurance policies.  I’ve made big progress and should have some genuinely interesting posts coming in the week ahead.  Catrisk fans hang in there!


Note from November 12, 2012.  You can now find the model here.  With yet more to come.

It’s (close to) a Weibull — again!

You recall that in my last post, I went through an involved process of showing how one could generate storm losses for individuals over years.  That process, which underlies a project to examine the effect of legal change on the sustainability of a catastrophe insurer, involved the copulas of beta distributions and a parameter mixture distribution in which the underlying distribution was also a beta distribution. It was not for the faint of heart.

One purpose of this effort was to generate a histogram that looks like the one below that shows the distribution of scaled claim sizes for non-negligible claims. This histogram was obtained by taking one draw from the copula distribution for each of the [latex]y[/latex] years in the simulation and using it to constrain the distribution of losses suffered by each of the [latex]n[/latex] policyholders in each of those [latex]y[/latex] years.  Thus, although the underlying process created an [latex]y \times n[/latex] matrix, the histogram below is for a single “flattened” [latex]y \times n[/latex] vector of values.

Histogram of individual scaled non-negligible claim sizes

Histogram of individual scaled non-negligible claim sizes

But, if we stare at that histogram for a while, we recognize the possibility that it might be approximated by a simple statistical distribution.  If that were the case, we could simply use the simple statistical distribution rather than the elaborate process for generating individual storm loss distributions. In other words, there might be a computational shortcut that could approximate the elaborate proces.  If that were the case, to get the experience of all [latex]n[/latex] policyholders — including those who did not have a claim at all — we could just upsample random variates drawn from our hypothesized simple distribution and add zeros; alternatively, we could create a mixture distribution in which most of the time one drew from a distribution that was always zero and, when there was a positive claim, one drew from this hypothesized simple distribution.

Continue reading

Copulas and insurance law reform

Storm models are crucial to law reform.  One needs them to get a sense if premiums are reasonable.  And, as I want to show in a series of blog posts, they can also help figure out the effect of legally mandated changes to the insurance contract.  You need to tie behavior at the level of the individual policyholder to the long term finances of the insurer. How would, for example, changing the required deductible on windstorm policies issued by the Texas Windstorm Insurance Association affect the precautions taken by policyholders to avoid storm damage?  That’s important for many reasons, among them that it affects the sustainability of TWIA. Might the imposition of coinsurance into the insurance contract do a better job of making TWIA sustainable?  These are the kind of questions for which a decent storm model is useful.

So, over the past few weeks I’ve been thinking again about ways in which one could, without access (yet) to gigabytes of needed data, develop approximations of the windstorm damage events likely to be suffered by policyholders.  And I’ve been thinking about ways in which one could parameterize those individual damages as a function of the level of precautions taken by policyholders to avoid damage.

What I’m going to present here is a model of storm damage that attempts to strike a reasonable balance of simplicity and fidelity. I’m afraid there’s a good bit of math involved, but I’m going to do my best here to clarify the underlying ideas and prevent your eyes from glazing over.  So, if you’ll stick with me, I’ll do my best to explain.  The reward is that, at the end of the day, we’re going to have a model that in some ways is better than what the professionals use.  It not only explains what is currently going on but can make predictions about the effect of legal change.

Let’s begin with two concepts: (1) “claim prevalence” and (2) “mean scaled claim size.”  By “claim prevalence,” which I’m going to signify with the Greek letter [latex]\nu[/latex] (nu), I mean the likelihood that, in any given year, a policyholder will file a claim based on an insured event. Thus, if in a given year 10,000 of TWIA’s 250,000 policyholders file a storm damage claim, that year’s prevalence is 0.04.  “Mean scaled claim size,” which I’m going to signify with the Greek letter [latex]\zeta[/latex] (zeta), is a little more complicated. It refers to the mean of the size of claims filed during a year divided by the value of the property insured for all properties on which claims are filed during a year.  To take a simple example, if TWIA were to insure 10 houses and, in a particular year, and 2 of them filed claims ([latex]\nu =0.2[/latex]) for $50,000 and for $280,000, and the insured values of the property were $150,000 and $600,000 respectively, the mean scaled claim size [latex]\zeta[/latex] would be 0.4.  That’s because: [latex]0.4=\frac{50}{2\ 150000}+\frac{280000}{2\ 600000}[/latex].

Notice, by the way, that [latex]\zeta \times \nu[/latex] is equal to aggregate claims in a year as a fraction of total insured value.  Thus, if [latex]\zeta \times \nu = 0.005[/latex] and the total insured value is, say, $71 billion, one would expect $355 million in claims in a year. I’ll abbreviate this ratio of aggregate claims in a year to total insured value as [latex]\psi[/latex] (psi).  In this example, then,  [latex]\psi=0.005[/latex].[1]

The central idea underlying my model is that claim prevalence and mean scaled claim size are positively correlated. That’s because both are likely to correlate positively with the destructive power of the storms that occurred during that year.  The correlation won’t be perfect.  A tornado, for example, may cause very high mean scaled claim sizes (total destruction of the homes it hits) but have a narrow path and hit just a few insured properties.  And a low grade tropical storm may cause modest levels of wind damage among a large number of insureds.  Still, most of the time, I suspect, bigger stoms not only cause more claims, but they increase the size of the scaled mean claim size.

copula distribution provides a relatively simple way of blending correlated random variables together.  There are lots of explanations: Wikipedia, a nice paper on the Social Science Research Network, and the Mathematica documentation on the function that creates copula distributions.   There are lots of ways of doing this blending, each with a different name.  I’m going to stick with a simple copula, however, the so-called “Binormal Copula” (a/k/a the “Gaussian Copula.”) with a correlation coefficient of 0.5.[2]

To simulate the underlying distributions, I’m going to use a two-parameter beta distribution for both claim prevalence mean scaled claim size. My experimentation suggests that, although there are probably many alternatives, both these distributions perform well in predicting the limited data available to me on these variables. They also benefit from modest analytic tractability. For people trying to recreate the math here, the distribution function of the beta distribution is [latex]I_x\left(\left(\frac{1}{\kappa ^2}-1\right) \mu ,\frac{\left(\kappa ^2-1\right) (\mu -1)}{\kappa ^2}\right)[/latex], where [latex]\mu[/latex] is the mean of the distribution and [latex]\kappa[/latex] is the fraction (0,1) of the maximum standard deviation of the distribution possible given the value of   [latex]\mu[/latex]. What I have found works well is to set [latex]\mu _{\nu }=0.0244[/latex], [latex]\kappa _{\nu }=0.274[/latex] for the claim prevalence distribution and [latex]\mu _{\zeta }=0.097[/latex], [latex]\kappa _{\zeta }=0.229[/latex] for the mean scaled claim size distribution. This means that policyholders will file a claim about every 41 years and that the value of claims for the year will, on average, be 9.7% of the insured value of the property.[3]

We can visualize this distribution in a couple of ways.  The first is to show a probability density function of the distribution but to scale the probability logarithmically.  This is shown below.

PDF of sample copula distribution

PDF of sample copula distribution

The second is to simulate 10,000 years worth of experience and to place a dot for each year showing claim prevalence and mean scaled claim size.  That is done below. I’ve annotated the graphic with labels showing what might represent a year in which there was a tornado outbreak, a catastrophic hurricane, a tropical storm as well as the large cluster of points representing years in which there was minimal storm damage.

Claim prevalence and mean scaled claim size for 10,000 year simulation

Claim prevalence and mean scaled claim size for 10,000 year simulation

Equipped with our copula, we can now generate losses at the individual policyholder level for any given year.  The idea is to create a “parameter mixture distribution” using the copula. As it turns out, one component of this parameter mixture distribution is itself a mixture distribution.

Dear reader, you now have a choice.  If you like details, have a little bit of a mathematical background and want to understand better how this model works, just keep reading at “A Mini-Course on Mixture and Parameter Mixture Distributions.”  If you just want the big picture, skip to “Simulating at the Policyholder Level” below.

A Mini-Course on Mixture and Parameter Mixture Distributions

To fully understand this model, we need some understanding of a mixture distribution and a parameter mixture distribution.  Let’s start with the mixture distribution, since that is easier.  Imagine a distribution in which you first randomly determine which underlying component distribution you are going to use and then you take a draw from the selected underlying component distribution.  You might, for example, roll a conventional six-sided die, which is a physical representation of what statisticians call a “discrete uniform distribution.”  If the die came up 5 or 6, you then draw from a beta distribution with a mean of 0.7 and a standard deviation of 0.3 times the maximum.  But if the die came up 1 through 4, you would draw from a uniform distribution on the interval [0,0.1].  The diagram below shows the probability density function of the resulting mixture distribution (in red) and the underlying components in blue.

Mixture Distribution with beta and uniform components

Mixture Distribution with beta and uniform components

The mixture distribution has a finite number of underlying component distributions and has discrete weights that you select. The parameter mixture distribution can handle both infinite underlying component distributions and handles weights that are themselves draws from a statistical distribution. Suppose we create a continuous function [latex]f[/latex] that takes a parameter [latex]x[/latex] and creates triangular distribution which has a mean of [latex]x[/latex] and extends 1/4 in each direction from the mean.  We will call this triangular distribution the underlying distribution of the parameter mixture distribution.  The particular member of the triangular distribution family used is determined by the value of the parameter. And, now, we want to create a “meta distribution” — a parameter mixture distribution — in which the probability of drawing a particular parameter [latex]x[/latex] and in turn getting that kind of triangular distribution with mean [latex]x[/latex] is itself determined by another distribution, which I will call [latex]w[/latex]. The distribution [latex]w[/latex] is the weighting distribution of the parameter mixture distribution. To make this concrete, suppose [latex]w[/latex] is a uniform distribution on the interval [0,1].

The diagram below shows the result.  The blue triangular underlying distributions represent a sample of the probability density functions of triangular distributions.  There are actually an infinite number of these triangular distributions, but obviously I can’t draw them all here. Notice that some of the density functions are more opaque than others. The opacity of each probability density function is based on the probability that such a distribution would be drawn from [latex]w[/latex].  The red line shows the probability density function of the resulting parameter mixture distribution.  It is kind of an envelope of these triangular distributions.

Parameter mixture distribution for triangular distributions where mean of triangular distributions is drawn from a uniform distribution

Parameter mixture distribution for triangular distributions where mean of triangular distributions is drawn from a uniform distribution

We can combine mixture distributions and parameter mixture distributions.  We can have a mixture distribution in which one or more of the underlying functions is a parameter mixture distribution.  And, we can have a parameter mixture distribution in which either the underlying function and/or the weighting function is a mixture distribution.

It’s that combination — a parameter mixture distribution in which the underlying function is a mixture distribution — that we’re going to need to get a good simulation of the damages caused by storms. The weighting distribution of this parameter mixture distribution is our copula. It throws out two parameters:  (1) [latex]\nu[/latex], the likelihood that in any given year the policyholder has a non-zero claim and (2) [latex]\zeta[/latex] the mean scaled claim size assuming that the policyholder has a non-zero claim.  Those two parameters are going to weight members of the underlying distribution, which is a mixture distribution.  The weights of the mixture distribution are the the likelihood that the policyholder has no claim and the likelihood that the policyholder has a non-zero claim (claim prevalence). The component distributions of the mixture distribution are (1) a distribution that always produces zero and (2) any distribution satisfying the constraint that its mean is equal to the mean scaled claim size.  I’m going to use another beta distribution for this latter purpose with a standard deviation equal to 0.2 of the maximum standard deviation.  I’ll denote this distribution as B. Some examination of data from Hurricane Ike is not inconsistent with the use of this distribution and the distribution has the virtue of being analytically tractable and relatively easy to compute.

This diagram may help understand what is going on.

The idea behind the parameter mixture distribution

The idea behind the parameter mixture distribution

Simulating at the Policyholder Level

So, we can now simulate a large insurance pool over the course of years by making, say, 10,000 draws from our copula.  And from each draw of the copula, we can determine the claim size for each of the policyholders insured in that sample year. Here’s an example.  Suppose our copula produces a year with some serious damage: claim prevalence value of 0.03 and a mean scaled claim size of 0.1 for the year.  If we simulate the fate of 250,000 policyholders, we find that 242,500 have no claim.  The graphic below shows the distribution of scaled claim sizes among those who did have a non-zero claim.

Scaled claim sizes for sample year

Scaled claim sizes for sample year

Fortunately, however, we don’t need to sample 250,000 policy holders each year for 10,000 years to get a good picture of what is going on.  We can simulate things quite nicely by looking at the condition of just 2,500 policyholders and then just multiplying aggregate losses by 100.  The graphic below shows a logarithmic plot of aggregate losses assuming a total insured value in the pool of $71 billion (which is about what TWIA has had recently).

Aggregate losses (simulated) on $71 billion of insured property

Aggregate losses (simulated) on $71 billion of insured property

We can also show a classical “exceedance curve” for our model.  The graphic below varies the aggregate losses on $71 billion of insured property and shows, for each value, the probability (on a logarithmic scale) that losses would exceed that amount.  One can thus get a sense of the damage caused by the 100 year storm and the 1000-year storm.  The figures don’t perfectly match TWIA’s internal models, but that’s simply because our parameters have not been tweaked at this point to accomplish that goal.

Exceedance curve (logarithmic) for sample 10,000 year run

Exceedance curve (logarithmic) for sample 10,000 year run

The final step is to model how extra precautions by a policyholder might alter these losses.  Presumably, precautions are like most economic things: there is a diminishing marginal return on investment.  So, I can roughly model matters by saying that for a precaution of x the insured results in the insured drawing from a new beta distribution with a mean equal to [latex]\ell \times 2^{-x}[/latex], where [latex]\ell[/latex] is the amount of damage they would have suffered had they taken no extra precautions. (I’ll keep the standard deviation of this beta distribution equal to 0.2 of its maximum possible value.) I have thus calibrated extra precautions such that each unit of extra precautions cuts the mean losses in half. It doesn’t mean that sometimes precautions won’t result in greater savings or that sometimes precautions won’t result in lesser savings; it just means that on average, each unit of precautions cuts the losses in half.

And, we’re done!  We’ve now got a storm model that when combined with the model of policyholder behavior that I will present in a future blog entry, should give us respectable predictions on the ability of insurance contract features such as deductibles and coinsurance to alter aggregate storm losses. Stay tuned!


[1] As I recognized a bit belatedly in this project, if one makes multiple draws from a copula distribution, it is not the case that the mean of the product of the two values [latex]\nu[/latex] and [latex]\zeta[/latex] drawn from the copula is equal to [latex]\nu \times \zeta[/latex]. You can see why this might be by imagining a copula distribution in which the two values were perfectly correlated, in which case one would be drawing from a distribution transformed by squaring.  It is not the case that the mean of such a transformed distribution is equal to the mean of the underlying distribution.

[2] Copulas got a bad name over the past 10 years for bearing some responsibility for the financial crisis..  This infamy, however, has nothing to do with the mathematics of copulas, which remains quite brilliant, but with their abuse and the fact that incorrect distributions were inserted into the copula.

[3] We thus end up with a copula distribution whose probability density function takes on this rather ghastly closed form.  (It won’t be on the exam.)

[latex]\frac{(1-\zeta )^{\frac{1-\mu _{\zeta }}{\kappa _{\zeta }^2}+\mu _{\zeta }-2} \zeta ^{\left(\frac{1}{\kappa _{\zeta }^2}-1\right) \mu _{\zeta }-1} (1-\nu )^{\frac{1-\mu _{\nu }}{\kappa _{\nu }^2}+\mu _{\nu }-2} \nu ^{\left(\frac{1}{\kappa _{\nu }^2}-1\right) \mu_{\nu }-1} \exp \left(\frac{\left(\text{erfc}^{-1}\left(2 I_{\zeta }\left(\left(\frac{1}{\kappa _{\zeta }^2}-1\right) \mu _{\zeta },\frac{\left(\kappa _{\zeta }^2-1\right) \left(\mu _{\zeta }-1\right)}{\kappa _{\zeta }^2}\right)\right)-\rho

\text{erfc}^{-1}\left(2 I_{\nu }\left(\left(\frac{1}{\kappa _{\nu }^2}-1\right) \mu _{\nu },\frac{\left(\kappa _{\nu }^2-1\right) \left(\mu _{\nu }-1\right)}{\kappa _{\nu }^2}\right)\right)\right){}^2}{\rho ^2-1}+\text{erfc}^{-1}\left(2 I_{\zeta
}\left(\left(\frac{1}{\kappa _{\zeta }^2}-1\right) \mu _{\zeta },\frac{\left(\kappa _{\zeta }^2-1\right) \left(\mu _{\zeta }-1\right)}{\kappa _{\zeta }^2}\right)\right){}^2\right)}{\sqrt{1-\rho ^2} B\left(\left(\frac{1}{\kappa _{\zeta }^2}-1\right) \mu _{\zeta
},\frac{\left(\kappa _{\zeta }^2-1\right) \left(\mu _{\zeta }-1\right)}{\kappa _{\zeta }^2}\right) B\left(\left(\frac{1}{\kappa _{\nu }^2}-1\right) \mu _{\nu },\frac{\left(\kappa _{\nu }^2-1\right) \left(\mu _{\nu }-1\right)}{\kappa _{\nu }^2}\right)}[/latex]



Some free market principles for addressing the TWIA issue

The R Street Institute has published a list of list of six principles for addressing the TWIA solvency issue.  You can find them here.  I  think you’ll find that they are pretty consistent with some of the ideas I have been advancing.  As always, however, the devil is in the details and in making a transition away from a long-standing government subsidized program upon which some people and businesses have come to depend.

The still-mysterious variation in private market penetration along the Texas coast

There’s this fact that stares you in the face as you try to figure out whether, as I have hoped, private insurers might significantly displace the system of coastal windstorm insurance in Texas currently dominated by TWIA. It’s that the private market appears to be alive and well in parts of the Texas coast. In Cameron County, for example, TWIA has only 31% of the residential market. And in Kleberg County, the figure is 27%. On the other hand in Galveston County, TWIA owns 77% of the residential market and 81% in Aransas County. What accounts for this variation? Maybe if we could figure it out, we could engineer some policies likely to induce the private market to re-enter to a greater extent throughout the coast.

I will save you the trouble of reading ahead. I didn’t find much. The variation remains pretty much of a mystery. I look forward to suggestions for further experimentation or someone who will just reveal the obvious answer.

If anyone, by the way, has data on the proportion of property or the population that is located within some distance of the actual coastline within each county, I’d be very interested in seeing that.  Maybe the reason the geographic data isn’t showing anything is that the county divisions are too coarse.  If, for example, Galveston County has a higher proportion of its population living close to the ocean than does, say, Kleberg County and if insurers don’t feel, for some reason, that they can underwrite within counties, that might provide some better explanation for the variation in private insurer participation in the Texas coastal windstorm market.

For those who care how I came by my “negative result” — just the sort of thing many academic journals tend to disdain — I offer the following brief synopsis.

If you just look at a map, no particular pattern appears.

What if we look at some data? I grabbed data on the TWIA counties I thought might possibly be relevant from the United States census. Maybe population density is important on the thought that the more dense the county, the more correlated the risk and the less private insurers would want to write there. Or maybe private insurers have greater (or lesser) marketing power in densely populated counties. I grabbed median income data on the thought that private insurers might prefer to write policies in wealthier counties. I grabbed ethnicity data on the thought that race and ethnicity often matter in modern America — not necessarily causally but because race and ethnicity end up correlating with things that matter. We end up with 14 data points and 3 dependent variables. There’s not a huge amount one can do with data sets this small, but I thought I’d give it a try.

If one does a simple-minded logit regression, one ends up with the following somewhat unusual result. With these three variables, we end up accounting for about 72% of the variation in the data, but no single variable is statistically significant, or even close.

Logit Model Fit of TWIA Market Share

We can also try something more sophisticated. Instead of just assuming a logistic linear relationship between the independent variables and the dependent one (TWIA penetration), we can ask the computer to explore a huge space of potential models and see if anything turns up. Such statistical work used to be impracticable without super computers due to the amount of computation involved and the custom programming required. It’s now eminently possible on an average desktop with software such as DataModeler from Evolved Analytics.  Although this process yields remarkable gains in understanding a system, such is not always the case.  And, for this small dataset, exploring a much larger model space leaves us with a number of models that have somewhat higher R-squared values that our logistic regression, but nothing to truly brag about and none that clearly point towards one or another of the variables in our model as being critical.

Sample results of Data Modeler predicting TWIA penetration

I thus end up saying that, for now, the mystery of varying market penetration remains unsolved.

Why TWIA policyholders should be warned of the insolvency risk, even in Corpus Christi

One of the proposals I have made is that TWIA policyholders be warned of the risk of insolvency and the risk of post-event assessments.  At yesterday’s hearing, Representative Todd Hunter indicated that such a warning was not needed and might needlessly scare policyholders or lenders.  If you go to 1:57:50 to 2:02:18 of the recording of the hearing, you can hear the exchange.

There is indeed a a balance between warning people about risk and unduly scaring people.  And there is a wealth of evidence showing people aren’t very good at assessing infrequent risks.  But, to my mind, the exchange missed the point in several ways.

1.  Even though there may have been but one storm in a simulated 10,000 years that resulted in (a) a hit on Corpus Christi that (b) bankrupted TWIA, that is not the most relevant risk.  The more relevant risk — even from a selfish Corpus-Christi-only perspective —  is the probability that there will be a hit somewhere on the Texas coast that disables TWIA from paying claims to people in Corpus Christi either that year or in subsequent years.  And there, the relevant annual risk is probably at least 1 in 60.

2.  Why?  First, hurricanes can and do cluster.  So, when a significant strike in Freeport wipes out TWIA that means that even a little tropical storm or category 1 that hits Corpus the same year will leave no money in the TWIA piggybank from which to pay claims.  Second, when a significant strike wipes out TWIA that means that even a modest hurricane that hits Corpus in subsequent years will be very, very difficult to finance.  TWIA won’t have a catastrophe reserve fund, it won’t be realistic to surcharge TWIA policyholders a second time, and the bond market is unlikely to accept a second round of post-event bonding.  All that TWIA will have is a little buffer of premium revenues.

3. I believe that’s what TWIA head John Polak was attempting to communicate/spar with Representative Hunter but didn’t do the clearest job of it when put on the spot.

4.  So, is the risk of a Corpus Christi being the holder of a claim against an insolvent TWIA less than the risk of a Galveston policyholder?  Yes.  But the risk IS NOT 1 in 10,000.  It is much higher.

5. Should TWIA warn policyholders? At least until TWIA is fixed, I think it definitely should.  Policyholders don’t need to be scared about every unlikely event, but they have a right as adults to know of a substantial risk.  Losing your house and facing an insolvent insurer qualifies. We warn holders of surplus lines policies of lesser protections against insurer insolvency with a great big stamp on the policy.  Why not the same for an equally unguaranteed and often far riskier insurer. And while we’re warning, let’s also warn them of the potential for post-event Class 1 assessments, for which the risk is yet far higher and uniform throughout the TWIA territory.

6. Practice pointer for insurance agents and their error & omissions insurers: consider disclosing the risk (and requiring disclosure) even if there is currently no regulatory obligation that you do so. With all the risk information floating about, do you really want a jury facing a plaintiff with a lost house being your test care for silence?

7. The better question, I think, is the utility of the warning.  As the Public Insurance Counsel Deeia Beck queried at the hearing, what are you supposed to do once you are warned?  If my hopes are realized, we will have set up an environment in which persons on the coast do have private alternatives to TWIA.  If they know that the Allstate policy costs 10% more but is more likely to pay, maybe they will opt for Allstate and TWIA will be voluntarily depopulated without government coercion.  And, if those now-warned persons realize they don’t have a choice, maybe they will wake up, realize that the current system, though providing some benefit, is actually victimizing them.  Maybe they’ll stop listening to those who offer false comfort now and demand a market or alternative that provides better protection.