Understanding Risk and Uncertainty in Insurance Markets
Explore the concept of risk and uncertainty in insurance markets, with insights on the definitions, expected value calculations, and practical examples like daily number bets and roulette. Gain a deeper understanding of how risk and uncertainty play crucial roles in decision-making and financial outcomes.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Chapter 9: Risk, Uncertainty and the Market for Insurance The policy of being too cautious is the greatest risk of all Attributed to Jawaharlal Nehru (former Prime Minister of India) Risk comes from not knowing what you're doing. Warren Buffett
Concepts: Risk and Uncertainty Definitions: Risk exists when there is a known probability of random variance in the outcome of a given action. For example, we know that the flip of a fair coin involves an equal chance of seeing a heads or a tails, but risk implies that the outcome will vary randomly from one flip to the next. Uncertainty is defined as imperfect information about the outcome of a given action, whether or not the outcome involves risk.
Expected Value The expected Value of a single outcome equals the probability that the outcome will occur times the payoff received if it does occur. If the probability of outcome i equals pi, and the payoff equals Mi, then the expected value equals pi Mi Expected Value (EV) of a risky decision equals the sum of the expected values of all possible outcomes of the decision. In symbols, EV = ipi Mi, where i equals the sum of the i outcomes, piequals the probability of outcome i, and Miequals the money payoff (positive or negative) from outcome i. For a case with three possible outcomes (i=1 through 3), this formula would equal p1M1+ p2M2+ p3M3, where p1+ p2+ p3= 1.
Example: The daily number Assume you buy a one dollar ticket with a three digit number. Each digit is chosen separately and can equal 0 through 9. The resulting number can be any number between 000 and 999, giving 1,000 equally likely outcomes. If you win, the payoff is $500 minus the one dollar cost of your ticket. Otherwise you lose $1. The expected value of this daily number bet equals 1/1000 x (500 -1) + 999/1,000 x (-1) = - 0.5.
Example: Roulette Roulette involves rolling a ball around a moving wheel and betting on where that ball lands. A U.S. roulette wheel includes the numbers 1-36 which are half black and half red, plus 0 and 00 slots which are green. You lose if the green slots A simple bet could be high/low , or red/black or even/odd The payoff for winning is equal to the bet, so that if someone bets $10 she ends up with $10 more. Let s use this information to answer the following questions:
Example: Roulette 1. What are the odds of winning a single roll for betting on red ? 2. What is the expected value of a $1 bet on red ?
Example: Roulette 1. What are the odds of winning a single roll for betting on red ? 18/38 or .4737 2. What is the expected value of a $1 bet on red ? 18/38 *$10 + 20/38*-$10 = -$.0526, or -5 cents
Concept: A Fair Gamble Concept: A fair gamble is a risky decision with an expected value of zero. This concept relates to our analysis of attitudes toward risk to follow Example: Roll a 6 sided die (dice). You win $5 if it lands on your number, and you lose $1 if it doesn t. EV of Rolling a die: [1/6 $5] + [5/6 (-$1)] = $5/6 - $5/6, or 0.
Decision Trees A decision tree is basically a chart of the possible outcomes of one or more related risky decisions, arranged in a branching pattern. Diagrams of risky decisions must include the odds and the payoffs for each possible outcome. A decision tree must also identify any places where decisions are made along with various points of risk.
Decision Tree Concepts The square at the left of the tree in Figure 9-2 represents a decision point and the circles represent points where either probabilities or payoffs are identified. Payoffs are given as dollar amounts on the right, and overall the tree provides a visual representation of an expected value problem.
Reading a decision tree The tree may be read from left to right. The choice presented is whether or not to flip a coin. If the person does not flip, the payoff is zero with complete certainty. If she does flip the coin, there is a chance of winning $1.10 and chance of losing $1.
Solving a decision tree problem Solving a decision tree involves starting from the right and working backwards, keeping only those choices with the highest expected values or utilities, until an answer is reached. The expected value of the coin flip is $1.10 + -$1, or $.05, not flipping has a payoff of 0. Replace the risky choice with the expected value, then choose whether to flip. This person should flip.
Example: The Iraq Game The decision is whether or not to invade Iraq in order to find and destroy weapons of mass destruction (WMD). Assume a 50 percent chance that these weapons exist. To solve, start at the right, find any expected Values, and solve the decisions on the right.
Solving the Iraq Game After solving the set of decisions on the right, the Problem reduces to this graph. Now calculate the final expected values for invading and not invading Iraq. Based on these numbers what should be done?
Example: Party or Study? Your Turn 9-3: Assume that you have a choice of 2 study strategies in the three nights before your next exam; (1) study seriously for all three nights, or (2) cram the night before the test. If you study seriously, you have a 75 percent chance of an A grade and a 25 percent chance of a B. Assign your own payoff values to these alternatives. If you cram on the last night, you will have 2 extra nights to party, but there is a 50 percent chance you will get a B grade and a 50 percent chance you will freeze and get a D. An A will be out of the question. A. Draw a decision tree for this problem. B. Using the values you assigned for A, B, and D grades, find the expected value for each of the two choices. Add a value for the two nights or partying to the cramming choice if you wish. C. Which choice is best for you? Discuss how different values for partying affect your choice.
The Limits of Expected Value Expected value ignores people s tastes regarding risk. A person s wealth also affects his/her choice of whether to accept a risky decision.
Two fair gambles (9-2) EV ($1 bet) = $1 + ($-1) =$0 (9-3) EV ($1 million bet) = $1,000,000 + ($- 1,000,000) = $0 Would you accept the first gamble? The second? If you have different answers, why?
Expected Utility The expected utility model adds two components to the expected value model. First, the values of each payoff are measured in terms of utility rather than dollars. Secondly, it measures payoffs in terms of a person s total wealth after an outcome occurs, rather than the value of the winning or losing payoff itself, so an initial level of wealth (Wo) is added.
Modeling risk attitudes A person s preferences toward risk may fit into one of three categories, risk averse, risk neutral, and risk preferring. Each will be represented by a relatively basic utility function.
Risk Aversion In Figure 9-6, a risk averse person prefers a fixed outcome such as Wo over a fair gamble even though the expected values are equal. A risk averse utility function for wealth is U(W) = Wealth
Ronda: A risk averse person Rhonda has $1,000 of initial wealth, and may place a $100 bet that a fair coin flip will come up heads. The odds are of winning and of losing, and the payoffs are + $100 and -$100. Her utility function is U(W) = Wealth If she wins, she ends up with $1,100, and if she loses she ends up with $900. Putting all the pieces of the expected utility model together, the expected utility of this fair gamble would be .
Rhondas Decision + (9-5) EU = + = + 1000 $ 100 $ 1000 $ 100 $ 900 $ , 1 100 = 16.583 + 15 = 31.583 utils, (this outcome is in terms of utility, not dollars) The utility of the original $1,000 equals , or 31.62 utils 1000 $ Since the utility of the initial wealth is higher, she will refuse the gamble and keep the initial wealth.
Risk Preferring Decisions For a risk preferring person the utility of the initial wealth is less than the expected utility of the gamble because the added utility from winning (the thrill of victory?) is greater than the reduction in utility from losing. A risk-loving utility of wealth is U(W) = (Wealth)2
Randy, A risk lover Randy has initial wealth of $1,000, and is considering a $100 fair gamble on a coin flip. His expected utility for this bet is EU= ($1000 + $100)2 + ($1000 - $100)2 = 605,000+405,000=1,010,000 utils For comparison, the utility of his initial wealth is (1,000)2, or 1,000,000 utils, less than the utility of the gamble. So Randy will prefer a fair gamble over not gambling.
Risk Neutrality: Indifferent to accepting a fair gamble A risk neutral utility function is a linear function of wealth, as in U = a W, where a is any constant. Example: Nat Neutral is considering the same $100 bet with even odds as Randy and Rhonda did above, and Nat has the same $1,000 in initial wealth. If Nat s utility function is U = W, his expected utility problem will be (9-9) EU = (1000 + 100) (1000 - 100) = (1,100) (900) = 1,000 utils The utility of the initial wealth is also 1,000 utils (check), so Nat will be indifferent between accepting or refusing the fair gamble.
Your Turn 9-4: Some gambles Assume you have $100 in spending money for the next week. A classmate offers a straight $20 bet on a coin flip. The expected value of the flip is zero. A. Would you take the bet? Does this mean you are risk averse, risk preferring, or risk neutral? B. Assume that you are risk averse and that the same classmate offers $22 if you win the bet. You would still lose $20. Calculate the expected utility using the square root utility function. Would you take the bet now according to the expected utility model? Would you take the bet in reality? C. Now assume that you are risk preferring, with a utility function of U = (Wealth)2. If the same classmate offered you $18 if you win, would you still take the bet according to the model? In reality?
Loss Aversion: Yet another utility function regarding risk Under Loss Aversion an individual will be risk averse toward gains from an initial starting point, but will be have a risk preferring function for losses (see Figure 9- 8). Therefore individuals will lose significantly more utility from a small loss than they will gain from an equally small gain.
The Market For Insurance For a market to exist, consumers must be willing to pay somewhat more than the cost of claims to the insurance companies.
Expected Loss to the Insurance Company Expected loss equals the odds of a loss times its dollar value. If a group of reckless drivers has a 1/10 chance of suffering a $2,000 fender-bender, the company s expected loss would equal 1/10 x $2,000, or $200. Expected loss is equal to expected value for a gamble where only losses occur. In this case the EV would equal EV = 1/10 x -$2,000 + 9/10 x $0 = -$200 Only the sign differs between the expected loss and the expected value, but the word loss implies a negative value.
The individuals maximum willingness to pay for insurance A risk averse consumer will be willing to pay more to reduce risk than the expected cost of the risk itself. The concept of expected utility will be used to demonstrate this.
Rudys Insurance Demand For Rudy Averse (Rhonda s father), this problem would include initial wealth of $10,000, a 1/10 chance of suffering a $2,000 accident, and a risk averse (square root) utility function. His expected utility without insurance would be 9 1 9-9)EU = + = 90 + 8.944 = 98.944 utils. $ 10 000 , $ 10 000 , , 2 $ 000 10 10
Certainty Equivalent Concept: A Certainty Equivalent is the amount of money with no risk which produces the same amount of utility as the expected utility of a gamble. Since Rudy s utility = the square root of his wealth or income, one can find the certainty equivalent by squaring his expected utility. (9-10) W = 98.944 utils. By squaring both sides, W = (98.944)2 = $9,789.92. If Rudy has $9,789.92 with no risk of loss from an accident, he would be just as happy as he is with his current risk.
Maximum Willingness to Pay for Insurance Assuming that Rudy is able to buy insurance that will cover 100% of his loss, the most he would be willing to pay is the difference between his initial wealth and his certainty equivalent. Concept: Maximum Willingness to Pay for Insurance: A person s maximum willingness to pay for insurance is the difference between a person s initial wealth and her certainty equivalent. The maximum Rudy would be willing to pay for insurance is $10,000 - $9,789.92, or $210.08. A bill of that size would leave him indifferent between buying the insurance and facing the financial risk of an accident.
Your Turn Rhonda has a 1/10 chance of being in a wreck, and her financial loss would be $1,900. Assume that her initial wealth equals $10,000, and that she is also risk averse with a utility function of U = . A. Find Rhonda s expected loss. B. Find her expected utility without insurance. C. Find her certainty equivalent. D. Find her maximum willingness to pay for insurance.
Concepts and Issues in Insurance Markets Concepts: Community Rating is the requirement that insurance providers charge the same rate to all clients, regardless of the client s level of risk. Adjusted Community Rating limits rate differences to selected categories, and also sometimes limits the size of the rate differences that are allowed. Adverse Selection: Given equal insurance prices for individuals, those with low expected losses will often choose not to buy insurance coverage. Given perfect information, only the less healthy or less safe are likely to buy insurance
. An example using 2 different groups All persons have initial wealth of $40,000 and a utility function U = . Members of the older group have a 1/5 (.2) chance of suffering $3,900 in health care costs per year, while the young group on average has a 1/20 (.05) chance of suffering a $1,975 health cost. STEPS: Find the maximum willingness to pay and expected loss for each group For community rating, take an average of the expected losses (or perhaps the maximums), and assume both groups will have to pay this average. If insurance is optional, compare the average cost to each group s maximum willingness to pay. Which groups will buy and which will not? This is an example of adverse selection.
Mandated (required) insurance If both parties must buy insurance at the community (average) rate you calculated earlier, they will now have no risk but also less wealth because of the purchase. Find the utility of their remaining wealth. Since both groups have the same initial wealth in this case, their new utility will be equal. Compare this utility to their initial expected utility without insurance to see who is better off with the community rated insurance and who is worse off. If you are having trouble, the answers are all in the text.
Expected Net Benefits Concept: Expected Net Benefits = the net benefits of every possible policy outcome weighted by the odds of that outcome. Formula: Expected Net Benefits = i pi PVi, where i equals the sum of all of the i possible outcomes, pi equals the probability of outcome i, and PVi equals the present value of the net benefits from outcome i. For a policy with three possible outcomes (i=1 through 3), this formula would equal p1PV1 + p2PV2 + p3PV3.
Option Price and Option Value Option Price = the maximum amount a person is willing to pay for a risky policy or product before knowing the outcome that will actually occur. This equals the certainty equivalent of the risky decision. Option Value = the difference between the option price and the expected value of the risky policy or product. This also means the maximum a person would pay to reduce risk. Option value will be positive for a risk averse person and negative for a risk preferring person.
Your Turn 9-9: Assume that your campus has a crime problem, particularly at night. Your college or university is considering expanding its security force to provide regular foot patrols between 9 PM and 3 AM. How much would you be willing to pay for this service? What is the average willingness to pay for your class? Assuming that your class s answers are typical, use the average willingness to pay and the total number of students on campus to find the aggregate student willingness to pay for this expanded security service. If each new security officer costs $30,000 per year, how many officers could your college afford to hire for this program?
Concepts for the Capital Asset Pricing Model Diversifiable risk: Risk that is specific to a firm, industry, or locality which can be eliminated through a diversified portfolio of assets. Non-diversifiable or systematic risk: System-wide or market risk that cannot be eliminated through diversification. Beta ( ): The variance in an individual rate of return divided by the variance in the market rate of return.
Values of Beta A beta greater than one means that the variance of the individual stock is greater than that of the market as a whole, while a stock with a beta less than one has a lower variance than the market. If an asset is negatively related to the movement of the market, the beta will be negative. A risk free asset will have a beta of zero.
The Capital Asset Pricing Formula Capital Asset Pricing Formula: Required rate of return = risk free rate + [ (market rate-risk free rate)] Your turn 9-11: Use the capital asset pricing model to find the required return for the following assets: Table 9-1: Required Rates of Return Example ASSET RISK-FREE RATE 2 2 4 3 MARKET RATE OF RETURN 8 8 9 6 BETA REQUIRED RATE 4.8 Tech Food Gold Telephone 1.8 0.8 -.2 0.6
Uncertainty and Policy Analysis
Two methods of including uncertainty in the analysis Sensitivity analysis involves assigning a set of different values to each uncertain variable and then calculating whether changes in these values change the policy decision. Quasi option value is the maximum a person would pay for new information that reduces uncertainty, or the minimum a person would accept to face added uncertainty.
Sensitivity Analysis Example: The Job Corps Program Assume that the Job Corps program will provide $1,100 in benefits for either 10 years or 40 years. Also assume that the discount rate is either 3% or 7 % The present value of each benefit stream can be calculated using the following annuity formula from Chapter 7. 1 (8-6) PVannuity = $1,100 1 n + 1 ( ) r r
Job Corps Example The results for the 7 percent interest rate are given below. Your job is to find the net benefits for each benefit stream using the 3 percent discount rate. Under which assumptions is this program worth the money? If the odds are for each of the two benefit time periods, what is the expected value of the 40 year and 10 year benefit streams for each interest rate? Is the expected value of the benefits greater than the cost for either interest rate? DISCOUNT RATES 7% 3% 40 YEARS OF BENEFITS $14,665 10 YEARS OF BENEFITS $7,726 EXPECTED VALUE