Why is car insurance required when other types of insurance in the US aren't?

Health insurance isn't required in the United States. So why do state governments require you to have car insurance?

Apr 28, 2021
Answer provided by
avatar
Eric Schad
Answered at Apr 28, 2021
“All states but Virginia and New Hampshire require liability insurance to get behind the wheel.
The reason why it’s mandatory is because it covers damage to other people’s property and health. If you don’t have health insurance, it doesn’t affect anyone but yourself.
But if you hit someone without car insurance, they have no recourse other than to sue you. From that perspective, car insurance is a legal requirement because it protects other motorists from the actions of others.”
Car Insurance
thumb-up

Did this answer help you?

Ask us a question by email and we will respond within a few days.
thumb-up

Have a different question?

You can meet us at our office and discuss the details of your question.

Easiest way to compare and buy car insurance

√
No long forms
√
No spam or unwanted phone calls
√
Quotes from top insurance companies