Services
Insurance
Loans
Repairs
Advice
About

Why is car insurance required when other types of insurance in the US aren't?

Health insurance isn't required in the United States. So why do state governments require you to have car insurance?

avatar
Eric Schad · Updated on
Reviewed by Shannon Martin, Licensed Insurance Agent.
“All states but Virginia and New Hampshire require
liability insurance
to get behind the wheel.
The reason why it’s mandatory is because it covers damage to other people’s property and health. If you don’t have health insurance, it doesn’t affect anyone but yourself.
But if you hit someone without
car insurance
, they have no recourse other than to sue you. From that perspective, car insurance is a legal requirement because it protects other motorists from the actions of others.”
View full answer 
WHY YOU CAN TRUST JERRY
Jerry partners with more than 50 insurance companies, but our content is independently researched, written, and fact-checked by our team of editors and agents. We aren’t paid for reviews or other content.

Join 4M+ members in lowering their car insurance

Easiest way to compare and buy car insurance

√
No long forms
√
No spam or unwanted phone calls
√
Quotes from top insurance companies
Find insurance savings