Why is car insurance required when other types of insurance in the US aren't?
Health insurance isn't required in the United States. So why do state governments require you to have car insurance?
Apr 28, 2021
Answer provided by
Answered at Apr 28, 2021
“All states but Virginia and New Hampshire require liability insurance to get behind the wheel.
The reason why it’s mandatory is because it covers damage to other people’s property and health. If you don’t have health insurance, it doesn’t affect anyone but yourself.
But if you hit someone without car insurance, they have no recourse other than to sue you. From that perspective, car insurance is a legal requirement because it protects other motorists from the actions of others.”
Did this answer help you?
Ask us a question by email and we will respond within a few days.
Have a different question?
You can meet us at our office and discuss the details of your question.
Browse by topics
What others are asking
Can I take a car that I cosigned for after a breakup?
I cosigned a car loan for my then-boyfriend two years ago and I've made all the payments. After our breakup, I asked him to give the car to me but he refused. Is there a way for me to get the car in my name only?
Mar 12, 2021
If my home is ruined, can I fight to get the maximum payout from my insurer?
"A huge tree fell on my house and the property's been deemed a complete loss. My home was insured for $210,000, but my insurance company is only willing to pay me $160,000. Do I have any grounds to fight them for the maximum payout of $210,000? "
Apr 07, 2021