Services
Insurance
Loans
Repairs
Advice
About

Do car dealerships sell your loan to other lenders?

Is it true that car dealerships sell your loan? I've heard that the reason that car dealers want you to finance with them is because they sell your loan.

avatar
Eric Schad · Updated on
Reviewed by Shannon Martin, Licensed Insurance Agent.
“They don’t always sell loans, but they can.
Typically, a car dealer will want you to finance through them so that they can charge you a higher APR on your
new car
.
They can often prey on buyers that aren’t familiar with the financing process, as well as those with bad credit. However, there are also dealerships that offer fair rates, so don’t think all
dealers are trying to scam you
.
Car dealerships can also sell your loan to get some extra money on the deal, especially if you negotiated a strong deal and got the price drastically reduced.
Dealers may also work with multiple lenders, so you might get several different offers. That might be the truth behind the rumors that you heard.”
View full answer 
WHY YOU CAN TRUST JERRY
Jerry partners with more than 50 insurance companies, but our content is independently researched, written, and fact-checked by our team of editors and agents. We aren’t paid for reviews or other content.

Join 4M+ members in lowering their car insurance

Easiest way to compare and buy car insurance

√
No long forms
√
No spam or unwanted phone calls
√
Quotes from top insurance companies
Find insurance savings