Do car dealerships sell your loan to other lenders?

Is it true that car dealerships sell your loan? I've heard that the reason that car dealers want you to finance with them is because they sell your loan.

Answer provided by
Eric Schad
Answered on Jul 03, 2021
Eric Schad has been a freelance writer for nearly a decade, as well as an SEO specialist and editor for the past five years. Before getting behind the keyboard, he worked in the finance and music industries (the perfect combo). With a wide array of professional and personal experiences, he’s developed a knack for tone and branding across many different verticals. Away from the computer, Schad is a blues guitar shredder, crazed sports fan, and always down for a spontaneous trip anywhere around the globe.
“They don’t always sell loans, but they can.
Typically, a car dealer will want you to finance through them so that they can charge you a higher APR on your new car.
They can often prey on buyers that aren’t familiar with the financing process, as well as those with bad credit. However, there are also dealerships that offer fair rates, so don’t think all dealers are trying to scam you.
Car dealerships can also sell your loan to get some extra money on the deal, especially if you negotiated a strong deal and got the price drastically reduced.
Dealers may also work with multiple lenders, so you might get several different offers. That might be the truth behind the rumors that you heard.”

Did this answer help you?

Ask us a question by email and we will respond within a few days.

Have a different question?

You can meet us at our office and discuss the details of your question.

Easiest way to compare and buy car insurance

No long forms
No spam or unwanted phone calls
Quotes from top insurance companies