When should you take your car to the dealership?

Are there times it's better to take your car to the dealership rather than a private mechanic? I know that the dealership is far more expensive for repairs, so I don't take it there.

“Perhaps the main reason to take your car to the dealership is to have work done that’s covered by your free maintenance plan or your manufacturer’s warranty.
You may also take your car to the dealer if you have issues with air bags or your car has had a recall. Dealerships often work with manufacturers to replace recalled items.
Other than that, there’s really no reason to take your car to the dealer unless you don’t have a trusted mechanic. Just be warned: you’re going to pay far more by going to the dealer, even if your car insurance is paying for repairs.
While you’re shopping for mechanics, it’s also a good idea to review your car insurance policy to make sure you’re getting the best deals. Try using the Jerry app to automate this process. Every six months, Jerry will review your car insurance policy and compare it to other companies to ensure you’re getting the best rate. If you’re not, Jerry will send the best deals to your phone in minutes. And the best part–Jerry does all of this for free.”
Eric Schad
Answered on Aug 31, 2021
Eric Schad has been a freelance writer for nearly a decade, as well as an SEO specialist and editor for the past five years. Before getting behind the keyboard, he worked in the finance and music industries (the perfect combo). With a wide array of professional and personal experiences, he’s developed a knack for tone and branding across many different verticals. Away from the computer, Schad is a blues guitar shredder, crazed sports fan, and always down for a spontaneous trip anywhere around the globe.

Did this answer help you?

Ask us a question by email and we will respond within a few days.

Have a different question?

You can meet us at our office and discuss the details of your question.