If Insurance Companies Told The Truth

Follow CBSDFW.COM: Facebook | Twitter

What would it be like if the insurance companies actually had to tell the truth in their companies?

Car insurance companies for example, you are legally required to have it, even though they'll fight tooth and nail to keep from paying for it when you have a car crash.  Which is ultimately what they're supposed to.

(©2015 CBS Local Media, a division of CBS Radio Inc. All Rights Reserved. This material may not be published, broadcast, rewritten, or redistributed.)

Read more
f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.