American car makers don’t believe Americans are buying their cars like before.
The car industry was born in the United States and American car companies have dominated it until recently. As American car buyers tastes change, so do their decisions on the cars they buy.…
