American car makers don’t believe Americans are buying their cars like before.
The car industry was born in the United States and American car companies have dominated it until recently. As American car buyers tastes change, so do their decisions on the cars they buy. Now it seems that American car companies are losing their hold on their own turf. So, does that mean it’s the end of the American car in…