A recent Forbes article talks about "the American dream of home ownership." Housingwire.com asserts that "most view becoming a homeowner as...a chance to put down roots and fulfill the American Dream." Whitehouse.gov dedicates an entire blog post to how Obama is "Promoting the American Dream of Homeownership."
So when did the American Dream become all about home ownership? The whole concept must have shifted over the decades. It used to be about things like freedom, and opportunity, and equality. Yeah, I know there was never an exact definition. But I also know that when I was a boy, back in the seventies, it had nothing to do with home ownership. (Although, I seem to recall it had a lot to do with Farrah Fawcett.)