Monday, March 16, 2015

Idea of US as Christian Nation Created by Capitalists

During the years following the Depression, Americans were told, time and time again, not just that the country should be a Christian nation, but that it always had been one. They soon came to think of the United States as “one nation under God.” They’ve believed it ever since. But it is not true.

Back in the 1930s, business leaders found themselves on the defensive. Their public prestige had plummeted with the Great Crash; their private businesses were under attack by Franklin D. Roosevelt’s New Deal from above and labor from below. To regain the upper hand, corporate leaders fought back on all fronts. They waged a figurative war in statehouses and, occasionally, a literal one in the streets; their campaigns extended from courts of law to the court of public opinion. But nothing worked particularly well until they began an inspired public relations offensive that cast capitalism as the handmaiden of Christianity.

Read all about it in the New York Times. We're been manipulated by Big Business and hierarchy builders again.