Advertisement
Relief Factor Advertisement
Columns

Obama: Hollywood Makes America Look Good

Share
Tweet
email Email
Print

In his desperate efforts to distract the public from Obamacare’s disasters, the president appealed to his friends in Hollywood. “Entertainment is part of our American diplomacy,” he told a crowd at DreamWorks studios. “It’s part of what makes us exceptional, part of what makes us such a world power.”

That was true 60 years ago, when Hollywood conveyed a wholesome, heroic vision of America, but consider the images in movies and TV today: popular entertainment portrays a world far more violent and sexualized, and far less friendly to family and faith than the real world where most Americans live. The season’s top movie—the Hunger Games sequel Catching Fire shows an impoverished, oppressive future dominated by cruel elites.

While Barack Obama flatters his Hollywood fans, their often dark and disturbing diversions hardly improve America’s image.

Share
Tweet
email Email
Print

Medhead

Listen Commercial FREE  |  On-Demand
Login Join

Follow Michael