The historical actions of our Christian nation are considered.
|From its very founding,|
America has always considered
itself to be a Christian nation.
has been governed by white men
full of proper Christian fervor.
Each Sunday the churches
were full of pious people
rejoicing to God, while
they prospered on the backs
of Colored slaves, property
to be bought and sold at will.
After the War of Yankee Aggression,
the white men turned to settling
the west. To the prairies they brought
prayer, while decimating Native
Americans and stealing their land.
In the first half of the twentieth
century, they wore their hoods
and robes to burn their fiery crosses,
confident they were being good
Christians while keeping the Jews,
Coloreds, and other lesser folk in line.
At the start of World War Two,
in a display of Christian charity,
America forced its Japanese-American
citizens behind barbed wire fences
guarded by armed soldiers, after
confiscating their homes and possessions.
In past decades, America
has invaded other countries
with impunity, reinforcing
its reputation with the world
as a war-mongering nation.
Recently, good Christians
in America have sought to ban
the building of mosques
by followers of the Muslim faith
in several American cities.
Today rancor thrives in our politics,
as racism toward our black president,
Hispanics, and Muslims abounds.
Yes, throughout America’s history
Americans have always considered
this to be a great Christian nation.
Unfortunately, all too often
in our country’s long history
we have failed to behave as such.
Please check out my ten books: