Now that Democrats have won the White House and have widened their margin of control in both houses of Congress, does this signify that American voters have moved to the left? Many Republicans question this claim. Research by the Pew Research Center seems to verify that America is still a right-of-center country. But the details are murky.
read more | digg story
Saturday, November 29, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment