The image of the United States has improved markedly in most parts of the world reflecting global confidence in Barack Obama. In many countries, opinions of the U.S. are now about as positive as they were at the beginning of the decade before George W. Bush took office.
Sign up for our weekly newsletter
We need to confirm your email address.
To complete the subscription process, please click the link in the email we just sent you.