Confidence in Obama Lifts U.S. Image Around the World
In many countries opinions of the United States are now about as positive as they were at the beginning of the decade before George W. Bush took office. Improvements in the U.S. image have been most pronounced in Western Europe, where favorable ratings for both the nation and the American people have soared. But opinions of America have also become more positive in key countries in Latin America, Africa and Asia, as well. Signs of improvement in views of America are seen even in some predominantly Muslim countries.