Numbers, Facts and Trends Shaping Your World

The Politics of Climate

Methodology

This report is drawn from a survey conducted as part of the American Trends Panel (ATP), a nationally representative panel of randomly selected U.S. adults living in households, created by Pew Research Center. Respondents who self-identify as internet users and who provided an email address participate in the panel via monthly self-administered web surveys, and those who do not use the internet or decline to provide an email address participate via the mail. The panel is being managed by Abt SRBI.

Data in this report are from the May wave of the panel, conducted May 10-June 6, 2016. Most findings in this report were conducted among 1,534 respondents (1,385 by web and 149 by mail) who were randomly assigned to complete one of three forms or sets of questions on the survey. The margin of sampling error for the sample of 1,534 respondents is plus or minus 4.0 percentage points.

Members of the American Trends Panel were recruited from two large, national landline and cellphone random-digit-dial (RDD) surveys conducted in English and Spanish. At the end of each survey, respondents were invited to join the panel. The first group of panelists was recruited from the 2014 Political Polarization and Typology Survey, conducted Jan. 23 to March 16, 2014. Of the 10,013 adults interviewed, 9,809 were invited to take part in the panel and a total of 5,338 agreed to participate.18 The second group of panelists was recruited from the 2015 Survey on Government, conducted Aug. 27 to Oct. 4, 2015. Of the 6,004 adults interviewed, all were invited to join the panel, and 2,976 agreed to participate.19

Participating panelists provided either a mailing address or an email address to which a welcome packet, a monetary incentive and future survey invitations could be sent. Panelists also receive a small monetary incentive after participating in each wave of the survey.

The ATP data were weighted in a multistep process that begins with a base weight incorporating the respondents’ original survey selection probability and the fact that in 2014 some panelists were subsampled for invitation to the panel. Next, an adjustment was made for the fact that the propensity to join the panel and remain an active panelist varied across different groups in the sample. The final step in the weighting uses an iterative technique that matches gender, age, education, race, Hispanic origin and region to parameters from the U.S. Census Bureau’s 2014 American Community Survey. Population density is weighted to match the 2010 U.S. Decennial Census. Telephone service is weighted to estimates of telephone coverage for 2016 that were projected from the July-December 2015 National Health Interview Survey. Volunteerism is weighted to match the 2013 Current Population Survey Volunteer Supplement. It also adjusts for party affiliation using an average of the three most recent Pew Research Center general public telephone surveys. Internet access is adjusted using a measure from the 2015 Survey on Government. Frequency of internet use is weighted to an estimate of daily internet use projected to 2016 from the 2013 Current Population Survey Computer and Internet Use Supplement. Sampling errors and statistical tests of significance take into account the effect of weighting. Interviews are conducted in both English and Spanish, but the Hispanic sample in the American Trends Panel is predominantly native born and English speaking.

The margins of error tables show the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey. Sample sizes and sampling errors for other subgroups are available upon request.

In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

The web component of the May wave had a response rate of 81% (4,091 responses among 5,053 web-based individuals in the panel); the mail component had a response rate of 77% (472 responses among 617 non-web individuals in the panel). Taking account of the combined, weighted response rate for the recruitment surveys (10.0%) and attrition from panel members who were removed at their request or for inactivity, the cumulative response rate for the May ATP wave is 2.9%.20

Questionnaire development and testing

Pew Research Center developed the questionnaire for this study. The design of the questionnaire was informed by the results of nine separate pretests with a non-probability sample, as well as input from Pew Research Center staff and five external advisers on the project.

Outside advisers: Pew Research Center consulted with a number of expert advisers, listed in the acknowledgements section above, to inform the development of the questionnaire. We are grateful to this group for their input, though Pew Research Center bears full responsibility for the questionnaire design and analysis.

Measurement properties of the science knowledge index

The Pew Research Center survey included a set of nine questions to tap public knowledge of science across a range of principles and topics. The set of questions is evaluated here for the degree to which responses are internally consistent, reflect a single underlying factor or dimension, and differentiate people with higher and lower knowledge scores.

As shown in the accompanying table, the internal reliability or consistency of the scale as measured by Cronbach’s alpha is 0.74. Each of the items in the scale is at least moderately correlated the other items.

An exploratory factor analysis finds one common factor explaining 77% of the shared variance in the items. The factor loadings show that each of the nine questions is moderately correlated with this single common factor. These indicators suggest that the set of items is measuring a single underlying dimension.

Note that all of the science knowledge questions are coded as binary variables. Both Cronbach’s alpha reliability analysis and the factor analysis are based on a Pearson’s correlation matrix. Pearson correlations with binary variables are restricted to a limited range, underestimating the association between two variables when compared with tetrachoric correlations. We do not anticipate the use of a Pearson’s correlation matrix affects the unidimensional factor solution for the scale, however.

We also ran an item-response theory analysis (IRT) to check how well each question distinguishes between those who know relatively more or less on the scale. This analysis fits a two-parameter logistic model, allowing discrimination and difficulty to vary across items. Discrimination shows the ability of the question to distinguish between those with higher and lower science knowledge. Difficulty shows how easy or hard each question is for the average respondent. We did not include a guessing parameter in the model because the questionnaire offered respondents an explicit option of not sure on the survey.

As desired, the results show variation in both difficulty and discrimination across the nine questions. The questions with the strongest ability to discriminate between those who hold more or less science knowledge are 1) the question about which gas is made as a consequence of burning fossil fuels and 2) the question asking respondents to calculate the conditional probability of an old bridge collapsing over time. The question with the weakest ability to discriminate between those with higher and lower science knowledge is that on the effectiveness of antibiotics to treat bacterial, but not other kinds of infections.

The test information curve mirrors a normal curve centered around zero, suggesting that the science knowledge index provides the most information about Americans near the mean level of knowledge.

  1. When data collection for the 2014 Political Polarization and Typology Survey began, non-internet users were subsampled at a rate of 25%, but a decision was made shortly thereafter to invite all non-internet users to join. In total, 83% of non-internet users were invited to join the panel.
  2. Respondents to the 2014 Political Polarization and Typology Survey who indicated that they are internet users but refused to provide an email address were initially permitted to participate in the American Trends Panel by mail, but were no longer permitted to join the panel after Feb. 6, 2014. Internet users from the 2015 Survey on Government who refused to provide an email address were not permitted to join the panel.
  3. Approximately once per year, panelists who have not participated in multiple consecutive waves are removed from the panel. These cases are counted in the denominator of cumulative response rates.
Icon for promotion number 1

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Icon for promotion number 1

Sign up for The Briefing

Weekly updates on the world of news & information