Numbers, Facts and Trends Shaping Your World

Why Do Some Americans Leave Their Religion While Others Stay?

Methodology

Overview

Much of this report is based on data from Wave 170 of the American Trends Panel (ATP), Pew Research Center’s nationally representative panel of randomly selected U.S. adults. The survey was conducted from May 5 to 11, 2025. A total of 8,937 panelists responded out of 9,531 who were sampled, for a survey-level response rate of 94%.

The cumulative response rate accounting for nonresponse to the recruitment surveys and attrition is 3%. The break-off rate among panelists who logged on to the survey and completed at least one item is less than 1%. The margin of sampling error for the full sample of 8,937 respondents is plus or minus 1.4 percentage points.

SSRS conducted the survey for Pew Research Center via online (n=8,720) and live telephone (n=217) interviewing. Interviews were conducted in both English and Spanish.

To learn more about the ATP, read “About the American Trends Panel.”

Panel recruitment

Since 2018, the ATP has used address-based sampling (ABS) for recruitment. A study cover letter and a pre-incentive are mailed to a stratified, random sample of households selected from the U.S. Postal Service’s Computerized Delivery Sequence File. This Postal Service file has been estimated to cover 90% to 98% of the population.9 Within each sampled household, the adult with the next birthday is selected to participate. Other details of the ABS recruitment protocol have changed over time but are available upon request.10 Prior to 2018, the ATP was recruited using landline and cellphone random-digit-dial surveys administered in English and Spanish.

A national sample of U.S. adults has been recruited to the ATP approximately once per year since 2014. In some years, the recruitment has included additional efforts (known as an “oversample”) to improve the accuracy of data for underrepresented groups. For example, Hispanic adults, Black adults and Asian adults were oversampled in 2019, 2022 and 2023, respectively.

Sample design

The overall target population for this survey was noninstitutionalized persons ages 18 and older living in the United States. All active ATP members who previously completed ATP Wave 162 were invited to participate in this wave. Respondent weights are adjusted to account for differential probabilities of selection as described in the Weighting section below.

Questionnaire development and testing

The questionnaire was developed by the Center in consultation with SSRS. The web program used for online respondents was rigorously tested on both PC and mobile devices by the SSRS project team and Center researchers. The SSRS project team also populated test data that was analyzed in SPSS to ensure the logic and randomizations were working as intended before launching the survey.

Incentives

All respondents were offered a post-paid incentive for their participation. Respondents could choose to receive the post-paid incentive in the form of a check or gift code to Amazon.com, Target.com, or Walmart.com. Incentive amounts ranged from $5 to $20 depending on whether the respondent belongs to a part of the population that is harder or easier to reach. Differential incentive amounts were designed to increase panel survey participation among groups that traditionally have low survey response propensities.

Data collection protocol

The data collection field period for this survey was May 5 to 11, 2025. Surveys were conducted via self-administered web survey or by live telephone interviewing. 

For panelists who take surveys online: Postcard notifications were mailed to a subset on May 5.11 Survey invitations were sent out in two separate launches: soft launch and full launch. Sixty panelists were included in the soft launch, which began with an initial invitation sent on May 5. All remaining English- and Spanish-speaking sampled online panelists were included in the full launch and were sent an invitation on May 6.

Table showing the invitation and reminder dates for web respondents for ATP Wave 170

Panelists participating online were sent an email invitation and up to two email reminders if they did not respond to the survey. ATP panelists who consented to SMS messages were sent an SMS invitation with a link to the survey and up to two SMS reminders.

For panelists who take surveys over the phone with a live interviewer: Prenotification postcards were mailed on May 2. Soft launch took place on May 5 and involved dialing until a total of five interviews had been completed. All remaining English- and Spanish-speaking sampled phone panelists’ numbers were dialed throughout the remaining field period. Panelists who take surveys via phone can receive up to six calls from trained SSRS interviewers.

Data quality checks

To ensure high-quality data, Center researchers performed data quality checks to identify any respondents showing patterns of satisficing. This includes checking for whether respondents left questions blank at very high rates or always selected the first or last answer presented. As a result of this checking, two ATP respondents were removed from the survey dataset prior to weighting and analysis.

Weighting

The ATP data is weighted in a process that accounts for multiple stages of sampling and nonresponse that occur at different points in the panel survey process. First, each panelist begins with a base weight that reflects their probability of recruitment into the panel. These weights are then calibrated to align with the population benchmarks in the accompanying table to correct for nonresponse to recruitment surveys and panel attrition. If only a subsample of panelists was invited to participate in the wave, this weight is adjusted to account for any differential probabilities of selection.

Table showing the American Trends Panel weighting dimensions

Among the panelists who completed the survey, this weight is then calibrated again to align with the population benchmarks identified in the accompanying table and trimmed at the 1st and 99th percentiles to reduce the loss in precision stemming from variance in the weights. Sampling errors and tests of statistical significance take into account the effect of weighting.

Confirming childhood and current religious identity

All of the respondents in Wave 170 also participated in Wave 162, which we conducted in February 2025. Wave 162 asked respondents about their current religion (“What is your present religion, if any?”). Wave 162 also asked respondents about their childhood religion (“Thinking about when you were a child, in what religion were you raised, if any?”). Each question had the following response options:

  • Protestant (for example, Baptist, Methodist, non-denominational, Lutheran, Presbyterian, Pentecostal, Episcopalian, Church of Christ, Congregational/United Church of Christ, Holiness, Reformed, Church of God, etc.)
  • Roman Catholic
  • Mormon (Church of Jesus Christ of Latter-day Saints or LDS)
  • Orthodox (such as Greek, Russian, or some other Orthodox church)
  • Jewish
  • Muslim
  • Buddhist
  • Hindu
  • Atheist
  • Agnostic
  • Something else (please specify)
  • Nothing in particular

In Wave 170, most respondents were asked to confirm both their current religious identity and their childhood religious identity from Wave 162.12 For instance, people who said in Wave 162 that they are Catholic were asked in W170, “In a previous survey we asked what your current religion is, and you indicated you are Catholic. Do we have that right?” And people who said in Wave 162 that they were raised Catholic were asked in Wave 170, “And you also indicated that as a child, you were raised Catholic. Do we have that right?”

Asking respondents to confirm their religion was necessary because Wave 170 included lots of questions about religious switching and other topics that required filling in blanks based on the respondent’s religious background. For instance, people who were raised Catholic but who now identify as atheist were asked about the reasons for why they are “atheist” today. Conversely, people who were raised atheist but are now Catholic were asked about the reasons for why they are “Catholic” today.

Overall, 92% of respondents who were asked to confirm their religious identity did so, while 8% did not.

Table showing the sample sizes and margins of error in ATP Wave 170

Throughout this report, analysis of the Wave 170 data that looks at people within religious categories (i.e., analysis of Protestants or Catholics, or of religiously unaffiliated people) is based on respondents who confirmed both their current religion and their childhood religion from Wave 162. Similarly, all analyses that look at religious trajectories (i.e., people who have switched religions or people who still identify with their childhood religion) are based on respondents who successfully confirmed both their current religion and their childhood religion from Wave 162.13

The table titled “Sample sizes and margins of error, ATP Wave 170” shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey.

Sample sizes and sampling errors for other subgroups are available upon request. In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.

Dispositions and response rates

Table showing the final dispositions in ATP Wave 170
Table showing the cumulative response rate in ATP Wave 170
← Prev Page
1 2 3 4 5 6 7
Next Page →
RECOMMENDED CITATION:

Alper, Becka A., Patricia Tevington, Asta Kallo and Jeff Diamant. 2025. “Why Do Some Americans Leave Their Religion While Others Stay?” Pew Research Center. doi: 10.58094/52kn-8828.

  1. AAPOR Task Force on Address-based Sampling. 2016. “AAPOR Report: Address-based Sampling.”
  2. Email pewsurveys@pewresearch.org.
  3. The ATP does not use routers or chains in any part of its online data collection protocol, nor are they used to direct respondents to additional surveys. Postcard notifications for web panelists are sent to 1) panelists who were recruited within the last two years and 2) panelists recruited prior to the last two years who opt to continue receiving postcard notifications.
  4. Respondents who indicated their current or childhood religion was “Something else” or who didn’t answer one or both questions were not asked to confirm their religion.
  5. We checked to see what difference it would make if our analysis was based on all respondents, and not just on those who confirmed their religion. The results reported here would not change meaningfully if they were based on all respondents rather than restricted to those who confirmed their religion.
Icon for promotion number 1

Sign up for our weekly newsletter

Fresh data delivered Saturday mornings

Thank you for subscribing!

Processing…