The data in this report are based on a nationally representative survey of 1,085 American adults, aged 18 and older. Results are reported for the subset of 938 registered voters who participated in the survey. The survey was conducted December 2 –12, 2022. All questionnaires were self-administered by respondents in a web-based environment. The median completion time for the survey was 19 minutes.
The sample was drawn from the Ipsos KnowledgePanel®, an online panel of members drawn using probability sampling methods. Prospective members are recruited using a combination of random digit dial and address-based sampling techniques that cover virtually all (non-institutional) residential phone numbers and addresses in the United States. Those contacted who would choose to join the panel but do not have access to the Internet are loaned computers and given Internet access so they may participate.
The sample therefore includes a representative cross-section of American adults—irrespective of whether they have Internet access, use only a cell phone, etc. Key demographic variables were weighted, post survey, to match US Census Bureau norms.
From November 2008 to December 2018, no KnowledgePanel® member participated in more than one Climate Change in the American Mind (CCAM) survey. Beginning with the April 2019 survey, panel members who have participated in CCAM surveys in the past, excluding the most recent two surveys, may be randomly selected for participation. In the current survey, 308 respondents, 268 of whom are registered voters included in this report, participated in a previous CCAM survey.
The survey instrument was designed by Anthony Leiserowitz, Seth Rosenthal, Jennifer Carman, Marija Verner, Sanguk Lee, Matthew Goldberg, and Jennifer Marlon of Yale University, and Edward Maibach, John Kotcher, and Teresa Myers of George Mason University. The categories for the content analysis of the open-ended responses about the Inflation Reduction Act (IRA) were developed by John Kotcher of George Mason University, and open-ended responses were coded by Patrick Ansah and Nicholas Badullovich of George Mason University. The figures and tables were designed by Sanguk Lee, Marija Verner, and Liz Neyens of Yale University.
Margins of error
All samples are subject to some degree of sampling error—that is, statistical results obtained from a sample can be expected to differ somewhat from results that would be obtained if every member of the target population was interviewed. Average margins of error, at the 95% confidence level, are as follows:
Rounding error and tabulation
In data tables, bases specified are unweighted, while percentages are weighted to match national population parameters.
For tabulation purposes, percentage points are rounded to the nearest whole number. As a result, percentages in a given chart may total slightly higher or lower than 100%. Summed response categories (e.g., “strongly support” + “somewhat support”) are rounded after sums are calculated. For example, in some cases, the sum of 25% + 25% might be reported as 51% (e.g., 25.3% + 25.3% = 50.6%, which, after rounding, would be reported as 25% + 25% = 51%).
Instructions for coding Section 4.2: Open-ended responses about the Inflation Reduction Act (IRA)
A doctoral student and a postdoctoral fellow coded the open-ended responses using instructions and categories developed by one of the Primary Investigators. Percent agreement ranged from 93% – 99% for the categories coded. Differences between the two coders were resolved via discussion between them and the Primary Investigator. “Haven’t heard of IRA” classification was determined by a “nothing at all” response to the preceding question, “How much, if anything, have you heard about the Inflation Reduction Act of 2022 (also known as the IRA), a bill that was passed by the U.S. Congress and signed by President Biden?” Participants who provided that response were not shown this open-ended question. Definitions of the other categories used by the coders are listed below.
For the following variables, we code each survey response for the presence or absence (0 = absent; 1 = present) of the following categories listed below. The order in which the categories are mentioned in the survey response does not matter for the purposes of coding, simply the presence or absence of a particular category.
A survey response can be coded positive for multiple content variables. For example, the response, “Green energy scam” would be coded positive for both climate/clean energy (for the reference to green energy) and skepticism (for referring to it as a scam). Definitions for each content variable are provided below.