How Are the Statistics of Political Polls Interpreted?

TheaDesign / Getty Images

  • Applications Of Statistics
  • Statistics Tutorials
  • Probability & Games
  • Descriptive Statistics
  • Inferential Statistics
  • Math Tutorials
  • Pre Algebra & Algebra
  • Exponential Decay
  • Worksheets By Grade
  • Ph.D., Mathematics, Purdue University
  • M.S., Mathematics, Purdue University
  • B.A., Mathematics, Physics, and Chemistry, Anderson University

At any given time throughout a political campaign , the media may want to know what the public at large thinks about policies or candidates. One solution would be to ask everyone who they would vote for. This would be costly, time-consuming, and infeasible. Another way to determine voter preference is to use a statistical sample .

Rather than ask every voter to state their preference in candidates, polling research companies poll a relatively small number of people who their favorite candidate is. The members of the statistical sample help to determine the preferences of the entire population. There are good polls and not so good polls, so it is important to ask the following questions when reading any results.

Who Was Polled?

A candidate makes their appeal to the voters because the voters are the ones who cast ballots. Consider the following groups of people:

  • Registered voters
  • Likely voters

To discern the mood of the public, any of these groups may be sampled. However, if the intent of the poll is to predict the winner of an election, the sample should be comprised of registered voters or likely voters.

The political composition of the sample sometimes plays a role in interpreting poll results. A sample consisting entirely of registered Republicans would not be good if someone wanted to ask a question about the electorate at large. Since the electorate rarely breaks into 50% registered Republicans and 50% registered Democrats, even this type of sample may not be the best to use.

When Was the Poll Conducted?

Politics can be fast-paced. Within a matter of days, an issue arises, alters the political landscape, and then is forgotten by most when some new issue surfaces. What people were talking about on Monday sometimes seems to be a distant memory when Friday comes. News runs faster than ever, but good polling takes time. Major events can take several days to show up in poll results. The dates when a poll was conducted should be noted to determine if current events have had time to affect the numbers in the results.

What Methods Were Used?

Suppose that Congress is considering a bill that deals with gun control. Read the following two scenarios and ask which is more likely to accurately determine the public sentiment.

  • A blog asks its readers to click on a box to show their support of the bill. A total of 5,000 people participate and there is overwhelming rejection of the bill.
  • A polling firm randomly calls 1,000 registered voters and asks them about their support of the bill. The firm finds that its respondents are more or less evenly split for and against the bill.

Although the first poll has more respondents, they are self-selected. It is likely that the people who would participate are those who have strong opinions. It could even be that the readers of the blog are very like-minded in their opinions (perhaps it is a blog about hunting). The second sample is random, and an independent party has selected the sample. Even though the first poll has a larger sample size, the second sample would be better.

How Large Is the Sample?

As the discussion above shows, a poll with a larger sample size is not necessarily the better poll. On the other hand, a sample size may be too small to state anything meaningful about public opinion. A random sample of 20 likely voters is too small to determine the direction that the entire U.S. population is leaning on an issue. But how large should the sample be?

Associated with the size of the sample is the margin of error . The larger the sample size, the smaller the margin of error. Surprisingly, sample sizes as small as 1,500 are typically used for polls such as presidential approval, whose margin of error is within a couple of percentage points.   The margin of error could be made as small as desired by using a larger sample, but this would require a higher cost to conduct the poll.

Bringing It All Together

The answers to the above questions should help in assessing the accuracy of results in political polls. Not all polls are created equally, and often details are buried in footnotes or omitted entirely in news articles that quote the poll. That's why it's important to be informed on how a poll was designed.

“ Our Survey Methodology in Detail .” Pew Research Center .

  • How to Calculate the Margin of Error
  • How Large of a Sample Size Do Is Needed for a Certain Margin of Error?
  • What Is Statistical Sampling?
  • Key Election Terms for Students
  • Confidence Interval for the Difference of Two Population Proportions
  • Election Day Guide
  • Plus Four Confidence Intervals
  • How to Construct a Confidence Interval for a Population Proportion
  • Calculating a Confidence Interval for a Mean
  • Margin of Error Formula for Population Mean
  • Example of Two Sample T Test and Confidence Interval
  • Pros and Cons of Compulsory Voting
  • Learn the Difference Between a Parameter and a Statistic
  • Calculate a Confidence Interval for a Mean When You Know Sigma
  • The Use of Confidence Intervals in Inferential Statistics
  • How the Votes Are Counted on Election Day

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Adv Pract Oncol
  • v.6(2); Mar-Apr 2015

Understanding and Evaluating Survey Research

A variety of methodologic approaches exist for individuals interested in conducting research. Selection of a research approach depends on a number of factors, including the purpose of the research, the type of research questions to be answered, and the availability of resources. The purpose of this article is to describe survey research as one approach to the conduct of research so that the reader can critically evaluate the appropriateness of the conclusions from studies employing survey research.

SURVEY RESEARCH

Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative research strategies (e.g., using questionnaires with numerically rated items), qualitative research strategies (e.g., using open-ended questions), or both strategies (i.e., mixed methods). As it is often used to describe and explore human behavior, surveys are therefore frequently used in social and psychological research ( Singleton & Straits, 2009 ).

Information has been obtained from individuals and groups through the use of survey research for decades. It can range from asking a few targeted questions of individuals on a street corner to obtain information related to behaviors and preferences, to a more rigorous study using multiple valid and reliable instruments. Common examples of less rigorous surveys include marketing or political surveys of consumer patterns and public opinion polls.

Survey research has historically included large population-based data collection. The primary purpose of this type of survey research was to obtain information describing characteristics of a large sample of individuals of interest relatively quickly. Large census surveys obtaining information reflecting demographic and personal characteristics and consumer feedback surveys are prime examples. These surveys were often provided through the mail and were intended to describe demographic characteristics of individuals or obtain opinions on which to base programs or products for a population or group.

More recently, survey research has developed into a rigorous approach to research, with scientifically tested strategies detailing who to include (representative sample), what and how to distribute (survey method), and when to initiate the survey and follow up with nonresponders (reducing nonresponse error), in order to ensure a high-quality research process and outcome. Currently, the term "survey" can reflect a range of research aims, sampling and recruitment strategies, data collection instruments, and methods of survey administration.

Given this range of options in the conduct of survey research, it is imperative for the consumer/reader of survey research to understand the potential for bias in survey research as well as the tested techniques for reducing bias, in order to draw appropriate conclusions about the information reported in this manner. Common types of error in research, along with the sources of error and strategies for reducing error as described throughout this article, are summarized in the Table .

An external file that holds a picture, illustration, etc.
Object name is jadp-06-168-g01.jpg

Sources of Error in Survey Research and Strategies to Reduce Error

The goal of sampling strategies in survey research is to obtain a sufficient sample that is representative of the population of interest. It is often not feasible to collect data from an entire population of interest (e.g., all individuals with lung cancer); therefore, a subset of the population or sample is used to estimate the population responses (e.g., individuals with lung cancer currently receiving treatment). A large random sample increases the likelihood that the responses from the sample will accurately reflect the entire population. In order to accurately draw conclusions about the population, the sample must include individuals with characteristics similar to the population.

It is therefore necessary to correctly identify the population of interest (e.g., individuals with lung cancer currently receiving treatment vs. all individuals with lung cancer). The sample will ideally include individuals who reflect the intended population in terms of all characteristics of the population (e.g., sex, socioeconomic characteristics, symptom experience) and contain a similar distribution of individuals with those characteristics. As discussed by Mady Stovall beginning on page 162, Fujimori et al. ( 2014 ), for example, were interested in the population of oncologists. The authors obtained a sample of oncologists from two hospitals in Japan. These participants may or may not have similar characteristics to all oncologists in Japan.

Participant recruitment strategies can affect the adequacy and representativeness of the sample obtained. Using diverse recruitment strategies can help improve the size of the sample and help ensure adequate coverage of the intended population. For example, if a survey researcher intends to obtain a sample of individuals with breast cancer representative of all individuals with breast cancer in the United States, the researcher would want to use recruitment strategies that would recruit both women and men, individuals from rural and urban settings, individuals receiving and not receiving active treatment, and so on. Because of the difficulty in obtaining samples representative of a large population, researchers may focus the population of interest to a subset of individuals (e.g., women with stage III or IV breast cancer). Large census surveys require extremely large samples to adequately represent the characteristics of the population because they are intended to represent the entire population.

DATA COLLECTION METHODS

Survey research may use a variety of data collection methods with the most common being questionnaires and interviews. Questionnaires may be self-administered or administered by a professional, may be administered individually or in a group, and typically include a series of items reflecting the research aims. Questionnaires may include demographic questions in addition to valid and reliable research instruments ( Costanzo, Stawski, Ryff, Coe, & Almeida, 2012 ; DuBenske et al., 2014 ; Ponto, Ellington, Mellon, & Beck, 2010 ). It is helpful to the reader when authors describe the contents of the survey questionnaire so that the reader can interpret and evaluate the potential for errors of validity (e.g., items or instruments that do not measure what they are intended to measure) and reliability (e.g., items or instruments that do not measure a construct consistently). Helpful examples of articles that describe the survey instruments exist in the literature ( Buerhaus et al., 2012 ).

Questionnaires may be in paper form and mailed to participants, delivered in an electronic format via email or an Internet-based program such as SurveyMonkey, or a combination of both, giving the participant the option to choose which method is preferred ( Ponto et al., 2010 ). Using a combination of methods of survey administration can help to ensure better sample coverage (i.e., all individuals in the population having a chance of inclusion in the sample) therefore reducing coverage error ( Dillman, Smyth, & Christian, 2014 ; Singleton & Straits, 2009 ). For example, if a researcher were to only use an Internet-delivered questionnaire, individuals without access to a computer would be excluded from participation. Self-administered mailed, group, or Internet-based questionnaires are relatively low cost and practical for a large sample ( Check & Schutt, 2012 ).

Dillman et al. ( 2014 ) have described and tested a tailored design method for survey research. Improving the visual appeal and graphics of surveys by using a font size appropriate for the respondents, ordering items logically without creating unintended response bias, and arranging items clearly on each page can increase the response rate to electronic questionnaires. Attending to these and other issues in electronic questionnaires can help reduce measurement error (i.e., lack of validity or reliability) and help ensure a better response rate.

Conducting interviews is another approach to data collection used in survey research. Interviews may be conducted by phone, computer, or in person and have the benefit of visually identifying the nonverbal response(s) of the interviewee and subsequently being able to clarify the intended question. An interviewer can use probing comments to obtain more information about a question or topic and can request clarification of an unclear response ( Singleton & Straits, 2009 ). Interviews can be costly and time intensive, and therefore are relatively impractical for large samples.

Some authors advocate for using mixed methods for survey research when no one method is adequate to address the planned research aims, to reduce the potential for measurement and non-response error, and to better tailor the study methods to the intended sample ( Dillman et al., 2014 ; Singleton & Straits, 2009 ). For example, a mixed methods survey research approach may begin with distributing a questionnaire and following up with telephone interviews to clarify unclear survey responses ( Singleton & Straits, 2009 ). Mixed methods might also be used when visual or auditory deficits preclude an individual from completing a questionnaire or participating in an interview.

FUJIMORI ET AL.: SURVEY RESEARCH

Fujimori et al. ( 2014 ) described the use of survey research in a study of the effect of communication skills training for oncologists on oncologist and patient outcomes (e.g., oncologist’s performance and confidence and patient’s distress, satisfaction, and trust). A sample of 30 oncologists from two hospitals was obtained and though the authors provided a power analysis concluding an adequate number of oncologist participants to detect differences between baseline and follow-up scores, the conclusions of the study may not be generalizable to a broader population of oncologists. Oncologists were randomized to either an intervention group (i.e., communication skills training) or a control group (i.e., no training).

Fujimori et al. ( 2014 ) chose a quantitative approach to collect data from oncologist and patient participants regarding the study outcome variables. Self-report numeric ratings were used to measure oncologist confidence and patient distress, satisfaction, and trust. Oncologist confidence was measured using two instruments each using 10-point Likert rating scales. The Hospital Anxiety and Depression Scale (HADS) was used to measure patient distress and has demonstrated validity and reliability in a number of populations including individuals with cancer ( Bjelland, Dahl, Haug, & Neckelmann, 2002 ). Patient satisfaction and trust were measured using 0 to 10 numeric rating scales. Numeric observer ratings were used to measure oncologist performance of communication skills based on a videotaped interaction with a standardized patient. Participants completed the same questionnaires at baseline and follow-up.

The authors clearly describe what data were collected from all participants. Providing additional information about the manner in which questionnaires were distributed (i.e., electronic, mail), the setting in which data were collected (e.g., home, clinic), and the design of the survey instruments (e.g., visual appeal, format, content, arrangement of items) would assist the reader in drawing conclusions about the potential for measurement and nonresponse error. The authors describe conducting a follow-up phone call or mail inquiry for nonresponders, using the Dillman et al. ( 2014 ) tailored design for survey research follow-up may have reduced nonresponse error.

CONCLUSIONS

Survey research is a useful and legitimate approach to research that has clear benefits in helping to describe and explore variables and constructs of interest. Survey research, like all research, has the potential for a variety of sources of error, but several strategies exist to reduce the potential for error. Advanced practitioners aware of the potential sources of error and strategies to improve survey research can better determine how and whether the conclusions from a survey research study apply to practice.

The author has no potential conflicts of interest to disclose.

Polling Fundamentals

The Dissection of a Telephone Number

All 4 components are assigned by the telephone company. The first 3 components are based on location and the final component is randomly generated.

The interviewer will then randomly select a person in the household to be interviewed. One common method is to ask for the adult in the household who had the most recent birthday. This is done because it is an easy way to obtain a random respondent from the household, rather than the first person to answer the phone. In addition, certain parts of the population, such as young males, are more difficult to get on the phone than others, such as the elderly. Because of this, interviewers often ask to speak with the youngest male in a household first.

Because nearly a quarter of the US population (as of 2014) has a cell phone but no landline telephone, true scientific samples should include a subsample of cell phone users. Cell phone sampling comes with its own unique challenges, such as higher cost and lower response rates. FCC regulations require that cell phone lines be dialed by hand, rather than computer, increasing time and manpower requirements. Geographic information is also different for cell phones, with the area code offering the only geographic information for the cell phone user, and the exchange and block numbers offering information on the service provider.  In addition, the portability of cell phones means that users can keep their numbers if they move. Despite these added complications, however, cell phone sampling methods are similar to those used for landline telephones.

Sampling Error is the calculated statistical imprecision due to interviewing a random sample instead of the entire population. The margin of error provides an estimate of how much the results of the sample may differ due to chance when compared to what would have been found if the entire population was interviewed.

An annotated example:

There are close to 200 million adult U.S. residents. For comparison, let’s say you have a giant jar of 200 million jelly beans. The president has commissioned you to find out how many jelly beans are red, how many are purple, and how many are some other color. Since you have limited funds and time, you opt against counting and sorting all 200 million jelly beans. Instead you randomly select 500 jelly beans of which 30% are red, 10% are purple and 60% are some other color.

Looking at the matrix below, you find that with a sample of 500 jelly beans you can report that 30 percent of the jelly beans in the jar are red, +/- 4%. To further elaborate, you can say, with 95% confidence, red jelly beans make up 30% {+/- 4% or the range of 26-34%} of the beans in the jar. Likewise you can report that purple jelly beans make up 10% {+/- 3%or the range of 7-13%} of the beans in the jar.

Recommended allowance for sampling error of a percentage * In Percentage Points (at 95 in 100 confidence level)**

Sample Size           9

Survey firms apply a technique called weighting to adjust the poll results to account for possible sample biases caused by specific groups of individuals not responding. The weighting uses known estimates of the total population provided by the Census to adjust the final results.

It’s not uncommon to weight data by age, gender, education, race, etc. in order to achieve the correct demographic proportions.

Data Collection Mode is the phrase used to describe the method by which the selected participants complete the survey. Survey modes include telephone, mail, in-person, or online.

Sampling procedure refers to the process by which researchers choose the respondents for a poll. The methods used can be probability or non-probability-based. A probability-based poll utilizes a randomized selection process where every person in the target population theoretically has an equal chance to be selected as a respondent. In a nonprobability poll, not every member of the target population could be selected to participate, which can introduce bias.

In the earliest days of polling, most polls were conducted using nonprobability quota methods. After roughly 1950, most U.S. polling organizations shifted to probability methods.  

Probability-based sampling can be used with any of these modes of data collection. With telephone polling, Random Digit Dialing (RDD) of active U.S. phone exchanges provides nearly comprehensive coverage of the national population and therefore provides a practical way to build a probability-based sample by randomly selecting telephone numbers. Address-Based Sampling (ABS) relies on random selection from the U.S. Postal Service (USPS) Computerized Delivery Sequence File, which also provides nearly complete coverage of residential mailing addresses in the U.S.

But using a probability-based sample to conduct an online survey presents a challenge. There is no comprehensive database of email addresses like there is of mailing addresses or active phone exchanges from which to make a random selection. Email addresses are not tied to residency, and not all Americans have an email address, while many have more than one, making it impossible to create a probability-based sampling frame to represent the total population.

With response rates for telephone polls decreasing dramatically as costs skyrocket, many organizations have decided that polling needs to move online, where contacting people is cheaper. The easiest and cheapest way to conduct online polling is to use a nonprobability-based online panel. Polls using this sampling approach are numerous and popular but risk bias as a result of surveying only that subset of the population who are online and oversampling those who are heavier internet users. These relatively inexpensive polls are based on large panels of respondents who agree to answer surveys, usually in return for small rewards like points that can be exchanged for gift certificates. Panelists can be recruited through email lists, online ads, or other methods. Samples for specific polls are often built using quotas for different demographic groups, and weighting is used to try to make samples representative of the target population.  

Researchers who want to retain the advantages of probability-based sampling find a few online options. Online probability panels polls – which are newer, less common and more expensive than nonprobability online polls – use traditional probability-based samples, like ABS, to make the first contact with a respondent or household. Those people who do not have web access are provided such access. A large number of respondents are selected to be a part of the panel, and then random selections are conducted within the panel or subpopulations of the panel to be invited to answer particular surveys.

Several organizations with studies in iPoll conduct polling via online probability panels, including KnowledgePanel (formerly KnowledgeNetworks), NORC AmeriSpeak, the Pew American Trends Panel, the RAND American Life Panel, and the SSRS Probability Panel.

Learn more:

https://aapor.org/publications-resources/education-resources/election-polling-resources/

Balancing samples

Once a sample has been selected and respondents contacted, pollsters implement methods intended to improve sample balance. Callbacks in telephone and in-person polls, or reminders or incentives to participate in online polls, can increase participation among those who are less likely to answer the phone or respond to emails. After the fieldwork ends, pollsters use weighting to bring the final sample in line with the national population in terms of sex, race, education, and other characteristics, using Census demographics or other benchmarks. These efforts improve polling accuracy, but nothing can ensure perfect representation.

Question wording

The most perfectly representative sample in the world can still misrepresent public opinion if the question wording is leading, unbalanced, or simply too confusing. The presence of other questions in the survey instrument and the order in which they are presented can also affect responses. For example, asking political questions before policy questions may prime the respondent to give answers more aligned with their party’s position.

Mode effects

The method used to contact the respondents – telephone calls, mail, web, text messages, in person interviewers, etc. – is called the survey mode. Different modes can lead to different results due to the sampling issues described above, but also because different types of interactions can have different effects on respondents. Questions about highly sensitive topics, like sexual or substance use experiences, might be answered more truthfully when the respondent feels more anonymous. But a well-trained interviewer may be effective at encouraging participation in a poll for a reluctant respondent, thereby reducing overall bias.

House effects

In some cases, different field organizations can get different results for the same survey questions using similar polling methods. This can be attributed to several factors, like choice of weighting demographics, number of callbacks or reminders used, visual design on an online polling instrument, or interviewer training. One example of interviewer training differences can be found in accepting “don’t know” as a volunteered response. Some organizations may ask their interviewers to probe for an answer before accepting “don’t know,” while others allow interviewers to accept a don’t know response immediately. The same question of the same population might therefore result in differing levels of “don’t know” response.

There are some tables that are straightforward. The Roper Center’s iPoll database offers the topline results to survey questions–toplines are how the full aggregated sample answered the questions.

iPoll example: You might say that the public is evenly split on judging the integrity of pollsters, according to this November 2002 telephone conducted by Harris Interactive and obtained from the Roper Center at the Cornell University.

Cross-tabulation tables can be more complicated. Crosstabs offer a look at how different groups within the sample answered the question. In other words, the table below can be summarized in this manner:

A New York Times poll in June 2000 found that among whites, 81% thought race relations in their community were “good”, while 72% of black respondents found this to be the case. Conversely, 14% of whites and 22% of blacks identified their community race relations as “bad”. Among those who identified with the “other” race category, 79%  responded good and 18% bad to the question of race relations in their community. There were too few Asians in the sample to be able to statistically rely upon the percentages. These data were provided by the Roper Center at the Cornell University.

Source: New York Times Poll, Race Relations in America, June 2000 Data provided by the Roper Center at the Cornell University.

For further information, please contact The Roper Center at 607.255.8129 or  Support at Roper.

Join Our Newsletter

Sign up for the Data Dive newsletter with quarterly news, data stories and more. Read our updated privacy policy here.

About Stanford GSB

  • The Leadership
  • Dean’s Updates
  • School News & History
  • Commencement
  • Business, Government & Society
  • Centers & Institutes
  • Center for Entrepreneurial Studies
  • Center for Social Innovation
  • Stanford Seed

About the Experience

  • Learning at Stanford GSB
  • Experiential Learning
  • Guest Speakers
  • Entrepreneurship
  • Social Innovation
  • Communication
  • Life at Stanford GSB
  • Collaborative Environment
  • Activities & Organizations
  • Student Services
  • Housing Options
  • International Students

Full-Time Degree Programs

  • Why Stanford MBA
  • Academic Experience
  • Financial Aid
  • Why Stanford MSx
  • Research Fellows Program
  • See All Programs

Non-Degree & Certificate Programs

  • Executive Education
  • Stanford Executive Program
  • Programs for Organizations
  • The Difference
  • Online Programs
  • Stanford LEAD
  • Seed Transformation Program
  • Aspire Program
  • Seed Spark Program
  • Faculty Profiles
  • Academic Areas
  • Awards & Honors
  • Conferences

Faculty Research

  • Publications
  • Working Papers
  • Case Studies

Research Hub

  • Research Labs & Initiatives
  • Business Library
  • Data, Analytics & Research Computing
  • Behavioral Lab

Research Labs

  • Cities, Housing & Society Lab
  • Golub Capital Social Impact Lab

Research Initiatives

  • Corporate Governance Research Initiative
  • Corporations and Society Initiative
  • Policy and Innovation Initiative
  • Rapid Decarbonization Initiative
  • Stanford Latino Entrepreneurship Initiative
  • Value Chain Innovation Initiative
  • Venture Capital Initiative
  • Career & Success
  • Climate & Sustainability
  • Corporate Governance
  • Culture & Society
  • Finance & Investing
  • Government & Politics
  • Leadership & Management
  • Markets & Trade
  • Operations & Logistics
  • Opportunity & Access
  • Organizational Behavior
  • Political Economy
  • Social Impact
  • Technology & AI
  • Opinion & Analysis
  • Email Newsletter

Welcome, Alumni

  • Communities
  • Digital Communities & Tools
  • Regional Chapters
  • Women’s Programs
  • Identity Chapters
  • Find Your Reunion
  • Career Resources
  • Job Search Resources
  • Career & Life Transitions
  • Programs & Services
  • Career Video Library
  • Alumni Education
  • Research Resources
  • Volunteering
  • Alumni News
  • Class Notes
  • Alumni Voices
  • Contact Alumni Relations
  • Upcoming Events

Admission Events & Information Sessions

  • MBA Program
  • MSx Program
  • PhD Program
  • Alumni Events
  • All Other Events

How Polls Influence Behavior

New research says polling data helps voters get the information they need to make decisions.

October 30, 2012

when a research company polls residents

A woman casts her ballot at a polling station in Sierra Leone. (Associated Press photo by Rebecca Blackwell)

With the presidential race in the home stretch, there’s little doubt that the unending barrage of polling data influences voters. But the conventional wisdom that they mindlessly follow the herd misses a critical point: Voters also look at polling results as a way to garner information they need to make up their minds.

A new working paper by Neil Malhotra of Stanford’s Graduate School of Business, and David Rothschild of Microsoft Research, shows that some voters do, in fact, switch sides in an effort to feel accepted and to be part of a winning team. But the paper also concludes that greater numbers of voters are searching for the “wisdom of crowds” when they evaluate poll results, and that the opinion of experts matters more to them than that of their peers.

The researchers asked a selected group of voters to state their opinions on a variety of real public policy questions, and then presented them with fabricated poll results on the same topics. When the test subjects learned that a large number of experts favored a position, opinions shifted by 11.3%. But the “opinions of people like me” changed opinions by just 6.2%, while a general poll saying that a majority of people favored one side or the other moved the needle by 8.1%.

Although the focus of the research was on how polls effect voting on policy questions, the results, says Malhotra, shed light on elections for office as well.

Voters, the researchers found, are looking for and responding to new information. As a result, polls that state what’s already well known — an extremely popular incumbent is likely to win, for example — are likely to have far less effect than those that reveal something unexpected.

“The main reason why people conform to majority opinion in the political domain is that they perceive there to be information about the quality of the policies in learning about mass support,” the paper concludes.

The researchers also investigated an intriguing possibility: Do voters switch sides to “resolve cognitive dissonance”; that is, to soothe themselves over the fact that a policy they don’t support is likely to win? But they found little evidence to support that theory.

The study offers some comfort to those who fear that polling is dangerous to the political process. “There are two ways to look at these results,” says Malhotra. “Majorities can cascade, which is not good if we want to preserve minority rights or worry about herding. But we see that people are also looking for information and try to learn from the wisdom of crowds.”

Their paper, “ Why Are Polls Self-Fulfilling Prophecies? A Test of Three Mechanisms ,” is available online.

For media inquiries, visit the Newsroom .

Explore More

Public pensions are mixing risky investments with unrealistic predictions, how the supreme court’s abortion ruling played in the court of public opinion, building better infrastructure auctions, editor’s picks.

when a research company polls residents

Are Polls and Probabilities Self-Fulfilling Prophecies? David Rothschild Neil Malhotra

  • Priorities for the GSB's Future
  • See the Current DEI Report
  • Supporting Data
  • Research & Insights
  • Share Your Thoughts
  • Search Fund Primer
  • Teaching & Curriculum
  • Affiliated Faculty
  • Faculty Advisors
  • Louis W. Foster Resource Center
  • Defining Social Innovation
  • Impact Compass
  • Global Health Innovation Insights
  • Faculty Affiliates
  • Student Awards & Certificates
  • Changemakers
  • Dean Jonathan Levin
  • Dean Garth Saloner
  • Dean Robert Joss
  • Dean Michael Spence
  • Dean Robert Jaedicke
  • Dean Rene McPherson
  • Dean Arjay Miller
  • Dean Ernest Arbuckle
  • Dean Jacob Hugh Jackson
  • Dean Willard Hotchkiss
  • Faculty in Memoriam
  • Stanford GSB Firsts
  • Certificate & Award Recipients
  • Teaching Approach
  • Analysis and Measurement of Impact
  • The Corporate Entrepreneur: Startup in a Grown-Up Enterprise
  • Data-Driven Impact
  • Designing Experiments for Impact
  • Digital Business Transformation
  • The Founder’s Right Hand
  • Marketing for Measurable Change
  • Product Management
  • Public Policy Lab: Financial Challenges Facing US Cities
  • Public Policy Lab: Homelessness in California
  • Lab Features
  • Curricular Integration
  • View From The Top
  • Formation of New Ventures
  • Managing Growing Enterprises
  • Startup Garage
  • Explore Beyond the Classroom
  • Stanford Venture Studio
  • Summer Program
  • Workshops & Events
  • The Five Lenses of Entrepreneurship
  • Leadership Labs
  • Executive Challenge
  • Arbuckle Leadership Fellows Program
  • Selection Process
  • Training Schedule
  • Time Commitment
  • Learning Expectations
  • Post-Training Opportunities
  • Who Should Apply
  • Introductory T-Groups
  • Leadership for Society Program
  • Certificate
  • 2023 Awardees
  • 2022 Awardees
  • 2021 Awardees
  • 2020 Awardees
  • 2019 Awardees
  • 2018 Awardees
  • Social Management Immersion Fund
  • Stanford Impact Founder Fellowships and Prizes
  • Stanford Impact Leader Prizes
  • Social Entrepreneurship
  • Stanford GSB Impact Fund
  • Economic Development
  • Energy & Environment
  • Stanford GSB Residences
  • Environmental Leadership
  • Stanford GSB Artwork
  • A Closer Look
  • California & the Bay Area
  • Voices of Stanford GSB
  • Business & Beneficial Technology
  • Business & Sustainability
  • Business & Free Markets
  • Business, Government, and Society Forum
  • Get Involved
  • Second Year
  • Global Experiences
  • JD/MBA Joint Degree
  • MA Education/MBA Joint Degree
  • MD/MBA Dual Degree
  • MPP/MBA Joint Degree
  • MS Computer Science/MBA Joint Degree
  • MS Electrical Engineering/MBA Joint Degree
  • MS Environment and Resources (E-IPER)/MBA Joint Degree
  • Academic Calendar
  • Clubs & Activities
  • LGBTQ+ Students
  • Military Veterans
  • Minorities & People of Color
  • Partners & Families
  • Students with Disabilities
  • Student Support
  • Residential Life
  • Student Voices
  • MBA Alumni Voices
  • A Week in the Life
  • Career Support
  • Employment Outcomes
  • Cost of Attendance
  • Knight-Hennessy Scholars Program
  • Yellow Ribbon Program
  • BOLD Fellows Fund
  • Application Process
  • Loan Forgiveness
  • Contact the Financial Aid Office
  • Evaluation Criteria
  • GMAT & GRE
  • English Language Proficiency
  • Personal Information, Activities & Awards
  • Professional Experience
  • Letters of Recommendation
  • Optional Short Answer Questions
  • Application Fee
  • Reapplication
  • Deferred Enrollment
  • Joint & Dual Degrees
  • Entering Class Profile
  • Event Schedule
  • Ambassadors
  • New & Noteworthy
  • Ask a Question
  • See Why Stanford MSx
  • Is MSx Right for You?
  • MSx Stories
  • Leadership Development
  • Career Advancement
  • Career Change
  • How You Will Learn
  • Admission Events
  • Personal Information
  • Information for Recommenders
  • GMAT, GRE & EA
  • English Proficiency Tests
  • After You’re Admitted
  • Daycare, Schools & Camps
  • U.S. Citizens and Permanent Residents
  • Requirements
  • Requirements: Behavioral
  • Requirements: Quantitative
  • Requirements: Macro
  • Requirements: Micro
  • Annual Evaluations
  • Field Examination
  • Research Activities
  • Research Papers
  • Dissertation
  • Oral Examination
  • Current Students
  • Education & CV
  • International Applicants
  • Statement of Purpose
  • Reapplicants
  • Application Fee Waiver
  • Deadline & Decisions
  • Job Market Candidates
  • Academic Placements
  • Stay in Touch
  • Faculty Mentors
  • Current Fellows
  • Standard Track
  • Fellowship & Benefits
  • Group Enrollment
  • Program Formats
  • Developing a Program
  • Diversity & Inclusion
  • Strategic Transformation
  • Program Experience
  • Contact Client Services
  • Campus Experience
  • Live Online Experience
  • Silicon Valley & Bay Area
  • Digital Credentials
  • Faculty Spotlights
  • Participant Spotlights
  • Eligibility
  • International Participants
  • Stanford Ignite
  • Frequently Asked Questions
  • Operations, Information & Technology
  • Classical Liberalism
  • The Eddie Lunch
  • Accounting Summer Camp
  • Videos, Code & Data
  • California Econometrics Conference
  • California Quantitative Marketing PhD Conference
  • California School Conference
  • China India Insights Conference
  • Homo economicus, Evolving
  • Political Economics (2023–24)
  • Scaling Geologic Storage of CO2 (2023–24)
  • A Resilient Pacific: Building Connections, Envisioning Solutions
  • Adaptation and Innovation
  • Changing Climate
  • Civil Society
  • Climate Impact Summit
  • Climate Science
  • Corporate Carbon Disclosures
  • Earth’s Seafloor
  • Environmental Justice
  • Operations and Information Technology
  • Organizations
  • Sustainability Reporting and Control
  • Taking the Pulse of the Planet
  • Urban Infrastructure
  • Watershed Restoration
  • Junior Faculty Workshop on Financial Regulation and Banking
  • Ken Singleton Celebration
  • Marketing Camp
  • Quantitative Marketing PhD Alumni Conference
  • Presentations
  • Theory and Inference in Accounting Research
  • Stanford Closer Look Series
  • Quick Guides
  • Core Concepts
  • Journal Articles
  • Glossary of Terms
  • Faculty & Staff
  • Researchers & Students
  • Research Approach
  • Charitable Giving
  • Financial Health
  • Government Services
  • Workers & Careers
  • Short Course
  • Adaptive & Iterative Experimentation
  • Incentive Design
  • Social Sciences & Behavioral Nudges
  • Bandit Experiment Application
  • Conferences & Events
  • Reading Materials
  • Energy Entrepreneurship
  • Faculty & Affiliates
  • SOLE Report
  • Responsible Supply Chains
  • Current Study Usage
  • Pre-Registration Information
  • Participate in a Study
  • Founding Donors
  • Location Information
  • Participant Profile
  • Network Membership
  • Program Impact
  • Collaborators
  • Entrepreneur Profiles
  • Company Spotlights
  • Seed Transformation Network
  • Responsibilities
  • Current Coaches
  • How to Apply
  • Meet the Consultants
  • Meet the Interns
  • Intern Profiles
  • Collaborate
  • Research Library
  • News & Insights
  • Program Contacts
  • Databases & Datasets
  • Research Guides
  • Consultations
  • Research Workshops
  • Career Research
  • Research Data Services
  • Course Reserves
  • Course Research Guides
  • Material Loan Periods
  • Fines & Other Charges
  • Document Delivery
  • Interlibrary Loan
  • Equipment Checkout
  • Print & Scan
  • MBA & MSx Students
  • PhD Students
  • Other Stanford Students
  • Faculty Assistants
  • Research Assistants
  • Stanford GSB Alumni
  • Telling Our Story
  • Staff Directory
  • Site Registration
  • Alumni Directory
  • Alumni Email
  • Privacy Settings & My Profile
  • Success Stories
  • The Story of Circles
  • Support Women’s Circles
  • Stanford Women on Boards Initiative
  • Alumnae Spotlights
  • Insights & Research
  • Industry & Professional
  • Entrepreneurial Commitment Group
  • Recent Alumni
  • Half-Century Club
  • Fall Reunions
  • Spring Reunions
  • MBA 25th Reunion
  • Half-Century Club Reunion
  • Faculty Lectures
  • Ernest C. Arbuckle Award
  • Alison Elliott Exceptional Achievement Award
  • ENCORE Award
  • Excellence in Leadership Award
  • John W. Gardner Volunteer Leadership Award
  • Robert K. Jaedicke Faculty Award
  • Jack McDonald Military Service Appreciation Award
  • Jerry I. Porras Latino Leadership Award
  • Tapestry Award
  • Student & Alumni Events
  • Executive Recruiters
  • Interviewing
  • Land the Perfect Job with LinkedIn
  • Negotiating
  • Elevator Pitch
  • Email Best Practices
  • Resumes & Cover Letters
  • Self-Assessment
  • Whitney Birdwell Ball
  • Margaret Brooks
  • Bryn Panee Burkhart
  • Margaret Chan
  • Ricki Frankel
  • Peter Gandolfo
  • Cindy W. Greig
  • Natalie Guillen
  • Carly Janson
  • Sloan Klein
  • Sherri Appel Lassila
  • Stuart Meyer
  • Tanisha Parrish
  • Virginia Roberson
  • Philippe Taieb
  • Michael Takagawa
  • Terra Winston
  • Johanna Wise
  • Debbie Wolter
  • Rebecca Zucker
  • Complimentary Coaching
  • Changing Careers
  • Work-Life Integration
  • Career Breaks
  • Flexible Work
  • Encore Careers
  • D&B Hoovers
  • Data Axle (ReferenceUSA)
  • EBSCO Business Source
  • Global Newsstream
  • Market Share Reporter
  • ProQuest One Business
  • Student Clubs
  • Entrepreneurial Students
  • Stanford GSB Trust
  • Alumni Community
  • How to Volunteer
  • Springboard Sessions
  • Consulting Projects
  • 2020 – 2029
  • 2010 – 2019
  • 2000 – 2009
  • 1990 – 1999
  • 1980 – 1989
  • 1970 – 1979
  • 1960 – 1969
  • 1950 – 1959
  • 1940 – 1949
  • Service Areas
  • ACT History
  • ACT Awards Celebration
  • ACT Governance Structure
  • Building Leadership for ACT
  • Individual Leadership Positions
  • Leadership Role Overview
  • Purpose of the ACT Management Board
  • Contact ACT
  • Business & Nonprofit Communities
  • Reunion Volunteers
  • Ways to Give
  • Fiscal Year Report
  • Business School Fund Leadership Council
  • Planned Giving Options
  • Planned Giving Benefits
  • Planned Gifts and Reunions
  • Legacy Partners
  • Giving News & Stories
  • Giving Deadlines
  • Development Staff
  • Submit Class Notes
  • Class Secretaries
  • Board of Directors
  • Health Care
  • Sustainability
  • Class Takeaways
  • All Else Equal: Making Better Decisions
  • If/Then: Business, Leadership, Society
  • Grit & Growth
  • Think Fast, Talk Smart
  • Spring 2022
  • Spring 2021
  • Autumn 2020
  • Summer 2020
  • Winter 2020
  • In the Media
  • For Journalists
  • DCI Fellows
  • Other Auditors
  • Academic Calendar & Deadlines
  • Course Materials
  • Entrepreneurial Resources
  • Campus Drive Grove
  • Campus Drive Lawn
  • CEMEX Auditorium
  • King Community Court
  • Seawell Family Boardroom
  • Stanford GSB Bowl
  • Stanford Investors Common
  • Town Square
  • Vidalakis Courtyard
  • Vidalakis Dining Hall
  • Catering Services
  • Policies & Guidelines
  • Reservations
  • Contact Faculty Recruiting
  • Lecturer Positions
  • Postdoctoral Positions
  • Accommodations
  • CMC-Managed Interviews
  • Recruiter-Managed Interviews
  • Virtual Interviews
  • Campus & Virtual
  • Search for Candidates
  • Think Globally
  • Recruiting Calendar
  • Recruiting Policies
  • Full-Time Employment
  • Summer Employment
  • Entrepreneurial Summer Program
  • Global Management Immersion Experience
  • Social-Purpose Summer Internships
  • Process Overview
  • Project Types
  • Client Eligibility Criteria
  • Client Screening
  • ACT Leadership
  • Social Innovation & Nonprofit Management Resources
  • Develop Your Organization’s Talent
  • Centers & Initiatives
  • Student Fellowships

Polling and Public Opinion Sources Online: Home

  • Reference Sources

US Polls and Polling Data   ~ International Polls ~ Social Surveys

For print sources, see Polling and Public Opinion: A Research Guide , compiled by Michael Engle.

US Public Opinion and Polling Data Online

  • Pew Research Center "Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. We conduct public opinion polling, demographic research, content analysis and other data-driven social science research." [ About Pew Research Center ]
  • Gallup.Com Gallup.com provides articles that summarize the results of recent, selected polling data. Includes some international polling. The full reports are available by "subscribing" (filling out a request form). See also: Gallup Analytics, below.
  • Gallup Analytics Allows users to access Gallup's U.S. Daily tracking, World Poll, and Gallup Social Series data to compare residents' responses region by region and nation by nation to questions on topics such as economic conditions, government and business, health and wellbeing, infrastructure, and education.
  • Roper Center @ Cornell Datasets from opinion polls conducted around the world. and iPoll, below, an online database of public opinion, organized at the question-level, from a wide variety of polling organizations.
  • Roper Center -- iPoll A full-text retrieval system, the iPOLL online database is organized at the question level, providing the tools to sift through nearly a half million questions asked on national public opinion surveys since 1935 and updated daily.
  • Exit Poll datasets Roper Center Polling Data > US Elections Exit polling data is collected by Edison Research, a market research and polling firm and purchased by major news networks for their forecasts. The Roper Center’s collection holds all U.S. major media exit polls dating back to the first in 1972, spearheaded by innovator Warren Mitofsky, as well as thousands of datasets from major pre- and post-election polls in the U.S. See also: Exit Polls: Surveying the American Electorate ,1927-2010. Washington, DC: SAGE Publications Ltd. 2012 (Olin Reference, JK 2007.B47 2012+) Authors Samuel J. Best and Brian S. Krueger - both election commentators for CBS news and statistical experts - present more than 100 tables and 100 figures showing the changes in the electorate and its voting patterns over time.
  • Louis Harris Data Center (Dataverse) "Since 1965, the Odum Institute has operated the Louis Harris Data Center, the national depository for publicly available survey data collected by Louis Harris and Associates, Inc. More than 1,000 Harris polls from as early as 1958 are archived at the Center and contain over 160,000 questions asked of more than 1,200,000 respondents. Respondent groups range from national cross-section samples to such special populations as Vietnam veterans, Hispanics, teenagers, and the elderly. The surveys also cover a diverse range of topics, such as aging, environmental issues, leisure and the arts, business, foreign affairs, presidential ratings, health care, attitudes toward government, and crime. Many questions have been repeated over time, allowing researchers to track changes in opinions and attitudes."

International Polling Data

  • Pew Global Indicators database Note: The data is drawn from some of the data in the Pew Global Values Survey . Select the "Custom Analysis" tab. U.S. Image: Opinion of the United States, Confidence in the U.S. President , Drone Strikes, U.S. Personal Freedoms ~ World Economy: Satisfaction with National Conditions, Country's Economic Situation, Views on Trade, Future Economic Situation ~ China Image: Opinion of China, Views of Chinese President, China's Growing Economy, China Personal Freedoms ~ Balance of Power: World's Leading Economic Power, World's Leading Superpower ~ Rating Countries: Opinion of Iran, Opinion of Russia, Opinion of Brazil. Note: Not every country is surveyed every year.
  • Polling the Nations Silver Spring, MD: ORS. Polls taken on a variety of subjects all over the world from 1986 to the present. Each record in the database consists of one poll question and the participants' responses. Records are indexed in one of more than 5,000 topics and six search fields: subject matter, question text, universe, date, polling organization and response categories. Other information provided includes: source name and contact information, sample size, and notes on the sample population. Partially supersedes American Public Opinion Index and American Public Opinion Data [microfiche].
  • Roper Center -- Datasets Datasets from 100+ countries. Datasets can be downloaded in STATA, SPSS, CSV and ASCII formats.
  • World Public Opinion Poll Reports that summarize opinion data. In partnership with the International Institutions and Global Governance program at the Council on Foreign Relations, WorldOpinion.org consolidates global and U.S. public opinion covering ten major issue areas: elements of world order, international institutions, violent conflict, terrorism, nuclear proliferation, climate change, energy security, the global economy, economic development, and human rights.

Social Surveys

  • General Social Survey (United States) "The General Social Survey (GSS) is a nationally representative survey of adults in the United States conducted since 1972. The GSS collects data on contemporary American society in order to monitor and explain trends in opinions, attitudes and behaviors. The GSS has adapted questions from earlier surveys, thereby allowing researchers to conduct comparisons for up to 80 years." [ About the GSS ] Download to explore with statistical analysis software (SPSS, STATA, SAS), use the online data explorer. , or view reports based on the survey in Key Trends .
  • Afrobarometer Note: Works best in Google Chrome Offers reports summarizing the survey results and an online analysis tool. Select the round (currently, 1-5), Select a variable. This opens the analysis tool. Choose a variable to cross-tabulate, Select additional variables, as needed.
  • East Asia Barometer 2 waves -- 2001-2003, 2005-2008. Select the study, Select a a single country or multiple countries, Select a question (variable) by clicking on the link (for example, gender of respondent). This opens up the online analysis tool, Select Cross-tabs, Use the arrows to page through the different variables that can be analyzed.
  • AmericasBarometer Latin America Public Opinion Project Requires the use of data analysis software (Stata or SPSS) and some understanding of statistical analysis methods. Datasets of surveys of democratic public opinion and behavior in the Americas using national probability samples of voting-age adults. Note: You can find some prepared reports under the Spotlights series.
  • European Commission Public Opinion Offers data sets for analysis. Also offers prepared reports and fact sheets based on the survey results. To find the prepared reports, go to the "Standard Eurobarometer" page and scroll down under "Attachments."
  • European Social Survey -- Online Analysis Tool The ESS Data Download Wizard gives access to cumulative data from countries that have been included in the integrated ESS files in two or more rounds, and all variables from questions that have been asked in more than one round. Note : You must register to use the the wizard.
  • 2007 - Leisure Time and Sports
  • 2005 - Work Orientations III
  • 2004 - Citizenship
  • 2003 - National Identity II
  • 2002 - Family III
  • 2001 - Social networks II
  • 2000 - Environment II
  • 1999 - Social Inequality III
  • 1998 - Religion II
  • World Values Survey Beginning in 1981, conducted in successive waves. A.- Perceptions of life B.- Environment C.- Work D.- Family E.- Politics and Society F.- Religion and Morale G.- National Identity S.- Structure of the file X.- Socio-demographics Y.- Special Indexes Search tips: On the left-hand menu, select Online Data Analysis. "You may browse the variable index and see question texts, frequencies for each answer, crosstabs of each question by country or any other variable, and even graphics." You can also check under Findings & Insights for summaries of some of the findings.

Send me an email!

Profile Photo

  • Next: Reference Sources >>
  • Last Updated: Aug 7, 2023 2:29 PM
  • URL: https://guides.library.cornell.edu/polling_survey_online

SSRS Opinion Panel

  • Contacted by SSRS?

when a research company polls residents

Forward thinking research.

SSRS is your trusted partner for complex and challenging research problems. With the latest data collection best practices, cutting-edge survey methodologies, and an industry leading team, SSRS is breaking the mold on what research companies can do.

when a research company polls residents

Providing answers you can stand behind.

SSRS Opinion Panel Products are widely renowned tools for providing accurate, multi-modal survey responses that deliver on how people think and how their opinions guide behavior. Fast and Flexible, our panels are at the forefront of our work and the foundation on which we have built our value.

when a research company polls residents

Rooted in truth.

Our Encipher suite of calibration services utilizes innovative and experimentally-validated methods to reduce bias and improve data quality. A rigorously tested approach to calibrating nonprobability and hybrid data, our expansive bank of customizable calibration items offers a refreshing alternative to “one size fits all” weighting.

SSRS in the News

Kff survey of medicaid unwinding.

Data Collection Conducted by SSRS

Women’s Views of Abortion Access and Policies in the Dobbs Era: Insights From the KFF Health Tracking Poll

New marquette law school poll national survey finds upturn in approval of u.s. supreme court.

Survey Conducted by SSRS

America Needs a New Approach on Affordable Housing

The Urban Institute engaged SSRS to conduct the 2022 Survey of Housing Instability

How Canada Compares: Results From The Commonwealth Fund’s 2023 International Health Policy Survey of the General Population Age 18+ in 10 Countries

Majority of voters in michigan and pennsylvania are dissatisfied with their presidential choices.

CNN Politics Poll Conducted by SSRS

Majority of New Jerseyans Say Teachers Should Keep Transgender Students’ Identity Confidential as Matter of Safety

Findings from the Rutgers-Eagleton/SSRS Garden State Panel

More than 6 in 10 SOTU viewers had a positive reaction to Biden’s speech

Instant Reaction CNN Poll Conducted on the SSRS Text Message Panel

New Journal of Survey Statistics and Methodology Article features CHIS data

Authored by Mickey Jackson of SSRS and Todd Hughes and Jiangzhou Fu of the UCLA Center for Health Policy Research

Medical Debt in Missouri

Eight focus groups were conducted by SSRS from December 6-21, 2023

SSRS provides answers you can trust through rigorous research and relevant insights. Our focus, resolve, and passion for solving problems is relentless. We apply independent thinking to custom research solutions, combined with agile and steadfast problem-solving. When you work with the SSRS team, you have confidence in the reliability of data rooted in truth.

Our Research Areas

Public opinion & policy.

Leading the way to a well-informed society through customized research.

  • Physicians and other health care providers
  • Businesses owners, benefits managers, and other company health care decision makers
  • Patients with a variety of health conditions and experiences
  • Older adults

KFF Vaccine Monitor

Health care & health policy.

Using patient and provider experiences to build healthier communities and improve health equity.

International Health Policy Surveys

Consumer & lifestyle.

SSRS gets to the heart of what makes people tick by digging into consumer attitudes and behaviors.

  • Hear the voice of the consumer
  • Explore product and service development and usage
  • Understand customer experience
  • Discover people’s lifestyles and choices

Canada Lawn & Garden Marketplace Study

Sports & entertainment.

Snapshots into the consumer mindset through the lens of lifetime, entertainment and sports with a historical perspective.

Sports Poll

Luker lens poll, sports poll kids, 2023 tournament trends.

The latest SSRS/LoT Sports Poll infographic explores trends in college basketball fandom just in time for March Madness.

Political & Election Polling

SSRS conduct non-partisan political and election-related research research using a mix of samples, modes, and methods.

  • SSRS utilizes mixed methods tailored to research needs
  • SSRS links a variety of data sources to improve survey data quality
  • SSRS applies an advanced use of weighting adjustments

2022 Collaborative Midterm Survey

Events and speaking engagements.

You can find the SSRS team at industry conferences and events all year long! Whether we are attending, speaking, or presenting, we look forward to connecting with you and sharing insights from our research.

Recent Insights

when a research company polls residents

Baseball: A Universal Language

Melissa's Story

when a research company polls residents

Athletes Make it Look Easy – We Know it’s Not

Tom's Story

when a research company polls residents

A Hockey Family Bond

Jake's Story

when a research company polls residents

Lifelong Passion and Community in Sports

Rich's Story

when a research company polls residents

UVA Football: A Generational Legacy

when a research company polls residents

Sports as a Social Catalyst

Rob's Story

when a research company polls residents

From Courtside Dreams to Real Life Passion

Gabrielle's Story

when a research company polls residents

We Practice Sports to Win the Game, and We Play the Game to Practice for Life

Alex's Story

SSRS Solutions

Providing answers you can trust through rigorous research and relevant insights. SSRS tackles the most complex and challenging research questions to impact positive change.

Proprietary. Innovative. Fit for purpose. SSRS offers tailor-made research solutions designed to gather hard-to-get data and answer complex problems.

Rigorous Standards Combine Accuracy, Flexibility, and Affordability

SSRS Opinion Panel Omnibus

Perfect if You Want to Field a Short Survey or…

SSRS Virtual Insights

A Full-service Solution for Conducting Virtual Qualitative Research

SSRS Business Insights

An Innovative Source of B2B Sample

A Suite of Calibration Services Utilizing Innovative and Experimentally Validated…

Snapshots into the Consumer Mindset Through the Lens of Lifetime,…

The First and Longest Running Tracking Study Focusing on Sports…

Building the Next Generation of Sports Fans

SSRS Text Message Panel

Probability-based, Quick Turnaround Polling

SSRS ABS Innovations

Elevating Address-based Sampling to New Levels

SSRS Proprietary Panels

Highly Scalable Cost-efficient Data Collection Platform Tailored to Meet Client…

Rutgers-Eagleton/SSRS Garden State Panel

Measuring Public Opinion in New Jersey

The People Behind the Data

We know research is all about people, and it’s time to get personal. Our team of critical thinkers has genuine enthusiasm for our work and a shared goal to connect people through research. We get excited about the inherent problem solving involved in survey-based research, and our process fosters transparency through collaboration.

“Being a part of a team that genuinely enjoys working together on thought-provoking research was my goal as we built SSRS. I want to nurture a creative, innovative, supportive environment. As new challenges arise so do new opportunities for creative problem solving. It is so exciting! ” Melissa Herrmann President
“We are proud to work with clients who share our main goal: Making sure that public opinion includes the voices of people that are oftentimes under-represented in research. ” Eran Ben-Porath Chief Research Director
“We all work very hard, but we also know how to have fun together. The culture here is amazing, and I have not experienced another company like SSRS in my 20+ year market research career.” Jen Schmidt VP of Online Research and Strategy
“I enjoy helping clients use data to support their ‘why’ so they can sustainably continue to implement and or alter their own services or offerings.” Gabi Salomon Senior Project Director
“Conducting reliable public opinion research is critical not only to meet specific client objectives, but to ensure a high functioning democracy overall. I am proud to be part of that enterprise!” Jordon Peugh Chief Business Officer

when a research company polls residents

Our passion for solving problems is relentless.

When you work with the SSRS team, you gain accountability-driven thoughtfulness into unpacking complex challenges. Let's start a conversation about your next project.

when a research company polls residents

Research Co.

Public Opinion Polls and Analysis

when a research company polls residents

Perception of Crime Highest in Manitoba, Lowest in Saskatchewan

More than half of British Columbians and Albertans believe criminal activity has increased in the past four years. Read more Perception of Crime Highest in Manitoba, Lowest in Saskatchewan

when a research company polls residents

Wobbly Commitment to Net-Zero Goal in British Columbia

Three-in-five residents are in favour of the initiative, but support wanes if average energy costs increase. Read more Wobbly Commitment to Net-Zero Goal in British Columbia

when a research company polls residents

Purported Return of the Death Penalty Splits Views in Canada

Support for capital punishment is up, but most Canadians pick life imprisonment without parole for murder convictions. Read more Purported Return of the Death Penalty Splits Views in Canada

when a research company polls residents

Four Years Later, British Columbians Assess the Pandemic

More than half of residents are satisfied with the way their family, their friends and all levels of government tackled COVID-19. Read more Four Years Later, British Columbians Assess the Pandemic

when a research company polls residents

Almost Half of Canadians Would Prefer an Elected Head of State

The favourability ratings for the Prince and Princess of Wales are vastly superior to those posted by the King and Queen Consort.  Read more Almost Half of Canadians Would Prefer an Elected Head of State

when a research company polls residents

A deeper dive into the latest demographic and public opinion trends. Read more Analysis

when a research company polls residents

We shed a light on issues no one else has thought about. Read more What we do

when a research company polls residents

In the media

How news organizations around the world feature our findings. Read more In the media

Featured Content

Featured podcast.

when a research company polls residents

Why are British Columbians worried about crime?

This is VANCOLOUR host Mo Amir sits down with Research Co. President Mario Canseco to discuss whether or not rising anxieties about crime in B.C. actually reflect crime statistics. Watch on YouTube or Listen on   Spotify . 

Featured Polls

when a research company polls residents

Confidence in Health Care Down 10 Points in Canada

Just over a third of Canadians think a shortage of doctors and nurses is the biggest problem facing the system right now…

when a research company polls residents

Fewer than One-in-Five Canadians Want Monarchy to Continue

Positive perceptions of six senior members of the Royal Family—including King Charles III—are lower now than…

Robe shipwreck researchers look for clues to bust myths and build Dutch connections

A painting of an old wooden sailing ship

Residents of a town in South Australia's south-east are being enlisted in research into a shipwreck that killed 16 Dutch sailors in 1857 soon after offloading 400 gold miners from China.

More scrutiny is also on objects thought to have come from the shipwreck of the Koning Willem de Tweede that actually may have been from other shipwrecks. 

On June 30, 1857, the Koning Willem de Tweede (King William II) was run aground on Robe's Long Beach during a storm by a captain trying to save it.

Instead, the Dutch ship was "beaten to matchsticks", according to James Hunter, who is leading a project researching the disaster.

A beach with a car and dog on it

Dr Hunter, curator of naval heritage and archaeology at the Australian National Maritime Museum in Sydney, said the waves that hit it might have been up to six metres high.

"These waves were smashing into this ship repeatedly and it was a timber vessel," he said.

"It was old — by this point it had been around for several years — so it didn't take long for it to start to break apart."

Dr Hunter and conservator Heather Berry were part of a team that found what appears to be the site of the shipwreck off Long Beach in 2022.

It is buried in metres of sand and has not yet been reached by divers.

Objects washed up from ship

The ship had come to Robe from Hong Kong so the Chinese gold miners on board could avoid a tax on immigrants arriving by sea in Victoria .

Because the wreck happened so quickly — in perhaps about half an hour — very little was removed from the ship, with the focus instead on saving the crew.

However, Dr Hunter and Ms Berry, from the Silentworld Foundation which funds research into maritime archaeology and heritage, believe many items from the ship would have washed up on Long Beach since.

"There's a lot of ceramics that have washed up over the years, glass and of course bits of the ship itself – maybe parts that were being used or maybe other parts that were being stored in case of need for repairs," Ms Berry said.

Advice on conservation

Ms Berry has put together a booklet about how to conserve items people find from shipwrecks that applies particularly to Robe but also to other places around Australia.

A small old building with three people standing at the front, one of them holding a book

Crockery is commonly found but Ms Berry said it could disintegrate if it was not looked after properly after being under water for sometimes hundreds of years.

"If you don't desalinate them properly then you will lose the glaze because, as the salt recrystallises, it just pops off that glaze," she said.

"It's quite impressive to actually look at – I've seen a few examples of it – but it is ultimately sad because you've lost that information."

The booklet is being launched in Robe today with a workshop on the same topic planned for tomorrow at the Robe Customs House Maritime Museum.

People can bring in objects they have found for advice on conserving, which Dr Hunter and Ms Berry can examine to determine historical value.

A yellow bell with placards around it

Connections to today

A bell believed to have come from the ship was rung at Robe Primary School from its founding in 1886 until about 15 years ago to bring students in from lunch and recess.

A replica now stands in its place with the original in the local maritime museum.

If Dr Hunter is not completely sure the bell came from the Koning Willem de Tweede he is very sceptical about a cannon near the museum overlooking Town Beach and an arch commemorating the Chinese migrants of the 1850s.

Four primary school students ringing a bell under a shelter

Because of its British design, he thinks it may have come from another ship wrecked near Robe in 1857, the Sultana.  

"There's all sorts of things that suggest it could well have been — but maybe not — so the first thing I'd really like to do is nail down where that stuff came from — the things that we already know about," he said.

A man with a black cannon overlooking a bay with a sign with Chinese writing on it

Building Dutch relations

Most of the funding for the project comes from the Dutch government as part of a push by the Netherlands to strengthen ties with Australia. 

The Dutch embassy's cultural attaché, Xenia Hanusiak, said the ship was one of five or six Dutch commercial ships that brought Chinese to Australia at the time.

She said many Australians knew about the Dutch East India Company ships that wrecked in Western Australia in the 1600s but the later history was less well known.

"Even though the Dutch had been travelling by happenstance in the 17th century to the shores and some of the boats were shipwrecked, this we would see as an example of when the Dutch perhaps became interested in settling in Australia for mercantile benefits," Dr Hanusiak said.

A woman stands in front of a large Chinese monument situated on a beach.

While in Robe, Dr Hunter and Ms Berry also visited the old police stables handed over to the National Trust six years ago to look at how objects there could be preserved.

Robe National Trust branch secretary Valerie Monaghan said it was good to have them in town.

"People with metal detectors still pick up things on Long Beach and it’s really important that we keep track of what’s found and keep the things that are really worth keeping," she said.

Researchers plan to return to Robe next summer to use more techniques to look into the wreck, including probing, using metal detectors and fanning (waving sand away with the hands).

ABC South East SA — local news in your inbox

  • X (formerly Twitter)

Related Stories

Chinese migrants walked a gruelling 500km to victoria's goldfields in the 19th century. some wouldn't survive.

A drawing of Chinese migrants carrying belongs at Black Hill Ballarat in September 1857.

After more than 180 years, the Indigenous side of a notorious shipwreck story will be told

A painting of an old sailing ship

Tombstone tourists: Stories of the dead come alive

A woman dressed in black among gravestones in a cemetery

  • Archaeology
  • Maritime Accidents and Incidents

We've detected unusual activity from your computer network

To continue, please click the box below to let us know you're not a robot.

Why did this happen?

Please make sure your browser supports JavaScript and cookies and that you are not blocking them from loading. For more information you can review our Terms of Service and Cookie Policy .

For inquiries related to this message please contact our support team and provide the reference ID below.

Read our research on: Gun Policy | International Conflict | Election 2024

Regions & Countries

9 facts about americans and marijuana.

People smell a cannabis plant on April 20, 2023, at Washington Square Park in New York City. (Leonardo Munoz/VIEWpress)

The use and possession of marijuana is illegal under U.S. federal law, but about three-quarters of states have legalized the drug for medical or recreational purposes. The changing legal landscape has coincided with a decades-long rise in public support for legalization, which a majority of Americans now favor.

Here are nine facts about Americans’ views of and experiences with marijuana, based on Pew Research Center surveys and other sources.

As more states legalize marijuana, Pew Research Center looked at Americans’ opinions on legalization and how these views have changed over time.

Data comes from surveys by the Center,  Gallup , and the  2022 National Survey on Drug Use and Health  from the U.S. Substance Abuse and Mental Health Services Administration. Information about the jurisdictions where marijuana is legal at the state level comes from the  National Organization for the Reform of Marijuana Laws .

More information about the Center surveys cited in the analysis, including the questions asked and their methodologies, can be found at the links in the text.

Around nine-in-ten Americans say marijuana should be legal for medical or recreational use,  according to a January 2024 Pew Research Center survey . An overwhelming majority of U.S. adults (88%) say either that marijuana should be legal for medical use only (32%) or that it should be legal for medical  and  recreational use (57%). Just 11% say the drug should not be legal in any form. These views have held relatively steady over the past five years.

A pie chart showing that only about 1 in 10 U.S. adults say marijuana should not be legal at all.

Views on marijuana legalization differ widely by age, political party, and race and ethnicity, the January survey shows.

A horizontal stacked bar chart showing that views about legalizing marijuana differ by race and ethnicity, age and partisanship.

While small shares across demographic groups say marijuana should not be legal at all, those least likely to favor it for both medical and recreational use include:

  • Older adults: 31% of adults ages 75 and older support marijuana legalization for medical and recreational purposes, compared with half of those ages 65 to 74, the next youngest age category. By contrast, 71% of adults under 30 support legalization for both uses.
  • Republicans and GOP-leaning independents: 42% of Republicans favor legalizing marijuana for both uses, compared with 72% of Democrats and Democratic leaners. Ideological differences exist as well: Within both parties, those who are more conservative are less likely to support legalization.
  • Hispanic and Asian Americans: 45% in each group support legalizing the drug for medical and recreational use. Larger shares of Black (65%) and White (59%) adults hold this view.

Support for marijuana legalization has increased dramatically over the last two decades. In addition to asking specifically about medical and recreational use of the drug, both the Center and Gallup have asked Americans about legalizing marijuana use in a general way. Gallup asked this question most recently, in 2023. That year, 70% of adults expressed support for legalization, more than double the share who said they favored it in 2000.

A line chart showing that U.S. public opinion on legalizing marijuana, 1969-2023.

Half of U.S. adults (50.3%) say they have ever used marijuana, according to the 2022 National Survey on Drug Use and Health . That is a smaller share than the 84.1% who say they have ever consumed alcohol and the 64.8% who have ever used tobacco products or vaped nicotine.

While many Americans say they have used marijuana in their lifetime, far fewer are current users, according to the same survey. In 2022, 23.0% of adults said they had used the drug in the past year, while 15.9% said they had used it in the past month.

While many Americans say legalizing recreational marijuana has economic and criminal justice benefits, views on these and other impacts vary, the Center’s January survey shows.

  • Economic benefits: About half of adults (52%) say that legalizing recreational marijuana is good for local economies, while 17% say it is bad. Another 29% say it has no impact.

A horizontal stacked bar chart showing how Americans view the effects of legalizing recreational marijuana.

  • Criminal justice system fairness: 42% of Americans say legalizing marijuana for recreational use makes the criminal justice system fairer, compared with 18% who say it makes the system less fair. About four-in-ten (38%) say it has no impact.
  • Use of other drugs: 27% say this policy decreases the use of other drugs like heroin, fentanyl and cocaine, and 29% say it increases it. But the largest share (42%) say it has no effect on other drug use.
  • Community safety: 21% say recreational legalization makes communities safer and 34% say it makes them less safe. Another 44% say it doesn’t impact safety.

Democrats and adults under 50 are more likely than Republicans and those in older age groups to say legalizing marijuana has positive impacts in each of these areas.

Most Americans support easing penalties for people with marijuana convictions, an October 2021 Center survey found . Two-thirds of adults say they favor releasing people from prison who are being held for marijuana-related offenses only, including 41% who strongly favor this. And 61% support removing or expunging marijuana-related offenses from people’s criminal records.

Younger adults, Democrats and Black Americans are especially likely to support these changes. For instance, 74% of Black adults  favor releasing people from prison  who are being held only for marijuana-related offenses, and just as many favor removing or expunging marijuana-related offenses from criminal records.

Twenty-four states and the District of Columbia have legalized small amounts of marijuana for both medical and recreational use as of March 2024,  according to the  National Organization for the Reform of Marijuana Laws  (NORML), an advocacy group that tracks state-level legislation on the issue. Another 14 states have legalized the drug for medical use only.

A map of the U.S. showing that nearly half of states have legalized the recreational use of marijuana.

Of the remaining 12 states, all allow limited access to products such as CBD oil that contain little to no THC – the main psychoactive substance in cannabis. And 26 states overall have at least partially  decriminalized recreational marijuana use , as has the District of Columbia.

In addition to 24 states and D.C.,  the U.S. Virgin Islands ,  Guam  and  the Northern Mariana Islands  have legalized marijuana for medical and recreational use.

More than half of Americans (54%) live in a state where both recreational and medical marijuana are legal, and 74% live in a state where it’s legal either for both purposes or medical use only, according to a February Center analysis of data from the Census Bureau and other outside sources. This analysis looked at state-level legislation in all 50 states and the District of Columbia.

In 2012, Colorado and Washington became the first states to pass legislation legalizing recreational marijuana.

About eight-in-ten Americans (79%) live in a county with at least one cannabis dispensary, according to the February analysis. There are nearly 15,000 marijuana dispensaries nationwide, and 76% are in states (including D.C.) where recreational use is legal. Another 23% are in medical marijuana-only states, and 1% are in states that have made legal allowances for low-percentage THC or CBD-only products.

The states with the largest number of dispensaries include California, Oklahoma, Florida, Colorado and Michigan.

A map of the U.S. showing that cannabis dispensaries are common along the coasts and in a few specific states.

Note: This is an update of a post originally published April 26, 2021, and updated April 13, 2023.  

when a research company polls residents

Sign up for our weekly newsletter

Fresh data delivered Saturday mornings

Americans overwhelmingly say marijuana should be legal for medical or recreational use

Religious americans are less likely to endorse legal marijuana for recreational use, four-in-ten u.s. drug arrests in 2018 were for marijuana offenses – mostly possession, two-thirds of americans support marijuana legalization, most popular.

About Pew Research Center Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of The Pew Charitable Trusts .

IMAGES

  1. Solved When a research company polls residents about their

    when a research company polls residents

  2. Sharing poll and survey results with your residents

    when a research company polls residents

  3. Poll vs Surveys: What’s the Difference?

    when a research company polls residents

  4. Confronting 2016 and 2020 Polling Limitations

    when a research company polls residents

  5. Using Nextdoor Polls to gather feedback from your residents

    when a research company polls residents

  6. The Growth of Pew Research Center's American Trends Panel

    when a research company polls residents

COMMENTS

  1. Math 227 statistics activity 7 Flashcards

    Study with Quizlet and memorize flashcards containing terms like Which question is unbiased?, When a research company polls residents about their voting intentions, new Canadians areunder-represented. This is an example of, A researcher is conducting a survey among students to determine their mean age. Data is collected by asking the age of a simple random sample of 150 students.

  2. Bias and sampling Flashcards

    Research company polls residents about voting intentions.new Canadians are under represented. Biased. Ex; Do you prefer news or mindless sitcom. Unbiased. Do you prefer daytime or evening television programing. stratisfied. more than one layerv. simple random sample.

  3. Stats quiz chapter 4

    When a research company polls residents about their voting intentions, new Canadians are under-represented. This is an example of. Student government at a high school surveys 100 students at the school to get the overall opinion on the change in the bell schedule.

  4. Public Opinion Polling Basics

    The term "poll" is usually applied to a survey designed to measure people's opinions. But not all surveys examine opinions. In fact, even our opinion polls often include questions about people's experiences, habits or future intentions. Let's look at the variety of ways polls and surveys are used.

  5. Interpreting Statistics and Political Polls

    Rather than ask every voter to state their preference in candidates, polling research companies poll a relatively small number of people who their favorite candidate is. The members of the statistical sample help to determine the preferences of the entire population. There are good polls and not so good polls, so it is important to ask the ...

  6. Understanding and Evaluating Survey Research

    Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative ...

  7. Frequently Asked Questions

    Why am I never asked to take a poll? You have roughly the same chance of being polled as anyone else living in the United States. This chance, however, is only about 1 in 170,000 for a typical Pew Research Center survey. To obtain that rough estimate, we divide the current adult population of the U.S. (about 255 million) by the typical number ...

  8. Polling Fundamentals

    A poll is a type of survey or inquiry into public opinion conducted by interviewing a random sample of people. What's a random sample? A random sample is the result of a process whereby a selection of participants is made from a larger population and each subject is chosen entirely by chance.

  9. How Polls Influence Behavior

    When the test subjects learned that a large number of experts favored a position, opinions shifted by 11.3%. But the "opinions of people like me" changed opinions by just 6.2%, while a general poll saying that a majority of people favored one side or the other moved the needle by 8.1%. Although the focus of the research was on how polls ...

  10. U.S. Surveys

    Pew Research Center has deep roots in U.S. public opinion research. Launched initially as a project focused primarily on U.S. policy and politics in the early 1990s, the Center has grown over time to study a wide range of topics vital to explaining America to itself and to the world.Our hallmarks: a rigorous approach to methodological quality, complete transparency as to our methods, and a ...

  11. Solved 15. Identify the type of bias present in the

    Other Math. Other Math questions and answers. 15. Identify the type of bias present in the following: A. When a research company polls residents about their voting intentions, new Canadians are under-represented. 30 B. A radio station asks its listeners to call in to answer a survey question on spending by politicians. C.

  12. Polling and Public Opinion Sources Online: Home

    A full-text retrieval system, the iPOLL online database is organized at the question level, providing the tools to sift through nearly a half million questions asked on national public opinion surveys since 1935 and updated daily. Exit Poll datasets. Roper Center Polling Data > US Elections. Exit polling data is collected by Edison Research, a ...

  13. Solved When a research company polls residents about their

    You'll get a detailed solution from a subject matter expert that helps you learn core concepts. Question: When a research company polls residents about their voting intentions, new Canadians are under-represented. This is an example of non-response bias response bias sampling bias measurement bias. There's just one step to solve this.

  14. PDF Sampling Methods for Political Polling

    For more than five decades probability sampling was the standard method for polls. But in recent years, as fewer people respond to polls and the costs of polls have gone up, researchers have turned to non-probability based sampling methods. For example, they may collect data on-line from volunteers who have joined an Internet panel.

  15. Home

    When you work with the SSRS team, you gain accountability-driven thoughtfulness into unpacking complex challenges. Let's start a conversation about your next project. Let's Talk. SSRS. (484) 840.4300. SSRS is redefining research to impact positive change. We provide answers you can trust through rigorous research and relevant insights.

  16. How do people in the U.S. take Pew Research Center surveys, anyway?

    These days, we mostly recruit people to take our online surveys by mailing them printed invitations - again at random - with the help of a residential address file kept by the U.S. Postal Service. This approach gives everyone living at a U.S. residential address a chance to be surveyed (though it does exclude some people, such as those who ...

  17. Is research-polls.com Legit?

    Is research-polls.com legit? It has a low trust score, according to our website Validator. The site is a little bit suspicious, so let's take a closer look at this business and its Data Analysis industry. Our in-depth review is based on the aggregation of 53 powerful factors to expose high-risk activity and see if research-polls.com is a scam. We also recommend ways to detect and block scam ...

  18. Research Co.

    Research Co. is a public opinion firm. Research Co. Public Opinion Polls and Analysis. Search Icon. Search for: ... Three-in-five residents are in favour of the initiative, but support wanes if average energy costs increase. ... Featured Polls. Confidence in Health Care Down 10 Points in Canada.

  19. consider a following scenario: suppose a pew research center (prc

    consider a following scenario: suppose a pew research center (prc) which is a popular research company polls residents in dallas about their voting intentions for next elections, and during this process of collecting the sample data prc ignore some countries. the above scenario is an example of: a. measurement bias b. sampling bias c. no sampling bias. d. all of the above are correct

  20. When a research company polls residents

    When a research company polls residents about their voting intentions, the fact that new Canadians are under-represented is an example of undercoverage bias. Undercoverage bias occurs when certain groups are not represented proportionally in a sample. In this case, the research company is not capturing the opinions and voting intentions of new ...

  21. Robe shipwreck researchers look for clues to bust myths and build Dutch

    Residents of a town in South Australia's south-east are being enlisted in research into a shipwreck that killed 16 Dutch sailors in 1857 soon after offloading 400 gold miners from China.

  22. Quiz: Sec. 4.1 Review Flashcards

    When a research company polls residents about their voting intentions, new Canadians are under-represented. Undercoverage. Examine each of the following questions for possible bias. Horizon Wireless is thinking of entering the satellite TV business. Their planning department decided to survey their existing cell phone customers regarding their ...

  23. Key things to know about election polls in the U.S.

    Key things to know about election polling in the United States. A robust public polling industry is a marker of a free society. It's a testament to the ability of organizations outside the government to gather and publish information about the well-being of the public and citizens' views on major issues. In nations without robust polling ...

  24. Rand Plunges as Poll Shows Zuma Eating Into ANC's Support

    The South African currency depreciated as much as 0.9% to 18.6231 per dollar after Bloomberg reported the result of the poll by the Social Research Foundation Wednesday. Prior to this, the rand ...

  25. 9 facts about Americans and marijuana

    While many Americans say they have used marijuana in their lifetime, far fewer are current users, according to the same survey. In 2022, 23.0% of adults said they had used the drug in the past year, while 15.9% said they had used it in the past month. While many Americans say legalizing recreational marijuana has economic and criminal justice ...