Contract with America Survey

Leah Drury, Matthew Paymar, Konstantin Zakashanskiy, and Courtenay Anderson

Statement of the Problem

The purpose of our project was two- fold; we wished (1) to investigate public opinion among University of Minnesota students concerning the Republican "Contract With America," and (2) to design and execute our own survey so as to better understand the difficulties inherent in developing reliable data from any survey study.

We had originally and ideally planned to focus primarily upon the following four problems related to our first goal: (1) Do people agree or disagree with the Contract's principles and bills as they are worded?; (2) Have people heard of the Contract, and if so, do they adequately understand what the Contract entails?

(Gauges what people think they know or understand); (3) Is there a disparity between people's answers on the survey and their own stated knowledge of the Contract and/or stated ideological stance? In other words, do people think they understand the Contract better or worse than the actually do? Do people think they are more liberal or conservative than they actually are? Does the rhetoric of the Contract obscure its ideology such that survey respondents accept agendas that they do not support in principle or reject agendas that they do support in principle?; (4) How does the data gathered in the above questions correlate with demographics?

As we became more involved with the logistics of the survey, however, we came to understand that the project would be shaped more by the myriad of largely unforeseen problems related to our second goal than these four issues relating to the first goal. In particular, we found that for all our questions to be addressed we would need to design a prohibitively long survey. As a result, we had to limit ourselves to only certain (usually publicly controversial) issues in the Contract and we weren't able to study at all the effect that rhetoric had on people's answers since we couldn't repeat questions using different wording.

Background

Our group came together in an odd manner. Not one of us had a substantive idea at first (indeed, this was almost an expose on gambling), but that soon changed. Matthew had envisioned a project concerning the Republican "Contract With America," and that sounded interesting and topical to the rest of us. We obtained a copy of the Contract, Congressional Reports, and essays and critiques from many sources in order to ground ourselves in the topic. Before long, we'd decided on a survey format, as that seemed like the only realistic way to utilize the subject matter.

As there are four of us, we prepared for the project both together and individually. As individuals, we left each meeting with an assignment: come up with new ideas on a particular subject, formulate questions for the survey, gather data, etc. These activities required the use of the textbook, especially chapters 1, 3, 4, and 5. This review helped to guide us in specific procedural matters.

Group preparation consisted of brainstorming, planning a schedule, and modifying our individual ideas until they fit into a cohesive whole. Probably the most difficult part of the project, this group work forced each of us to examine our priorities as they related to the survey: arguments arose over the possibility of asking several demographic questions, what kind of sample size we wanted, and what areas in which to collect data.

Through all of these uncertainties, we were guided by our collective experience with surveys (as we discussed in class, it seems as if everyone has responded to some sort of poll), information from the textbook, and finally, Leah's experience working with the Minnesota Poll. All of these factors contributed strongly to the final product.

In retrospect, it was hard to decide on a topic and even harder to assimilate all of our ideas into an unbiased survey, but the rest was enjoyable. We learned that working in such a large group necessarily entails logistical as well as theoretical planning; the schedule was constantly changing, but we finished tabulating our data on time. From the beginning, we had doubted our streamlined, week- by- week strategy, and we all learned that it helps to have an idea about the temporal demands of gathering data.

Method

We decided that a survey would be the most appropriate method of collecting data to find out if student's political leanings were perhaps associated with their socioeconomic background. We divided the survey into three parts: demographics, general issue- oriented questions asking if students supported some of the basic principles relating to the tenets outlined by the Republican Contract With America, and finally pointed questions asking students if they had heard of the Contract and if they supported it.

The placement of questions was deliberate: by placing the demographic

questions first, we were able to "lure" students into completing the rest of the survey since the first questions they saw were innocuous. The next section, including the Contract principles, asked straightforward yes/no questions about some of the top issues facing the country today. We tried not to bias students by identifying the issue questions as part of the Contract; we simply asked if they favored or opposed the issue, such as an increase in the minimum wage. Then we asked if they had heard of the Contract; based on their response, we then attempted to gauge knowledge of the content of the Contract as well as the student's general support for its principles.

Once we had our survey formulated and revised, each of us took twenty- five copies and spread out over campus. One hundred surveys is, unfortunately, not representative of the some 60,000 students on campus, but in the context of this class it seemed appropriate. Some typical survey spots included Coffman Memorial Union, a few area coffee shops, classes, and outside on the Mall. Two members of the group even conducted a few random telephone surveys using the student directory! As we soon realized, this method of distributing surveys in no way garners a randomized sample of the student population. We did not have the time or resources available to attempt to cover all of the popular student hang- outs.

Results

In order to study the data we collected, it was necessary to find meaningful correlations between the responses of the sample group. These relationships are demonstrated in the various graphs that accompany this paper, along with piechart style graphs that demonstrate the amount of yes/no/ undecided responses. The responses may be easily matched with the questions on the survey (see attached copy).

Conclusions

As illustrated by the accompanying charts and tables, we found that for the most part, students do not support the Contract. One of our goals was to see if we could positively correlate students' demographics with their support of the Contract. National figures show that supporters (Republicans/ conservatives) tend to be older, whiter, and more financially well- off then the detractors (Democrats/liberals), who tend not to fall into the above categories in high numbers. Do these same assumptions hold true on our campus?

Curiously, students whose parents are funding their college education clearly do not support the Contract, whereas students who support themselves tend to fall in the moderate- support category. Students depending on loans, while mostly showing the least support for the Contract, also showed considerable strength for it as well. We had assumed that students supported by their parents may be

more likely to demonstrate more conservative leanings since they would appear to be more financially stable and perhaps not yet out in the "real world"; our survey clearly proves us wrong!

Students of color tended to show little support for the Contract, keeping in line with national trends. Asian- American students surveyed averaged a 5 in support, African- Americans 2.5, and Caucasian students a 3 on a scale of 19, 1 being least supportive.

College of enrollment seemed to have much sway in how students felt about the Contract. Popular lore at the University will have us believe that College of Liberal Arts students indeed tend to be more liberal whereas Institute of Technology, College of Biological Sciences and other more career- minded schools enroll more conservative students. Our survey findings illustrate this is true; then again, this could be due to chance.

We discovered an interesting nuance in correlating the last two questions on the survey. The more informed a student was about the content of the Contract with America, the less likely s/he was to support it. Conversely, students who identified themselves as having little to some knowledge of the Contract's principles tended to demonstrate moderate support despite their admitted ignorance. Selecting 5, the mid- point in the given continuum of 1- 9, may also have been respondents' way of expressing no real feelings either way about the Contract since they had little knowledge of its content.

Again, the complete findings of the survey, too long to go into in this brief summary of important conclusions, may be found in the appendix.

Critique

We believe that we learned firsthand the difficulty of conducting a truly unbiased survey! The wording of the questions proved a source of much debate; an example would be the race/ethnic background question. We forgot to include a gender question; and then the question would have been whether "sex" or "gender" would be a more appropriate wording!

One specific question that garnered some debate was #5, which asks the respondent to place him/herself on a political spectrum. We feared that this question might lead respondents to answer the following questions in a corresponding fashion. In retrospect, we feel that the question should have been at the end of the survey, along with 23a and 23b. Furthermore, at least six respondents complained that an ordinal scale ranging from "Liberal to Conservative" is too limiting since there are political positions left of liberal (e.g. "progressive," "socialist," "anarchist") and political positions right of conservative (e.g. "reactionary," "fascist"). It probably would have been more appropriate for the scale to flow from "Left" to "Right," instead of "Liberal" to "Conservative," although we might then have to use a larger range of numbers and label the points at which a respondent could identify him/herself as a "liberal" or "conservative"--lest people assume that a liberal is the far left or a conservative the far right. At any rate, question #5 posed many problems.

Similarly, after we'd finished our polling we found that on many modern surveys the wording for questions concerning race (our question #4) was different. Where we used the phrase "Native American," for instance, most modern polls seem to use the phrase: "American Indian or Alaskan Native;" where we said "Asian- American," other polls say "Asian or Pacific Islander;" where we say "Caucasian" other surveys have said "White;" and most importantly, we lacked a category for those from "East India or the Middle East" altogether.

We also detected one minor error in question #19. As it stands now the question reads "Should the United Nations have control over US troops?" The spirit of the question, however, is "Should the United Nations have control over some US troops?" This principle was taken directly out of the Contract, believe it, or not, but it is obvious that the Republicans are not proposing that the UN have control over all US troops, but only some of them.

We also wanted to find out students' economic background. Initially we were planning to ask about the parents' estimated annual income; however, we decided that might generate too many answers of "don't know" such that we would not be able to make any correlations between students' financial status and their political leanings. Also, we realized many students are not supported by their parents and asking such a question would not include them. We then changed the question to, hopefully, one that would be better reflect the students' current financial state: how was the majority of their college education being funded? We anticipated that this question would then weed out which students were financially independent (selfjob); which were not (parents); and which had need for financial aid (self- loan, need- based scholarship). The category of meritbased scholarships was one that we felt compelled to include in the interest of thoroughness, but we also realized that it would not reveal the financial status of the students in question.

The question format also proved to be problematic. We discovered during the polling process that if a student identified her/himself as liberal, their answers would usually fall into the "no" category. Likewise, a conservative student would usually answer "yes." This overall design flaw was practically unforeseeable until we started the polling. Much like the "question #5 problem," we feared that such a uniform line of questioning would "hypnotize" the student into answering the same way without considering each question individually.

In a related issue, we wondered if we should have used different types of questions, and not just the "yes/no/undecided" variety. Presumably, a greater mixture of question- type would keep people on their proverbial toes. In addition, it seems as though the sometimes obscure subject matter called for a "don't know" category to differentiate between the "undecideds" and the truly ignorant: very few students, it seems, understand the intricacies of the capitalgains tax cut issue, and no doubt, several of them parlayed this lack of knowledge into an "undecided" response.

Finally, we considered whether we could have avoided some of the above problems by utilizing less of the Contract's language verbatim. Some of our questions were exactly representative of Contract issues, and perhaps this borrowing influenced some of the respondents. Furthermore, it was probably not judicious to state that we were asking about the Republican Contract with America; presumably, a strongly liberal respondent would be biased by such partisan language.

Our sample group was also the cause for some concern. Polling methods on a campus of one's peers are necessarily suspect: the urge to poll one's friends and classmates is overwhelming: first because they are familiar, and second because we desire to learn what they think. All the members of the group noted that after a day of polling, we had polled at least a few friends/classmates. The classmates, presumably, do not threaten the randomness of the poll as much as the friends (one generally knows the political ideology of a close friend), but nonetheless, the classmates are not selected randomly. Phone interviewing, which constituted 14% of the of polling, dealt with the randomness issue handsomely. Nevertheless, the extraordinary length of time that it took to conduct a phone interview made it prohibitive. Furthermore, the interviewer was likely to call the 624- or 625- prefix, thereby selecting a limited age group of University students (specifically, those in the dormitories).

Each of us discovered that when we were in the role of the surveyor, we had ultimate power over how the results could turn out. As we discussed earlier, we realized early on in the project that is was nearly impossible for us to complete a truly randomized survey considering the enormity of both the subject matter and the selected population: students at the University of Minnesota. Since we were distributing the surveys in person, we could pick and choose who to ask to complete the survey. Hence, our own biases played into the results of the survey. For instance, if we (individually) felt we needed more ethnic representation in the sample, we could intentionally not ask random Caucasian passersby to fill out the survey but instead seek out a person of color.

For future students wishing to embark on the arduous survey journey, beware: a political theme is potentially dangerous for numerous reasons. It seems as though it would be best to ascertain whether or not you are all of the same political persuasion, but this is not necessarily the case. If you all are, for example, liberal, you run the risk of leading others with your questions and biasing your survey without knowing that you are doing so. Conversely, if your group is scattered over the political spectrum, your project may collapse due to ideologically- based differences on how to proceed.

Next, take heed of your ambitions: a sample size of 100, such as we analyzed, is sufficiently difficult, but probably not large enough to represent the University community. You should ask yourself and your group members how much time you are willing to commit, and if that allowance is large enough to accomplish your goals. The most labor- intensive parts of the process are planning the project and reporting/analyzing the data- - the rest is relatively painless.