Research Assignment #25: Survey Research& Exercise ‘For the Departments of English & Media Studies by Prof Dr Sohail Ansari
Survey
Research
‘Survey research is one of the most
important areas of measurement in applied social research. The broad area of
survey research encompasses any measurement procedures that involve asking
questions of respondents. A "survey" can be anything form a short paper-and-pencil
feedback form to an intensive one-on-one in-depth interview.
Types of Surveys
‘Surveys
can be divided into two broad categories: the questionnaire and the interview. Questionnaires are usually paper-and-pencil instruments
that the respondent completes. Interviews are completed by the interviewer
based on the respondent says. Sometimes, it's hard to tell the difference
between a questionnaire and an interview. For instance, some people think that
questionnaires always ask short closed-ended questions while interviews always
ask broad open-ended ones. But you will see questionnaires with open-ended
questions (although they do tend to be shorter than in interviews) and there
will often be a series of closed-ended questions asked in an interview’.
Exercise:
Write
down open-ended questions for a questionnaire and then make them longer for an
interviews
Selecting the
Survey Method
Selecting the type of survey you are
going to use is one of the most critical decisions in many social research
contexts. You'll see that there are very few simple rules that will make the
decision for you -- you have to use your judgment to balance the advantages and
disadvantages of different survey types. Here, all I want to do is give you a
number of questions you might ask that can help guide your decision.
Population Issues
The first set of considerations have to do with
the population and its accessibility.
- Can the population be enumerated?
For some populations, you have a complete listing of the units
that will be sampled. For others, such a list is difficult or impossible to
compile. For instance, there are complete listings of registered voters or
person with active drivers licenses. But no one keeps a complete list of
homeless people. If you are doing a study that requires input from homeless
persons, you are very likely going to need to go and find the respondents
personally. In such contexts, you can pretty much rule out the idea of mail
surveys or telephone interviews.
Is the population literate?
Questionnaires require that your respondents can read. While this
might seem initially like a reasonable assumption for many adult populations,
we know from recent research that the instance of adult illiteracy is
alarmingly high. And, even if your respondents can read to some degree, your
questionnaire may contain difficult or technical vocabulary. Clearly, there are
some populations that you would expect to be illiterate. Young children would
not be good targets for questionnaires.
Exercise:
Write
down questions for educated people and then change same questions for illiterate?
Are there language issues?
We live in a multilingual world. Virtually every society has
members who speak other than the predominant language. Some countries (like
Canada) are officially multilingual. And, our increasingly global economy
requires us to do research that spans countries and language groups. Can you
produce multiple versions of your questionnaire? For mail instruments, can you
know in advance the language your respondent speaks, or do you send multiple
translations of your instrument? Can you be confident that important
connotations in your instrument are not culturally specific? Could some of the
important nuances get lost in the process of translating your questions?
Will the population cooperate?
People who do research on immigration issues have a difficult
methodological problem. They often need to speak with undocumented immigrants
or people who may be able to identify others who are. Why would we expect those
respondents to cooperate? Although the researcher may mean no harm, the
respondents are at considerable risk legally if information they divulge should
get into the hand of the authorities. The same can be said for any target group
that is engaging in illegal or unpopular activities.
Exercise:
Do
you think people you target will speak against a king? If not so what you will
do?
What are the geographic restrictions?
Is your population of interest dispersed over too broad a
geographic range for you to study feasibly with a personal interview? It may be
possible for you to send a mail instrument to a nationwide sample. You may be
able to conduct phone interviews with them. But it will almost certainly be
less feasible to do research that requires interviewers to visit directly with
respondents if they are widely dispersed.
Sampling
Issues
The sample is the actual group you will have to contact in
some way. There are several important sampling issues you need to consider when
doing survey research.
What data is available?
What
information do you have about your sample? Do you know their current addresses?
Their current phone numbers? Are your contact lists up to date?
Can respondents be found?
Can
your respondents be located? Some people are very busy. Some travel a lot. Some
work the night shift. Even if you have an accurate phone or address, you may
not be able to locate or make contact with your sample.
Who is the respondent?
Who
is the respondent in your study? Let's say you draw a sample of households in a
small city. A household is not a respondent. Do you want to interview a
specific individual? Do you want to talk only to the "head of
household" (and how is that person defined)? Are you willing to talk to
any member of the household? Do you state that you will speak to the first
adult member of the household who opens the door? What if that person is
unwilling to be interviewed but someone else in the house is willing? How do
you deal with multi-family households? Similar problems arise when you sample
groups, agencies, or companies. Can you survey any member of the organization?
Or, do you only want to speak to the Director of Human Resources? What if the
person you would like to interview is unwilling or unable to participate? Do
you use another member of the organization?
Can all members of population be sampled?
If
you have an incomplete list of the population (i.e., sampling frame) you may
not be able to sample every member of the population. Lists of various groups
are extremely hard to keep up to date. People move or change their names. Even
though they are on your sampling frame listing, you may not be able to get to
them. And, it's possible they are not even on the list.
Are response rates likely to be a problem?
Even
if you are able to solve all of the other population and sampling problems, you
still have to deal with the issue of response rates. Some members of your
sample will simply refuse to respond. Others have the best of intentions, but
can't seem to find the time to send in your questionnaire by the due date.
Still others misplace the instrument or forget about the appointment for an
interview. Low response rates are among the most difficult of problems in
survey research. They can ruin an otherwise well-designed survey effort.
Question Issues
Sometimes
the nature of what you want to ask respondents will determine the type of
survey you select.
What types of questions can be asked?
Are
you going to be asking personal questions? Are you going to need to get lots of
detail in the responses? Can you anticipate the most frequent or important
types of responses and develop reasonable closed-ended questions?
How complex will the questions be?
Sometimes
you are dealing with a complex subject or topic. The questions you want to ask
are going to have multiple parts. You may need to branch to sub-questions.
Will screening questions be needed?
A
screening question may be needed to determine whether the respondent is
qualified to answer your question of interest. For instance, you wouldn't want
to ask someone their opinions about a specific computer program without first
"screening" them to find out whether they have any experience using
the program. Sometimes you have to screen on several variables (e.g., age,
gender, experience). The more complicated the screening, the less likely it is
that you can rely on paper-and-pencil instruments without confusing the
respondent.
Can question sequence be controlled?
Is
your survey one where you can construct in advance a reasonable sequence of
questions? Or, are you doing an initial exploratory study where you may need to
ask lots of follow-up questions that you can't easily anticipate?
Will lengthy questions be asked?
If
your subject matter is complicated, you may need to give the respondent some
detailed background for a question. Can you reasonably expect your respondent
to sit still long enough in a phone interview to ask your question?
Will long response scales be used?
If
you are asking people about the different computer equipment they use, you may
have to have a lengthy response list (CD-ROM drive, floppy drive, mouse, touch
pad, modem, network connection, external speakers, etc.). Clearly, it may be
difficult to ask about each of these in a short phone interview.
Content Issues
The content of your study can also pose
challenges for the different survey types you might utilize.
Can the respondents be expected to know about the issue?
If
the respondent does not keep up with the news (e.g., by reading the newspaper,
watching television news, or talking with others), they may not even know about
the news issue you want to ask them about. Or, if you want to do a study of
family finances and you are talking to the spouse who doesn't pay the bills on
a regular basis, they may not have the information to answer your questions.
Exercise:
Inhabitants of kingdom are most likely do not keep up with news;
so what you will do?
Will respondent need to consult records?
Even
if the respondent understands what you're asking about, you may need to allow
them to consult their records in order to get an accurate answer. For instance,
if you ask them how much money they spent on food in the past month, they may
need to look up their personal check and credit card records. In this case, you
don't want to be involved in an interview where they would have to go look
things up while they keep you waiting (they wouldn't be comfortable with that).
Bias Issues
People
come to the research endeavor with their own sets of biases and prejudices.
Sometimes, these biases will be less of a problem with certain types of survey
approaches.
Exercise:
Identify your own sets
of biases and prejudices
Can social desirability be avoided?
Respondents
generally want to "look good" in the eyes of others. None of us likes
to look like we don't know an answer. We don't want to say anything that would
be embarrassing. If you ask people about information that may put them in this
kind of position, they may not tell you the truth, or they may "spin"
the response so that it makes them look better. This may be more of a problem
in an interview situation where they are face-to face or on the phone with a
live interviewer.
Can interviewer distortion and subversion be controlled?
Interviewers
may distort an interview as well. They may not ask questions that make them
uncomfortable. They may not listen carefully to respondents on topics for which
they have strong opinions. They may make the judgment that they already know
what the respondent would say to a question based on their prior responses,
even though that may not be true.
Can false respondents be avoided?
With
mail surveys it may be difficult to know who actually responded. Did the head
of household complete the survey or someone else? Did the CEO actually give the
responses or instead pass the task off to a subordinate? Is the person you're
speaking with on the phone actually who they say they are? At least with
personal interviews, you have a reasonable chance of knowing who you are
speaking with. In mail surveys or phone interviews, this may not be the case.
Administrative
Issues
Last, but certainly not least, you have
to consider the feasibility of the survey method for your study.
Costs
Cost
is often the major determining factor in selecting survey type. You might
prefer to do personal interviews, but can't justify the high cost of training
and paying for the interviewers. You may prefer to send out an extensive
mailing but can't afford the postage to do so.
Facilities
Do
you have the facilities (or access to them) to process and manage your study?
In phone interviews, do you have well-equipped phone surveying facilities? For
focus groups, do you have a comfortable and accessible room to host the group?
Do you have the equipment needed to record and transcribe responses?
Time
Some
types of surveys take longer than others. Do you need responses immediately (as
in an overnight public opinion poll)? Have you budgeted enough time for your
study to send out mail surveys and follow-up reminders, and to get the
responses back by mail? Have you allowed for enough time to get enough personal
interviews to justify that approach?
Personnel
Different
types of surveys make different demands of personnel. Interviews require
interviewers who are motivated and well-trained. Group administered surveys
require people who are trained in group facilitation. Some studies may be in a
technical area that requires some degree of expertise in the interviewer.
Types Of Questions
Survey questions can be divided into two broad
types: structured and unstructured. From an
instrument design point of view, the structured questions pose the greater
difficulties. From a content perspective, it may actually be more difficult to
write good unstructured questions. Here, I'll discuss the variety of structured
questions you can consider for your survey.
Dichotomous Questions
When
a question has two possible responses, we consider it dichotomous. Surveys
often use dichotomous questions that ask for a Yes/No, True/False or
Agree/Disagree response. There are a variety of ways to lay these questions out
on a questionnaire:
Questions Based on Level Of
Measurement
We
can also classify questions in terms of their level of measurement. For instance, we might
measure occupation using a nominal question.
Here, the number next to each response has no meaning except as a placeholder
for that response. The choice of a "2" for a lawyer and a
"1" for a truck driver is arbitrary -- from the numbering system used
we can't infer that a lawyer is "twice" something that a truck driver
is.
Filter or Contingency Questions
Sometimes
you have to ask the respondent one question in order to determine if they are
qualified or experienced enough to answer a subsequent one. This requires using
a filter or
contingency question. For instance, you may want to ask one
question if the respondent has ever smoked marijuana and a different question
if they have not. in this case, you would have to construct a filter question
to determine whether they've ever smoked marijuana:
Filter questions can get very complex. Sometimes, you have to have
multiple filter questions in order to direct your respondents to the correct
subsequent questions. There are a few conventions you should keep in mind when
using filters:
Try to avoid having more than three
levels (two jumps) for any question
Too many jumps will confuse the
respondent and may discourage them from continuing with the survey.
- if
only two levels, use graphic to jump (e.g., arrow and box)
if possible, jump to a new page
If you can't fit the response to a
filter on a single page, it's probably best to be able to say something like
"If YES, please turn to page 4" rather that "If YES, please go
to Question 38" because the respondent will generally have an easier time
finding a page than a specific question.
Question Content
For each question in your survey, you
should ask yourself how well it addresses the content you are trying to get at.
Here are some content-related questions you can ask about your survey
questions.
Is the Question Necessary/Useful?
Examine each question to see if you
need to ask it at all and if you need to ask it at the level of detail you
currently have.
- Do
you need the age of each child
or just the number
of children under 16?
- Do
you need to ask
income or can you estimate?
Are Several Questions Needed?
This is the classic problem of
the double-barreled question. You should think about
splitting each of the following questions into two separate ones. You can often
spot these kinds of problems by looking for the conjunction "and" in
your question.
-
What are your feelings towards African-Americans and Hispanic-Americans?
- What do you think of proposed
changes in benefits and hours?
Another reason you might need more than
one question is that the question you ask does not cover all
possibilities. For instance, if you ask about earnings, the respondent
might not mention all income (e.g., dividends, gifts). Or, if you ask the
respondents if they're in favor of public TV, they might not understand that
you're asking generally. They may not be in favor of public TV for themselves
(they never watch it), but might favor it very much for their children (who
watch Sesame Street regularly). You might be better off asking
two questions, one for their own viewing and one for other members of their
household.
Sometimes
you need to ask additional questions because your question does not give you enough context to interpret
the answer. For instance, if you ask about attitudes towards Catholics, can you
interpret this without finding out about their attitudes towards religion in
general, or other religious groups?
At times, you need to ask additional
questions because your question does not determine the intensity of the
respondent's attitude or belief. For example, if they say they support public
TV, you probably should also ask them whether they ever watch it or if they
would be willing to have their tax dollars spent on it. It's one thing for a
respondent to tell you they support something. But the intensity of that
response is greater if they are willing to back their sentiment of support with
their behavior.
Exercise:
What additional questions you think you need to ask to determine
the intensity of respondent’s belief about the king and his governance?
Do Respondents
Have the Needed Information?
Look
at each question in your survey to see whether the respondent is likely to have
the necessary information to be able to answer the question. For example, let's
say you want to ask the question:
Do you think Dean Rusk acted correctly in the Bay of Pigs
crisis?
The
respondent won't be able to answer this question if they have no idea who Dean
Rusk was or what the Bay of Pigs crisis was. In surveys of television viewing,
you cannot expect that the respondent can answer questions about shows they
have never watched. You should ask a filter question first (e.g., Have you ever
watched the show ER?)
before asking them their opinions about it.
Does the Question
Need to be More Specific?
Sometimes
we ask our questions too generally and the information we obtain is more
difficult to interpret. For example, let's say you want to find out
respondent's opinions about a specific book. You could ask them
How well did you like the book?
on
some scale ranging from "Not At All" to "Extremely Well."
But what would their response mean? What does it mean to say you liked a
book very well?
Instead, you might as questions designed to be more specific like:
Did you recommend the book to others?
or
Did you look for other books by that author?
Is Question
Sufficiently General?
You
can err in the other direction as well by being too specific. For instance, if
you ask someone to list the televisions program they liked best in the past
week, you could get a very different answer than if you asked them which show
they've enjoyed most over the past year. Perhaps a show they don't usually like
had a great episode in the past week, or their show was preempted by another
program.
Is Question
Biased or Loaded?
One
danger in question-writing is that your own biases and blind-spots may affect
the wording (see Decisions About Question Wording). For
instance, you might generally be in favor of tax cuts. If you ask a question
like:
What
do you see as the benefits of a tax cut?
You’re only asking about one side
of the issue. You might get a very different picture of the respondents'
positions if you also asked about the disadvantages of tax cuts. The same thing
could occur if you are in favor of public welfare and you ask:
What do you see as the disadvantages of eliminating welfare?
without
also asking about the potential benefits.
Will Respondent
Answer Truthfully?
For each question on your survey, ask
yourself whether the respondent will have any difficulty answering the question
truthfully. If there is some reason why they may not, consider rewording the
question. For instance, some people are sensitive about answering questions
about their exact age or income. In this case, you might give them response brackets to choose from (e.g., between 30 and 40 years old, between
$50,000 and $100,000 annual income). Sometimes even bracketed responses won't
be enough. Some people do not like to share how much money they give to
charitable causes (they may be afraid of being solicited even more). No matter
how you word the question, they would not be likely to tell you their
contribution rate. But sometimes you can do this by posing the question in
terms of a hypothetical projective respondent (a little bit like a projective test). In this case,
you might get reasonable estimates if you ask the respondent how much money
"people you know" typically give in a year to charitable causes.
Finally, you can sometimes dispense with asking a question at all if you can
obtain the answer unobtrusively (see Unobtrusive Measures). If you are interested
in finding out what magazines the respondent reads, you might instead tell them
you are collecting magazines for a recycling drive and ask if they have any old
ones to donate (of course, you have to consider the ethical implications of
such deception!).
Question Wording
One of the major difficulties in writing
good survey questions is getting the wording right. Even slight wording
differences can confuse the respondent or lead to incorrect interpretations of
the question. Here, I outline some questions you can ask about how you worded
each of your survey questions.
Can the Question be
Misunderstood?
The
survey author has to always be on the lookout for questions that could be
misunderstood or confusing. For instance, if you ask a person for their
nationality, it might not be clear what you want (Do you want someone from
Malaysia to say Malaysian, Asian, or Pacific Islander?). Or, if you ask for
marital status, do you want someone to say simply that they are either married
or not married? Or, do you want more detail (like divorced, widow/widower,
etc.)?
Some terms are just too vague to be
useful. For instance, if you ask a question about the "mass media,"
what do you mean? The newspapers? Radio? Television?
What kind
of headache remedy do you use?
Do
you want to know what brand name medicine they take? Do you want to know about
"home" remedies? Are you asking whether they prefer a pill, capsule
or caplet?
What Assumptions
Does the Question Make?
Sometimes
we don't stop to consider how a question will appear from the respondent's
point-of-view. We don't think about the assumptions behind our questions. For
instance, if you ask what social class someone's in, you assume that they know
what social class is and that they think of themselves as being in one. In this
kind of case, you may need to use a filter question first to determine whether
either of these assumptions is true.
Is the time
frame specified?
Whenever
you use the words "will", "could", "might", or
"may" in a question, you might suspect that the question asks a
time-related question. Be sure that, if it does, you have specified the time
frame precisely. For instance, you might ask:
Do you think Congress will cut taxes?
or something like
Do you think Congress could successfully resist tax cuts?
Neither
of these questions specifies a time frame.
How personal is
the wording?
With
a change of just a few words, a question can go from being relatively
impersonal to probing into your private perspectives. Consider the following
three questions, each of which asks about the respondent's satisfaction with
working conditions:
- Are
working conditions satisfactory or not satisfactory in the plant where you
work?
- Do
you feel that working conditions are satisfactory or not satisfactory in
the plant where you work?
- Are
you personally satisfied with working conditions in the plant where you
work?
The
first question is stated from a fairly detached, objective viewpoint. The
second asks how you "feel." The last asks whether you are
"personally satisfied." Be sure the questions in your survey are at
an appropriate level for your context. And, be sure there is consistency in
this across questions in your survey.
Is the wording
too direct?
There
are times when asking a question too directly may be too threatening or
disturbing for respondents. For instance, consider a study where you want to
discuss battlefield experiences with former soldiers who experienced trauma.
Examine the following three question options:
- How
did you feel about being in the war?
- How
well did the equipment hold up in the field?
- How
well were new recruits trained?
The
first question may be too direct. For this population it may elicit powerful
negative emotions based on their recollections. The second question is a less
direct one. It asks about equipment in the field, but, for this population, may
also lead the discussion toward more difficult issues to discuss directly. The
last question is probably the least direct and least threatening. Bashing the
new recruits is standard protocol in almost any social context. The question is
likely to get the respondent talking, recounting anecdotes, without eliciting
much stress. Of course, all of this may simply be begging the question. If you
are doing a study where the respondents may experience high levels of stress
because of the questions you ask, you should reconsider the ethics of doing the
study.
Other Wording
Issues
The
nuances of language guarantee that the task of the question writer will be
endlessly complex. Without trying to generate an exhaustive list, here are a
few other questions to keep in mind:
- Does
the question contain difficult or unclear terminology?
- Does
the question make each alternative explicit?
- Is
the wording objectionable?
- Is
the wording loaded or slanted?
Response Format
The
response format is how you collect the answer from the respondent. Let's start
with a simple distinction between what we'll call unstructured response
formats and structured response formats. [On this page, I'll use standard
web-based form fields to show you how various response formats might look on
the web. If you want to see how these are generated, select the View Source option
on your web browser.]
Structured
Response Formats
Structured
formats help the respondent to respond more easily and help the researcher to
accumulate and summarize responses more efficiently. But, they can also
constrain the respondent and limit the researcher's ability to understand what
the respondent really means. There are many different structured response
formats, each with its own strengths and weaknesses. We'll review the major
ones here.
Fill-In-The-Blank. One
of the simplest response formats is a blank line. A blank line can be used for
a number of different response types. For instance:
Please enter your gender:
_____ Male
_____ Female
Here,
the respondent would probably put a check mark or an X next to the response.
This is also an example of a dichotomous response,
because it only has two possible values. Other common dichotomous responses are
True/False and Yes/No. Here's another common use of a fill-in-the-blank
response format:
Please enter your preference
for the following candidates where '1' = your first choice, '2' = your second
choice, and so on.
_____ Robert Dole
_____ Colin Powell
_____ Bill Clinton
_____ Al Gore
In
this example, the respondent writes a number in each blank. Notice that here,
we expect the respondent to place a number on every blank, whereas in the
previous example, we expect to respondent to choose only one. Then, of course,
there's the classic:
NAME: ________________________
And
here's the same fill-in-the-blank response item in web format:
NAME:
Of
course, there's always the classic fill-in-the-blank test item:
One of
President Lincoln's most famous speeches, the Address, only lasted a few minutes when delivered.
Check The Answer. The
respondent places a check next to the response(s). The simplest form would be
the example given above where we ask the person to indicate their gender.
Sometimes, we supply a box that the person can fill in with an 'X' (which is
sort of a variation on the check mark. Here's a web version of the checkbox:
Please check if you have
the following item on the computer you use most:
modem
printer
CD-ROM drive
joystick
scanner
printer
CD-ROM drive
joystick
scanner
Notice
that in this example, it is possible for you to check more than one response.
By convention, we usually use the checkmark format when we want to allow the
respondent to select multiple items.
We
sometimes refer to this as a multi-option variable. You
have to be careful when you analyze data from a multi-option variable. Because
the respondent can select any of the options, you have to treat this type of
variable in your analysis as
though each option is a separate variable. For instance, for each
option we would normally enter either a '0' if the respondent did not check it
or a '1' if the respondent did check it. For the example above, if the
respondent had only a modem and CD-ROM drive, we would enter the sequence 1, 0,
1, 0, 0. There is a very important reason why you should code this variable as
either 0 or 1 when you enter the data. If you do, and you want to determine
what percent of your sample has a modem, all you have to do is compute the
average of the 0's and 1's for the modem variable. For instance, if you have 10
respondents and only 3 have a modem, the average would be 3/10 = .30 or 30%,
which is the percent who checked that item.
The
example above is also a good example of a checklist item. Whenever you use a
checklist, you want to be sure that you ask the following questions:
- Are
all of the alternatives covered?
- Is
the list of reasonable length?
- Is
the wording impartial?
- Is
the form of the response easy, uniform?
Sometimes
you may not be sure that you have covered all of the possible responses in a
checklist. If that is the case, you should probably allow the respondent to
write in any other options that may apply.
Circle The Answer. Sometimes the respondent is
asked to circle an item to indicate their response. Usually we are asking them
to circle a number. For instance, we might have the following:
Unstructured
Response Formats
While
there is a wide variety of structured response formats, there are relatively
few unstructured ones. What is an unstructured response format? Generally, it's
written text. If the respondent (or interviewer) writes down text as the
response, you've got an unstructured response format. These can vary from short
comment boxes to the transcript of an interview.
Comments
Post a Comment