Describe the strengths and weakness of using surveys as a research methodology?Differentiate between open and closed ended questions.

504 Week 09 Overview (4.00) – Survey – Theory

Writer, there are 3 parts for this order.  I will complete the survey media ASAP and share the results, so you can address discussion 1 (part 1) and discussion 2 (part 2). The transcript of the survey pasted below. Part 3 is the most important – Survey Paper Preparation that the template/format has been provided below).  Please scroll down to find instructions for parts 1-3.

Introduction

Week 09 Video Transcript

Week 09 Video Transcript

Welcome to the survey module. Now, you are likely familiar with the survey method. Probably, you have taken more surveys than you care to remember. This is normal. Surveys are all around us, on the bottom of receipts, on websites we visit, and indeed, even in the courses in this program.

So you may understandably be wondering, what more do I need to know about surveys. Anyone can throw together a survey. And yes, that is true. So let me ask you this question. How many bad surveys have you taken? Surveys which irritated you. Surveys which confused you. How many good surveys have you taken?

How many surveys have you taken, that made you think the survey designer truly was interested in your opinion? Did these surveys also make you like, trust the company behind the survey more? Surveys can be good PR. It is likely, you have also used some of the weekly feedback surveys in this core to vent and release frustrations.

I truly do value the feedback you give in these weekly feedback surveys. And I hope they fulfill one of their purposes which is to make you believe I care about your experiences, and I really do. It’s also far better for me and the program if you vent on the survey and not to each other or other human beings.

If you release some frustration in the survey, it allows a fresh start to next week’s material. Back to this week though and the survey module. This survey module will replicate the structure of the focus group module. This first week of surveys is all about the basics, learning how to craft effective questions and avoiding those survey questions that annoy people.

In other words, some dos and don’ts for survey design. Just as in the focus group module, you will work in your teams to design and deploy the data collection method, the survey. This week’s materials primarily focus on this: the design of the survey.

Learning Objectives

 

By the end of this week, you will be able to:

  • Recognize when surveys should be used.
  • Describe the strengths and weakness of using surveys as a research methodology.
  • Differentiate between open and closed ended questions.

 

Reading

Textbook:

Creswell, J. W., & Creswell J. D. (2018). Research design: Qualitative, quantitative, and mixed methods approaches (5th ed.). Thousand Oaks, CA: Sage Publications, Inc. Chapter 8 (pp. 147 start of chapter – pp. 159 stop at “Example 8.1”).

 

Journal Articles:

Brown, A.G., Weingart, S., Johnson. J. R. J., & Dance, B. (2004). Librarians don’t bite: Assessing library orientation for freshmen. Reference Services Review, 32, 394-403. (PDF)

Ibraheem, A. I., & Devine, C. (2013). A survey of the experiences of African librarians in American academic libraries. College & Research Libraries, 74, 288-306. (PDF)

Ismail, L. (2010). What Net generation students really want: Determining library help-seeking preferences of undergraduates. Reference Services Review, 38, 10-27. (PDF)

Kennedy, M. R., & Brancolini, K. R. (2012). Academic librarian research: A survey of attitudes, involvement, and perceived capabilities. College & Research Libraries, 73, 431-448. (PDF)

Week 09 Instructional Materials

Survey – Theory  (Need to be completed by me but will share the results so the writer can finish discussion 1 & 2 assignment)

Please click on the Survey Overview image below for an overview of survey as a research methodology.

Important things to note:

  1. Allow yourself sixty minutes to complete this exercise.
  2. You will be asked to respond throughout the piece, and that unless otherwise specified, your answers will be tabulated and shown to the class as a basis for the discussions this week.
  3. Once you have finished, click on the Results button to view your classmates’ responses before participating in the weekly discussion. If few responses have been tendered, please check back. Please note: This activity requires Flash. Please enable Flash in your browser, when prompted.
  4. Please complete this as early in the week as you can, so that there is adequate time to appraise the responses before the discussions close.
  5. If you have any technical difficulties entering your response within the media piece, please use the affiliated discussion board as an alternate method for replying to these questions.

Survey Media Transcript

Screen 1

Audio Script:

Survey is another type of research methodology. Surveys are widely used. Surveys search for correlations indicating possible cause and effect. Often the researcher can use an existing questionnaire and does not need to design his or her own.

Screen 2

Audio Script:

Surveys can assess a wide range of things from factual knowledge, to beliefs or perceptions, to affective feelings or emotions, to behavior reports to traits or states. Now let’s look at some examples to illustrate some of these points.

Exams are an example of a survey assessing factual knowledge.

Questions such as “I feel that blank” is a survey question assessing beliefs and perceptions.

“How much you like cooking? How sad are you?” are two concrete examples of survey questions assessing affective feelings and emotions.

How often do you use the database is a survey question assessing a behavior.

Finally “How happy are you generally?” assesses the general trait of happiness whereas “How happy are you about graduation? assesses a specific state.

Screen Visual:

Surveys can assess…

  • Factual Knowledge
    • Exams
  • Beliefs/perceptions
    • “I feel that…”
  • Affective feelings/emotions
    • How much you like cooking? How sad are you?
  • Behavior reports
    • How often you use the database?
  • Traits/states
    • How happy are you generally? (trait)
    • How happy you are about graduation? (state)

Audio Script:

What follows on the next slide is an example to illustrate a specific point about survey questions. In this media piece a number of extreme examples are intentionally used with the hope these examples will be more memorable. It is not that I believe you might make the same extreme error but you might make a similar type of error. In the next slide you will see how the question is too difficult or impossible for the participant to answer. Remember this when you write your questions – make sure participants are able to answer them.

Here, aside from being patronizing, the participant, regardless of how much they wanted to help, could not respond to this survey question if they were blind. So, the key point here is to design your questions so participants can provide you with the answer. Do not make questions impossible or extremely difficult to answer.

Screen Visual:

Dear Disabled Friend

Are you blind? Yes or No

(please tick appropriate box)

Audio Script:

Surveys may have statements, such as “I like ice cream” or questions such as “Do you like ice cream?”

Screen Visual:

Surveys may have statements

A row of numbers 1 through 7 appears horizontally across the screen with the word “Disagree” under the 1 and the word “Agree” under the 7. Above this row of numbers is the statement, “I like ice cream.”

Surveys may have questions

A second row of numbers appears horizontally across the screen with the word, “Yes” under the number 1 and the word “No” under the number 7. Above this row of numbers is the statement, “Do you like ice cream?”

Audio Script:

Both approaches to survey questions, having statements or questions, can be equally effective. Some businesses, industries, or bosses may have a preference but again both types of survey approaches work.

Screen Visual:

Asking questions is not better or worse than statements.

Audio Script:

Another way survey questions can be classified is by whether they are closed ended or open ended.

An example of a closed ended question is, “How much do you like USC?” with a response option provided such as a response scale.

An example of an open ended question is, “Why do you like USC?”, with no response options provided other than a text box to type/write answers.

Screen Visual:

Survey questions may be closed or open.

Close-ended

A row of numbers 1 through 7 appears horizontally across the screen with the words “Not at all” under the 1 and the words “Very much” under the 7. Above this row of numbers is the question, “How much do you like USC?”

Open-ended

Why do like USC?

Audio Script:

Here you can compare and contrast the two question types closed and opened questions.

Screen Visual:

Open-ended Questions

  • Positives
    • Allows the participant to write what they think (greater freedom)
    • Reveal reasoning
    • Chance to discover something not anticipated
    • Often good for pilot study or small number
  • Negatives
    • Hard to code/interpret content
    • Researcher bias in coding
    • More work for participant
    • You may misunderstand the answer
    • Irrelevant answers increase (e.g. because…)

Closed-ended Questions

  • Positives
    • Easier to code/analyze
    • Less work for participant
    • Many of us on seeing lots of open-ended questions give up and quit survey as it takes much more time to type responses in. Closed-ended questions, just clicking a number, are quick and easy to answer.
  • Negatives
    • Limits responses
    • Restricts response options
    • You may miss something important due to the constraints you impose

Audio Script:

Within a survey you can also have some closed ended questions and some open ended questions the question types are not mutually exclusive within a survey.

Screen 4 – Activity

Audio Script:

We will assume everyone in the class likes USC. In real research, this is a dangerous assumption. Maybe some students don’t like USC, or liked USC before they took this class, or liked USC until they met their classmates. For reasons of simplicity, though, we will assume everyone here likes USC.

Screen Visual:

Open-ended questions: class exercise

List five reasons you like USC. If you cannot think of reasons why you personally like USC list some reasons why you think others might like USC.

[Write these reasons and submit.]

Note: these responses will be shown to the class for evaluation.

Screen 5

Audio Script:

Let’s look at double-barreled questions, meaning there are two things in each question.

Screen Visual: [Question on screen]

I am happy and hard working.

Audio Script:

Here, happiness and hard-working are being assessed. What do you do if you are happy, but happy because you are not hard working? Or unhappy because you are hard working?

Screen Visual: [Question on screen]

I am in favor of cutting back on evening classes and that each class should last no longer than 2 hours.

Audio Script:

This question has the same type of problem. Two things are being assessed: favor of cutting back on evening classes and class length being two hours. How do you respond if you are in favor of cutting back on evening classes but are fine with classes lasting longer than two hours?

There are occasions when it is appropriate to ask two things in one question, but most of the time, this type of double-barreled question should be avoided.

Now let’s look at how bias can affect your questions.

Screen Visual:

Two rows of numbers 1 through 7 appear horizontally across the screen with the words “Strongly Disagree” under the 1 and the words “Strongly Agree” under the 7. Above each row of numbers are the questions, “I believe killing unborn innocent babies is acceptable.” and, “I believe women should be forced to give birth to unwanted babies.”

Audio Script:

Here are some extreme examples that show how the wording of a question can be phrased to push participants to respond in a pro-life or a pro-choice fashion. Frequently in media reports you do not see the question wording that was given to participants. Now, most respectable opinion polling companies or media organizations would not use wording as biased as shown here. However, even slight bias in question wording can make certain responses more or less likely.

Screen 6 – Activity

Audio Script:

Create your own two questions, question A and question B, similar to the examples but on a different topic than pro-choice and pro-life.

Your question A should generate a typical response from participants that suggests participants have a certain viewpoint. Your question B though should generate a typical response from participants that suggests participants have a different viewpoint.

These fictitious participants would never see both question A and question B. Rather some participants respond to question A, and different participants respond to question B.

Screen Visual:

Bias: class exercise

Create your own two questions, question A and question B, similar to the examples but on a different topic than pro-choice and pro-life.

[Write these reasons and submit.]

Note: these responses will be shown to the class for evaluation.

Screen 7

Audio Script:

I am now going to show you another extreme example showing you the importance of question order. I am going to show you four pictures and ask you to respond to the same question for each picture.

Screen Visual:

Perceptions of self:

Please rate how similar you think each picture is to how you see yourself. Indicate your response by choosing a number on the scale of 1 to 7, the number 1 designating, “Not at all like me” and the number 7 designating, “Very much like me.”

Audio Script:

In this instance you will not be asked to share your specific answers with anyone else in the class.

Screen 8- Activity

Screen Visual:

[Choose a number on the scale of 1 to 7, the number 1 designating, “Not at all like me” and the number 7 designating, “Very much like me.”]

Picture 1

This picture is a portrait of a young, smiling, Caucasian woman, in her twenties. She has brown hair and brown eyes. Her hair is long and straight and tied back. She is wearing a dark sweater and a lower-cut blouse.

Picture 2

This picture is a portrait of a young smiling, Caucasian woman. She has brown hair and brown eyes. Her hair falls to her shoulders. She is wearing a dark jacket and a white collared shirt.

Picture 3

This picture is a portrait of a young, smiling, Latina. She has dark brown hair and dark brown eyes. Her hair is wavy and falls to her shoulders. She is wearing glasses, a blue collared blouse and dangling silver earrings.

Picture 4

This picture is a portrait of a young, smiling, African-American woman. She has black hair and dark brown eyes. Her hair is in an afro style. She is wearing a dark, short-sleeved, collared blouse.

Screen 9

Audio Script:

Now I will show you another four pictures and again ask you to respond to the same question as before.

Screen 10

Screen Visual:

[Choose a number on the scale of 1 to 7, the number 1 designating, “Not at all like me” and the number 7 designating, “Very much like me.”]

Picture 1

This picture is a portrait of a space alien. The face is green. The eyes are large, black, and almond shaped. There is no nose.

Picture 2

This picture is a portrait of a chimpanzee. The face is covered with fur. The eyes are dark brown and round. The nose is flat.

Picture 3

This picture is a portrait of a young, smiling, Latina. She has dark brown hair and dark brown eyes. Her hair is wavy and falls to her shoulders. She is wearing glasses, a blue collared blouse and dangling silver earrings. [This is the same portrait as picture 3 in the previous set of four pictures.]

Picture 4

This picture is a portrait of a young, smiling, African-American woman. She has black hair and dark brown eyes. Her hair is in an afro style. She is wearing a dark, short-sleeved, collared blouse. [This is the same portrait as picture 4 in the previous set of four pictures.]

Screen 11

Audio Script:

As you noticed, two of the pictures from the second set of pictures were also shown in the first set – the African-American and the Latina.

However, it is likely your responses to the African-American and Latina were very different the second time. This is because in the first set of pictures you were primed to think about your skin color. In the second set of pictures you were primed to think about being human.

These differences in responses are caused by what is known as an order effect. In your surveys it’s unlikely you will come across such a strong order effect.

However, whatever question is first in your survey causes people to start thinking a certain way. This way of thinking then influences responses to the second question. Question 1 and Question 2 combine to influence thinking about Question 3 and so on throughout the survey.

Screen 12 – Activity

Screen Visual:

Question order: class exercise

Create your own example of scenario where question order affects responses. Create 2-3 questions, activities or something else where completion of these tasks will then affect responses to a second task (question). Explain how and why this order effect happens in your scenario.

[Write this list and submit.]

Note: these responses will be shown to the class for evaluation.

Screen 13 – Activity

Screen Visual:

Counter-balancing: class exercise

One way to address and try and reduce the effects of question ordering is via counter-balancing. Look up counter-balancing on the internet. [Write] a short 4-5 sentence summary of what counter-balancing is and why you think it is a good or bad solution [and submit].

Note: these responses will be shown to the class for evaluation.

Screen 14

Audio Script:

Most of us know what the socially appropriate answer is to most questions most of the time. Many of us will hide or temper what we really believe to indicate what we think society thinks is the appropriate answer. For many students here on campus at USC, there is the perception the general culture is liberal, and so to fit in, people act in a liberal manner and restrict or even hide their possible more non-liberal beliefs and behaviors. All institutions have some sort of culture and definition of what is appropriate.

You experience something similar at your workplace. Perhaps even though the day officially ends at 5:30pm, you should not leave until your boss leaves. Or perhaps, technically, you can leave the building for lunch, but no one does. Or maybe it is that you should not use all of your vacation days. Even though you want to leave before your boss, go out for lunch, or use all of your vacation days, you don’t because no one else does and the unspoken, and maybe even unconfirmed belief that if you do, something bad might happen.

Social norms dictate what is socially desirable. As human beings, we are social, and the social norms affect us. Consider your views of affirmative action or gay marriage; you likely have an opinion on this but might not express your true opinion in all situations. This is also a problem in other types of research, for instance, focus groups and experiments.

In these examples, society has taught us that it is socially appropriate to indicate we do consider the results of important actions we take. But many times, we make instinctive decisions or go with a gut feeling– for example, perhaps about what we majored in in undergraduate or whether to apply to this degree program. It is also socially appropriate to say we help others less fortunate, but many of us might be hard pressed to say the last time we actually did help someone less fortunate.

Screen 15 – Activity

Screen Visual:

Social desirability: class exercise

[Write] your own question where social desirability is an issue. Explain why social desirability would influence a participant’s response to this question. [Submit.]

Note: these responses will be shown to the class for evaluation.

Screen 16 – Activity

Screen Visual:

Options: class exercise

Please indicate your political beliefs: Democrat or Republican

What is wrong with the above question and how might you correct it? [Write your response and submit.]

Note: these responses will be shown to the class for evaluation.

Screen 17

Audio Script:

Now let’s look at how the length and difficulty of your questions can affect the responses/results.

Regardless of how much participants want to help you and provide correct answers, they are limited by their own ability. This slide shows another concrete extreme example of this.

How often did you go out to eat in the last week? Ok — not too difficult.

How often did you go out to eat in the last month? Ok–much harder. It took you more time and effort to come up with an answer and it is likely you are not sure your answer is truly correct.

How often did you go out to eat in junior high? Way too difficult.

Similarly, make the time frame in questions helpful and participant friendly. Asking, “How often did you go out to eat in the last week?” is helpful, but, “How often did you go out to eat in the last nine days?” or “six days,” or “thirteen days” is not, as we don’t think in such time frames. Similarly, “How often did you go out to eat in the last 35 days?” or “26 days” is also hard. We think in terms of weeks or months, not “13 days” or “35 days.”

Make your questions friendly to participants. Make life easy for them when possible. They are doing you a favor by giving you their time in answering your survey, so be nice to them.

Screen Visual:

Question length

Question 1: My overall feelings and thoughts about myself are predominantly favorable most of the time, leading me to feel pretty satisfied about who I am.

Question 2: On the whole, I am satisfied with myself.

Audio Script:

Question length is also a factor to consider. Short, easy questions are good.

Both these questions (above) ask essentially the same thing. It takes me about eight seconds to read the bad question and two seconds to read the good question. I can answer four good questions in the time it takes me to answer one bad question. Participants value their time and will give you only a certain amount of time on a survey. Typically you want to ask as many questions as possible to get accurate information, but you must balance this with participant patience. Once participants get bored, their responses stop (they quit the survey), or they no longer give accurate answers to questions (they select answers randomly). With short, good questions, you can ask more questions before you lose the participant’s attention.”

Now let’s look at question wording. In most surveys there are key words. Understanding of these key words are essential to the survey.

Screen 18

Screen Visual:

Key words: class exercise

Describe your definitions of the following:

  • Usually means XX times per day. What number does XX represent here?
  • Generally means XX times per week. What number does XX represent here?
  • How close (in miles) to the sea do you need to live to agree you live near the sea?
  • How often do you need to speak to your parents to say “you do speak often”?

[Describe your definitions and submit.]

Note: these responses will be shown to the class for evaluation.

Screen 19

Audio Script:

Interpretations of “usually, generally, how close to the coast you need to live to claim you live near the sea?”, and “how often is speaking often to your parents?” are very diverse. Also, you need to consider, whether posting on the wall of your parent’s Facebook page counts as talking and other such interpretations.

Definitions of important words in a survey are therefore key.

Give interpretations of key words. For example, instead of ”usually” or ”generally”, specify a time.

Cultural interpretations are especially important. Different cultures can have very different interpretations of the same words. These cultures might be based on skin color, nationality, gender, or even political or sexual orientation. When research is conducted with a diverse range of participants, such as if a survey is online, where people all over the country or world might respond, or even in a diverse urban environment, cultural interpretations can have a large influence on results and interpretations.

Look at the wording in your questions. Does the question wording suggest a certain viewpoint is correct? As an educated person you might feel more comfortable than many in disagreeing with a scientific expert, but for many people if an expert says it, they believe it is probably right.

Screen Visual:

Avoid biased wording

Example 1: Don’t you agree that…?

Example 2: Scientific experts think… do you agree?

Audio Script:

Here we see examples of biased wording in questions (above).

Also, in simply conducting research you have reached an elevated position is the eyes of many participants. You have the USC name/logo behind you if you are conducting research in this course or the brand and fame of your company behind you if you are conducting research at work. This prestige gives you a certain authority which again can make many participants agree with you because they think you must be correct due to where you are from (USC or your company).

Screen Visual:

Things to think about…

  • If asking how often something happens specify a time period.
  • Try and predict participant confusion and remove it before it happens.
  • Avoid slang, acronyms, abbreviations and jargon.

Audio Script:

If I ask Jessica how often she goes to the gym, she might say four because she goes to the gym four times a month. If I ask Brett how often he goes to the gym he might say three because he goes three times a week. However, all I have are the numbers three and four, so I conclude that Jessica goes to the gym more often than Brett. Interpretations of questions by participants are therefore key. It is your job as a survey designer to make sure participant interpretation is as you intend it to be.

A composite is intended to give you an accurate assessment by taking the average response to questions.

When in doubt, ask more than one question to assess something as many ways as possible. We can then take the average to form a composite. For example, a GPA is a composite of grades in a variety of courses. Your undergraduate GPA is a much better measure of your academic performance in undergraduate than your grade in any one specific undergraduate class. Or, you may ask 12 questions to assess anxiety and form a composite of the scores to show anxiety. Normally, a composite is the average of response to questions (indicators).

Screen 20

Audio Script:

Now, let’s look at the problems encountered with surveys.

Participants lie. “How often do you break the speed limit?”, ‘”How often do you help others?”, “Did you ever cheat in undergraduate?” Most of us break the speed limit more often than we care to admit, help others less than we wish, and research shows that students cheating in undergraduate range from 40-80%. We lie to present ourselves in a more desirable manner.

Participants forget. Remember the example from earlier about how often you ate out in high school? Or what did you eat for breakfast a week ago? Or what was the exact time you logged in to the class today? These are all things you knew at some point but have now forgotten.

Intentions do not equal behavior. I intended to go to the gym five times a week this year. I think I made it to about the second or third week before my behavior no longer matched my intention. Many of us intended to go on a diet and lose weight but then that same day found ourselves eating cake or ice-cream or engaging in some other behavior that does not match our dieting intention.

Some people just say yes. For example, you probably have a friend like this. Suppose you are talking with your friend about where to eat tonight. You suggest Chinese, they say yes; you suggest sushi, they say yes; you suggest Ethiopian, they say yes; you suggest Thai, they say yes, and so on. When your friend takes surveys, he or she says yes to everything too.

Some people just say no. The stereotypical example of this is teenage males. Teenage males like to say no to everything. Teenage males take surveys too.

Unstable and changing: suppose when you logged in I asked you how tired, excited, and interested in the course material you were. If I asked you those same questions again right now, it is likely your answers have changed. You answered both times accurately and truthfully, but some things change quickly or a lot with time.

What happens when you get bored on a survey? You quit, or just start clicking the middle response (there is a perception being neutral is safe), or start clicking random responses. None of these responses are good for your survey data collection. So balance lots of questions (best measure) against participant patience.

If you ask in your survey people’s opinions on trade relations between Australia and Eastern Timor they have opinions. Even if they don’t know where Eastern Timor is, even if they don’t know where Australia is. We don’t like to admit we don’t know.

Screen 21

Audio Script:

In the Likert scale, you can reverse a question to see if a participant is paying attention.

Screen Visual:

Likert scale

A row of numbers 1 through 7 appears horizontally across the screen with the words “Not at all” under the 1 and the words “Very much” under the 7. Above this row of numbers is the question, “How much do you like chocolate?”

A second row of numbers 1 through 7 appears horizontally across the screen with the words “Not at all” under the 1 and the words “Very much” under the 7. Above this row of numbers is the question, “How much do you dislike chocolate?”

Audio Script:

Someone who likes chocolate would answer a 6 or 7 to the first question but a 1 or a 2 to a second question. The problem though is reverse scoring a question annoys people who are paying attention. If they get annoyed, they quit the survey– which is not what we want. So, reverse score a small number of questions, but not too many.

Once people answer a few questions they are likely to continue. Getting them to answer the first few questions is key, so make the first questions interesting. Don’t have the first question be “Have you ever engaged in unsafe sex?” or “Have you been put in jail?” Such questions are likely to scare participants away.

They think if this is the first question, what might other questions be about? If you put sensitive questions at the end, participants are often more likely to answer as they think, ”Well, I’ve already answered the other questions already I might as well answer this question as well.”

Don’t have an 8 point font; don’t have a red font on a green background.

Screen Visual:

Possible introduction

In this survey, I will be asking you questions about your thoughts and opinions on traveling. When the survey talks about traveling, it means going somewhere more than 100 miles away from where you live. It does not matter if you go there for work, pleasure or some other reason. There are no right or wrong answers to these questions. I am only interested in what you personally think. All of your responses will be kept confidential and only researchers working under my supervision will see your answers. Please answer the questions below using the scale provided. Write a number next to each question in the space provided to indicate your answer.

Audio Script:

Here is an example introduction (above).

The first sentence tells the reader what the survey is about.

The second sentence recognizes travelling needs defining. We all know what travelling is but might have different definitions.

The third sentence defines travelling further.

The fourth sentence reduces participants’ evaluation apprehension that I am judging them.

The fifth sentence lets participants know their responses will not be seen by others.

Screen Visual:

Possible introduction

In this survey, I will be asking you questions about your thoughts and opinions on traveling. When the survey talks about traveling, it means going somewhere more than 100 miles away from where you live. It does not matter if you go there for work, pleasure or some other reason. There are no right or wrong answers to these questions. I am only interested in what you personally think. All of your responses will be kept confidential and only researchers working under my supervision will see your answers. Please write two sentences about why you like to travel.

Audio Script:

If you are using an open-ended question, let participants know how much you want them to write.

A Note About Scale Scoring

 

 

Part 1 Week 09 Discussion 01: Question Order

Due: Post your response and reply to at least two of your classmates’ posts (I will provide the results and responses of the survey to the writer.  Writer needs to complete discussion 1 post and 2 reply responses posts)

 

Instructions

In the Survey Overview media piece in this week’s instructional materials, you were asked to respond to two sets of questions showing images. In both sets of images, some images were the same but others were different and the order of the images was also different. In this discussion, post your thoughts on what this exercise illustrated regarding the importance of question order.

 

Part 2 Week 09 Discussion 02: Open-Ended Questions

Due: Post your response and reply to at least two of your classmates’ posts. (I will provide the results and responses of the survey to the writer.  Writer needs to complete discussion 1 post and 2 reply responses posts)

 

Instructions

Refer to the Survey Overview media piece in the instructional materials for this week, and your classmate’s responses by clicking on the Results button below the media piece. Look at all the responses that the class has generated for reasons why they like USC. There are five responses for each person in the class so there are a lot of responses. However, with this number of responses it is just about possible to eyeball them – glance through them and see what is going on, what patterns there are, and how many people said similar things.

However, we want to impose numbers and talk about percentages to make these results in some ways more meaningful and easier to communicate to others. If we wanted to present these results to someone else (e.g., a reader of our report) we could just show them all the responses, but in many ways it is more helpful to the reader to say “X % indicated Y was a reason they liked USC.”

This means we need to look for patterns in the data. The approach we will use is a basic form of content analysis – a research method we will cover in more detail later in the course.

Post your answers to the following:

  1. In the class generated responses as to why people liked USC what patterns do you see? Did more than one person say the same thing or something similar?
  2. Identify three types of responses and work out how many people indicated that type of response, and what percentage that number reflects.

As you can see coding the open-ended responses takes time. Also, consider how time consuming this would be if there were 100 participants responding to the question, or 1,000, or 10,000 or even more. Such data sets are not uncommon.

 

Part 3 Week 09 Survey Paper PreparationTop of Form

Bottom of Form

 

Instructions

Similar to the focus group module, you will work in your teams to collect your data for the Survey Paper. Unlike with the Focus Group Paper, for this paper you also work with your team to design your survey (for the FG module you designed your own FG moderator guide).

You can refer to the details for preparing for the Survey Paper here: Survey Paper Preparation Information (PDF) – see below

 

Looking for Discount?

You'll get a high-quality service, that's for sure.

To welcome you, we give you a 15% discount on your All orders! use code - ESSAY15

Discount applies to orders from $30
©2020 EssayChronicles.com. All Rights Reserved. | Disclaimer: for assistance purposes only. These custom papers should be used with proper reference.