Teaching Students to Use iPads for Survey Data Collection (2 of 2)

In my last post, I wrote about a Comm Research project where students use iPads for survey data collection.This is my favorite of the 3 projects we do in my Communication Research Class (see all posts on Comm 435; see syllabus).

This week, I want to follow up by discussing how to program the surveys to work on the iPads. I’ll talk through how I teach all of this in class and through activities.

Lastly, I’ll explain how I prepare the data for use in SPSS.

Once students have created their surveys, we need to get them onto ONA.io

Programming surveys to work on ONA.io – the free, open-source tool used by my class and researchers around the world – is a little tricky. It follows XLS formatting. Once you get the hang of it, it is super easy. And it is quick to teach and learn.

I go over this online Lab Guide (http://bit.ly/435_lab_digitalsurvey) that I created on how to program XLS forms in class. I then provide students with a practice activity to create a survey in Excel or Google Spreadsheets. The activity asks students to create:

1) A question of how many years they are in school

2) A check all that apply question – I usually pick something fun like their favorite movies from a list

3) A likert-style question. Ex: How much they like binge-watching on Netflix.

In sum, they practice creating an integer, select_multiple, and select_one question.

Once students get the hang of it, they log into an ONA.io account I create for the class. Next, they upload their practice survey to test in class using our department’s iPads. But, this could be done on a phone or even a computer itself (Instructions on how to do this are in the lab guide).

The #1 thing, is that things have to be done exactly in this formatting. So, little errors like forgetting to put an _ (and putting a space instead) for “list_name” will result in ONA.io kicking the survey back and telling you there is an error. If a mistake is made, no problem. Just fix your form and re-upload.

I check to make sure everything is done correctly. This saves time when they program their own surveys. If everything is good, I give students lab time to work on formatting their surveys and help out as needed.

After everything has been uploaded successfully – this usually takes time outside of class, so I make it due the following class – students are ready to go out into the field. This is where the fun happens!

Students always get great feedback when they use iPads to collect survey data. People tend to be interested in what they’re doing and happy to participate. Some students this year told me that people came up to them around campus and asked if they could participate. That is much different than the usual online survey where we often struggle to get respondents! I can’t express how rewarding it is to see students go out into the field, collect data, and come back having gathered data no one else has before. For most of them, this is their first time doing data collection of any kind. And so while the class is tough and a lot of work, it is rewarding. You can see the ‘aha’ moments the students have when they start drawing inferences from their data.

Preparing Data for Analysis in SPSS

If you only want to look at summaries of responses, you can check that out in ONA.io. But, if you want to analyze the data you’ve got to get it from the way students labeled it to the #s for SPSS.

For example, in the below example where the question asks the participant their favorite ice cream, if the ‘choices’ in our XLS code is:

Lab_Guide_-_FormHub_-_Google_Docs

And the participant answers “Vanilla” the data collected would be icecream2.

But, SPSS can’t analyze “incecream2.” It can only analyze a number. So, we need every instance when a participant selected Vanilla to be recorded as simply “2” in SPSS.

Here’s how to quickly do this:

Download the data Excel file of the completed surveys. Open in Excel. Replace “icecream” with “” (that is, with nothing – no spaces. Just leave the replace section blank). Excel will remove “icecream” from the Excel file and you’re left with the number for responses such that “icecream2” now is “2”. Repeat this step for each question. For check all that apply questions, ONA.io records “FALSE” for answer choices left blank, and “TRUE” for instances when the participant checked the answer choice. For example, if the question was “Check all your favorite ice cream flavors” and the participant checked “Vanilla,” ONA would record a “TRUE” and if they left it blank, ONA would record “FALSE.” These can be easily prepared for SPSS by replacing FALSE with “0” and TRUE with “1”.

Admittedly, this step is the drawback of using XLS forms. While a little tedious, it is quick and easy to do. Considering the advantages, I don’t mind taking 20 minutes of my time cleaning the data for my students.

When done, I send the student teams their data and we work on analyzing them in class.

 

Well that’s all for now! I hope you enjoyed this tutorial and consider using iPads for survey data collection in your research class, or other classes where surveys could prove valuable!

Here at Shepherd, finals week starts this week. I hope everyone has a great end to the semester!

Using iPads for Survey Data Collection in the Communication Research Class

Surveys are a common method uses in communication research class projects. Since I started teaching this class at Shepherd University, I’ve added a fun, cool feature that really brings the survey data collection process to life!

Students in my Comm 435 Communication Research class (see all posts on Comm 435; see syllabus) now use iPads for data collection in the field. My students grab a department iPad and go around campus to recruit participants. The participants complete the surveys on the iPads, and the data is synched to the cloud where it can be downloaded and analyzed.

ipadsurveys

Overview

For the final of three hands-on projects in my class, student teams identify a problem or question they have pertaining to Shepherd University or the local community. They design a study to research that problem. In my first two hands-on projects, students don’t design the methods or the measurements. They are based on scenarios I set up and materials I provide. For example, here’s a discussion of my computer-assisted content analysis assignment.

As a part of the assignment for today’s post, students are required to conduct 1) surveys, and 2) either focus groups or interviews. Let’s talk about the surveys:

After discussing surveys as a method, with a particular focus on survey design and considerations, each team designs a brief survey.

In the lecture before they create the survey, I lecture on important considerations in survey design. And then students do an in class activity to practice putting these concepts into motion using a mock scenario. I then provide feedback on their survey design, and help them make improvements.

The class the following time we meet is dedicated to helping students design measurements that meet the research objective and research questions they’ve developed that will help them get the answers to the questions they want to know. The day is also dedicated to helping them write effective survey questions (as well as interview or focus group questions, for that part of the assignment). I started dedicating an entire class period to measurement design after spotting this as a major weakness in the projects last semester.

Next, rather than using paper & pen, or surveymonkey.com (which limits students to only 10 questions), teams program their surveys into ONA.io. It is a free, open access web survey tool designed by folks at Columbia University. So, we spend the 3rd day learning how to use ONA.io to program their surveys. I’ll talk in detail about that in the next post.

During data collection week, students check out department iPads, load the survey onto their iPad, and go out into the field to collect data. A group of students will check out several iPads and hit up the student union, library, or campus quads and collect data fairly quickly. The data syncs automatically over our campus-wide wifi! That means, when all students get back to the computer lab, their data – from each iPad used – is already synced to ONA.io where it could be downloaded and analyzed.

Pretty cool, huh? It is my favorite project that we do in my communication research class and the students seem to really enjoy using the iPads for surveys.

There are a few caveats.

  1. After the data is collected, in order for it to be analyzed in SPSS it has to be cleaned. If you do formhub, you’ll notice that the data you get doesn’t quite fit in with the format SPSS needs. So, I spend a few hours before we meet as a class to look at the data that was collected and analyze it.
  2. This year, Formhub.org seems to be moving painfully slow. I’ve had trouble last week getting the website to work. And am still having trouble this week. With data collection set to start tomorrow, I am stressing that it may not work! – update: I’ve read in several places about ongoing stability issues with Formhub. I’m now using ONA.io instead which works the exact same way! I’ve updated verbiage above to reflect that.

I’ve provided a copy of the assignment below. Enjoy!

On my next post, I will provide info on programming surveys into the XLS forms format, which is a bit tricky. I spend a day in class teaching this. I’ll also show you how to load the surveys onto the iPads and get them synced up to the computer if you aren’t on WiFi when you collect the data.

photo: CC by Sean MacEntee

Syllabi Spring 2015: Communication Research and Writing Across Platforms classes

The semester is underway!

I have shared select syllabi every semester since I started this blog. A lot of people contact me asking me for my syllabi and for class assignments. And thus I am happy to continue the trend. You can find all past syllabi from the menu on the left! I’m so glad that folks enjoy these and find them useful!

This semester, two classes I will discuss are my Writing Across Platforms and my applied Communication Research class. I’ve talked about assignments, activities, and perspectives on both in the past (see posts about them under the menu on the left. Blog Topics->Teaching Social Media->Classes).

I have not changed each class all that much since last teaching them. So I’ll spare reviewing each in depth. But here are a few changes or little things worth mentioning:

Writing Across Platforms

Facebook – In the last post I wrote about my decision to continue to teach Facebook in this class.

– Mobile – This is a packed class and it is hard to add without taking something else away. I’ve squeezed in a little time to focus on mobile and writing for mobile. I have an exercise planned where I will bring in our department iPads and have students explore the look and feel of their writing as read from mobile devices. As more and more people rely on mobile devices to read, it is important that we emphasize the medium, its affordances, and its limitations.

– Concise Writing – I am placing more emphasize on conciseness in writing. This is something we all struggle with. I know I do. While it has always been important, shorter attention spans, mobile and digital platforms, and the high-stakes competition for reader attention necessitate saying more with less. We’ll do exercises where students help one-another find the shortest, most powerful way to communicate. There’s also a fun website I am incorporating that can help with writing. I will give it its own post in the future.

– PitchEngine – I used PitchEngine the past two years for my social news release. I haven’t blogged about PitchEngine much. But I’ll be sure to do so this semester. I always try to bring in industry software when possible. And the awesome people at PitchEngine have been very helpful. I’m excited we’ll be using PitchEngine again this year. PitchEngine has undergone exciting changes since last year. And I’ll be adapting my social news release assignment accordingly. Note: Dr. Gallicano and Dr. Sweetser have a great guideline for teaching the social media release.

Communication Research

I made some minor tweaks and improvements to how I’ll present content, and streamlined a few assignments. In a tough class like this, I provide a lot of handouts – such as for how to structure a literature review, methods, results, and discussion section. I worked hard to simplify and clarify those.

I’ve been chatting with colleagues about changes and advancement in social data analysis. I’m hoping to incorporate them into this class in a future semester. To do so, I will need time to dedicate to exploring these options this semester. Thus, I’m presently sticking with my same 3-project model I wrote about last year. Hopefully I’ll have a brand new social media analysis assignment for Spring 2016.

This semester I promise to do something I failed to do last year – blog about our final project in the Comm Research class where students use iPads to collect survey data around campus. I love this project and hope you do too.

Below are the syllabi. A happy start to the semester to all! – Matt

Writing Across Platforms:

Communication Research

Sentiment Analysis using Content Analysis Software: Project Assignment

In the last two posts, I’ve been discussing the Yoshikoder sentiment analysis project in my Communication Research class here at Shepherd University.

My first post looked at the project in general. And the second, most recent post, looked at how to teach computer-assisted content analysis using the Yoshikoder computer-assisted content analysis software and the activities I provide my students to prepare them for the project.

I encourage you to check out those posts for background and set up! Ok, now on to sharing the assignment itself and providing a brief overview of it.

As I’ve stated elsewhere, the purpose of this assignment is to

1) give students a hands-on look under the hood of sentiment analysis – that is, to understand HOW it works and its flaws.

2) To teach students via hands=on experience about quantitative content analysis, particularly computer-assisted content analysis

3) To teach them how to conduct a computer-assisted content analysis using software (Yoshikoder)

So here’s the set up to the assignment (which you can see below). This hands-on learning project is based on a real brand and a realistic but made up scenario. I do this with both this assignment, and my first project in this class.  Specifically, I provide The Situation or Problem / Campaign goals and objectives (of an imaginary campaign that is ongoing or happened) / benchmarks / KPIs.

In this case, the situation had to do with a popular online retail brand and rising customer complains and dissatisfaction as the brand has grown beyond its core base of loyal customers in recent years.I’ve redacted the brand and the situation from the below assignment. But you can fill in your own.

I rely on Stacks (2011) model for writing the problem, goals, objectives.  While I provide the research objective(s) in my first project, in this project students must come up with the research objective(s) and RQ(s).

I then provide some benchmarks. In this scenario, at a certain point in time sentiment was strong (let’s say, 70% positive). And then after the hypothetical situation, it dropped (say, to 50%). The students have been recently introduced to the concepts of benchmarks and KPIs via a brief lecture, so this is their first experience with these concepts. They are given 1 KPI (let’s say 65% positive sentiment) against which to measure their success. Keep in mind that the situation assumes that a campaign already took place aimed at addressing decreased customer satisfaction and negative comments on Twitter addressed at the brand of choice. We are now seeking to assess whether this campaign that happened successfully increased sentiment towards the brand (at a deeper level, repaired relationships and the image of the brand among the online community).

There are other important considerations students must make:

1) Since we’ve discussed sentiment and its flaws, they need to think about the valence of sentiment (The AFINN dictionary scores terms from -5 to +5), and they need to research and understand how AFINN was designed and works (I provide some sources to get them started). If you’re not familiar with the AFINN dictionary, it was designed for sentiment analysis of microblogs.It is a free sentiment dictionary of terms you can download and use in Yoshikoder. 

For more details on the assignment, check out the assignment embedded below and the requirements for what must be turned in.

As I’ve noted in a previous post, this project isn’t perfect. But it is a fairly straightforward and accessible learning experience for students who are in their first semester of experiencing how research can be conducted. It covers a wide array of experiences and learning opportunities – from discussion of what sentiment is, to understanding its flaws, to understanding the flaws of quantitative content analysis, to learning to apply a number of key research terms, as well as providing exposure to how to write research reports. The project itself is bolstered by several lectures, it comes about 1/2 way through the semester, and takes several days in the classroom of hands on learning. Students of course finish the writing up outside of class. But we do the analysis all in class to ensure students are getting my help as the “guide on the side.”

My previous post covers some activities we do to build up to this assignment.

So that’s all for now! Please feel to use this assignment, to modify it, and improve it. If you do, come back and share how you have or how you would improve upon it and modify it in the comments below!

If you want to know more about my Communication Research class, please see this post which includes the syllabus.

Teaching Computer-Assisted Content Analysis with Yoshikoder

Last blog post I discussed the second project in my applied research class, a sentiment analysis of Tweets using Yoshikoder – a free computer-assisted content analysis program from Harvard.

As promised, I want to share my assignment, and my handout for students that teaches them how to use Yoshikoder. Before we do the project, however, I do a brief in class activity to get students learning how to use Yoshikoder. So let’s start there for today’s post. And next post, I’ll share the assignment itself.

PART 1: THE SET UP

What I like to do, is present the problem to the students via the project assignment. Then, we go back and start learning what we’d need to do to solve the problem. So, after lecturing about what sentiment analysis is and why it is important, I get students introduced first to the idea of constructing a coding sheet for keywords by taking a list of keywords and adding them to categories.

First, we talk about the idea in class, and I show them some simple examples, like: If I wanted to code a sample for the presence of “sunshine” – what words would I need? Students brainstorm things like  start, sun, sunny, sunshine, etc., etc.

We discuss the importance of mutual exclusivity, being exhaustive, etc.

I show an example from my dissertation which looked at agenda setting topics on Twitter.

On the class day before I introduce Yoshikoder to the class, students do a practice assignment where I give them a list of random terms related to politics and elections. They then have to create “positive” and “negative” content categories using the terms. The terms aren’t necessarily well fit for this exercise, which gets them thinking a bit… They then hand code a sample of Tweets I provide about two different politicians. I tend to use the most recent election. So, in this case Obama and Romney. They are frustrated by having to hand code these Tweets – but a little trick is to do a search for the exact phrases in the Tweet files on the computer and they are done fairly quickly. Ok, so on the next class period:

1) Practice with Yoshikoder We do the same basic task, but this time they learn to program their “positive” and “negative” categories into Yoshikoder. They then load the Tweets (which I have saved as a txt file) and analyze them for the presence of their positive and negative content categories. This is a great point to stop and have students assess the reliability between what they hand coded and what the computer coded. Often, there will be discrepancies. And this makes for a great opportunity for discussion.

Here is the activity that I use in class. I also provide Tweets that I’ve downloaded using the search terms for the politician/candidate I’m using in the activity (e.g., Obama; Romney) in plain text format so Yoshikoder can read it. Also, see the below handout which I provide students to show them how to use Yoshikoder and how to program, and run the analyses I just described.

As I mentioned above, I create a handout that I like to give students that explains the different functionalities of Yoshikoder and how to run the analyses. As I’ve discussed elsewhere, I like to provide handouts. And the one below isn’t one of my more elaborate handouts. But it provides a quick overview with some screen shots to show what buttons need to be clicked. This is super helpful if you are trying to learn Yoshikoder, or want to use it alongside the activity (discussed in this post or the project discussed in my last post, and which I will provide in my next blog post).


Enjoy! .

EDIT: The assignment is now up. See the post.

If you’d like to learn more about using Yoshikoder, I found this great tutorial:

– Cheers! Matt

Applied Research Class: Sentiment Analysis Project Reflection

I began this semester with the intention of blogging a bit about my applied research class. I provided an overview of it and a copy of the syllabus on an earlier post. But since writing that post, I’ve yet to do a follow up… until now.

Edit: There are 2 follow up posts to this post. 1) Looks at activities for this assignment, and 2) provides the assignment itself.

First, let me say that more and more I am trying to decrease my lecturing and spend more time in class with hands on learning, having my students learn by doing rather than just listening – sort of like the flipped classroom Gary Schirr has been discussing recently on his blog.   So this class is really pushing in class projects and experiential learning. Following this approach, in order to introduce students to research, I provided students with the instructions and a lot of structure for their first two projects.

I want to use our second research project as an example. Then, I’ll talk about the pros and cons. The second project was a sentiment analysis of Tweets about a brand I chose and a (realistic but not necessarily real) scenario.

My goals with this project were to teach students:

  1. About computer-assisted content analysis. We focused on how it is different from a hand-coded quantitative content analysis (which was the focus of our first project). And its strengths and weaknesses.
  2. How to do a basic computer-assisted content analysis using Yoshikoder, an easy to use, free App that works on Mac and PC. So my students can use it at home if needed!
  3. About sentiment analysis – what it is, why it is used by organizations to evaluate the online conversation about their brand, and its strengths and weaknesses.
  4. How to write up a research report (In the first project, I provided the project overview and requested results and discussion. In the second project, I added a literature review and methods section, and had them write the research objective and research question).

Why I chose to do this project this way: A number of social media analytics tools today are offering sentiment analysis.  There are also sites like socialmention.com that will provide you with a free sentiment analysis of a search term. But how are these analyses conducted? What are their strengths and weaknesses? Are they reliable? Do they mean anything at all? And what do we need to be careful of before accepting them, and thus drawing inferences from them?

So what I wanted my students to do, was to SEE how a sentiment analysis would be conducted by some of those high-price (or no price!) analytic tools. In other words, I want my students to get their hands dirty as opposed to allowing some distant and hidden algorithm to do the analysis for them. I believe gaining hands on experience with this project provides students a more critical lens through which to see and evaluate a sentiment analysis of social media messages.

The Set Up: I provide in the assignment: The Situation or Problem / Campaign goals and objectives (of an imaginary campaign that is ongoing or happened) / benchmarks / KPIs. In this case, the situation had to do with a popular online retail brand and rising customer complains and dissatisfaction as the brand has grown beyond its core base of loyal customers in recent years.

I provide students with the sample of about 1000 Tweets I downloaded and formatted to play nicely with Yoshikoder. The sample comprises of mentions of the brand. This ensures students are all looking at the same dataset, and streamlines (or eliminates I should say) the data collection process to help students focus on other elements of the assignment. For the sentiment analysis,

I rely on the AFINN dictionary, which was designed for sentiment analysis of microblogs. Students learn about what the AFINN is and a little about how linguistic analysis dictionaries are created through research. Students then analyze the Twitter dataset using the AFINN dictionary to determine the sentiment scores. There is no fancy stats being done here. By checking the sentiment analysis output, they simply determine if their KPI (which was a % of positive Tweets about the brand) was met. In this case, the result they are looking for is a % – so simple division. Not scary at all, no SPSS training needed (that comes with a later project).

They also look at the valence of the sentiment (with a range of + or -5) and explore the meaning of that. The students use this information, along with class lecture, other exercises on how to write research reports, etc., to produce their project #2 report.

Again, to reiterate an important point, we discuss the benefits and of this analysis as well as its real weaknesses. Students always bring up the fact that the results lack context – what if someone used the word “bad” meaning good? What about sarcasm? I show them how to use Yoshikoder to look at Keywords in context as a way of addressing this.

The Benefits and Drawbacks of This (and these types of) Projects As I said above,  I am really trying to move away from lecture in favor of experiential learning. Here are some things I’ve noticed. Some may be benefits, others drawbacks, and others a bit of both…

  • The focus on this project is not on the stats or the analysis and I provide a lot of the needed information – so it makes for a good ‘getting your feet wet’ project that teaches students other important elements of research.
  • It would be nice to teach them more advanced methods of analysis – but I do cover that a bit more later in the semester.
  • Students learn through their mistakes and from my feedback as opposed to me paving the way for them and simply asking them to drive down the smooth road.
  • I provide a LOT of handouts on how to write different sections of a research report, etc. They are detailed… sometimes too detailed and I fear students don’t read them because it is information shock.
  • Sometimes, I wish I had more time to teach them how to avoid the simple mistakes I see in their work, particularly their research reports. I say to myself, “oh man, I thought I told them how to do that.” Or, “Why didn’t you read the handout that explains how to structure this!?”
  • They likely won’t do sentiment analysis like this every again – but at least they’ll understand it!
  • They get to see the results for themselves and get a sense that they discovered the results.
  • Class time is busy – our class rushes by and we don’t always get to cover everything I want to. As a person who likes order and time management, I am having to “let go a little” and let things happen. This is helping me grow. I wonder if it is helping my students though…
  • I know I enjoy doing these sorts of projects a lot more than standing and lecturing, lecturing, lecturing about research. I feel it has made research a lot more “real” and hands on to them.

So that is my overview of the project in general, and some thoughts. It isn’t perfect but it seems to have gone well and I really enjoyed doing it. I’d love any feedback or suggestions you may have to make this the best possible experience for my students. And of course, feel free to adapt, modify, or improve upon this idea.

In an upcoming post(s), I’ll share the assignments (I want to move my documents over to SlideShare due to the pay wall on Scribd). And I will provide some basic info on how to use the Yoshikoder software.

Cheers! -Matt

Just a reminder: There are 2 follow up posts to this post. 1) Looks at activities for this assignment, and 2) provides the assignment itself.

photo CC by netzkobold

Here Are My Spring 2014 Syllabi: Writing and Research

The snow is coming down here in West Virginia! Classes are canceled today so I will be catching up on research and some other things. But let’s talk classes and syllabi!

In addition to the applied Communication Research class I am teaching this semester (discussed in the previous post) I’m also teaching a few other classes. 🙂 I want to quickly share some of my syllabi for the semester. I’ve uploaded syllabi for these classes to my Scribd account, which is where I host past syllabi and class assignments. Click the link below to see the syllabus. (You can also see all the below-described syllabi as well as past syllabi via the menu on the left, by mousing over “syllabi.”)

Comm 435: Communication Research – This class is discussed in depth in my previous post. Please read it to learn more about that class.

Comm 335: Writing Across Platforms – Changes from Fall 13 include: A lab day for greater access to press release examples and working with peers on the first press release assignment, I’ve re-organized and updated the related social media and blog writing assignments, and have shifted a few lectures around to more effectively deliver material. Other minor changes to make sure content is up to date. I’m also super excited that for our PitchEngine assignment this semester, all of our students will be temporarily upgraded from the free version of PitchEngine to the paid level thanks to the awesome people at PitchEngine! So, students will get experience with advanced functionality.

Hope you find these new syllabi helpful! If you share your syllabi online, please share in the comments below!