Hootsuite University and social media education research to be presented at #AEJMC15

#AEJMC15 is just around the corner! This year I am truly thrilled to be traveling to San Francisco to co-present a study about social media education in the college classroom.

sanfran

Our study, titled “Hootsuite University: Equipping Academics and Future PR Professionals for Social Media Success”, investigated perceptions among students, faculty, and professionals of the social media certification higher education program, Hootsuite University, as part of a college social media course (I’ve written a bit about my own use of Hootsuite University in my social media class in the past).

The paper will be presented at the Top Teaching Papers session @ 9:15am, Sunday August 9 in Salon 15 (Conference program).

On this project, I had the pleasure of working with some truly awesome social media professors (Emily Kinsky, Karen Freberg, Carolyn Mae Kim, and William Ward). If you do not follow these folks, I strongly recommend it. They are great educators and inspiring resources for social media education.

Come see our presentation to learn more about our study and our findings. Tweet at me @mjkushin and please come say hello in person. I always love to meet friends and colleagues from the web.

Also, this year I’m excited to have been recruited to join the Public Relations Division Social Media Team. I’ve always loved the social media sharing the PRD does and their yearly coverage of the AEJMC conference leads the field. I’m looking forward to meeting the fellow team members and helping plan some great content for the upcoming year.

Hope to see you @ #AEJMC15!

-Matt

photo: CC

Teaching Students to Use iPads for Survey Data Collection (2 of 2)

In my last post, I wrote about a Comm Research project where students use iPads for survey data collection.This is my favorite of the 3 projects we do in my Communication Research Class (see all posts on Comm 435; see syllabus).

This week, I want to follow up by discussing how to program the surveys to work on the iPads. I’ll talk through how I teach all of this in class and through activities.

Lastly, I’ll explain how I prepare the data for use in SPSS.

Once students have created their surveys, we need to get them onto ONA.io

Programming surveys to work on ONA.io – the free, open-source tool used by my class and researchers around the world – is a little tricky. It follows XLS formatting. Once you get the hang of it, it is super easy. And it is quick to teach and learn.

I go over this online Lab Guide (http://bit.ly/435_lab_digitalsurvey) that I created on how to program XLS forms in class. I then provide students with a practice activity to create a survey in Excel or Google Spreadsheets. The activity asks students to create:

1) A question of how many years they are in school

2) A check all that apply question – I usually pick something fun like their favorite movies from a list

3) A likert-style question. Ex: How much they like binge-watching on Netflix.

In sum, they practice creating an integer, select_multiple, and select_one question.

Once students get the hang of it, they log into an ONA.io account I create for the class. Next, they upload their practice survey to test in class using our department’s iPads. But, this could be done on a phone or even a computer itself (Instructions on how to do this are in the lab guide).

The #1 thing, is that things have to be done exactly in this formatting. So, little errors like forgetting to put an _ (and putting a space instead) for “list_name” will result in ONA.io kicking the survey back and telling you there is an error. If a mistake is made, no problem. Just fix your form and re-upload.

I check to make sure everything is done correctly. This saves time when they program their own surveys. If everything is good, I give students lab time to work on formatting their surveys and help out as needed.

After everything has been uploaded successfully – this usually takes time outside of class, so I make it due the following class – students are ready to go out into the field. This is where the fun happens!

Students always get great feedback when they use iPads to collect survey data. People tend to be interested in what they’re doing and happy to participate. Some students this year told me that people came up to them around campus and asked if they could participate. That is much different than the usual online survey where we often struggle to get respondents! I can’t express how rewarding it is to see students go out into the field, collect data, and come back having gathered data no one else has before. For most of them, this is their first time doing data collection of any kind. And so while the class is tough and a lot of work, it is rewarding. You can see the ‘aha’ moments the students have when they start drawing inferences from their data.

Preparing Data for Analysis in SPSS

If you only want to look at summaries of responses, you can check that out in ONA.io. But, if you want to analyze the data you’ve got to get it from the way students labeled it to the #s for SPSS.

For example, in the below example where the question asks the participant their favorite ice cream, if the ‘choices’ in our XLS code is:

Lab_Guide_-_FormHub_-_Google_Docs

And the participant answers “Vanilla” the data collected would be icecream2.

But, SPSS can’t analyze “incecream2.” It can only analyze a number. So, we need every instance when a participant selected Vanilla to be recorded as simply “2” in SPSS.

Here’s how to quickly do this:

Download the data Excel file of the completed surveys. Open in Excel. Replace “icecream” with “” (that is, with nothing – no spaces. Just leave the replace section blank). Excel will remove “icecream” from the Excel file and you’re left with the number for responses such that “icecream2” now is “2”. Repeat this step for each question. For check all that apply questions, ONA.io records “FALSE” for answer choices left blank, and “TRUE” for instances when the participant checked the answer choice. For example, if the question was “Check all your favorite ice cream flavors” and the participant checked “Vanilla,” ONA would record a “TRUE” and if they left it blank, ONA would record “FALSE.” These can be easily prepared for SPSS by replacing FALSE with “0” and TRUE with “1”.

Admittedly, this step is the drawback of using XLS forms. While a little tedious, it is quick and easy to do. Considering the advantages, I don’t mind taking 20 minutes of my time cleaning the data for my students.

When done, I send the student teams their data and we work on analyzing them in class.

 

Well that’s all for now! I hope you enjoyed this tutorial and consider using iPads for survey data collection in your research class, or other classes where surveys could prove valuable!

Here at Shepherd, finals week starts this week. I hope everyone has a great end to the semester!

Using iPads for Survey Data Collection in the Communication Research Class

Surveys are a common method uses in communication research class projects. Since I started teaching this class at Shepherd University, I’ve added a fun, cool feature that really brings the survey data collection process to life!

Students in my Comm 435 Communication Research class (see all posts on Comm 435; see syllabus) now use iPads for data collection in the field. My students grab a department iPad and go around campus to recruit participants. The participants complete the surveys on the iPads, and the data is synched to the cloud where it can be downloaded and analyzed.

ipadsurveys

Overview

For the final of three hands-on projects in my class, student teams identify a problem or question they have pertaining to Shepherd University or the local community. They design a study to research that problem. In my first two hands-on projects, students don’t design the methods or the measurements. They are based on scenarios I set up and materials I provide. For example, here’s a discussion of my computer-assisted content analysis assignment.

As a part of the assignment for today’s post, students are required to conduct 1) surveys, and 2) either focus groups or interviews. Let’s talk about the surveys:

After discussing surveys as a method, with a particular focus on survey design and considerations, each team designs a brief survey.

In the lecture before they create the survey, I lecture on important considerations in survey design. And then students do an in class activity to practice putting these concepts into motion using a mock scenario. I then provide feedback on their survey design, and help them make improvements.

The class the following time we meet is dedicated to helping students design measurements that meet the research objective and research questions they’ve developed that will help them get the answers to the questions they want to know. The day is also dedicated to helping them write effective survey questions (as well as interview or focus group questions, for that part of the assignment). I started dedicating an entire class period to measurement design after spotting this as a major weakness in the projects last semester.

Next, rather than using paper & pen, or surveymonkey.com (which limits students to only 10 questions), teams program their surveys into ONA.io. It is a free, open access web survey tool designed by folks at Columbia University. So, we spend the 3rd day learning how to use ONA.io to program their surveys. I’ll talk in detail about that in the next post.

During data collection week, students check out department iPads, load the survey onto their iPad, and go out into the field to collect data. A group of students will check out several iPads and hit up the student union, library, or campus quads and collect data fairly quickly. The data syncs automatically over our campus-wide wifi! That means, when all students get back to the computer lab, their data – from each iPad used – is already synced to ONA.io where it could be downloaded and analyzed.

Pretty cool, huh? It is my favorite project that we do in my communication research class and the students seem to really enjoy using the iPads for surveys.

There are a few caveats.

  1. After the data is collected, in order for it to be analyzed in SPSS it has to be cleaned. If you do formhub, you’ll notice that the data you get doesn’t quite fit in with the format SPSS needs. So, I spend a few hours before we meet as a class to look at the data that was collected and analyze it.
  2. This year, Formhub.org seems to be moving painfully slow. I’ve had trouble last week getting the website to work. And am still having trouble this week. With data collection set to start tomorrow, I am stressing that it may not work! – update: I’ve read in several places about ongoing stability issues with Formhub. I’m now using ONA.io instead which works the exact same way! I’ve updated verbiage above to reflect that.

I’ve provided a copy of the assignment below. Enjoy!

On my next post, I will provide info on programming surveys into the XLS forms format, which is a bit tricky. I spend a day in class teaching this. I’ll also show you how to load the surveys onto the iPads and get them synced up to the computer if you aren’t on WiFi when you collect the data.

photo: CC by Sean MacEntee

Syllabi Spring 2015: Communication Research and Writing Across Platforms classes

The semester is underway!

I have shared select syllabi every semester since I started this blog. A lot of people contact me asking me for my syllabi and for class assignments. And thus I am happy to continue the trend. You can find all past syllabi from the menu on the left! I’m so glad that folks enjoy these and find them useful!

This semester, two classes I will discuss are my Writing Across Platforms and my applied Communication Research class. I’ve talked about assignments, activities, and perspectives on both in the past (see posts about them under the menu on the left. Blog Topics->Teaching Social Media->Classes).

I have not changed each class all that much since last teaching them. So I’ll spare reviewing each in depth. But here are a few changes or little things worth mentioning:

Writing Across Platforms

Facebook – In the last post I wrote about my decision to continue to teach Facebook in this class.

– Mobile – This is a packed class and it is hard to add without taking something else away. I’ve squeezed in a little time to focus on mobile and writing for mobile. I have an exercise planned where I will bring in our department iPads and have students explore the look and feel of their writing as read from mobile devices. As more and more people rely on mobile devices to read, it is important that we emphasize the medium, its affordances, and its limitations.

– Concise Writing – I am placing more emphasize on conciseness in writing. This is something we all struggle with. I know I do. While it has always been important, shorter attention spans, mobile and digital platforms, and the high-stakes competition for reader attention necessitate saying more with less. We’ll do exercises where students help one-another find the shortest, most powerful way to communicate. There’s also a fun website I am incorporating that can help with writing. I will give it its own post in the future.

– PitchEngine – I used PitchEngine the past two years for my social news release. I haven’t blogged about PitchEngine much. But I’ll be sure to do so this semester. I always try to bring in industry software when possible. And the awesome people at PitchEngine have been very helpful. I’m excited we’ll be using PitchEngine again this year. PitchEngine has undergone exciting changes since last year. And I’ll be adapting my social news release assignment accordingly. Note: Dr. Gallicano and Dr. Sweetser have a great guideline for teaching the social media release.

Communication Research

I made some minor tweaks and improvements to how I’ll present content, and streamlined a few assignments. In a tough class like this, I provide a lot of handouts – such as for how to structure a literature review, methods, results, and discussion section. I worked hard to simplify and clarify those.

I’ve been chatting with colleagues about changes and advancement in social data analysis. I’m hoping to incorporate them into this class in a future semester. To do so, I will need time to dedicate to exploring these options this semester. Thus, I’m presently sticking with my same 3-project model I wrote about last year. Hopefully I’ll have a brand new social media analysis assignment for Spring 2016.

This semester I promise to do something I failed to do last year – blog about our final project in the Comm Research class where students use iPads to collect survey data around campus. I love this project and hope you do too.

Below are the syllabi. A happy start to the semester to all! – Matt

Writing Across Platforms:

Communication Research

Applied Research Class: Sentiment Analysis Project Reflection

I began this semester with the intention of blogging a bit about my applied research class. I provided an overview of it and a copy of the syllabus on an earlier post. But since writing that post, I’ve yet to do a follow up… until now.

Edit: There are 2 follow up posts to this post. 1) Looks at activities for this assignment, and 2) provides the assignment itself.

First, let me say that more and more I am trying to decrease my lecturing and spend more time in class with hands on learning, having my students learn by doing rather than just listening – sort of like the flipped classroom Gary Schirr has been discussing recently on his blog.   So this class is really pushing in class projects and experiential learning. Following this approach, in order to introduce students to research, I provided students with the instructions and a lot of structure for their first two projects.

I want to use our second research project as an example. Then, I’ll talk about the pros and cons. The second project was a sentiment analysis of Tweets about a brand I chose and a (realistic but not necessarily real) scenario.

My goals with this project were to teach students:

  1. About computer-assisted content analysis. We focused on how it is different from a hand-coded quantitative content analysis (which was the focus of our first project). And its strengths and weaknesses.
  2. How to do a basic computer-assisted content analysis using Yoshikoder, an easy to use, free App that works on Mac and PC. So my students can use it at home if needed!
  3. About sentiment analysis – what it is, why it is used by organizations to evaluate the online conversation about their brand, and its strengths and weaknesses.
  4. How to write up a research report (In the first project, I provided the project overview and requested results and discussion. In the second project, I added a literature review and methods section, and had them write the research objective and research question).

Why I chose to do this project this way: A number of social media analytics tools today are offering sentiment analysis.  There are also sites like socialmention.com that will provide you with a free sentiment analysis of a search term. But how are these analyses conducted? What are their strengths and weaknesses? Are they reliable? Do they mean anything at all? And what do we need to be careful of before accepting them, and thus drawing inferences from them?

So what I wanted my students to do, was to SEE how a sentiment analysis would be conducted by some of those high-price (or no price!) analytic tools. In other words, I want my students to get their hands dirty as opposed to allowing some distant and hidden algorithm to do the analysis for them. I believe gaining hands on experience with this project provides students a more critical lens through which to see and evaluate a sentiment analysis of social media messages.

The Set Up: I provide in the assignment: The Situation or Problem / Campaign goals and objectives (of an imaginary campaign that is ongoing or happened) / benchmarks / KPIs. In this case, the situation had to do with a popular online retail brand and rising customer complains and dissatisfaction as the brand has grown beyond its core base of loyal customers in recent years.

I provide students with the sample of about 1000 Tweets I downloaded and formatted to play nicely with Yoshikoder. The sample comprises of mentions of the brand. This ensures students are all looking at the same dataset, and streamlines (or eliminates I should say) the data collection process to help students focus on other elements of the assignment. For the sentiment analysis,

I rely on the AFINN dictionary, which was designed for sentiment analysis of microblogs. Students learn about what the AFINN is and a little about how linguistic analysis dictionaries are created through research. Students then analyze the Twitter dataset using the AFINN dictionary to determine the sentiment scores. There is no fancy stats being done here. By checking the sentiment analysis output, they simply determine if their KPI (which was a % of positive Tweets about the brand) was met. In this case, the result they are looking for is a % – so simple division. Not scary at all, no SPSS training needed (that comes with a later project).

They also look at the valence of the sentiment (with a range of + or -5) and explore the meaning of that. The students use this information, along with class lecture, other exercises on how to write research reports, etc., to produce their project #2 report.

Again, to reiterate an important point, we discuss the benefits and of this analysis as well as its real weaknesses. Students always bring up the fact that the results lack context – what if someone used the word “bad” meaning good? What about sarcasm? I show them how to use Yoshikoder to look at Keywords in context as a way of addressing this.

The Benefits and Drawbacks of This (and these types of) Projects As I said above,  I am really trying to move away from lecture in favor of experiential learning. Here are some things I’ve noticed. Some may be benefits, others drawbacks, and others a bit of both…

  • The focus on this project is not on the stats or the analysis and I provide a lot of the needed information – so it makes for a good ‘getting your feet wet’ project that teaches students other important elements of research.
  • It would be nice to teach them more advanced methods of analysis – but I do cover that a bit more later in the semester.
  • Students learn through their mistakes and from my feedback as opposed to me paving the way for them and simply asking them to drive down the smooth road.
  • I provide a LOT of handouts on how to write different sections of a research report, etc. They are detailed… sometimes too detailed and I fear students don’t read them because it is information shock.
  • Sometimes, I wish I had more time to teach them how to avoid the simple mistakes I see in their work, particularly their research reports. I say to myself, “oh man, I thought I told them how to do that.” Or, “Why didn’t you read the handout that explains how to structure this!?”
  • They likely won’t do sentiment analysis like this every again – but at least they’ll understand it!
  • They get to see the results for themselves and get a sense that they discovered the results.
  • Class time is busy – our class rushes by and we don’t always get to cover everything I want to. As a person who likes order and time management, I am having to “let go a little” and let things happen. This is helping me grow. I wonder if it is helping my students though…
  • I know I enjoy doing these sorts of projects a lot more than standing and lecturing, lecturing, lecturing about research. I feel it has made research a lot more “real” and hands on to them.

So that is my overview of the project in general, and some thoughts. It isn’t perfect but it seems to have gone well and I really enjoyed doing it. I’d love any feedback or suggestions you may have to make this the best possible experience for my students. And of course, feel free to adapt, modify, or improve upon this idea.

In an upcoming post(s), I’ll share the assignments (I want to move my documents over to SlideShare due to the pay wall on Scribd). And I will provide some basic info on how to use the Yoshikoder software.

Cheers! -Matt

Just a reminder: There are 2 follow up posts to this post. 1) Looks at activities for this assignment, and 2) provides the assignment itself.

photo CC by netzkobold

Here Are My Spring 2014 Syllabi: Writing and Research

The snow is coming down here in West Virginia! Classes are canceled today so I will be catching up on research and some other things. But let’s talk classes and syllabi!

In addition to the applied Communication Research class I am teaching this semester (discussed in the previous post) I’m also teaching a few other classes. 🙂 I want to quickly share some of my syllabi for the semester. I’ve uploaded syllabi for these classes to my Scribd account, which is where I host past syllabi and class assignments. Click the link below to see the syllabus. (You can also see all the below-described syllabi as well as past syllabi via the menu on the left, by mousing over “syllabi.”)

Comm 435: Communication Research – This class is discussed in depth in my previous post. Please read it to learn more about that class.

Comm 335: Writing Across Platforms – Changes from Fall 13 include: A lab day for greater access to press release examples and working with peers on the first press release assignment, I’ve re-organized and updated the related social media and blog writing assignments, and have shifted a few lectures around to more effectively deliver material. Other minor changes to make sure content is up to date. I’m also super excited that for our PitchEngine assignment this semester, all of our students will be temporarily upgraded from the free version of PitchEngine to the paid level thanks to the awesome people at PitchEngine! So, students will get experience with advanced functionality.

Hope you find these new syllabi helpful! If you share your syllabi online, please share in the comments below!