Student success and engagement are inextricably linked. Engaged students learn more, persist more, and participate as part of the school community more — both as current students and in the future as alumni. In fact, one report found that students who report having a great university experience are 31x more likely to be proud alumni and are 51% more likely to recommend their institution to family and friends. Conversely, disengaged students tend to have poorer academic performance and are more likely to stop out — and often, schools don’t see it coming. It’s no wonder that increasing engagement is top of mind for many higher ed leaders.
The best strategy to improve engagement is to track student sentiment at scale and act on what you learn. To make this as effective as possible, this should be an ongoing process with frequent student contact.
So what does this look like in practice for an innovative — but time-strapped — higher ed leader?
Designing a student engagement pulse check that actually works
First and foremost, getting the cadence right is key. You want to be able to track any ebbs and flows in student engagement throughout the year. If you’re relying on annual surveys, you’re likely to react too late or miss an opportunity to act entirely. And if you only survey your students at the beginning and end of the academic year, you risk only hearing from them when they’re riding the high of starting a new educational journey or closing out a chapter that brings them year closer to their goals. Meanwhile, weekly surveys may lead to students experiencing survey fatigue, and are too frequent to demonstrate that you’re acting on all the feedback.
Feeling a bit like Goldilocks trying to figure out what’s just right? We’ve got you. We’ve found that monthly pulse checks are an excellent balance, allowing you to regularly check in on how students feel without overwhelming the.
Timing is only part of it. You also need to know what to ask. Your pulse check questions should be targeted and relevant — you may need alternative versions for different student groups depending on the time of year. But you need to ensure that the pulse checks aren’t so granular that distributing and analyzing them becomes too much of an administrative burden.
Again, it’s all about finding that balance. At Mainstay, we’ve modeled our new Pulse Checks on three research-backed factors which contribute to student success:
- Sense of belonging:
Students’ perception that they are connected to and supported by the school and community. (Not-so-fun fact… did you know that only 12% of students report feeling like they totally belong?)
Students’ belief in their ability to accomplish a task or achieve a goal.
Students’ enthusiasm for and engagement in their studies, perception of the quality of their learning environment, and degree to which they feel the school is invested in their success.
Taking your Pulse Check strategy to the next level
The goal of tracking student engagement is to truly understand sentiment across the student body, not just from students who are willing to set aside time to complete a long and involved questionnaire.
Here are a few things to keep in mind to maximize your Pulse Check’s reach and hear from as many students as possible.
1. Communication and follow-up
People are more likely to respond if they understand how their responses will be used. Your students want to know that their voice will be heard — no one wants to send feedback into a black hole. Let your students know to expect periodic surveys and explain their purpose: to help the school learn how to most effectively support them. And most importantly, make sure students see you following up on what you learn. If you start a new initiative due to feedback from the survey, tell them!
Quick, convenient check-ins are more likely to be completed. If you are reaching students monthly, our recommendation is to ask no more than 5 questions. Resist the temptation to stuff pulse surveys with tons of questions — you’ll see diminishing returns.
3. Question type
Open text prompts are invaluable when it comes to gathering feedback, although they usually take more time to complete — which means they’re less likely to be answered. You can get away with more multiple choice questions than open-ended ones. Plus, multiple choice questions are easier to analyze at scale.
4. Tools and distribution
Where you build and distribute questions influences how students will respond. One option is to create web-based surveys with tools like Qualtrics or SurveyMonkey and distribute the link through email. Or, you can use Mainstay to directly survey students through text messages — instead of needing to click through to a survey, students can receive and respond to each question as a text. We’ve written before about the benefits of texting over emailing students, but to summarize: students are much more likely to respond to texts and answer more quickly.
Closing the loop with listening-based leadership
Picture it – you’ve gained a nearly real-time pulse on student sentiment. You’ve measured student engagement and gathered feedback. You have an initial internal benchmark against which you can track your progress.
It’s time to close the loop and act on the insights you’ve uncovered.
Consider the following example:
Let's talk engagement
Mainstay takes the guesswork out of student engagement. Like everything we do, our Pulse Checks are built on a foundation of research into what drives student outcomes. We’ve done the legwork so you can move beyond theory and onto systematically listening to and acting on student sentiment at scale.
If we’ve piqued your interest, let’s chat about student engagement at your institution.
Not ready to chat yet, but want to learn more about student engagement? Check out our white paper, Defining Student Engagement.