Event app surveys have streamlined the event survey process, boosted response rates and provided event planners and organisers with insightful reporting. While technology improves the distribution and collection of surveys and polls, it’s worth bearing in mind that they’re being viewed primarily on mobile devices. This means smaller screens and more distractions.
It is therefore critical to pose the right questions in the right format to optimise response rates and data quality.
The question length, number of questions, and the detail in each question all have an impact on the feedback you’re gathering. If the survey is too long, attendees are less likely to complete it. If the survey questions are too long, attendees are less likely to read it. And if there isn’t enough detail in the question, the answers won’t be useful anyway.
The following guidelines are worth keeping in mind:
-
Keep surveys short and to the point.
-
Keep survey questions as concise and specific as possible without excluding anyone.
-
Keep to one topic per survey or poll.
Poor Questions
Quality responses from your attendees are a valuable commodity, so getting the most out of every question is important. In addition to avoiding long and vague questions, also watch out for:
-
Poor question phrasing or jargon (e.g. What did you think about the use of native event apps at the conference?)
-
Leading survey questions (e.g. How remarkable did you find the keynote speech?)
-
Asking questions that don’t matter or are less of a priority. Understanding your audience is integral to understanding their feedback. You may like to have feedback on the catering, but is it more important to find out which sessions provided the best technical information?
Good Questions
In addition to keeping surveys and polls simple and limited to one subject, make sure your questions:
-
Focus on things that you have control of at the event (e.g. Multiple choice question asking if a particular session was run at an appropriate time).
-
Provide attendees with the opportunity to qualify their answers. A simple follow up question with an open-text response asking why they answered the way they did will give valuable insight into yes/no or multiple choice responses.
-
Have a balanced and full range of responses so nobody feels forced into an answer that doesn’t accurately represent their feelings (which will compromise your data.) If it’s possible they don’t know, let that be an answer. ‘Don’t know’ responses can be useful in themselves e.g. has information been properly communicated? Was the agenda clear enough? Were session descriptions accurate?