Post-training surveys are an indispensable part of running a successful training course. Feedback from participants can help you evaluate the effectiveness of your training, make improvements, and plan for future courses. Post-training surveys are the best way of enhancing your training and ensuring that it fulfills its goals and offers value for money.
But post-training surveys are only as good as the questions they contain. In this post, we’ll show you how to write effective post-training survey questions and show you examples and types from which to draw inspiration.
Regardless of the type of training you offer, all courses have the same basic goal; to help learners develop new skills and implement them as soon as possible. Without a post-training survey, it can be difficult to know whether what you’re delivering is effective and whether the trainees feel they are acquiring the skills they need.
Post-training surveys are especially effective when you use them as part of a model for assessing training effectiveness. One of the most well-known models is the Kirkpatrick model.
This popular training evaluation model has four levels of learning evaluation:
Post-training surveys can help you meet the Kirkpatrick Level 1 evaluation goal, Reaction, by measuring the participant’s reaction to the training they received. In this way, the surveys can help you evaluate the overall effectiveness of any training you offer.
There are two types of survey questions: objective and subjective. Knowing the difference between these two types of questions is the first step in writing effective post-training survey questions of your own.
Objective questions are factual by nature and their answers can, in theory, be independently verified. An example of an objective question would be:
“Over the past year, roughly how many training sessions have you attended?”
The answer doesn’t need to be precise; you could offer a range of rough answers, such as 0, 1-2, 3 or more. However, the answers should be factually accurate.
The other type of question is subjective questions. The answers to these types of questions will be based on what the survey-taker things or feels. A good example of subjective questions would be a Kirkpatrick level 1 ‘reaction’ post-training survey.
“Did the training meet your expectations?”
Even if you pose this as a single-choice or multiple choice question with a range of answers (such as below expectation, met expectations, exceeded expectations), the respondent’s answers couldn’t be factually verified. It’s their opinion, based on what they think, feel or believe.
There are a number of possible ways to have respondents answer questions. These are:
Here are some examples of each of these types of questions that you may find in a post-training survey.
Here at KodoSurvey, we define single-choice questions as those that ask the respondent to choose one answer from a number of alternatives. For example, a post-training survey for a GROW coaching course might be:
“Which characteristic should employee goals NOT have?”
A. Trying
B. Specific
C. Attainable
D. Realistic
E. Don’t know
We define this question as single-choice because the respondent is being asked to choose one answer. If you notice, Kodo Survey includes a “Don’t know” option by default on all single-choice and multiple choice questions to dissuade respondents from guessing the correct answer.
By contrast, a multiple choice question would ask the respondents to select a number of answers. For instance:
“Employees should be guided to pick goals that have which of the following characteristics?”
A. Specific
B. Attainable
C. Trying
D. Realistic
E. Don’t know
To answer this correctly, respondents would need to select A, B, and D, making it a multiple choice question. Single-choice and multiple choice questions are a common format for many post-training surveys because they limit the number of possible answers and this makes the survey data easier to analyze and work with.
Likert scale questions are a popular alternative to single-choice and multiple choice questions. This format asks respondents to assign a number (usually out of 5) to each question.
Assign a number between one and five to answer the following questions with one being the lowest score and five being the highest.
Post-training Ratings question examples
Ratings questions are similar to Likert scale questions, except the format uses smiley faces, stars or other graphics. If you’ve ever been handed a ‘smile sheet’ after a training session, you’ll be familiar with rating questions.
Assign a smiley face to answer each of the following questions. One face is the lowest score and five faces are the highest.
Closed-ended questions include yes/no questions and any question with one definite right answer. They help you gain specific data about the participants or certain areas of the course. They work well if you want to know specific things about the course and the trainees.
Open-ended questions are an important part of any post-training survey. They let you gain valuable feedback from the participants and understand whether the training was successful.
Now we’ve looked at the types of post-training survey questions and the range of responses, you’re ready to start writing your own. The problem is, it can be tricky to know which questions to include and how to phrase them.
We recommend following these three simple rules:
1. Be concise.
State your questions in the simplest possible way, with the fewest number of words. This reduces the chance that respondents will interpret the questions in different ways.
2. Write in a simple, conversational tone.
Imagine you are chatting with a colleague or coworker; how would you phrase the questions?
3. Test your survey
After writing your questions, test them out on friends or colleagues to make sure they are easy to understand. This helps you identify and address any problem areas.
Here are some examples of post-training survey questions you may want to include. Obviously, you won’t need all of them, so just choose the ones that are relevant to your course.
Expectations
These sample post-training survey questions will help you identify whether the training met the learner’s expectations.
“Did you have clear expectations for the course?”
(Closed question: Yes/No)
“If yes, what were your expectations of the course?”
(Open-ended: Provide space for a written answer)
“Did the course meet your expectations?”
(Single-choice: No, To some extent, Yes, Don’t know)
“How would you rate the pace of learning on the course?”
(Single-choice: Slow, Medium, Fast, Don’t know)
“Did the training cover the content you were expecting?”
(Single-choice: No, To some extent, Very much so, Don’t know)
Most training instructors run pre- and post-training assessments to measure whether learning took place on the course. This is especially true for companies following the Kirkpatrick training evaluation model as Level 2, Learning, explicitly looks at this area.
As we mentioned earlier, Kodo Survey includes the alternative “Don’t know” by default for all single-choice and multiple choice questions. This encourages respondents to select this option instead of guessing the answer to the question. Therefore, it is important to choose alternative answers that look equally likely to be correct to the ‘untrained eye’.
We suggest two ways to do this. Firstly, you can make the alternative answers contain roughly the same number of characters or words as the correct answer. Secondly, make the content look equally plausible to avoid a situation where the answer is obvious.
A poorly written question would be:
“What does the ‘R’ in the acronym GROW stand for?”
A. Row
B. Rap
C. Realistic
D. Red
E. Don’t know
A better question would be:
“What does the ‘R’ in the acronym GROW stand for?”
A. Realistic
B. Reality
C. Real-time
D. Rational
E. Don’t know
In this question, options A-D look equally likely to be correct for someone who hasn’t taken the training.
You may wish to include questions about participants’ experiences of working during the training. This will help you gain insights into how trainees felt, and whether they felt it was collaborative and interactive. Likert 5-scale questions are a good way to gain feedback about experiences. For example:
“Please rate your overall experience on the course”
(Likert 5-scale)
“Please rate the collaborative working opportunities on the course”
(Likert 5-scale)
One of the most areas of post-training survey feedback is asking the participants what they thought of the course content. This gives you ideas about ways you can improve the content for future training sessions.
“Overall, did the course content meet your expectations?”
(Rating question, out of 5 stars)
“Why did the course content meet or not meet your expectations?”
(Open-ended question)
“Rate the quality of the course content”
(Rating question, out of 5)
“Did you find the content easy to understand?”
(Likert 5-scale)
“Was there any content you were expecting but was missing? If so, please elaborate.”
(Open-ended question)
Some courses are built with solid training materials but suffer from a confusing structure or poor delivery. These questions can help you determine whether the delivery and structure were an issue for your training.
“How engaging was the course instructor?”
(Rating question, out of 5 stars)
“Did the course delivery meet your expectations?”
(Rating question)
“Was the content clearly presented?”
(Closed question, yes/no)
“Did the course have a clear structure?”
(Closed question, yes/no)
“Did you find any aspect of the course unclear or confusing? Please elaborate.”
(Open-ended question)
If you are using a Learning Management System (LMS) to facilitate the question, it’s worth asking participants for feedback on this aspect of the training. A confusing, poorly designed, or faulty LMS can ruin an otherwise sound training course.
“How would you rate your experience with multimedia on the course?”
(Likert 5-scale)
“How helpful did you find the multimedia on this course?”
(Likert 5-scale)
“Did you experience any technical issues? Is so, please explain.”
(Open-ended question, provide space for an answer)
“Please rate the quality of the multimedia (audio, video, and animation)”
(Rating question)
“In what ways could the multimedia used on this course be improved. Please explain. (Likert 5-scale)
(Open-ended question, provide space for an answer)
We hope that our guide to writing effective post-training survey questions will help you write some of your own. Overall, the most important tip is to continually test your questions on friends, colleagues, and coworkers to ensure that everyone interprets the questions in the same way. This will give you high-quality data that you can use to enhance the training.
To find out more, download our white paper, Determining and optimizing the impact of your training and development, or book a meeting with one of our experts.
Related Posts