If you’ve attended a conference, participated in a workshop, or taken a class, you’ve probably filled out a feedback form. Typically, this happens right after the end of the event.

But for speakers, trainers, and teachers who are serious about change, I think it’s time to put an end to the single, long feedback form. Here’s an alternative approach.

Change when you ask

Many learning models rely on what Paul Pimsleur called “graduated interval recall” in a 1967 paper. Also known as spaced repetition, the approach suggests that “a teacher should recall the item very frequently right after it is first presented, though interspersed with other activities which take the student’s mind off of it between recalls. Then he should continue recalling it with decreasing frequency during the succeeding days and weeks.”

Unfortunately, most trainers seek feedback just once. A one-time evaluation may be fine for a teacher who only shares info-snacks (i.e., little tidbits of information, such as “tips and tricks” or “30 apps you can use”). But, trainers who desire deeper learning–sustained change that shifts mental models or introduces new ways to work–will want to measure impact over a longer period of time.

So, replace your long feedback form with a series of shorter surveys. Ask a few questions immediately after a session, but also get permission to send participants a series of short surveys at appropriate intervals later. For example, after a conference workshop session, you might send additional surveys five days later, 25 days later, and four months later.

SEE: On-the-job training: How to develop IT skills that translate into results (Tech Pro Research)

Change what you ask

In the 1950s, Donald Kirkpatrick developed a four-level model to evaluate training. The widely-used Kirkpatrick Model evaluates training on four levels: Reaction (level 1), Learning (level 2), Behavior (level 3), and Results (level 4). Each level measures an increasingly meaningful level of impact from training. Reaction, for example, measures learner satisfaction, engagement, and relevance, while Behavior looks for actions changed as a result of training.

Focus on questions to assess Reaction and Learning soon after a training. For example:

  • Did you like the learning experience?
  • Did you find the training relevant to your needs?
  • Did you gain the knowledge or skill desired?
  • Do you believe you can use the knowledge or skill acquired?

Ask questions that measure Behavior and Results in surveys sent later in the sequence. For example:

  • What changes did you make after the training?
  • What have you stopped (or started) doing?
  • How would you measure the impact of the training?
  • Have other people noticed these changes? Why? How?

You can use a variety of Google Form fields to track responses. While you can measure a reaction to training (e.g., “Did you like the learning experience?”) on a multi-point scale, not every response needs to be a numeric field. A behavior change, such as a response to an open-ended question (e.g., “What changes did you make?”), might be best captured in paragraphs.

SEE: Using analytics to align IT with the business (Tech Pro Research)

Change how you ask

Think through your form response options. For example, if you request an email address on your form so you can email additional survey links later, people will know that responses aren’t anonymous. You may want to gather participant email addresses elsewhere, off the form, if there’s a chance that anonymity might yield more honest responses.

Similarly, a small change to a field may have a big impact. A four-point linear scale forces a response that “leans” one way or another–positive or negative, pro or con. A five-point linear scale leaves a “neutral” option available to indicate no feeling either way. Choose thoughtfully.

You can also include questions and answer options that include images. Somehow, choosing a “smiling face with heart-shaped eyes” seems a little more fun than ranking a session as a “5” on a 1-5 scale. (If you need images for your forms, both Twitter’s emoji collection and Google’s Material Design icons offer images.)

Finally, always include a freeform text response field. People can use the field to explain or clarify a response. The field also lets people provide feedback to questions you didn’t ask. In the end, if you want people to learn–and want to improve how training helps people learn–being open to unexpected comments may be the most useful behavior of all.