Post-Session Audience Evaluation (Draft)

We encourage users to post events happening in the community to the community events group on https://www.drupal.org.
You are viewing a wiki page. You are welcome to join the group and then edit it. Be bold!

I tried to keep this pretty short since I think most people won’t want to fill out anything too involved after every Drupalcon session. While I agree with gdemet’s thought that passing out evaluation cards and pencils would result in higher participation, sadly it’s probably not practical. My suggestion would be to include a short form on each session page. Most folks have laptops at these conferences, so there could be a few minutes at the end of each session to allow attendees to fill out the evaluation online.

Please rate the quality of the speaker:
1 - Poor
2 - Okay
3 - Average
4 - Great
5 - Awesome

Please rate the usefulness of the content:
1 - Poor
2 - Okay
3 - Average
4 - Great
5 - Awesome

Please indicate who you think was the intended audience for this session (check all that apply):

__ Beginners (New to Drupal)
__ Developers
__ Designers
__ Project Managers
__ Potential Clients
__ Other _________________

Other Comments: ___________________________________________________________________

Comments

Various thoughts

Crell's picture

I disagree that paper forms are not practical. I've been to conferences before where they handed out paper forms as you entered the room and collected them on the way out. I was the usher responsible for said passing and collecting. :-) The response rate was fairly good, IIRC. Of course, if you want to crunch numbers on them you then need to do a fair bit of data entry, which I agree could be a challenge for 3000 people * however many sessions.

OK, maybe they're impractical for scale reasons because we'd get a good response rate. :-)

As a speaker, I'd also want to see "how could this session be made better?" There's always room for improvement, even if only 5 people bother to answer that question.

At a recent training workshop, we gave out a survey with these questions:

This workshop covered what I expected:
This workshop was well organized:
This workshop will be useful to me in my work:
The presenters were active and engaging:
The presenters were knowledgeable in the subject:

All of which had a 5-part "strongly agree ... strongly disagree" answer set. We also had

This workshop was...:
Much too advanced
Too advanced
On target
Too basic
Much too basic

Which is probably not applicable here given the audience. The "check all that apply" approach is probably better here for that.

If we're going online, I don't think 5 "agree/disagree" scales is a burden.

I agree, the format suggested

jjkd's picture

I agree, the format suggested (5-part "strongly agree ... strongly disagree" answer set), also known as a 'Likert scale', is generally accepted as a useful measure in this kind of analysis.

Note that "This workshop covered what I expected" can end up saying more about the match (or the lack thereof) between the session and the promotion for the session, than it does about the session itself. That isn't necessarily a bad thing to measure, as long as you keep that in mind when interpreting the results.

If we were to come up with such a standardized evaluation that could be leveraged across (many) Drupalcons, Camps and so forth, it might be worth considering creating mark-sense or otherwise machine-readable paper forms. If the form wouldn't be reused for a number of events, or if the number of free-text responses is high, there wouldn't be much advantage in going machine-readable.

If an online evaluation were made simple enough and well-themed, it could be practical to complete it from an iPhone / Android / Smartphone in addition to from a laptop. Not sure what the penetration for both will be for attendance in our target population, but I'd guess it to be significantly higher than in the general population. Offering both online and paper is also an option.