Improving Seminar Results: How to Use Speaker Evaluations
If you want to publish a book, this 1-day event is for you. Learn about BOTH self-publishing and how to get picked up by a traditional publishing house
Don't wait for someone else to publish your book, DO IT YOURSELF! This program will give you all the tools to do just that. It's actually simple and easy!
Want to get some advice DIRECTLY from me? Find out how I can help you (one-on-one) with YOUR business. The rates are surprisingly affordable!
I recently attended and spoke at a conference. The specific conference is irrelevant. At the event there were a number of speakers. When the event began, the attendees were given a packet which included an “evaluation” for the event.
The questions on that evaluation sheet were vague. One of the three or four questions asked who the attendees would like to hear speak at future conferences. Nothing that asked them to rate the speakers at THAT event. The other questions weren’t relevant to this discussion.
NO WHERE, and at NO TIME, did the promoters of this event ask for evaluations of the individual speakers.
Given that that one of the major indicators of success of any conference, seminar or event is how well the attendees felt each speaker performed, this was CONFUSING, to say the least.
Most events I’ve attended and ALL events that I give ask for the participants to rate each of the speakers individually. This is done for a couple of reasons. First, you want to see which topics and which speakers you may want to invite back. Secondly, you want to give the speakers feedback on their topics so they can change and improve.
To NOT do individual evaluations for the speakers at a conference or event is, in my opinion, negligent. I’m being kind here.
I can only come to three conclusions as to why the promoters would not want to have objective audience measurements related to the quality of their speakers and their content.
1. They don’t care. They are making enough money that they don’t give a crap about whether people liked the speaker or what they had to say. The promoters have a TAKE IT OR LEAVE IT attitude about their content and if the attendees don’t like it: TOO BAD!
2. They are lazy. They would have wanted to do the objective evaluations of each speaker but felt they had more and better things to do. What? I don’t know. But in their minds, this was NOT an essential element of giving a workshop.
3. They don’t WANT to know what people thought. For a variety of unknown reasons, a promoter of an event doesn’t want to know. They had their agenda and frankly don’t care what their attendees thought of each of the speakers. The reason for this attitude is well beyond my pay grade, so I won’t try and play amateur psychologist.
I have a sign up in my office. It says: “Measurement Eliminates Argument”
Other business experts have said similar things. Tom Peters (the well know business writer) used to say: “What gets measured gets done”.
Without objective measurement of your speakers at an event you know VERY little about what people thought. Sure, you can ask folks what they thought of the event face to face, but the answers you get will be skewed. It’s not anonymous.
Very few participants feel comfortable telling the promoter, or their minions, what they REALLY feel/felt about things face to face.
If YOU do an event, I would ask you NOT to follow the example of the seminar promoter I’ve described above.
Many things in the seminar business CAN’T be measured. The performance of the speakers CAN and SHOULD be measured.
For many years I worked with a seminar company called CareerTrack. They were one of the biggest seminar producers in the 1990s. At the end of every event, every attendee was given a form to fill out. It asked for feedback in two areas. Speaker content and speaker delivery (speaking skills).
Those who filled out the forms could choose to include their names or remain anonymous.
These forms were collected and sent back to the home office in Boulder,Colorado. They were scanned and the results were posted for all the other speakers to see. WHY? So that the speakers would know where they stood vis a vis their other speaking counterparts.
Net result? Speakers were always trying to improve themselves. ALL speakers got better. Both in terms of their content AND their delivery. Why? Because they were being measured and they were competing.
Net result for the company? The quality of the speakers improved and with that the profitability of CareerTrack itself.
As for this recent event I was at? I have NO CLUE as to why a seminar promoter would choose NOT to have their audience rate the speakers at the event. It doesn’t matter if you don’t do LOTS of events. At EVERY event you should be using speaker evaluations.
All I know is that it makes NO LOGICAL SENSE.
Your action point? Always make sure to have your audience members at ANY event, evaluate your speakers. All it takes is a one page, one sided, 10 question evaluation form. Make them “fill in the bubble” with their answers on a 1-10 scale. Make the form scannable if you’ll be doing a lot of them. Add some lines down at the bottom of the form for “additional comments”. If they are good ones and people sign the form and check the box allowing you to use their comments, you’ve now got some great written testimonials.
See you at your next event!