My blog has always been about sharing my experiences as a nonprofit consultant – which includes revealing the occasional wart, so others might avoid it. In keeping with that pledge – welcome to the tale of my first reader survey. Spoiler alert: it has a happy ending. But the unexpected challenges and snafus – as well as insights gained along the way – prompt me, as a service of sorts, to tell the full story.
What Sounds Easy … Isn’t
A little background: after over a year of delivering my monthly dashboard, I’m seeking feedback on topics of most interest to readers. That everyone’s busy is a given, so I want my survey to be quick and easy – just five questions plus the option to add comments.
My initial insight: hats off to professional survey developers. Deciding what to ask and how to phrase each item is complicated. It requires clarity of thought and wording to match, particularly with a short survey, because you’re looking for as much information as possible from only a few questions. For nonprofits considering a survey, I recommend devoting plenty of time and staff discussion to the nitty-gritty of determining what goes in the survey.
Confusion in the Survey Rankings
With questions set, the next challenge: how should readers respond? Rate topics by level of interest? Give a number value to each? I’ve taken surveys – I’m sure you have too – where the ranking system seemed more than a little arbitrary. On a scale of 1 to 10, what is the difference between a 3 and a 4, or a 6 and a 7? I want to discourage guesswork. So what will make sense to those responding and produce answers that make sense to me?
Once I decide, I step back and try to view my survey from a taker’s perspective. Within an organization, ask staff or volunteers who haven’t helped draft the survey to test it, for ensuring questions and directions are clear.
Getting Past Glitches
At last, with questions finessed and ranking system resolved, my dashboard with survey link launches at 6 a.m. I wait a respectable hour or two and begin checking responses. Good open rate on the dashboard email. That’s encouraging. But the survey … one lone set of answers! Yet I can see where many others have tried … and failed to submit the survey.
The dreaded glitch! A setup mistake. What to do? I resend the survey with apologies. Although there is another technological hiccup – quickly corrected – I’m cheered by the number of readers who retake the survey.
The moral of my survey story? Think, rethink; check, double check; test, retest – it is those devilish details that produce a useful survey. And when necessary, ask your target audience to try again. Despite my trials and errors, I’m pleased with results to date and the guidance they will give me. My thanks to all who stuck with me and responded!
My survey is still open, so please follow this link and tell me what you think.