Online Survey Response Rates
Response rate refers to the proportion of people you invite to take a survey who actually complete it. For example, if you sent out 200 invitations, and 60 people completed your survey, that would be a response rate of 30% (60/200 = .30).
All researchers much prefer larger response rates to smaller ones. The reason for this is that most surveys are conducted in order to learn the opinions of or other information about a larger, usually much larger, group of people than the number you invite to participate. When only some of the people you invite complete your questionnaire, you should always have some concern whether there might be differences in the answers given by the people who participated compared to the answers that would have been given by the people who chose not to dissipate. If there are differences between the answers given by those who participated and the answers that would have been given by those who did not participate, then the answers you received may not reflect the larger group whose opinions you are trying to learn.
There are no hard and fast rules as to how low a response rate indicates a problem. Response rates for all types of surveys have been falling for years. Happily, the best available evidence suggests this is not as big a problem as researchers fear, but it is always a potential problem in any particular project.
Since a very large number of online surveys are conducted every year by researchers of varying levels of skill and with tools of variable quality, there is no one number that fairly reflects average response rates. Estimates of those averages typically range from about 8-24%. Of course some projects will achieve much higher rates and some will achieve lower rates.
Factors that influence response rates include how likely the people you invite are to be interested in the topic of your questionnaire, whether you provide an incentive (reward) for participating, how visually engaging it is, how skillfully it is written (if it is difficult to understand or has spelling errors or doesn’t seem logical, many people will drop out) and whether your online survey tool can adapt to different devices (many people now receive email on their smart phones and won’t complete a questionnaire designed only for desktop PCs, that make it difficult to read on a phone). Yet another factor is the kind of people whose opinions you wish to learn. If you are researching people in their 20s and 30s, almost all of them will have Internet access. But if you are researching people in their 80s, that will not be the case. Language concerns can be another issue. For example, if you wish to learn specifically about the opinions of Hispanic people and your questionnaire is only in English, some will not be able to participate.
The take away from all this is that you should try to do what you can to maximize response rates, including paying careful attention to question wording, using an online survey tool that uses responsive design for different devices, consider what if an incentive you wish to use and to make sure you have a good list of people to contact. If you have all of these covered, don’t be too worried about low response rates, but do be suspicious if you get a seemingly odd result.