Inscrivez-vous ou connectez-vous pour rejoindre votre communauté professionnelle.
Mail survey is type of research is frequently used as a relatively inexpensive means of collecting data for marketing purposes.
Mail surveys are especially helpful due to their comparatively low data collection costs and ease of administration.
mail surveys are a good strategy for obtaining feedback from respondents who are dissatisfied with a service or who have strong concerns.
Market Street Research has conducted mail surveys for many types of businesses and organizations, such as chambers of commerce, retail and manufacturing companies, banks, hospitals and educational institutions.
Mail surveys are a quantitative marketing research data collection method in which respondents complete questionnaires on paper and return them via the mail. Market Street Research handles all aspects of mail surveys, including questionnaire design, data collection, and analysis of results
Thanks for the invitation
Mail surveys are a quantitative marketing research data collection method in which respondents complete questionnaires on paper and return them via the mail
Excellent feedback from the subject experts about the one of the marketing survey methodology.
Basm allah alrahman alrahim
what I want from any person or employer to answer my questions , with survey it look like fast analytics
I am compleatly agree with your answer Mohammed Ashraf
thanks ,
It is answered already
If the mail survey you are referring to is email surveys, then that is a method of embedding a survey within an email to collect customer data regarding their experience and likelihood of using the company for other services and/or products.
These types of surveys are cost efficient, effective, and yield immediate results when it comes to data. The data is immediately stored into the database and is collected for further analysis.
It's a quantitative research method in which the audience fills a questionnaire sent to them by mail (or e-mail). Surveys are used to conduct different types of research, and not just marketing research. Here are some points to consider when conducting a survey:
1-Surveys are not used to study phenomena because they are not deep research. They deal with visible things, and that's why researchers need to employ qualitative methods if they want to understand HOW things are happening.
2-Surveys deal with trends and they aim to make generalizations; therefore, it's important to use bigger samples.
3-Samling is the most important thing to care about, because if you fail to choose to a representative sample, then the results won't be reliable. So how to choose the population and which sampling methods are crucial steps here.
4-Give respondents all the time the choice to answer ''I don't know'' or ''I don't understand'' because then you will force him/her to give a wrong answer, which would of course affect the reliability of your research.
5-Make sure to conduct a pilot study in the beginning to ensure the reliability of your study.
Sometimes the most appropriate way to do a survey is by mail. If most of the following conditions apply, a mail survey could be the best type to use:
Any sets of questions which take the form of "Which radio stations do you listen to?" followed a little later by "What's your opinion of FM99?" are likely to produce biased results, as many people read through a questionnaire before beginning to answer it. They'll realize that FM99 is sponsoring the survey, and many people will reward the sponsor by not criticizing it.
It's generally not worthwhile to do a mail survey with the general public. Most people simply won't answer, so you won't be able to determine how representative your results are. But when small specialized populations are to be surveyed, mail surveys can be very effective.
The biggest problem with mail surveys is a low response rate. In my experience, the minimum response rate for producing valid results is about 60%, but many mail surveys achieve less than 30% return. To overcome this, you need to make it easier and more rewarding for people to respond.
Most of this chapter also applies to other types of self-completion questionnaires - such as those distributed at events (see the chapter on event surveys), and questionnaires left by interviewers for respondents to fill in, to be collected when the interviewer returns.
People who are willing to return a mail questionnaire may not get around to doing so without some prompting. For this reason it's normal to offer respondents some encouragement to mail their questionnaires back.
1. Include a return envelope
The first method of encouragement is an easy way to get the questionnaire back: a business reply or freepost envelope, addressed to the originator of the survey. Freepost licences are easy to obtain (in most countries), and the only costs involved are associated with printing envelopes (in a format strictly specified by the postal authority. In Australia, freepost letters cost 2 cents for each letter returned, on top of the 45 cents postage. In several other countries, the proportions are much the same. The freepost charge is paid only when an envelope goes through the mail, so if you send out 1000 questionnaires and only get 500 back, you are only charged for 500.
If you put stamps on all the return envelopes, this normally produces a slightly higher response rate than freepost (because some people don't like a stamp to be wasted), but it will cost you a lot more than using freepost.
Another freepost method is that for respondents to supply their own envelope and address it themselves. This saves them only the cost of a stamp. However, it is much more convenient for a return envelope to be included with the questionnaire — otherwise respondents use many shapes and sizes of envelopes, and it can cost more than the cost of an envelope to unfold and flatten questionnaires. Also, if you supply a pre-addressed freepost envelope with the questionnaire, this is one less reason for respondents to delay returning their questionnaires.
2. Give a deadline
The second incentive seems trivial, but we have found it to be surprisingly effective. Simply print, near the beginning of the questionnaire, something like this:
Please try to return this questionnaire within 7 days
=====================================================
The shorter the request, the better it seems to work. Though some people ignore such appeals, many take notice of this one, and it will help you get more questionnaires back, sooner.
3. Offer an incentive
Surveys that don't use interviewers tend to have much lower response rates than surveys where the interviewer speaks to the respondent. It's much easier to ignore a questionnaire that comes in the mail than to ignore a real interviewer. Therefore, mail surveys need to use incentives, to boost the response rate. There are two types of incentive, which I call psychological and financial.
An psychological incentive is a way of making people feel good about filling in a questionnaire - e.g. "If you like to use our products, please help us improve them by completing this questionnaire."
A financial incentive is money or goods given to the respondent. This can be in two forms: either every respondent is given a small gift, or every respondent is given what amounts to a lottery ticket, and a chance to win a large prize. In some countries, the lottery incentive is illegal. In others, a special licence must be obtained from the authorities - which may take months.
After experimenting with incentives of various types and sizes, I have reached two conclusions:
1. A small chance of winning a large amount works better than the certainty of a small amount. It's also much less work to give out one large prize than lots of tiny ones.
Judging the size of the incentive is something of an art: if the incentive is too small, many people won't bother to respond. But if you're offering a huge prize for one lucky respondent, some respondents will send in multiple questionnaires - probably not with identical answers, some even answered at random - simply to try to win the prize. Once I worked on a project with a prize of a holiday in Europe. When we sorted the file in order of respondents' addresses, we found that some had made multiple entries with slightly different versions of their address. In other households, improbably large numbers of people had returned questionnaires - and some of their names looked suspiciously like cats and dogs!
Very large prizes produce diminishing returns: when the prize is doubled, the response rate rises only a few percent. An amount that I've found effective is the value of about a day's wages for the average respondent. For most people, this would be a pleasant amount to win, but would not make them distort their answers.
Offering several small prizes doesn't work as well as one large prize - unless respondents can clearly see that they have a higher chance with a small prize - for example, one prize in each village surveyed.
Take care that the prize offered is something which won't discourage potential respondents who already have one. A poorly chosen prize can affect the accuracy of a survey. For example, a survey in the late 1980s set out to measure the musical taste of the Australian population. A compact disc player was offered as a prize. As the people who were most interested in music probably owned a CD player already, the survey would have underestimated Australians' interest in music.
In wealthy countries, the most effective kinds of prizes are often small luxury items that cannot be stored up. Vouchers for restaurant meals are often very effective. An incentive that worked well in a survey of rich businessmen was the chance to meet others of their kind: we offered a meal in a high-priced restaurant for 10 respondents. They were so keen to meet each other that some even offered to pay for their meals if they didn't win a prize!
Don't begrudge spending money on rewards: it's usually more than saved by the number of questionnaires not printed and mailed out to achieve an equal number of responses.
2. It's best to use two different kinds of reward at the same time: psychological incentives as well as financial ones. By psychological incentives, I mean reasoned appeals to complete a questionnaire. These arguments can appeal to either self-interest or philanthropy - sometimes both. For example:
Because people who don't use a service much also tend not to respond to surveys about it, it's a good idea to add another emotional appeal, perhaps something like this:
Another type of psychological incentive is a promise to tell respondents about the survey's results. This is simplest to fulfil when you have regular contact with respondents, and notifying them of results could be as simple as putting an article in their next newsletter. One point in favour of this type of incentive is that it can work with people who are unaffected by other types of incentive.
Psychological incentives work well with regular users of a media service, but non-users don't usually have any emotional attachment. Non-users usually respond well to financial incentives, but regular users respond little to financial incentives - unless the prize is very large. That's why using both types of incentive will produce a better balanced sample than using only one type.
Compared with questionnaires for telephone surveys (which have to be read and completed only by interviewers), self-completion questionnaires need much more care taken with their design.
Before having a questionnaire printed and mailed out, it's essential to test it thoroughly with 5 to 10 respondents: some old, some young, some not very bright. Don't use your office staff, or your friends or relatives: they know too much about your intentions. Go to strangers, who have never heard of your survey before - but if the questionnaire is only for listeners to your station, obviously the strangers should also be listeners. Sit beside them while they fill in the draft version, and ask them to think aloud.
You need to make sure that:
You'll find that even after you have produced many mail questionnaires, occasional problems still occur. Having 5 to 10 testers fill in your questionnaire is an excellent way of improving the survey - as long as you take notice of any problems they have, and any misunderstandings. If the testing results in extensive changes to the questionnaire, find a new lot of testers - 5 is usually enough, the second time.
Tell them exactly how to answer
In every question, make it clear to respondents how many answers are expected, using instructions like this - placed after the question, but before the possible answers.
Please tick one box.
Please tick all answers that apply to you.
Please give at least one answer, but no more than three.
Use a consistent method for answering. If you ask respondents to answer some questions by ticking, and others by circling code numbers, you'll find that some put ticks where they should have circled codes — and vice versa. To make it easy for most respondents, use ticks wherever possible.
In some countries (including Australia, Japan, and Ethiopia) a tick means "Yes" and a cross means "No." In other countries, such as the USA, a cross means "This applies" - which is almost the same as Yes. People from countries where a cross means No can get confused and give the opposite answers to the questionnaire writer's intention - so before printing a questionnaire for a country you don't know well, be certain which convention applies there.
I agree with Mr Mohmmed answers, Thanks for the invitation.