Tag Archives: Online research

Talking Data

July 2016 – Talking Data At i-Link, we realise that the delivery of reliable data, on time, is of the utmost importance to our clients. For this reason, we don’t use automated systems to clean the data but give that responsibility to the Project Managers, all based in Sydney, who conduct checks manually. These checks, detailed below, are done during the pilot stage to identify the expected rate of poor responses and then again before delivering a final data file.
  1. Speeders – This involves comparing the average completion rate with the respondents completion rate. Rather than using a specific formula we do this manually in order to be able to look further into the responses of the potential speeders so we can identify if they have legitimately attempted to answer the questions within the survey.
  2. Flat liners – This involves looking at a list of statements which are rated on a scale and if 95% or more of the responses have been given the same rating then they would be considered to have not attempted to legitimately answer the survey. The context of the question is also considered in the event that a flat line response may be legitimate. For example answering ‘don’t know’ on the scale to a series of a statements they might not have an opinion on.
  3. Bad verbatim – This involves reading each of the responses to the open ended questions within the survey and determining if they have provided a legitimate answer to the question. Again the context in which the question is placed is also considered here. For example if they state they are an avid basketball fan but couldn’t name any basketball teams then their response would be considered junk.
You can ask Patrick or Scott at the below address about anything relating to data quality or i-Link’s services in general! sales@i-linkresearch.com  

Are you using i-Link minipolls?

October 2016

Struggling to determine an incidence rate for a potential project? Looking to quickly pre-target hard to reach respondents? Need to throw in some quick stats for a last minute pitch to a new client?


Book in an i-Link minipoll today and for a small fee we can usually set the poll up and launch within 2 hours. Specify your demographics, question(s), length of time in field and number of respondents and we take it from there!


Contact Patrick via patrick@i-linkresearch.com and he will get the ball rolling.

Related October Stories:

i-Link Members Poll
i-Link Noble Causes
i-Link Streaming Poll 2016


ANZAC Day 2015 Poll

April 2015 – ANZAC Day 2015 Poll

ANZAC Day has come and gone, and we polled the members of LiveTribe to find out what they had planned for the day as well as what they thought of not having a public holiday on the Monday. Here is what they told us…




For more information about this poll, or to discuss your online consumer research needs, email our Client Services team: cs@i-linkresearch.com



Client Testimonials


Environmetrics has used the internet surveying services of i-Link on a number of projects over the last few years. On each occasion they brought a professionalism and attention to detail to the project that allowed us to deliver accurate and reliable research results to our clients. They provide a complete service, from creative research design at the outset, to the quick set-up of the survey site and regular contact and data updates as the survey progresses. We have used them for a range of clients and they have been very flexible in adapting to the different surveys topics and samples that have been involved. We would not hesitate to use i-Link on future projects and would recommend them to anyone who required high quality and cost effective internet research services. Pete Wilson   Envrionmetrics

Inside Story

INSIDE STORY has been using i-Link’s services since 2002. We have always found the i-Link staff willing to please and provide excellent service, going beyond a simple supplier relationship to more of a partnership. Most of the projects we commission from i-Link differ from the usual online questionnaire style and have offered unusual challenges. i-Link has always risen to these and the technicians have never failed to solve any issues that occur. We continue to be impressed with i-Link and are happy to advocate them. Catherine Anderson INSIDE STORY

Footprints Market Research

We have been working with i-Link for almost six years now and have found them to be superb in terms of responsiveness, turn-around speed, accuracy and presentation of our questionnaires. Footprints Market Research is pleased to use i-Link for its online surveys needs. Time and again, the company has accommodated tight turnarounds, provided high quality output and delivered a professional service. We have been particularly impressed by the ‘can-do’ attitude of Scott, Chris and all the team at i-Link.” Nicola Pringle Footprints Market Research

Wallis Consulting Group

Wallis Consulting Group has used i-Link Research since the company formed for most of its internet studies. We have found all our dealings to be extremely professional and pleasant. We find i-Link’s services to be fast and dependable and the quality of work is exceptional. Our clients have also been extremely happy with the results. So good is the service we receive that we have delayed our own plans to install internet software and continue to outsource this function safe in the knowledge it is in the hands of experts. Perhaps the highest recommendation is that, on occasion, i-Link staff have dealt directly with our clients. Our clients have been impressed with the level of knowledge and professionalism displayed. We have no hesitation in recommending i-Link as a provider of internet research services. Jayne Van souwe Wallis Consulting Group

Survey Invitation Routing

March 2013

Survey Invitation Routing

As the demand for online data collection grows it has become an increasingly challenging exercise to maintain research participation rates. Survey routing is a sampling technique that helps to match a respondent to a number of survey invitations, which he or she may then qualify for. In contacting a respondent once, it offers multiple opportunities to qualify for a research exercise rather an individual invitation to each individual research exercise. A panel member’s active response is a valuable commodity and data collection agencies therefore realise the value of matching this response to a suitable research activity if and when they decide to do so.

Matching Willing Participants with Available Surveys

Using a common screening platform survey routing directs willing participants to a range of available surveys that he or she could potentially qualify for. It offers an economy around their willingness to participate that could yield the following benefits:
  • Better engagement – Respondents will not suffer repeated rejections and screen outs as the number of opportunities to qualify are significantly increased. When they do not fit the criteria of one study, they are seamlessly directed onto the next survey and its specific qualifiers until they fit the criteria of one of the other surveys placed within the qualifier pool at that time. They may well not qualify for any but the odds are greatly increased once they have taken the time and inclination to accept an invitation to participate.
  • Improves response rate – Respondents with a diverse but regular participation history will be retained longer as a panel member and thus contribute to a higher number of research exercises.
  • Supports the fielding of surveys to low-incidence populations – With a large and consistent stream of willing participants piping into the router, it is easier to screen and match low-incidence targets to available studies.

Randomisation to Avoid Potential Survey Selection Bias

A survey router needs to be carefully programmed with the appropriate sampling considerations. Routers can introduce sampling bias if priority routers are used instead of random routers.
  • A priority router first directs willing participants to one particular survey over other available surveys to prioritise studies with a lower incidence rate or projects with more challenging quota controls.
  • On the other hand, a random router exposes the participant to a whole pool of surveys that he or she is able to randomly screen for.
A well designed router implements multiple points of randomisation to eliminate bias. Below are some important factors before a survey is considered for a router:
  • It requires a regular supply of live surveys with a diverse range and healthy number of projects running within the system at any one time.
  • Simultaneous screening – Willing participants are exposed to common screening questions from all of the projects that are currently running, before then randomly allocating the respondents to one of the available surveys.
  • Auto checking of a member’s participation history and duly limiting the number and category of surveys they are exposed to and apply within any single period of time.
In conclusion, survey router can enhance the online sampling process by optimising the use of an immediately available population to a group of simultaneously available surveys. It improves response rates because it gives willing participants numerous chances to qualify for a survey compared to single direct email invitation samples. In the online environment competition for people’s time and attention is intense and by offering an enjoyable online research experience, respondents are more willing to express their genuine opinions and increase their engagement.  
Share Button
Back to News

Designing Effective Online Surveys

June 2012

Designing Effective Online Surveys

Arguably the key benefit of conducting online quantitative surveys is that they are a highly effective way of collecting valuable data from a targeted number of respondents in a relatively short period of time.

However when designing an online survey, one needs to keep in mind that it is a self administered interview from the respondents perspective. It therefore follows that a well-structured questionnaire will help you to obtain the robust measures and actionable insights you are seeking.

Let’s check out some good housekeeping tips when it comes to online survey design!

Define your objectives and keep your focus

First and foremost, you need to clearly define your research objectives, having well-defined objectives will help you to determine which questions should make the cut and more importantly those that should be discarded, thus keeping the survey as concise as possible. Define your exact hypotheses and then develop a question set to test and confirm it. When designing the questionnaire, it is also important to visualise in advance the pending data outputs and results, this allows you to hit the ground running in your reporting as soon as you are in receipt of your data file.

Diversity of the sample selection

The current penetration rate for internet usage in Australia stands at 89.8% (Source : Internet World Stats), however it’s safe to assume the vast majority have yet to register themselves to become members of a market research only panel. So to get the most out of your online sampling, it is important to understand the online behaviour of your target audience. The general demographic of internet users in countries with high internet penetration rates would likely be closely aligned to that of the general “offline” population. Use of a larger sample is wise if a suspected diversity exits between the general and online population. To ensure your findings are more statistically sound, specific quota control should also be applied to ensure that the final sample accurately mirrors the key demographics of your target audience.

A courteous welcome and thank-you page

“What’s in it for me?” – Is the question many respondents would ask before deciding whether to accept and complete your survey. Therefore from the very first page we need to engage the respondents, and make them feel appreciated. Start by thanking them for their time, identify who you are and then clearly set out the reasons why the research is being conducted, including also the expected interview length. Although appropriate incentives might entice them initially, they will be more motivated by knowing that their opinions are being listened to and could ultimately make a difference to how that product or service actually develops. Don’t forget a simple “Thank You!” message upon completion of the survey, including those who are screened out or at quota full.

Getting your respondents into the flow of the survey

Maximise respondent engagement and stimulate a genuine interest by asking interesting questions early in the survey. Ease the respondents into the survey by starting with some broad, generalized questions as a form of ‘warm-up’. Whilst always ensuring that any screener questions are inserted at the very beginning, and from here funnel down to ask the more specific qualifying questions. Front load any important questions that require more considered responses, as there could be the potential for the respondents to become disengaged or fatigued during the later part of the survey if it proves too long.

The sequence of the research topics is equally important. Having a logical flow for all questions or topics can avoid the respondent being confused about the direction of the interview and thus becoming less engaged. It is also best to place profiling demographic questions at the back end of the survey as these questions required little effort to answer.

Keep your survey as concise as possible to avoid respondent fatigue and increase completion rates

The average online attention span of an adult is considered to be less than 15 minutes. Therefore it’s always advised to be honest about how long the survey will take to complete, stating it clearly in the email invitation or in the welcome page. Stating a range (for example 10 – 15 minutes) is advisable when there are questions that involve routing or piping. If you find yourself with a long survey (25 minutes +) due to the inclusion of multiple research objectives, it is worth considering a multiple of shorter surveys as an alternative, also by evaluating each subject separately you will optimise the respondents focus on that topic alone by reducing the known issues associated with respondent fatigue.

It is considered best practice to use only open-ended questions to capture spontaneous feedback where it matters most. Although open responses can elicit extremely valuable insights, it is best to limit the number of open-ended questions as they always take more time and energy for respondents to express their opinion, so again use only where they are truly needed.

Simplify your survey – brevity , clarity and consistency

Make your survey easier to understand

  • Construct clear and direct questions by using the language that respondents will understand.
  • Use shorter sentences and make your questions as specific as possible. When there is more text displayed on the screen, the fewer words respondents actually read.
  • If you wish to focus respondent attention on certain key word(s), consider highlighting them by bolding it or using different font and colours.

Provide a self-explanatory, clear and jargon free answer list for easy completion

To pre-define relevant response options for each question: avoid providing too many choices; ambiguous (overlapping categories like “1-2 units only” and “2 – 5 units”); or ‘double barreled’ answers (such as “friendly and knowledgeable customer service staff”) as this could confuse the respondents and slow down the completion time.

If relevant, leave open the possibility of other response options, such as “other please specify” to find out what you don’t know or allow an opt-out response (e.g. “prefer not to answer”, “don’t know” or “does not apply”) where respondents are unable to respond in the way that’s been prescribed. Any forced answers introduce agreement, dissatisfaction and therefore inaccuracies in your final data set.

Use consistent rating scale throughout the survey

Another key point is that unbalanced scales are inappropriate and almost always lead to biased results, i.e. presenting rating scales in different directions and of different sizes (positive to negative or vice versa) within the same survey. For example:

  • In one question, using a 10-point liking scale starting from ‘Do not like it at all’ as code 1 to ‘Like it very much’ as code 10; and
  • In another question using a 5-point rating scale starting from a positive to negative statement, i.e. from ‘Extremely appealing’ as code 1 to ‘Very unappealing’ as code 5. With rating scales, it’s also best to avoid neutral response options where possible, or not allowing a respondent to choose a “not applicable” or “don’t know” option. Design a good question with valid and thorough response options, this ensures your data is valid and representative at the back end.

Avoid cumbersome and complex question types

Although grid questions are commonly used to ask a series of repetitive questions, it can be tedious and complex for respondents to fill out. Respondents can easily become disengaged when presented with a long battery of questions and response options. This can lead to contrived agreement as they will just click through as quickly as possible to get it finished, resulting in the straight lining of responses and ultimately junk data.

There are ways to make it easy for respondents to complete a survey:

  • Brand logos, product images can be used to make questions more visually engaging and intuitive.

Also the use of dropdown questions is ideal for a very long choice list such as countries or postcodes. It also works well for displaying correlated questions together on the same screen. For example:

  • Asking respondents to indicate the make, model year and type of the car
  • Determining respondents’ socio-economic status by asking them their highest level of education, income level and occupation

These dropdowns give the impression that the survey will take a shorter time to complete as it reduces the number of ‘clicks’ to continue to the next survey screen, and is more visually appealing as it removes clutter from the screen.

Survey Engagement – Enhancing the Survey Experience

Visual layout and formatting

The physical layout of a question, and the number of questions displayed on screen at one time are also important considerations as they can affect interview completion rates and ultimately data collection. One question per screen is always recommended as it prevents respondents from reading ahead; avoiding the possibility of them altering their responses after seeing subsequent questions and before submitting their response.

Depending on the respondent’s browser and screen resolution, they may need to use the scroll bars (vertical or horizontal) in order to see everything. It is essential to choose the question structure, limit the amount of text and/or images of the question and avoid requiring respondents to scroll horizontally (left to right) in a survey screen. They can get confused or frustrated and then dropout since they see it as a daunting task to complete.

Use a variety of question types

Like it or not there is a high potential for respondents to become bored when taking your survey, thus they begin to respond in a repetitive manner, meaning the response become ill considered with data quality the first thing to suffer. Keep your respondents engaged by utilising a variety of question types and where possible presented in a visually engaging way.

Minimise potential bias in your research data

Knowing any potential bias enables you to eliminate them before launching the survey.

  • Online surveys allow you to set randomisation rules with ease across questions, response listings and concepts, thus eliminating order bias.
  • Use skip logic and conditional routing to ensure the relevancy of your questions, this can be easily and efficiently programmed when using an online survey to collect your data.


Creating a great survey is an skill that requires attention to many details. How questions are being asked and the survey layout may influence how people understand and answer your questions. Engaged respondents will answer your questions fully and accurately, thus resulting in better insights and a higher quality of data. This is because they feel that they are genuinely contributing to something meaningful and sharing their opinions in a constructive way, not just contributing their personal time for a quick reward.

To conclude, below is a summary of some important considerations to keep respondents engaged throughout the interview process :

  • Communicate to respondents that their answers matter
  • Keep the language plain and simple, jargon free and avoid ambiguous questions and instructions
  • Make the survey as short as possible, consistent with meeting the clearly understood research objectives
  • Relevancy and accuracy, avoid the temptation to ask ‘nice to know’ questions
  • Wording style, type and logical question sequence are important to achieve
  • Asking questions in an unbiased way will give you reliable data to analyse
  • Check for bias, randomizing the order of your questions and response choices
  • Remember, keeping your respondent engaged is the key to better research data
  • For more advice simply get in touch with our Client Services Team (cs@i-linkresearch.com) who can provide you with consultation and support on constructing an effective online questionnaire and research design.

    Author: Megan Kuek – i-Link Research Solutions

    Share Button


    Back to News

i-Link Quick Stat – QR Technology

May 2012

i-Link Quick Stat – QR Technology

At i-Link we regularly profile our members via quick polls to determine their usage of everyday products and services. Recently we polled our members around the very topical subject of smartphones and the emerging use of QR codes or Quick Response technology. To briefly explain a QR code is two-dimensional bar code (see below example), which users of a smartphone can scan via a specially downloaded reader. The benefit is that it allows the user to navigate to a specific website without having to manually type in an often lengthy URL address. It is becoming quite popular for a variety of reasons with marketers especially using it to communicate with their customers whilst in store or via a print placing on brochures, tickets or receipts. The custom sites they are directed to will often provide them with special offers regarding new product lines, advance ticket sales, the clients social media sites, directions to a nearest store or even opportunities to join a retailers online shopper community. It is also being actively used to connect with customers from a research perspective, activities such as “live event measurement” is now made easy by the positioning of QR codes at specific locations around a venue to measure how customers are actually experiencing a game, concert or a cinema experience to name a few. QR assisted research is operational and spontaneous, it essentially facilitates live and “in the moment” feedback, respondents are provided with a facility to quickly and easily access an online poll or mini survey to record their experience right at the very time of engagement, undoubtedly a very powerful measure to be able to capture. Example QR Code: In realising the rapid integration of QR codes across our daily lives we recently polled 338 of our LiveTribe Research Panel members to top line understand:
  • Their level of and type of smart-phone ownership
  • Their use of QR readers and willingness to use in-store or at live events.
Of the 338 members we randomly polled, 55.3% said they currently owned a smartphone. Of those who owned a smartphone, Apple and Samsung were cited as the most popular brands. Do you own a ‘Smart Phone’? (i.e. a phone with internet access, camera, music player, and can run apps) Which brand is your current smart phone? Of those who currently owned a smartphone, approximately 28% were now aware of QR technology and of these a further 60% stated they had already downloaded an actual QR reader. Over 70% of those who have downloaded a QR reader have gone on to actually use it, with the majority also indicating they are open to the idea of using it again and on a spontaneous capacity, be it in-store, at a sporting match, at a concert or even the cinema. For both researchers and marketers this new, easy to use and flexible technology now presents another way to capture and measure their consumer at the point of interaction, the possibilities from both a marketing and research perspective are genuinely endless. Needless to say i-Link’s advanced survey system i-Question can be easily tailored to operate via a QR access code, already our clients have began using this new medium to position invitations to short and concise measurement surveys at live events, functions and on in-store signage. The type of interview, given that it will be accessed via a smartphone, must be extremely concise (5 questions maximum) or otherwise the respondents will opt to close the survey. If used properly however it can be used to get those invaluable at point of interaction measurements, sign up’s to social media or an organisations online marketing and research community etc. For further details on how this technology is being used by our clients then call the i-Link Client Services Team on +61 2 9262 7171 or +61 3 9863 7144, or email us at cs@i-linkresearch.com

Share Button


Back to News