January 11, 2016 § Leave a comment
We recently had a good question from a CEO who just heard about Adaptive Survey(r) Technology, “Why should I use Adaptive instead of SurveyMonkey or any other tool my people already know?”
One of our advisors, Professor Raghu Santanam from Arizona State University, jumped right in with a short definition of the various uses of research, “There are three basic uses for research…”
- Descriptive surveys are backward looking and captures what happened to what demographic in the past. CX or Customer Experience surveys are in this category.
- Confirmatory surveys confirm information you already know by taking a current reading. Some of these are tracking over time to see if anything changes – NPS or other tracking surveys for example.
- Discovery surveys are future looking where you want actionable insights that lead you to doing something.
Adaptive Survey(r) technology falls primarily in the Discovery category. It is used to generate new ideas, innovations or simply unexpected opportunities to delight customers. If you are looking for new and actionable insights in priority order, Adaptive is the right type of tool for you. Adaptive generates new ideas not conceived in a traditional design.
The first and only tool for doing Adaptive Surveys(r) is at GroupInsight.com.
Once you see some results form your discovery survey, you’ll find that Adaptive is also a useful replacement for open-ended questions and collapsing ratings into one Adaptive Question(r) in descriptive and confirmatory surveys too. Many survey can be reduced by 80% using this method; you can turn 30 questions into 5 or 6 using this method. Adaptive surveys require dramatically fewer questions, yet provide more business insights.
March 1, 2013 § Leave a comment
An Adaptive Survey® is a market research method that combines qualitative and quantitative research features. This unique combination allows researchers to speed up the research process by gathering ideas and prioritizing them in the same research project.
Adaptive Surveys are offered by CloudMR, Inc. and this blog is related to that company. The benefits of this technique…
- Systematically gather and prioritize open-ended text in a single project
- Replace dozens of traditional market research rating scales with a single Adaptive Question™
- Answer questions you didn’t even know to ask
- Get higher response rates since Adaptive Surveys® are short and conversational
- Add structure to unstructured data
- Prioritize ideas using any representative sample you choose
December 3, 2011 § Leave a comment
The likely-to-recommend question was designed to work at the brand level – usually a company…
How likely are you to recommend ABC Corporation to your friends, family and colleagues?
It also seems to work well at business unit and product levels if customers recognize those as a brand.
I see companies asking this question at the functional level which makes no sense to me. I saw one recently where the question was asked at the company level, the division level, the business unit level, the product level and at every functional level imaginable.
How likely are you to recommend ABC Corporation’s tech support representative to your friends, family and colleagues? Really? My answer…
Well, it depends. Does my friend own ABC Corporation’s product? If they own the product, do they have a need for technical support? Does my friend have a support contract? Does my friend get to choose the tech support representative when they call ABC support? Since I don’t have the answer to any of these questions, it makes no sense to ask this question at this level, and my answer is likely to be 5 – a detractor. Even better, I might just quit the survey at this point. Why would you ask such a weird question?
If you wonder why your most specific questions don’t make sense when you try to roll them up to an overall company NPS, this is probably why.
May 29, 2011 § Leave a comment
The loyalty world has an ultimate question, so why shouldn’t word-of-mouth marketing? The key issue for any word-of-mouth program is the message. I think we want to know what message resonates most with members of our target group. Among that group, what would they say to a friend to convince them to try abc company? Something like, “If you were going to recommend abc company to a friend, what would you tell them?”
What do you think is the ultimate word-of-mouth marketing question?
May 1, 2011 § Leave a comment
Assuming we can get past the user experience with questions, exactly who do the responses represent? The questions are viral and I like that feature a bit because I can always pull a representative sample from the total. The only problem is that Facebook’s interface with website doesn’t work for some of the demographics even if respondents agree to let you see their information. We have had trouble getting anything more than gender from Facebook. Other stuff that would be useful to our clients when drawing a sample…
- Location (at least country)
- Social graphics such as number of friends, activities, etc.
May 1, 2011 § 1 Comment
I tried Questions a few days ago and wanted to give myself a little time to think before responding. It seems like a good step on the surface but it seems like market research issues are always in the details.
Facebook limits me to 10 answers which cuts out my favorit eleven-point scale (zero to ten). That is not even the bad news. The user experience is probably something that most researchers won’t like.
My test using a ten-point scale only showed part of the question on my wall and only three of the ten answers showed. I think that is a problem when we want to get responses from people based on their initial response. Questions require respondents to do something before they can even read the entire question or see the entire answer list. I’m just not comfortable with those results. How do you feel about that?
February 20, 2011 § Leave a comment
Lots of engineers around me are talking about sentiment analysis. Most of the market researchers I know are more than skeptical about it. I can see the allure of some sort of magical box that will automatically make sense of all of these verbatim comments, but for me it doesn’t really matter. Just give me the handful of comments that resonate with most of the respondents and I’ll read them myself – sentiment and all. That is what is so exciting about some of CloudMR’s early testing of their proprietary algorithm. It doesn’t include fancy text analyzers or extra complexity. It quickly generates a score for each comment and sorts them. Interestingly the early models stratify the comments and group like ideas all together. Since I can easily see that grouping, the algorithm is clearly doing something right and producing the top ideas. It will be fun to see this implemented over the next few weeks. Jeffrey Henning has an interesting post about this same issue from last summer.