by Thania Farrar
Loosely used in market research circles, crowdsourcing (in its most general sense) is an attempt to gather input from “a crowd” in the form of ideas, votes, feedback, or even funding to help an entity or solicitor make decisions. In market research, crowdsourcing means different things to different people, and there is a continuing struggle to understand its role in our industry. Does it replace or enhance traditional research methods? Is it the same or different, better or worse than traditional methods? Let’s examine crowdsourcing a bit more closely to help answer these questions.
Although not an entirely a new concept, crowdsourcing has received a significant boost in recent years due to the advent of technology, voting shows like American Idol, and the new culture of social media interactions. It is seen today as more of a natural proposition for both solicitors and crowds. Crowdsourcing provides an opportunity for real engagement and goes beyond a one-sided interaction. Crowd participants don’t only respond or vote, they engage with the solicitor and with other participants, and there is power in their collaboration. Examples of successful applications exist in the tech sector: Apple’s open source (provides the opportunity for open source software development), Dell’s Social Innovation Challenge (opportunity to help solve world social issues and problems) and Kickstarter (helps fund startups with new creative ideas that people want to bring to the market).
When it comes to the areas where companies tend to leverage market research inputs (product development, marketing, and customer service feedback, etc.), crowdsourcing has been applied with varying levels of success. A quick online search yields more examples of crowdsourcing gone awry than successful attempts. Some of the worst, yet amusing, outcomes focus on naming a product or selecting product options. My favorite is the search for a new mascot for Ole Miss which resulted in the student body selecting General Akbar from Star Wars. Sadly, Lucas Films was not aligned and the crowd’s vote was vetoed. In another example, NASA requested help in naming the International Space Station and the crowd selected “Stephen Colbert” (influenced by Colbert himself asking viewers of his popular TV show to vote). That didn’t go over too well with NASA and that vote was also rejected. Lastly, Kraft’s popular Australian spread, Vegemite, solicited a name for a new version of their product which yielded the head-scratching name of “iSnack 2.0”. Surprisingly, this one actually made it to market.
It seems to me that many of these crowdsourcing attempts have something in common. These attempts are short-term, broad-reaching efforts that have more of an awareness building objective than a problem solving objective. They seek to attract attention and generate discussion, but do not deliberately seek the meaningful output that results from true crowd collaboration. In the failed attempts, even if the crowd loses, the solicitor has built awareness for the endeavor, which some see as a success in and of itself. If the end goal is an improved output from new ideas, I argue market research does a better job of identifying winning propositions, albeit with less pomp and circumstance.
However, not all crowdsourcing attempts have gone wrong. The Lego Ideas and My Starbucks Idea are two successful applications of crowdsourcing that recently caught my eye. Both sites support and encourage people to submit ideas that aim to help the company create new products and services. In the case of Lego, participants contribute ideas for new sets and models. For Starbucks, people offer suggestions for new products and service experiences. Both platforms encourage engagement among participants and allow participants to vote on others’ ideas and provide feedback so that ideas can be improved upon. The most interesting aspect of these sites is the fact that participants are truly helping shape what makes it to market and they can see which ideas LEGO and Starbucks chose to pursue. With the objective of learning, these two companies have created platforms for ideation, obtaining feedback, and gaining an understanding of how successful an idea might be in the market. These platforms are managed carefully and have clear rules of engagement, which helps avoid the silliness that can result if left unchecked.
In my opinion, LEGO and Starbucks have been successful in their crowdsourcing efforts because of the following:
- The effort is ongoing – The platforms are always open and they provide the crowd ample time to generate discussion and vote.
- The rules of engagement are clear – LEGO, in particular, has very specific guidelines that each idea must meet in order to be considered.
- Others are encouraged to vote and provide feedback – Both platforms employ simple voting buttons, comment sections, leaderboards, indicators on the amount of time left to vote, etc.
- They provide idea status information – Both let participants know which ideas reach a threshold (if one exists), which ideas are more popular, which ideas are under review, and which products hit the market.
- Winning ideas are recognized…and the companies make a big deal out of it – LEGO, in particular does a great job of recognizing winners by creating profiles of the people behind the ideas and sharing their stories. The contributor also receives a royalty from the sales of the product.
- Both companies make it clear that they have the final say – Based on the amount of interaction on these sites, it is clear the crowd has a vote, but the company has the final say.
Done well, crowdsourcing can offer a different way to gather ideas from a variety of people into a pipeline and, at the same time, build brand equity through engagement. Crowdsourcing is another way to collect input for product development and customer experience improvements that can be refined and tested via more traditional research methods. There is no guarantee that an idea selected by a crowd will automatically succeed in market, but market research can help compensate for the fact that the crowd is not a scientific sample and thus cannot be projected to a population with confidence. A downside to crowdsourcing is the tremendous effort that is required to set up and manage a platform like LEGO’s and Starbucks’. It also takes a significant commitment of resources to evaluate the many ideas submitted and then connect them to the company’s strategy, which is something the crowd is usually unaware of when they submit ideas. Lastly, a solicitor must have the ability to deliver multiple products to market in a reasonable timeframe that it can directly attribute the inputs from the crowd. The solicitor has to demonstrate results to keep people engaged.
In closing, crowdsourcing is one more tool in the kit to help generate new ideas that help companies solve problems and meet the needs of their consumers to grow the bottom line. I believe crowdsourcing can work in concert with existing processes and research methods. The engagement aspect is unique and something traditional market research doesn’t provide; however, crowdsourcing isn’t something you simply dabble in – it’s a commitment. It must be done well in order to harness the benefits of engagement for the company or brand.
As Vice President of Research Innovation at Burke, Inc., Thania Farrar is constantly challenging the ways researchers gain insight into what people do and why to give our clients a competitive advantage.