Once seen as an experimental technique, crowdsourcing has found acceptance as a mode of technology innovation and implementation across organizations. Gartner has estimated that by 2018 yearend, crowdsourcing (the practice of tapping a distributed labor force of mainly freelance workers, typically through digital channels, to complete a specific piece of work) will constitute 20 percent of all enterprise software application-development initiatives.
Enterprises and technology providers are increasingly turning to crowdsourcing because of the benefits it offers:
- Talent: As demand for increasingly niche and sophisticated skills increases, crowdsourcing is becoming a flexible way to access new talent. Organizations are also using internal crowdsourcing marketplaces to help employees hone new skills and keep them engaged.
- Speed: Huge projects that might otherwise take months to complete can be broken down, parceled out, and completed by multiple teams working in parallel. At one company, crowdsourcing reduced what would normally be a nine-month-long research project to three months.
- Cost: The average cost of a crowdsourced project is usually significantly lower than what most organizations would spend internally to develop the same solution. Analysts estimate that within IT-services businesses, crowdsourcing can increase productivity by up to 9 percent and reduce costs by up to 7 percent.
One technology major sourced three different analytical models through crowdsourcing and developed a best-of-breed solution at a ~5 percent lower cost than deploying a full-time team.
That shift in demand has been matched by a change on the supply side, as many skilled workers are now looking for opportunities to work on diverse sets of problems with no long-term employment commitment. Crowdsourcing platforms give them the flexibility of choosing the when, what, and how of their work. This development has now made it easier than ever to hire niche talent from around the globe to work on specific problems on short notice.
Despite the clear benefits of crowdsourcing, we find that companies are consistently running into pitfalls:
- Poor understanding of what tasks and platforms are best suited for crowdsourcing. Companies are often tempted to crowdsource a wide range of tasks without understanding which activities are suitable and which are not. That lack of clarity often extends to platform selection. Different crowdsourcing platforms are optimized for different activities.
- A management apparatus that can’t address the specific demands of crowdsourcing. Enterprises have developed management roles and practices to manage traditional IT projects, but those capabilities cannot address the specific circumstances of crowdsourced projects. New capabilities are needed to focus on process, integration, and management to support crowdsourcing.
- Underestimating the complexity of working with established architecture. Standard IT structures tend to be monolithic and made up of a mix of large and often-outdated systems. That lack of flexibility can radically increase the complexity of crowdsourcing as IT managers try to address an often-baffling set of system dependencies and configurations for any given project.
- Insufficient attention to client confidentiality. The remote nature of crowdsourced delivery means that enterprises need to enforce high levels of client confidentiality. But the complexities of sharing, allowing access to files, and validating solutions require a clear approach to confidentiality protections before a project even begins.
- Misaligned pricing incentives. Too often, enterprises default to traditional pricing practices that lead to poor outcomes, do not attract the best talent, and have a low ROI.
Overcoming these challenges requires revisiting existing development-process, governance and program-management protocols. Based on our experience with some of the largest IT firms experimenting with crowdsourcing, we have found that the following five guiding principles are the most important:
1. Focus on a clear and narrow set of use cases supported by the right crowdsourcing platform
We have found that only discrete, well-defined problems with clear input and output criteria (such as low-end tasks like data entry or sophisticated tasks such as analytics modeling) are most appropriate for crowdsourcing. Testing, for example, is well suited to crowdsourcing, particularly when the testing needs to be performed on multiple platforms and across multiple markets and geographies (for more options, see Exhibit 1).
To increase the probability of success, it pays to choose projects that require minimal knowledge of the organization’s processes, technology, and data or that require access to specific licensed software. One IT-services company working with a financial-services client, for example, was tasked with developing a customer-profiling model for a fraud-detection algorithm. The IT-services firm didn’t have the requisite talent available in the specified time frame, but since the work was very specific and didn’t require broader knowledge of the financial company’s systems, they crowdsourced the work. The firm found the talent more than 100 percent faster than if they’d tried to hire someone, and the work led to 10 to 15 percent greater accuracy levels for key customer segments in the profiling model.
Projects that come with very strict service-level agreements (SLAs) are tougher to execute in a crowdsourcing environment. Crowdsourcing is also less well suited for those projects or tasks that require significant direct interactions with customers, since that tends to create confusion with other customer-facing elements. In general, we have found that cloud apps are better to crowdsource than noncloud apps (see Exhibit 2).
Project suitability also extends to platform selection. Some crowdsourcing platforms focus on specific types of work, such as coding and testing, while others focus on specific features, such as target audiences. At one IT-services client, we found that niche platforms such as Applause and Kaggle were better suited for crowd testing and crowd analytics, respectively, versus a “generalized” platform like Freelancer. In addition to considering the nature of the task, organizations should also assess the maturity of the platform offering, the size and diversity of the crowd available on the platform, pricing-model flexibility, and support for activities such as project or architecture management.
2. Set up robust team roles and responsibilities to manage crowdsourcing
The standard set of roles and tasks that IT-services businesses have in place for standard programs doesn’t work for crowdsourced projects. A captive unit of a global bank realized this point quickly as it tried to crowdsource an application as part of their systems-modularization effort. So they created dedicated roles to manage the project and hired a crowdsourcing pilot manager and an integration specialist to support a crowdsourced application.
Crowdsourcing requires a specific set of management approaches and roles:
Lead manager: Prioritizes crowdsourcing efforts, liaises with internal business and technology leads to validate needs, manages budgets, and allocates resources as needed.
Integration specialist: Works directly with the project lead to smooth the transition of the element back into the business by prepping the people and environments, shares feedback on the product with the crowdsourcing team, does quality control, and oversees testing. We found that this person had a large role in helping to reconfigure the anonymized crowdsourced element so that it could incorporate the organization’s actual data and ensure smooth integration of the element into the rest of the project.
Project lead: Defines the input data and interaction model to be used up front; then, working with the planning team, finalizes the functional and technical specs of the task. S/he also codifies the design principles, validations, and desired outputs at the outset to ensure that all parties have a clear sense of the governing guidelines (without getting so detailed as to constrain innovation and creative problem solving). This person also manages the project (timelines, milestones, validation, etc.) and works with the integration specialist to integrate the completed work back into the business.
Planning team: Develops detailed functional specifications for each module of the project, using standardized use-case modeling frameworks such as Unified Modeling Language (UML). This task is crucial, we found, because accurate and detailed specifications have to be in place before reaching out to the crowdsourcing community so they are clear on what is needed. Particularly challenging is the need to create these requirements without exposing proprietary client data.
Review panel: Reviews and evaluates crowdsourced elements when they’re submitted, a process that helps ensure alignment among all stakeholders. This panel should be made up of a mix of people with relevant skill sets to do the evaluation quickly, for example, a solution architect, a project manager, a quality-assurance lead, a business analyst, and the end client.
3. Combine modular design with automated agile delivery to overcome complexities
Modular design using microservices or a service-oriented architecture (SOA) gives the organization maximum flexibility, so that short, self-contained tasks don’t require contributors to understand the complexity of the whole. This approach simplifies the work for contributors, allows the services team to slot in different elements as they are developed and refined, and reduces the back-and-forth between developers and the organization that can delay delivery.
Distributed agile (iterative development) and DevOps (continuous integration and automated testing) practices help teams accelerate and automate quality assurance and code integration, especially when the volume of submissions from the “crowd” is large.
We have found that automatic integration of all developer working copies of the source code into a shared mainline several times a day is crucial. Automating unit tests ensures that no new submissions break or interfere with existing features, while greatly accelerating the process. One IT-services team used this approach so the client could more quickly see progress and provide feedback as well.
The use of end-to-end automated functional testing helps teams ensure that the project meets the required stability, reliability, cybersecurity, and user-experience criteria. In another example, a different IT services team provided a set of functional test scripts for tasks they were crowdsourcing. Automated testing allowed them to quickly review submissions and see that many of them had too many defects. Examining the test cases, the team realized it needed to make the test cases much more specific in terms of what outcomes were expected from each of the units of code. This fix drastically reduced the number of defects in subsequent submissions.
To help facilitate the approaches above, leading organizations typically create a “sandbox” environment that recreates the application development stack inside the business, often using the same tools (JIRA for bug tracking, Jenkins for build). This allows the teams to have a high degree of certainty that the crowdsourced elements they test will be accepted in the main environment. Some businesses we have worked with have even begun to experiment with using virtual machines to create test environments in cases where access to proprietary software is required.
4. Anonymize data and establish strong intellectual-property (IP) standards
Organizations that intend to rely heavily on crowdsourcing must integrate an IP strategy into the technology roadmap and system architecture.
Fortunately, the rapid growth and maturity of the sector means that the best crowdsourcing platforms have strong IP management protections in place, and businesses can negotiate for even stricter protections. For more innovative projects, there needs to be a balance between flexibility and confidentiality to allow for creative thinking and freedom to operate.
Likewise, leading companies take special care that tasks placed on external crowdsourcing platforms do not bear the client name or actual data containing personally identifiable information. Organizations can negotiate with platforms to include more stringent privacy protections. For example, one leading IT-services provider negotiated with its crowdsourcing teams to remove any traces of its use cases from the crowdsourced platform after the project was completed.
It is crucial that shared data be anonymized and secured before exposing it to the crowd. Compartmentalizing system components based on the nature of the data allows less-confidential data to be exposed easily to the crowd, while keeping highly confidential data secure.
5. Price based on value
IT-services companies are accustomed to pricing work through standardized dollar/hour rate cards. This approach can work for standardized low- or medium-complexity projects such as basic manual testing. For more innovative tasks, however, more-advanced organizations use premium pricing or even outcome-based pricing.
Successful pricing approaches require new KPIs that IT-services firms generally don’t use. For example, one company paid crowdsourcing talent for the number of tests conducted but quickly found this to be very expensive. So they changed the KPI to the number of bugs reported and then shifted the pricing to pay more if the reported bug was severe. This approach led the crowdsourcing talent to focus on finding critical bugs.
Some companies have learned to embrace pricing flexibility and creativity. One IT-services firm looking for user experience (UX) work priced the submitted elements based on the value created for the client. To encourage a range of good ideas, they agreed to pay for the top three entries. Borrowing the best ideas from each, they created a single optimal solution for the client. This approach, paying on a project basis, also tends to attract the most innovative people, who want to be paid by the complexity of the work rather than by the hour.
In other cases, companies will add bonus pricing to incentivize teams with a particular background or skill set. Examples include a loyalty bonus to encourage developers who have successfully worked with the enterprise before, a reliability bonus that rewards teams for projects that consistently pass a predefined performance threshold, and an open-source bonus that encourages teams to use open-source libraries, which facilitate reuse.
With an average of 14 percent of the business-services industry made up of freelance or independent workers, businesses have increasing access to a flexible pool of talent. Managed well, crowdsourcing has the potential to be a significant competitive advantage for IT-services companies and enterprises.