This post was written by Innovation Program Manager Dan Kuster, who designed and developed a large portion of the next generation Prodigy online scoring system.
When we ask Solvers and Seekers how to improve the open innovation experience, we consistently hear about a need for quick and useful feedback. After all, Solvers are committing time and resources for a chance to win an award…as a Solver, it can be hard to know if it is worth the effort. On the other hand, Seekers feel the burden of uncertainty too, by exposing valuable problems to the world in the hope that someone will have a solution. If it were possible to pre-evaluate a submission, then Solvers would be able to estimate the risk in developing a full solution, and Seekers could manage an open innovation project with greater certainty of the results.
The Prodigy online scoring system was developed as an attempt to provide some quick and objective feedback, by providing Solvers with a single number score indicating the quality of their submission, relative to the Challenge Requirements, and to other Solvers. The first incarnation of the Prodigy system compared a Solver’s predictions to a known answer, and reported how well the Solver was able to predict the known answer. The next generation of Prodigy takes the idea of online feedback even further, by allowing Solvers to upload native R code. A Solver’s code is dynamically evaluated, in real-time, on our standardized server hardware, where performance can be measured objectively on an independent set of data. For Solvers, this means you can spend your effort developing good code, and when your score is good, you know it is worthwhile to invest the time in making a full submission. For Seekers, this means submissions are guaranteed to work, because performance has already been demonstrated on an independent system with independent data.
If you are interested in trying out the Prodigy scoring system for yourself, check out this Challenge: https://www.innocentive.com/ar/challenge/9932794. An upcoming blog post will show “How To” write R code for this Challenge, make a submission to the Prodigy online scoring system, and see how you stack up against other Solvers.
While the Prodigy scoring systems begin to address the need for quick and useful feedback early in the open innovation experience, we realize there are even more opportunities to enable Solvers to focus on the most interesting and valuable solutions, especially beyond the domain of computational or analytical Challenges. Do you have an idea that would make the open innovation experience better? Leave a comment to let us know!