Challenge <p>The ODNI-OUSD(I) Xamine Challenge: Machine Verification of Collected Information</p>

The ODNI-OUSD(I) Xamine Challenge: Machine Verification of Collected Information

Award:  $25,000 USD
Deadline: Jul 02 2018 23:59 EDT
Active Solvers: 85
Posted: May 03 2018
Challenge ID: 9934079
 
Share Challenge Share
Abstract

Machine-based approaches to generating and evaluating analytic products from disparate structured and unstructured data types are emerging areas of research for the U.S. Intelligence Community (IC). As these approaches mature beyond demonstration systems with controlled data sources, such IC systems will require a means for inspecting and ensuring the integrity of the data that are ingested by these systems.  These considerations will become particularly critical as the information available to the IC’s analytic community continues to exceed the ability for traditional, human vetting.  Accordingly, the ODNI and OUSD(I) are seeking ideas and descriptions of a viable technical approach for enabling the automated validation of information prior to the dissemination of machine-generated intelligence products. A total award pool of $75,000 is available for this Challenge with a guaranteed payout of $25,000.

This is an Ideation Challenge with a guaranteed award for at least one submitted solution.

Overview

Machine-driven approaches to generating and evaluating analytic products from disparate structured and unstructured data types are emerging areas of research for the U.S. Intelligence Community (IC). As these approaches mature beyond demonstration systems with controlled data sources, such IC systems will require a means for inspecting and ensuring the integrity of the data that are ingested by these systems.  These considerations will become particularly critical as the information available to the IC’s analytic community continues to exceed the ability for traditional, human vetting.

Removing humans from the IC’s information assurance processes will naturally raise additional concerns with the veracity of any resulting machine-generated products.  Enabling the success of any automated information ingestion approach then will require ensuring that the quality of the results obtained by such systems is comparable to, and preferably better than, those achieved by trained humans. Doing so will require statistical methods that connect to underlying data, establish confidence hierarchies, model uncertainty and error propagation, and manage risk as a function of time and complexity.

Determining the veracity of ingested information is often challenging for trained humans, but additional problems exist for systems employing artificial intelligence (AI) methods.  Machine learning (ML) techniques, for example, can be undermined by intelligent and adaptive adversaries.  Such actors have demonstrated the ability to manipulate input data, thereby exploiting specific vulnerabilities of ML algorithms and compromising the system’s integrity.  These concerns regarding “spoofing” are generally focused on ML-based applications, such as voice and image recognition, but analogous approaches can be used for additional artificial intelligence-based techniques, such as natural language processing.

Missing from the IC’s analytic toolset then is an objective means for ensuring the accuracy and veracity of input information as intelligence products develop—an IC capability that will be essential should approaches such as the Office of the Director of National Intelligence’s (ODNI’s) and the Office of the Under Secretary of Defense for Intelligence’s (OUSD[I]’s) previous Challenges[1] to craft machine-generated intelligence products bear fruit.

The ODNI and OUSD(I) are seeking ideas and descriptions of a viable technical approach for enabling the automated inspection and validation of uncertain information prior to the dissemination of machine-generated intelligence products.  For this Ideation Challenge, Solvers are asked to submit their ideas along with a well-supported, technology-based justification for how the proposed approach could rapidly and objectively determine the trustworthiness of input information.  An additional award pool of $50,000 is available for Solvers who are able to provide, upon request from the Seekers, more detailed information such as a pseudo-code implementation of their proposed solution.

This is an Ideation Challenge, which has the following unique features:

  • There is a guaranteed award.  The award(s) will be paid to the best submission(s) as solely determined by the Seekers.  The total payout will be $25,000, with at least one award being no smaller than $5,000 and no award being smaller than $1,000.
  • The Solvers are not required to transfer exclusive intellectual property rights to the Seekers.  Rather, by submitting a proposal, the Solver grants to the Seekers a royalty-free, perpetual, and non-exclusive license to use any information included in this proposal, including for promotional purposes.
  • After initial review of submissions, Solvers with highly rated submissions may be asked to provide additional detailed information including, but not limited to, a pseudo-code or better implementation of their proposed solution.  An additional award pool of $50,000 is available for submissions of this additional information that meet additional criteria specified in the Seekers’ request.  Such additional requested information is not subject to the standard Ideation licensing provision and Solvers will be asked to grant to the Seekers a non-exclusive license for US government use purposes only if chosen for an additional award.

Submissions to this Challenge must be received by 11:59 PM (US Eastern Time) on July 2, 2018. 

Late submissions will not be considered.

After the Challenge deadline, the Seekers will complete the review process and make a decision with regards to the Winning Solution(s).  All Solvers that submit a proposal will be notified on the status of their submissions; however, no detailed evaluation of individual submissions will be provided.

 

ELIGIBILITY

Federal entities or Federal employees acting within the scope of their employment are eligible to compete but are NOT eligible to receive a monetary award for this Challenge.

Please note that winners will have to submit an Academic Institution Acknowledgement Letter acknowledging the role of ODNI in this Challenge if you are:  (i) a U.S. Academic Institution at the college or university level, (ii) an employee of such institution who is participating on behalf of that institution, or (iii) an employee of such institution who is participating in their personal capacity if they are using the resources of such institution to respond to this Challenge.  A template for this letter is included as an attachment to this Challenge and will be available after accepting the Challenge-Specific Agreement (CSA).  Click the “View Challenge Details” button to access the CSA for details.

Entities or employees of entities from the following countries subject to U.S. economic sanctions are not eligible to participate in this Challenge: Iran, Syria, Sudan, Cuba, and North Korea.  In addition, individuals and entities listed on the U.S. Government’s Consolidated Screening List (available at http://export.gov/ecr/eg_main_023148.asp) are not eligible to participate in this Challenge.

This Challenge is open to all others (18 years of age and older) not excluded above.  Only one submission per team should be submitted.

    

ABOUT THE SEEKERS

This Challenge is sponsored by ODNI’s Office of the Director of Science and Technology (DS&T), in partnership with the Office of the Under Secretary of Defense for Intelligence (OUSD[I]) and in collaboration with the Air Force Research Laboratory (AFRL).  DS&T leads the Intelligence Community’s (IC’s) efforts to enhance the returns on investments in technology—its mission is to deliver innovative, technology-based capabilities which solve intelligence challenges today and in the future.  OUSD(I) serves as advisor to the Secretary and Deputy Secretary of Defense for intelligence, counterintelligence, security, sensitive activities and other intelligence-related matters.  AFRL is the Air Force's only organization wholly dedicated to leading the discovery, development, and integration of warfighting technologies for our air, space and cyberspace forces.

................................................................................................................................................................................................................

[1] Earlier jointly-released ODNI and OUSD(I) challenges—the Xpress Challenge and the Xtend Challenge—explored the potential for machine generation and machine evaluation, respectively, of analytic products. These research thrust areas will be critical for a new model for intelligence production.  As outlined in this proposal, the Xamine Challenge will be an essential complement to the operationalization of Xpress- and Xtend-derived results.

What is InnoCentive?
InnoCentive is the global innovation marketplace where creative minds solve some of the world's most important problems for cash awards up to $1 million. Commercial, governmental and humanitarian organizations engage with InnoCentive to solve problems that can impact humankind in areas ranging from the environment to medical advancements.

What is an InnoCentive Ideation™ Challenge?

An InnoCentive Ideation™ Challenge is a broad question formulated to obtain access to new ideas, similar to a global brainstorm for producing a breakthrough idea or market survey which may include ideas for a new product line, a new commercial application for a current product, or even a viral marketing idea to recruit new customers. Ideation™ Challenge submissions are typically about two written pages, and Seekers receive a non-exclusive, perpetual license to use all submissions.

In an Ideation™ Challenge, Solvers may:

  • Submit ideas of their own
  • Submit third party information that they have the right to use and further the authority to convey that right and the right to use and develop derivative works to Seekers
  • Submit information considered in the public domain without any limitations on use

Solvers should not reveal any confidential information in their submissions. Often the Ideation™ Challenge will be followed by one or more of the other three Challenge types to further develop the ideas and gain Intellectual Property protection when the concept has been well-defined.

Share This Challenge
Challenge Data (What's This?)
Solvers
Solvers
Table showing data points for submissions recieved
Submissions
Submissions
Table showing data points for submissions recieved
Solver Map
Solver Map
InnoCentive Trust Partners