Judging Criteria

Learn How Submissions Will Be Evaluated for the mFIT Challenge  

Phase 1: Judging Criteria

Concept Paper

Concept Paper submissions will be evaluated based on the following criteria:​

Compliance Check (Pass/Fail)

This criterion will be evaluated on the following factor:

If a submission passes Criterion 0, it will be evaluated on the following criteria:

0 %
Strategic Alignment & Technical Outcomes
  • Strategic AlignmentThe extent to which the proposed approach meets the objectives listed in the mFIT Challenge goals; the likelihood the contestant’s solution, if successfully implemented, will have a significant real-world impact on mobile fingerprint capture capabilities.
    • Proposed Technology & Justification–Contestant’s strategy for solving the problem statement, making technological improvements and justification for those improvements aligns with the mFIT Challenge goals.
    • Methodology Data Acquisition –Contestant’s demonstrated understanding of law enforcement requirements, limitations and use case.
    • Proposed Use of Sensors, Add-on Components and Software – Contestant’s hardware or software plan (types of hardware/software platform) and how the plan aligns with the mFIT Challenge goals.
 
  • Technical Outcome – Extent to which the proposed approach will result in significant improvement in commercially available technology and will potentially result in a technical outcome that enables considerable progress toward the mFIT Challenge goals.
    • Accuracy of the Technological Solution – demonstrates knowledge of the technical details of mobile fingerprint capture.
 
0 %
Feasibility & Team
  • Team – The extent to which the contestant’s capability can address all aspects of the proposed project with a high chance of success, including, but not limited to, qualifications, relevant expertise, and time commitment of the contestant. Reviewers will assess: (a) the relevance of the qualifications and experience of the key staff, leadership, and technical experts, (b) the extent the contestants’ prior experience and the quality of the results in similar projects related to the purpose, scope or tasks of this challenge.

 

  • Approach – Contestant’s plan to manage the limited schedule, resources, project risks, and other challenges and produce high-quality project outcomes to meet the mFIT Challenge goals. 

Phase 2 Walk-On: Judging Criteria

All Phase 2 submissions will be evaluated based on the following criteria:

Basic Compliance (Pass/Fail)

  • Verification of completeness according to How to Enter Phase 2 requirements, found within the Official Rules

 

In the event there are more than five compliant Walk-on submissions, NIST will further evaluate the submissions using the following criterion.

0 %
Demonstration Video Review
  • The SME panel will review the demonstration videos for a clear depiction of all components of the solution, including an explanation of the solution, the technical gaps addressed, and the technical improvements made. The SME panel will provide a summary analysis to the judges.

 

The five highest ranked submissions, as determined by the judges, will advance to the All Phase 2 Submissions Evaluation.

Phase 2: Judging Criteria

Virtual Demonstration and Evaluation of Prototypes

Following the evaluation of Walk-on contestants’ submissions, the SMEs and judges will continue the review and evaluation process of all Phase 2 submissions using the following criteria:

Compliance Testing (Pass/Fail)

This compliance testing includes: 


  • Verification of completeness according to “How to Enter” Phase 2 requirements.
  • Verification the devices and mobile applications are responsive to the contest.
  • Evaluation of devices and mobile applications for potential safety and security risks to NIST staff, SMEs, judges, and government facilities.
  • That devices and mobile applications can be operated by SMEs and judges with minimal installation and assembly.
  • Verification the solution is capable of capturing quality digital images of at least two fingers (including index and middle) on the left and right hands, and the resulting digital images are in PNG file format have a resolution of 500 pixels per square inch (ppi) ± 5 ppi. Optical resolution of devices may be higher than 500 ppi, but output images must be down sampled to 500 ppi ± 5 ppi.
  • Verification the prototypes store multiple fingerprint files which can be downloaded as a batch or compressed file package (e.g., zip file) with file names that identify finger(s), hand and subject (may use codes).
0 %
Efficiency and Effectiveness

In scoring for Efficiency and Effectiveness, the solution will be evaluated on how well the solution captures a quality digital fingerprint image and how that image compares to an image captured by benchmark peripheral fingerprint devices.  The benchmark devices may be commercial or non-commercial. Images captured by benchmark devices will be anonymized with respect to the device. The names of benchmark device manufacturers will not be released.


The Efficiency and Effectiveness evaluation includes:

 

  • Comprehensiveness of contestant’s solution in capturing a quality digital fingerprint image.
  • The contestant’s solution will be compared against the results from FBI EBTS Appendix F certified hardware peripheral devices. Digital fingerprint image files collected from contestants’ prototypes will be compared to corresponding digital fingerprint image files collected from Appendix F certified devices.
  • The images generated by a contestant’s solution will be scored by a semi-automated algorithm that evaluates image quality (e.g., entropy and contrast) and suitability (e.g., successful automated database registration) of the image for use against an identity database.
  • Solutions that are fully self-contained will score higher than solutions that require a remote server for image processing.
0 %
Feasibility

In scoring for the Feasibility criterion, the judges will evaluate each solution for ease with which a public safety agency could implement the solution.

 

The Feasibility evaluation includes:

 

  • cost per deployment
  • ability to deploy the system within four years
0 %
User Satisfaction and Ease of Use

An evaluation of the solution’s responsiveness to public safety user needs including:

 

  • Ease of use
  • Speed of collection and processing
  • Convenience of form factor
  • User satisfaction
0 %
Innovation & Creativity

An evaluation of the novel approach taken by the contestant that is an outstanding technological innovation or has the potential for exceptional impact on the public safety community. Examples include, but are not limited to, the unique use of existing device sensors, the development of new image rendering algorithms, and the creation and justification of an add-on component.

Our Partners
firstnet_logo copy
FirstNetAuthority-Logo-2020-white

HOSTED BY

PSCR_Logo_transparent_white_vector (2)

IMPLEMENTED BY

The mFIT Challenge is implemented by Sensis under contract with NIST PSCR. This website is not owned or operated by the Government. All content, data, and information included on or collected by this site is created, managed, and owned by parties other than the Government.