Sorry, you need to enable JavaScript to visit this website.
Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.

Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.

The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

ITS Challenges Creators to Enhance Computer Vision for Public Safety

Author
Sheryl Genco, Director of the Institute for Telecommunication Sciences

For decades, NTIA’s research laboratory, the Institute for Telecommunication Sciences (ITS), has been working alongside the National Institute of Standards and Technology’s (NIST) Public Safety Communications Research program (PSCR) and the public safety community to enhance mission-critical communications. First, it was all about intelligible voice communications. But first responders now have access to equipment and networks that support video and image transmission as well.

Streaming video could help an incident commander coordinate firefighters, for example. However, the challenge is that fire response is fast—about 3 minutes for a single-family dwelling. Deploying and coordinating helmet-cam feeds from 25 firefighters will only be practical with the aid of computer vision and video analytics, which are currently just out of reach.

Today’s off-the-shelf cameras come with many image-quality problems. Think of a door’s peep hole: you look through it and see a backlit person or just the arm of someone standing too far to the side. It would be great to have something between you and the peephole that fixes these problems.

The missing component is a metric that assesses the quality of an image or video. These metrics are called no-reference (NR), because they cannot refer to a pristine original to understand what the image or video should look like. The NR metric would detect image-quality problems such as excessive compression, jerky motion, blur, mosquito noise, slicing, freezing, flickering, and sensor artifacts. When the NR metric identifies quality problems, the computer vision system would employ diverse strategies to fix the issue, such as panning the camera, increasing the bit-rate, or switching among computer vision algorithms.

Today’s NR metrics emulate human perception, to help industry optimize video transmission services. To help first responders, we need NR metrics that understand what “good quality” means to a computer vision algorithm. To establish this new line of research, ITS helped NIST create the Enhancing Computer Vision for Public Safety Challenge, with FirstNet providing technical support.

This prize challenge has three goals:

  • Create training data—images and videos depicting camera impairments that hinder computer vision algorithms
  • Measure failure rate—brainstorm the best methods to assess the likelihood that the computer vision algorithms can make reliable decisions
  • Open data—inspire new research on NR metrics for computer vision applications 

NIST funded this challenge with a total prize purse of up to $240,000. Each contestant team will compete for up to $28,000 in cash prizes and have the opportunity for their datasets to be distributed on the ITS-hosted Consumer Digital Video Library for future research and development purposes. The challenge is open for submissions beginning September 8 with concept papers due October 20.  

Funds for this and other Open Innovation Challenges come from spectrum auction proceeds deposited in the Public Safety Trust Fund established by the Middle Class Tax Relief and Job Creation Act of 2012. The Act sought to address communications problems that hindered public safety response during the 9/11 tragedy and ensure that our first responders have the equipment and operational capabilities to do their jobs.

Visit the challenge website for more information. This prize challenge was designed and funded by NIST PSCR. The technical work described in this blog was conducted by the ITS video quality research program and funded by NIST PSCR.