Bad pixel correction on GPU

Modern image sensors usually have much more than megapixel resolution and we could expect that all these pixels are fully functional. Unfortunately, this is not the case and among millions of pixels there are some defective. It's super complicated task to manufacture an image sensor without such pixels. These pixels are usually called "bad pixels" and they could be classified according the following:

  • Dead pixels don't give any feedback to incoming light (usually output constant value)
  • Hot pixels generate excessive values to incoming light (usually output is significantly higher than it should be)
  • Bright pixels generate excessive values to incoming light (usually output is slightly higher than it should be)
  • Black pixels - these are pixels with weak response (usually output less value than it should be)

Here we are discussing pixels with predictable behaviour. There could be some "crazy pixels" with unpredictable output. We don't discuss them here, though it should be a good idea to correct them as well. We just need to classify them as bad pixels during calibration or test procedures.

If we have a look at image sensor detailed specification from any manufacturer, we will see a confidential document which is describing bad pixels and their positions. Yes, each image sensor manufacturer knows that there are some bad pixels and that number will eventually grow during sensor usage. To consider any particular image sensor to be good, we could compare actual positions and a number of bad pixels with the specification. Usually bad pixels are allowed to be not in the vicinity of image sensor center, but closer to the borders.

Camera manufacturers usually sort image sensors before manufacturing cameras, because different camera applications have not the same requirements for image quality. If we compare a surveillance camera with a camera for microscope, they just can't have the same number of bad pixels.

Image sensor manufacturers grade their devices according to a number and type of defects present.

Bad pixels are really bad if we don't remove them before image segmentation, image recognition, stitching, 3D modelling, etc. These bad pixels don't do anything good for image viewing either. So we see too many reasons to suppress them.

As soon as bad pixels are quite common, there are several ways of how we could remove them:

  • we could fix them inside the camera on FPGA according to the known list of coordinates
  • they could be fixed in RAW data at the software
  • bad pixels could be fixed in RGB data at the software

Bad pixels suppression in the FPGA is very convenient way and it's super fast, but we need to keep the list of bad pixels up-to-date. To check the appearance of new bad pixels, we need to run a calibration test to update that list. Unfortunately this is not always convenient to do that, so that approach is not 100% applicable.

The most flexible approach is to remove bad pixels in RAW data via software. Here we can either interpolate bad pixels according to the known list of coordinates or we could implement special filter which is able to recognize bad pixels and to interpolate them at RAW data.

If we haven't removed bad pixels from RAW, we still can remove them from RGB. In such a case, the removal is still possible, though it will be done after demosaicing, that's why some "traces" of bad pixels could be found at the neighbour pixels. That method is viable, but it's not the best.

Bad pixel correction filter

We've implemented high performance filter for bad pixel correction (BPC). Basically, we need to remove impulse noise and such a removal is almost always good for the picture. The main idea is to apply that filter not to all pixels, but only for those, which look like bad/hot/dead pixels. We analyze the vicinity of each RAW pixel and formulate some criteria to recognize the pixel to be bad. This is done in RAW domain, so we apply different criteria for Green and Red/Blue pixels. The algorithm should work in wide range of illumination intensities and we've done some tests to investigate these situations.

We see almost the same situation when we calibrate a camera to get Flat-Field Correction (FFC) frame. Hot pixels could influence on the results of FFC, so we need to interpolate them before we do any computations.

 

bad pixel correction

 

Apart from bad pixel removal we always need to keep an eye on a total number of processed pixels with that filter. This is indirect means to check image sensor degradation. That solution is handy and could be applied to RAW data from any camera. Soon we will extend that filter to be able to work with RGB data as well, because such a demand does exist.

Bad pixel correction on NVIDIA GPU

We've implemented BPC filter on GPU to offer good performance and quality even for cameras with high resolution image sensors in realtime applications. We process bad pixels in RAW domain and we do that really fast. One frame from 12 MPix 12-bit image sensor could be fixed at ~1 ms on NVIDIA GeForce GTX 1080. That feature is now the constituent part of Fastvideo SDK and it's available for mobile, laptop, desktop and server GPUs from NVIDIA.

  • Input data: 8/12/16-bit RAW from HDD/SSD/RAID in PGM/DNG/CinemaDNG format
  • Output format: 8/12/16-bit RAW with corrcted bad pixels
  • Maximum image resolution up to 16k×16k pixels and more
  • OS support: Windows-7/8/10, Linux Ubuntu/CentOS, L4T for Jetson

To check performance and quality for BPC algorithm, you can download Fast CinemaDNG Processor software for evaluation.

Contact Form

This form collects your name and email. Check out our Privacy Policy on how we protect and manage your personal data.