Image Processing Framework on Jetson
Nowadays quite a lot of tasks for image sensors and camera applications are solved with the centralized computing architecture for image processing. Just a minor part of image processing features is implemented on the image sensor itself, so all the rest is done on CPU/GPU/DSP/FPGA which could reside very close to the image sensor. The latest achievements at hardware and software solutions allow us to get enhanced performance of computations and to enlarge the scope of tasks to be solved.
From that point of view, NVIDIA Jetson series is suited exactly for the task of high performance image processing from RAW to YUV. Image sensor or camera module can be connected directly to any Jetson via MIPI SCI-2 (2lane/4lane), USB3 or PCIe interfaces. Jetson could offer high performance computations either on ISP or on GPU. Below we show what could be done on GPU. We believe that raw image processing on GPU can offer more flexibility, better performance, quality and ease of management in comparison with hardware-based ISP for many applications.
What is Image Processing Framework for Jetson?
To get high quality and max performance at image processing tasks on Jetson, we've implemented a GPU-based SDK for raw processing. Now we are expanding that approach by creating an effective framework to control all system components, including hardware and software. For example, it means that image sensor control should be included in the workflow at realtime to become a part of general control algorithm.
Image processing framework components
Additional features for the framework
Implementation of image sensor control at the workflow brings us additional features which are essential. For example, integrated exposure and gain control will allow to get better quality in the case of varying illumination. Apart from that, calibration data usually depend on exposure/gain and it means that we will be able to utilize correct processing parameters at any moment for any viewing conditions.
In general, standard RAW concept eventually lacks internal camera parameters and full calibration data. We could solve that problem by including image sensor control both in calibration and image processing. We can utilize image sensor abstraction layer to take into account full metadata for each frame.
Such a solution depends on utilized image sensor and task to be solved, so we can configure and optimize the Image Processing Framework for a particular image sensor from SONY, Gpixel, CMOSIS image sensors. These solutions on Jetson have already been implemented by teams of Fastvideo and MRTech.
Integrated Image Sensor Control
Full image sensor control also includes bit depth, FPS (frames per second), raw image format, bit packing, mode of operation, etc.
GPU image processing modules on Jetson for 16/32-bit pipeline
Image/Video Encoding modules on GPU
Is it better or faster than NVIDIA ISP for Jetson?
There are a lot of situations where we can say YES to this question. NVIDIA ISP for Jetson is a great product, it's free, versatile, reliable, and it takes less power/load from Jetson, but we have our own advantages which are also of great importance for our customers:
We've built that software from the scratch and we've been working in that field more than 10 years, so we have an experience and we can offer reliable solutions and support. Apart from that we are offering custom software design to solve almost any problem in a timely manner.
What are benefits of that approach?
That approach allows us to create embedded image processing solutions on Jetson with high quality, exceptional performance, low latency and full image sensor control. Software-based solution in combination with GPU image processing on NVIDIA Jetson could help our customers to create their imaging products with minimum efforts and maximum quality and performance.
Other blog posts about Jetson hardware and software