SPRAD86A March   2023  – May 2024 AM62A3 , AM62A3-Q1 , AM62A7 , AM62A7-Q1 , AM68A , AM69A

 

  1.   1
  2.   Abstract
  3.   Trademarks
  4. Introduction
  5. Tuning Overview
  6. Hardware Requirement
  7. Software Requirement
    1. 4.1 Processor SDK Linux
    2. 4.2 TI's Reference Imaging Software
    3. 4.3 ISP Tuning Tool
  8. Sensor Software Integration
    1. 5.1 Overview of Image Pipeline Software Architecture
    2. 5.2 Adding Sensor Driver to SDK
    3. 5.3 Updating TIOVX Modules
      1. 5.3.1 Source Code Change
      2. 5.3.2 Rebuild Modules
    4. 5.4 Update GStreamer Plug-in for VISS
      1. 5.4.1 Update VISS Plug-in Property
      2. 5.4.2 Add Exposure Setting for 2A Algorithm
        1. 5.4.2.1 Gain
        2. 5.4.2.2 Exposure Time
        3. 5.4.2.3 Other Parameters
      3. 5.4.3 Rebuild Plug-ins
      4. 5.4.4 Verify New Sensor in GStreamer Plug-in
  9. Tuning Procedure
    1. 6.1 Verify Functional Operation of Camera Capturing
    2. 6.2 Enable Camera Streaming With Initial VPAC Configuration
      1. 6.2.1 Generate Configuration Files
      2. 6.2.2 Generate DCC Binary Files
      3. 6.2.3 Stream Video With the Initial Configuration
    3. 6.3 Adjust Camera Mounting
  10. Perform Basic Tuning
    1. 7.1 Launch the Tuning Tool and Create a Project
    2. 7.2 Tuning Order
    3. 7.3 Black Level Subtraction
    4. 7.4 Hardware 3A (H3A)
    5. 7.5 PCID
    6. 7.6 Auto White Balance (AWB)
      1. 7.6.1 Capture Raw Images for Different Lighting Conditions
      2. 7.6.2 Tuning AWB
    7. 7.7 Color Correction
  11. Perform Fine Tuning
    1. 8.1 Edge Enhancement (EE)
    2. 8.2 Noise Filter 4 (NSF4)
  12. Live Tuning
    1. 9.1 Requirements
    2. 9.2 Supported Features
      1. 9.2.1 RAW Capture
      2. 9.2.2 YUV Capture
      3. 9.2.3 Live DCC Update
      4. 9.2.4 Exposure Control
      5. 9.2.5 White Balance Control
      6. 9.2.6 Sensor Register Read/Write
  13. 10Summary
  14. 11Revision History

Verify Functional Operation of Camera Capturing

Assume the camera driver has been integrated into the SDK, and the AM62A SK EVM boots to Linux and can probe the camera. Verify that both v4l2-ctl and media-ctl commands show expected output as below (with IMX219 as an example):

root@am62axx-evm:~# v4l2-ctl --list-devices
j721e-csi2rx (platform:30102000.ticsi2rx):
        /dev/video3
        /dev/video4
...
        /dev/media0                                                 
 
root@am62axx-evm:~# media-ctl -d /dev/media0 -p | grep imx219
                <- "imx219 4-0010":0 [ENABLED,IMMUTABLE]             
- entity 13: imx219 4-0010 (1 pad, 1 link, 0 route)
Note: 4-0010 in the media-ctl output is the I2C bus address for the sensor and this value can be different for a different SDK release.

Then verify that the camera can be configured to a certain format and raw images can be captured using a GStreamer pipeline. Below is an example, assuming 4-0010 is what is shown by the media-ctl command as above:

root@am62axx-evm:~# media-ctl -V '"imx219 4-0010":0 [fmt:SRGGB10_1X10/1920x1080 field:none]'
root@am62axx-evm:~# gst-launch-1.0 -v v4l2src num-buffers=5 device=/dev/video3 io-mode=dmabuf ! \
video/x-bayer, width=1920, height=1080, framerate=30/1, format=rggb10 ! \
multifilesink location="imx219-image-%d.raw"

The captured raw images are in pure Bayer pattern array format (RGGB for IMX219 sensor) without any header or compression. These raw images can be displayed by a raw image viewer or other tools such as ffmpeg. At this stage, the raw images can be either overexposed or underexposed since the default exposure time and gain in the sensor does not necessarily match the lighting environment where these images are captured.