This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

SK-TDA4VM: Problem with Perspective transform integration in LDC mesh

Part Number: SK-TDA4VM

Tool/software:

Hello, 

I have followed Matlab code given in [FAQ] TDA4VM: How to create a LDC mesh LUT for fisheye distortion correction on TDA4? - Processors forum - Processors - TI E2E support forums (in python) to generate bin file. 

With my camera parameters and transformation matrix. 

pitch_in_mm= 0.003 
f_in_mm= 3.0 
w= 1920  
h= 1080 
s=2
m=4
tr_front = np.array([[1.29623404e+00, -3.33760860e-01, 0],
                 [3.44805409e-02, 2.91188168e+00, 0],
                 [1.24291410e-04, -1.15834076e-04, 1.00000000e+00]])

I get correct undistorted image. It is looking correct. 

But when I multiply transformation matrix (tr_front) with xt, yt, zt. I get following image

 

My setting in DCC tool file is looking like this 

I have tried changing the values of 'm', it does not seem to make any difference. But when I reduce the value of 's' to 0.39 I get a zoomed in image. 

I have also tried changing SL2 Size (bytes) value to 30,000. But it is also not making any difference.

Can you please tell me, what am I doing wrong here? 

The MATLAB code is already in the link. So I am not pasting the code again. But you want to see my python code, then let me know. 

Thank you very much in advanced. 

  • Hi Vivek,

    I get correct undistorted image. It is looking correct. 

    Do you mean you get the correct output image in matlab or somewhere in your python code?

    Could you please show how you applied the matrix in matlab code?

    But when I multiply transformation matrix (tr_front) with xt, yt, zt. I get following image

    It looks like this is what you get in tuning tool with your mesh LUT from matlab.
    It mean that your perspective transform is not done properly in matlab code.

    I have tried changing the values of 'm', it does not seem to make any difference.

    Yes, m is for down-sampling the mesh LUT.
    It does not change the mapping.

    But when I reduce the value of 's' to 0.39 I get a zoomed in image.

    Yes, "s" is part of the perspective change (zooming factor) and it will change your output image.

    I have also tried changing SL2 Size (bytes) value to 30,000

    This is for H/W cache size for LDC output buffering and it does not change your output image.

  • Hello Gang Hua,

    Thank you very much for your quick response. 

    All the pictures I have sent you are the output of DCC tool.

    Here is my python code, where I applied matrix transformation.

     

        # Centered coordinates
        xt = x - hc
        yt = y - vc
        zt = z * np.ones_like(xt)  # Equivalent to z * ones(size(xt)) in Octave
        tr_front = np.array([[1.29623404e+00, -3.33760860e-01, 0],
                             [3.44805409e-02, 2.91188168e+00, 0],
                             [1.24291410e-04, -1.15834076e-04, 1.00000000e+00]])
       
        rotation_points = np.column_stack((xt.ravel(), yt.ravel(), zt.ravel()))
        xt, yt, zt =  np.dot(tr_front, rotation_points.T)
        
        # Convert to cylindrical coordinates
        phi, r = np.arctan2(yt, xt), np.hypot(xt, yt)  # Use np.hypot for consistent norm
        
        # Convert to cylindrical coordinates
        phi, r = np.arctan2(yt, xt), np.hypot(xt, yt)  # Use np.hypot for consistent norm
    
        # Calculate undistorted radius
        theta = np.arctan2(r, zt)
    
        # Read distortion specification
        lut = read_spec(spec_file, pitch_in_mm)
    
        # Interpolate distorted radius using LUT
        lut_func = interp1d(lut[:, 0], lut[:, 1], kind='linear', fill_value='extrapolate')  # Create interpolation function
        r_distorted = lut_func(theta)
    
        # Convert back to distorted Cartesian coordinates
        h_d, v_d = r_distorted * np.cos(phi), r_distorted * np.sin(phi)
    
        # Shift back to original center
        h_d += hc
        v_d += vc

    Thank you

  • Hi Vivek,

    Here is my python code, where I applied matrix transformation.

    I cannot tell for sure what is really wrong.

    Most likely, it is a difference between the (x,y) coordinate range in your python code and matlab.

    If you have everything working in python, we can get the input image coordinates in python.
    Then, you may simple create the mesh LUT in python following the matlab code for mesh LUT encoding and down-sampling.

  • "Yes, I am able to create a mesh Look-Up Table (LUT) and it is working fine. Initially, I used input data from the [FAQ] TDA4VM: How to create a LDC mesh LUT for fisheye distortion correction on TDA4? - Processors forum - Processors - TI E2E support forums and I obtained the correct output mesh LUT file. However, when I tried using the file in the DCC tool and added a transformation matrix, it stopped working.

    Is there an example with an Affine/warp transformation in Matlab that you could share? This would allow me to compare it with my full code and potentially resolve the issue.

    If you want to recreate same issue on your side then here are all the required info, 

    * spec_file

    0.0 0.0
    1.6 0.086054
    3.2 0.17209275
    4.8 0.25810099
    6.4 0.34406343
    8.0 0.42996477
    9.6 0.51578965
    11.2 0.60152269
    12.8 0.68714843
    14.4 0.77265134
    16.0 0.85801581
    17.6 0.94322615
    19.2 1.02826654
    20.8 1.11312104
    22.4 1.19777357
    24.0 1.2822079
    25.6 1.36640761
    27.2 1.45035607
    28.8 1.53403646
    30.4 1.61743167
    32.0 1.70052432
    33.6 1.78329673
    35.2 1.86573087
    36.8 1.94780829
    38.4 2.02951013
    40.0 2.11081705
    41.6 2.19170915
    43.2 2.27216594
    44.8 2.35216624
    46.4 2.43168814
    48.0 2.51070888
    49.6 2.58920475
    51.2 2.66715102
    52.8 2.74452178
    54.4 2.82128981
    56.0 2.89742643
    57.6 2.97290135
    59.2 3.04768244
    60.8 3.12173555
    62.4 3.19502425
    64.0 3.26750955
    65.6 3.33914964
    67.2 3.40989952
    68.8 3.47971064
    70.4 3.54853049
    72.0 3.61630213
    73.6 3.68296368
    75.2 3.74844774
    76.8 3.81268081
    78.4 3.87558257
    80.0 3.93706514
    1.6 0.08606435
    3.2 0.17211312
    4.8 0.25813074
    6.4 0.34410161
    8.0 0.43001012
    9.6 0.51584059
    11.2 0.60157735
    12.8 0.68720464
    14.4 0.77270666
    16.0 0.85806753
    17.6 0.94327128
    19.2 1.02830187
    20.8 1.11314313
    22.4 1.19777876
    24.0 1.28219235
    25.6 1.3663673
    27.2 1.45028686
    28.8 1.53393406
    30.4 1.61729173
    32.0 1.70034244
    33.6 1.78306846
    35.2 1.86545179
    36.8 1.94747404
    38.4 2.02911647
    40.0 2.11035986
    41.6 2.19118453
    43.2 2.27157025
    44.8 2.35149618
    46.4 2.43094078
    48.0 2.50988177
    49.6 2.588296
    51.2 2.66615936
    52.8 2.74344668
    54.4 2.82013159
    56.0 2.89618636
    57.6 2.97158178
    59.2 3.04628693
    60.8 3.12026901
    62.4 3.19349311
    64.0 3.26592194
    65.6 3.33751555
    67.2 3.40823101
    68.8 3.47802206
    70.4 3.54683869
    72.0 3.61462671
    73.6 3.68132723
    75.2 3.74687613
    76.8 3.81120342
    78.4 3.87423259
    80.0 3.93587982
    1.6 0.08615605
    3.2 0.1722964
    4.8 0.25840536
    6.4 0.34446723
    8.0 0.43046625
    9.6 0.51638667
    11.2 0.60221268
    12.8 0.68792842
    14.4 0.77351799
    16.0 0.8589654
    17.6 0.9442546
    19.2 1.02936945
    20.8 1.1142937
    22.4 1.19901098
    24.0 1.2835048
    25.6 1.36775851
    27.2 1.4517553
    28.8 1.53547817
    30.4 1.61890991
    32.0 1.70203307
    33.6 1.78482993
    35.2 1.86728249
    36.8 1.9493724
    38.4 2.03108093
    40.0 2.11238896
    41.6 2.19327688
    43.2 2.27372455
    44.8 2.35371127
    46.4 2.43321567
    48.0 2.51221563
    49.6 2.59068823
    51.2 2.66860961
    52.8 2.74595489
    54.4 2.82269801
    56.0 2.89881164
    57.6 2.97426697
    59.2 3.04903355
    60.8 3.12307913
    62.4 3.19636938
    64.0 3.26886766
    65.6 3.34053476
    67.2 3.41132856
    68.8 3.48120368
    70.4 3.55011109
    72.0 3.61799766
    73.6 3.68480569
    75.2 3.75047231
    76.8 3.81492892
    78.4 3.87810048
    80.0 3.93990477
    

    * my input parameters, 

    s = 2;
    m = 4;
    pitch_in_mm = 0.003;
    f_in_mm = 3.0;
    W = 1920;
    H = 1080;

    * input image - Unfortunately, I can not share the image. My image size is too big (~267mb). 

    Thank you

  • Hello 

    I have also tried running your code in octave. 

    function [] = gen_lut(spec_file, pitch_in_mm,f_in_mm, W, H, hc, vc,s ,m)
    f = f_in_mm/pitch_in_mm ;
    [h_p , v_p] = meshgrid( 0:W, 0:H);
    [h_d,v_d] = xyz2distorted(h_p,v_p, f/s, hc, vc,spec_file, pitch_in_mm);
    h_delta = round((h_d-h_p) * 8);
    v_delta = round((v_d-v_p) * 8);
    mh = h_delta(1:2^m:end, 1:2^m:end)';
    mv = v_delta(1:2^m:end, 1:2^m:end)';
    dlmwrite('new_mesh2.txt', [mh(:), mv(:)],  'delimiter', ' ');
    
    function [h_d, v_d] = xyz2distorted(x, y, z, hc, vc, spec_file, pitch_in_mm)
    xt = x - hc;
    yt = y - vc;
    zt = z * ones(size(xt));
    
    %%
    T = [1 0 0;
         0 1 0;
         0 0 1]; % Example transformation matrix (Identity matrix)
    % Apply the transformation matrix
    xt = xt(:)';
    yt = yt(:)';
    zt = zt(:)';
    homogeneous_coords =[xt; yt; zt];
    rotated_points  = T* homogeneous_coords;
    xt = rotated_points(1,:);
    yt = rotated_points(2,:);
    zt = rotated_points(3,:);
    %%
    
    [phi, r] = cart2pol(x-hc, y-vc);
    theta = atan2(r, z);
    lut = read_spec(spec_file, pitch_in_mm);
    r = interp1(lut(:,1), lut(:,2), theta);
    [h_d, v_d] = pol2cart(phi, r);
    h_d = h_d + hc;
    v_d = v_d + vc;
    
    
    function lut = read_spec(spec_file, pitch_in_mm)
    lut0 = dlmread(spec_file);
    theta = lut0(:,1)/180*pi;
    lut = [theta, lut0(:,2)/pitch_in_mm];
    s = 2;
    m = 4;
    pitch_in_mm = 0.003;
    f_in_mm = 3;
    W = 1920;
    H = 1080;
    hc = W/2;
    vc = H/2;
    Wmesh = ceil(W / 2^m) * 2^m;
    Hmesh = ceil(H / 2^m) * 2^m;
    gen_lut("my_spec_file.txt", pitch_in_mm, f_in_mm, Wmesh, Hmesh, hc, vc, s, m)

    I am still getting following image 

    output

    Thank you

  • Hi Vivek,

    However, when I tried using the file in the DCC tool and added a transformation matrix, it stopped working.

    Do you use the matrix in LDC H/W or simply adding perspective change into the mesh LUT?

    My image size is too big (~267mb). 

    VPAC image size is limited to 4K and that is about 12MB for 8MP.
    How do you get a LDC input image for 267MB?

  • I am still getting following image 

    Your output view is still too large.
    You may think of "s" as part of the projective transform, i.e., a scaling factor.

  • Hello Gang Hua, 

    Thank you for your reply. 

    I am not using any hardware for lens distortion correction (LDC). My goal is to create a bin file (using DCC tool)  that includes lens distortion correction and perspective transformation. This file will be used with the tiovxldc gstreamer plugin to obtain a live camera stream.

    I am using NeduCAM25_CUTDA4 camera with 1920x1080 resolution. I am just capturing an image using gstreamer pipeline 

    DEVICE="/dev/video3" 
    WIDTH=1920
    HEIGHT=1080
    GST_DEBUG_COLOR_MODE=off GST_TRACERS="latency(flags=pipeline+element)" GST_DEBUG=GST_TRACER:7 GST_DEBUG_FILE=traces.log gst-launch-1.0 \
    v4l2src device=$DEVICE ! video/x-raw,width=$WIDTH,height=$HEIGHT,format=UYVY,framerate=30/1 ! filesink location="/opt/edgeai-gst-apps/chessboard_1920x1080.yuv"

    I hope, I am capturing the image correctly. 

  • OK, but if I manipulate 's' value, then it is also cropping the image from all the size. So I am losing visual information from left and right sides.  If try to get full view of the camera, then I get distortion (that stretch lines) on top and on bottom of the image. I have to do this FOV compromise ? 

  • If try to get full view of the camera, then I get distortion (that stretch lines) on top and on bottom of the image.

    I am not sure if I understand you correctly.
    Do you have a fisheye lens or wide FOV lens?

    Correcting distortion will always loose some FOV.