This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA2PXEVM: VLIB Lucas-Kanade-Tracker detects no movement at all

Part Number: TDA2PXEVM

Hello,

I am currently working on a Feature-Tracking Algorithm using the VLIB_trackFeatureLucasKanade_7x7 function. For a test purpose I created a Pixel-Grid like this:

I then shift the white Pixels 3 steps aside (I also tried bigger movements).

I use VLIB_xyGradients to compute the gradients and also tried to let the Lucas Kanade Tracker compute it itself. For real Images I use a minDist Filter

but it dosen't matter for this Test scenario.

I copy the coordinates from the unschifted pattern to the output coordinates as initial quess. The coordinates are shiftet by 4 to the left, so they fit into the

UQ12.4 format.

Then I run the Tracker. I tried different Parameters:

+ Use xyGradients output
+ Let the Lucas-Kande Function compute the gradients itself
+ more and less iterations
+ Use and sont use error calculation
+ differtent threscholds for early exit

But nothing seems to change the reslut.

I think I allocated the Buffers all right. The Variable "good" holds the number of Features which are to track. I started width a scratch Buffer

of 839 Byte. Later I tried double the size, but it had no effect.

I don't know what I am missing here or if the tracker even works well with an generated pattern like this.

I hope somebody can help me out here.

My code looks like this:

<...>

        // *************************************************************************************************
	// 						Track Features with Lukas Kanade Tracker
	// *************************************************************************************************
	x = (UInt16*)Utils_memAlloc(UTILS_HEAPID_DDR_CACHED_SR, good*sizeof(UInt16), ALGORITHMLINK_FRAME_ALIGN);
	y = (UInt16*)Utils_memAlloc(UTILS_HEAPID_DDR_CACHED_SR, good*sizeof(UInt16), ALGORITHMLINK_FRAME_ALIGN);
	err = (UInt16*)Utils_memAlloc(UTILS_HEAPID_DDR_CACHED_SR, good*sizeof(UInt16), ALGORITHMLINK_FRAME_ALIGN);
	outx = (UInt16*)Utils_memAlloc(UTILS_HEAPID_DDR_CACHED_SR, good*sizeof(UInt16), ALGORITHMLINK_FRAME_ALIGN);
	outy = (UInt16*)Utils_memAlloc(UTILS_HEAPID_DDR_CACHED_SR, good*sizeof(UInt16), ALGORITHMLINK_FRAME_ALIGN);
	scratch = (UInt8*)malloc(839*sizeof(UInt16));

	memset(x, 0, good*sizeof(UInt16));
	memset(y, 0, good*sizeof(UInt16));

	for(i = 0; i < good; i++)
	{
		x[i] = (UInt16)(minDistFeatures[i].x << 4);
		y[i] = (UInt16)(minDistFeatures[i].y << 4);
	}

	memcpy(outx, x, good*sizeof(UInt16));
	memcpy(outy, y, good*sizeof(UInt16));

	Int32 ret;

	ret = VLIB_trackFeaturesLucasKanade_7x7(img1, img2, NULL, NULL, width, height, good, x, y, outx, outy, NULL, 10, 0, scratch);

	if(ret == 0)
	{
		Vps_printf("********************************************************");
		Vps_printf("error");
		Vps_printf("********************************************************");
	}

	UInt8 *mask = (UInt8*) Utils_memAlloc(UTILS_HEAPID_DDR_CACHED_SR, width*height*sizeof(UInt8), ALGORITHMLINK_FRAME_ALIGN);

	for(i = 0; i < good; i++)
	{
		if(outx[i]  != 0xFFFFU && outy[i]  != 0xFFFFU)
		{
			if(outx[i] != x[i] && outy[i] != y[i])
			{
				Vps_printf("(%d, %d) --> (%d, %d)", x[i] >> 4 , y[i] >> 4, outx[i] >> 4, outy[i] >> 4);

				DrawLine(mask, width, x[i] >> 4, y[i] >> 4, outx[i] >> 4, outy[i] >> 4);
			}
		}
	}

<...>

  • Have you tried this on real images?  The algorithm works on a 7x7 neighborhood and was designed for tracking features computed from something like harris corners in real images, so I'm not sure if something about the algorithm doesn't work well with this kind of artifical pattern.  Also, the offset may be too far.  Generally this is run on each level of a gaussian image pyramid to account for different amounts of motion between frames of features.

    I suggest trying this on real images using harris corners to detect features and seeing if it is working in that setup.

    Jesse

  • Hello Jesse,

    thanks for your quick reply. I tried the algorithm first on real Images, but when it doesn't work I wanted to test it unter controled circumstances. So I created this Test-Pattern.

    I don't think I can use the Harris-Scores in the lucas-kanade tracker, because it takes the Feature coordinates and no mask. But I think I get your Idea to not use single pixel Features. Insted let the Pixel-values fade out in the neighborhood, so the tracker gets more "structure" to track.

    This indeed sloved my Problem. The Features are tracked very well now.

    Switching back to real images showed no tracking again. Then I found my mistake. It is kind of embarrassing, but maybe it helps somebody in the future.

    In every Iteration of my Algorithm I copy the new frame inPtr[0] into the variable img1 and the old frame (which is in img1) to img2.

    I did it this way:

    memcpy(pAlgHandle->img1, inPtr[0], width*height+1);
    memcpy(img2, pAlgHandle->img1, width*height+1);
    			

    In this case img1 and img2 will hold the same frame. Between the same frame there is no movement to track. After switching the order of the memcpy calls the tracking works fine.

    Thanks Jesse you've been a great help