We are developing a custom board using the C6746.
I have had the NDK working for some time now, though when streaming data we are losing some blocks of data. I have seen references to L2 cache being an issue. Our board has SDRAM and not DDR. The default setup used in the NDK examples that I copied have L2 set to all cache and the MAR192-223 set to 0xffffffff. I changed the MAR128 bit to 1 and the MAR64 and MAR65 bits to 1 to enable caching of the SDRAM. After doing this the host PC cannot make a socket connection to the DSP board server. I can't ping either. Have I set up this correctly? Attached is my tcf file. The usual messages appear in the console window at boot up indicating that the EMAC is initialized and the appropriate (hard coded) IP address has been added, and the PHY link is good (100 Mbit/s).
I am using NDK 2.20.03.24, NSP 1.00.00.09, BIOS 5.41.10.36, EDMA LLD 01.11.02.05, and CCS 4.2
I have corrected the lost blocks of data issue by buffering up more data before calling send() on the socket. But I would like to know why caching the SDRAM causes the network operations to fail. I would like to be able to enable caching for the SDRAM. I thought I read in one of the documents that the EDMA handles cache coherence automatically. I would expect the NDK stack to handle it if the DMA doesn't. If I leave caching enabled for L3_CBA_RAM (MAR128) but disable caching for SDRAM (MAR64, MAR65) it works fine.
Mary