Hi,
I cannot get the DMA transfer to happen with the DSP 0 core neither with the EDMA3 LLD nor with CSL functions.
With the LLD I do the following:
Uint32 edma3InstanceId = 0;
Uint32 edma3RegionId = 4;
Error_Block eb;
Semaphore_Params semParams;
EDMA3_DRV_Result edma3Result;
EDMA3_DRV_Handle hEdma;
// these are linked from the EDMA3 LLD package samples:
extern EDMA3_DRV_InstanceInitConfig sampleInstInitConfig[][EDMA3_MAX_REGIONS];
extern EDMA3_RM_InstanceInitConfig defInstInitConfig[][EDMA3_MAX_REGIONS];
extern EDMA3_DRV_GblConfigParams sampleEdma3GblCfgParams[];
// Allocate memory for the test
Uint8* dmaSrc = (Uint8 *)malloc(1024); // becomes 0x00823800
Uint8* dmaDst = (Uint8 *)malloc(1024); // becomes 0x00823C08
// Configure Resource Manager, semaphore and create the driver instance
EDMA3_DRV_GblConfigParams *globalConfig = &sampleEdma3GblCfgParams[edma3InstanceId];
EDMA3_DRV_MiscParam miscParam;
miscParam.isSlave = FALSE; // ARM+DSP = specify if master; single-CPU = use FALSE
EDMA3_DRV_create(edma3InstanceId, globalConfig, (void *)&miscParam);
EDMA3_DRV_InstanceInitConfig *instanceConfig = &sampleInstInitConfig[edma3InstanceId][edma3RegionId];
EDMA3_DRV_InitConfig initCfg;
initCfg.isMaster = TRUE; // Single-CPU processor, choose TRUE
initCfg.regionId = edma3RegionId;
initCfg.drvInstInitConfig = instanceConfig;
initCfg.gblerrCb = NULL;
initCfg.gblerrData = NULL;
Semaphore_Params_init(&semParams);
initCfg.drvSemHandle = Semaphore_create(1, &semParams, &eb);
// open driver
hEdma = MY_EDMA3_DRV_open(edma3InstanceId, (void*)&initCfg, &edma3Result);
// request DMA channel
Uint32 LCh = EDMA3_DRV_HW_CHANNEL_EVENT_32; // GPIO 0 event
EDMA3_RM_EventQueue evtQueue = 0;
Uint32 Tcc = EDMA3_DRV_TCC_ANY; // becomes to 32
edma3Result = MY_EDMA3_DRV_requestChannel(hEdma, &LCh, &Tcc, evtQueue, (EDMA3_RM_TccCallback)&edma_isr,
(void*)&g_isrArg);
Uint32 paramPhyAddr = 0; // becomes 0x02706400
edma3Result = EDMA3_DRV_getPaRAMPhyAddr(hEdma, LCh, ¶mPhyAddr);
// setup the transfer or 256 bytes from dmaSrc ---> dmaDst
EDMA3_DRV_PaRAMRegs paramSet;
memset(¶mSet, 0, sizeof(EDMA3_DRV_PaRAMRegs));
paramSet.srcAddr = (uint32_t)(dmaSrc);
paramSet.destAddr = (uint32_t)(dmaDst);
paramSet.srcBIdx = 0;
paramSet.destBIdx = 0;
paramSet.srcCIdx = 0;
paramSet.destCIdx = 0;
paramSet.aCnt = 0x0100;
paramSet.bCnt = 1;
paramSet.cCnt = 1;
paramSet.bCntReload = 0;
paramSet.linkAddr = 0xFFFF;
paramSet.opt = 0x00100008 | (Tcc << 12);;
edma3Result = EDMA3_DRV_setPaRAM(hEdma, LCh, ¶mSet);
memset((void*)dmaSrc, 0xff, 1024); // fill source test data with 0xff
memset((void*)dmaDst, 0, 1024); // fill the destination test data with zeros
So far all the above has succeeded without any errors. I have checked the return values in each step although not included in the code above.
Now the contents of PaRAM at 0x02706400 is:
00120004 00823800
00010100 00823C08
00000000 0000FFFF
00000000 00000001
The event enable registers EER and EERH are both zero, also for the shadow region. I don't know why but forcing them to 1 makes no difference.
The interrupt enable register IERH = 1, IPRH = 0.
// init the transfer
edma3Result = MY_EDMA3_DRV_enableTransfer(hEdma, LCh, EDMA3_DRV_TRIG_MODE_MANUAL);
The result is OK, the PaRAM changes to:
00000000 00000000
00000000 00000000
00000000 0000FFFF
00000000 00000000
as expected.
The interrupt pending register IPRH becomes 1. (Tcc = 32)
But the contents of dmaDst has not changed to 0xff as expected. The contents of dmaSrc is also not changed.
Have I forgotten something? I have heard about memory protection. Do I need to deal with it somehow?
The similar transfer succeeds with ARM core without problems but not with the DSP. What can be the difference between these 2 environments? The operating HW is the same after all.
The test data arrays seem be allocated from the L2 SRAM. Can this cause problems? When I tested it with ARM core I used DDR3 memory.
Best regards,
Ari