Other Parts Discussed in Thread: SYSCONFIG
Tool/software:
Hello!
I'm trying to encrypt data using the crypto module, here is a simple example I'm using to test it out.
// Initialize the crypto-drivers CryptoCC32XX_init(); CryptoCC32XX_Handle handle = NULL; handle = CryptoCC32XX_open(0, CryptoCC32XX_AES | CryptoCC32XX_DES | CryptoCC32XX_HMAC); if (!handle) { Display_printf(dispHandle, 0, 0, "CryptoCC32XX did not open"); } else { //Space for input and data unsigned char plainData[16] = "whatsoever123456"; unsigned int plainDataLen = sizeof(plainData); unsigned char cipherData[16] = {0}; unsigned int cipherDataLen = 0; // 32 byte key (256bits) uint8_t key[32] = { 0xE0, 0xE1, 0xE2, 0xE3, 0xE4, 0xE5, 0xE6, 0xE7, 0xE0, 0xE1, 0xE2, 0xE3, 0xE4, 0xE5, 0xE6, 0xE7, 0xE0, 0xE1, 0xE2, 0xE3, 0xE4, 0xE5, 0xE6, 0xE7, 0xE0, 0xE1, 0xE2, 0xE3, 0xE4, 0xE5, 0xE6, 0xE7, }; //12 byte nonce + 4 bytes counter (for AES_CTR) uint8_t nonce[16] = { 0xF0, 0xF1, 0xF2, 0xF3, 0xF4, 0xF5, 0xF6, 0xF7, 0xF8, 0xF9, 0xFA, 0xFB, 0x00, 0x00, 0x00, 0x00, }; CryptoCC32XX_EncryptParams aesParams; aesParams.aes.keySize = CryptoCC32XX_AES_KEY_SIZE_256BIT; aesParams.aes.pKey = &key[0]; aesParams.aes.pIV = (void *)&nonce[0]; int32_t status = CryptoCC32XX_encrypt(handle, CryptoCC32XX_AES_CTR , plainData, plainDataLen, cipherData , &cipherDataLen , &aesParams); Display_printf(dispHandle, 0, 0, "Encrypting Data: status = %d", status); }
The "CryptoCC32XX_encrypt()" returns -1, and when I step into the function, it correctly calls "CryptoCC32XX_aesProcess()".
But then it gets to this part
int32_t count = CryptoCC32XX_CONTEXT_READY_MAX_COUNTER; /* Step1: Enable Interrupts Step2: Wait for Context Ready Interrupt Step3: Set the Configuration Parameters (Direction,AES Mode and Key Size) Step4: Set the Initialization Vector Step5: Write Key Step6: Start the Crypt Process */ /* Clear the flag. */ g_bAESReadyFlag = false; /* Enable all interrupts. */ MAP_AESIntEnable(AES_BASE, AES_INT_CONTEXT_IN | AES_INT_CONTEXT_OUT | AES_INT_DATA_IN | AES_INT_DATA_OUT); /* Wait for the context in flag, the flag will be set in the Interrupt handler. */ while((!g_bAESReadyFlag) && (count > 0)) { count --; } if (count == 0) { return CryptoCC32XX_STATUS_ERROR; }
where the counter always goes to 0, which returns "CryptoCC32XX_STATUS_ERROR" (the -1 value).
So, the "g_bAESReadyFlag" is never set to true within the 1000 tries.
And it looks like it should be set in "CryptoCC32XX_aesIntHandler()" here
void CryptoCC32XX_aesIntHandler(void) { uint32_t uiIntStatus; /* Read the AES masked interrupt status. */ uiIntStatus = MAP_AESIntStatus(AES_BASE, true); /* Set Different flags depending on the interrupt source. */ if(uiIntStatus & AES_INT_CONTEXT_IN) { MAP_AESIntDisable(AES_BASE, AES_INT_CONTEXT_IN); g_bAESReadyFlag = true; } if(uiIntStatus & AES_INT_DATA_IN) { MAP_AESIntDisable(AES_BASE, AES_INT_DATA_IN); } if(uiIntStatus & AES_INT_CONTEXT_OUT) { MAP_AESIntDisable(AES_BASE, AES_INT_CONTEXT_OUT); } if(uiIntStatus & AES_INT_DATA_OUT) { MAP_AESIntDisable(AES_BASE, AES_INT_DATA_OUT); } }
if the "AES_INT_CONTEXT_IN" flag is set. (Although it looks like it disables the flag where "g_bAESReadyFlag" is set to true.)
From here on it is a little bit harder to follow, but my best guess is that
//This is a macro MAP_AESIntEnable(AES_BASE, AES_INT_CONTEXT_IN | AES_INT_CONTEXT_OUT | AES_INT_DATA_IN | AES_INT_DATA_OUT); //... which expands to AESIntEnable(AES_BASE, AES_INT_CONTEXT_IN | AES_INT_CONTEXT_OUT | AES_INT_DATA_IN | AES_INT_DATA_OUT); //The function looks for these flags and sets them (using some magic) if the type is AES. AES_INT_CONTEXT_IN AES_INT_CONTEXT_OUT AES_INT_DATA_IN AES_INT_DATA_OUT AES_INT_DMA_CONTEXT_IN AES_INT_DMA_CONTEXT_OUT AES_INT_DMA_DATA_IN AES_INT_DMA_DATA_OUT
So, from all I can see. It should really be working. Please tell me if I'm missing something, or if I have misunderstood things.
Thanks in advance for the help!
Kind regards
David