Can you please clarify?
- Using CkCrypt2W
- AES block size for IV is 16 bytes
- Hex encoded block (from sample): 000102030405060708090A0B0C0D0E0F
Is it okay to pass as HEX encoded unicode? Does it strip out the zeros?
For example, if my "block" was 3 bytes of all AAA hex encoded is 414141.
Do you pass L"414141" which is actually 16-bytes "410041004100". If the IV was block size of 2 would it just use 4100->A or would it internally process 414141->AAA?
Hope this makes sense. Basically I'm asking if the AES block is 16 bytes:
Do I pass a Unicode encoded hex string (like 410041) or ANSI encoded as unicodeString (L"4141")?
Internally maybe it is converting the L"414141" to utf8 414141?
Just wanted to verify.