Corrupted data structures with CRU DMA transfers

Hello,

we took data with the 2 TPC FECs and the CRU. When decoding the data i see that the data structure layout (Packets, Superpages and interleaving Superpages) seems to be broken.

I see the following, as an example. Similar behavior can be observed with just one link in the data (doesn’t matter if Link 0 or Link 1).

Link 0 - 128 packets
Link 1 - 128 packets
Link 0 - 128 packets
Link 1 - 128 packets
Link 0 - 128 packets
Link 1 - 128 packets
Link 0 - 128 packets
Link 1 - 128 packets
Link 0 - 128 packets
Link 1 - 128 packets
Link 0 - 128 packets
Link 1 - 128 packets

Link 0 - 1158 packets
Link X - 6266 packets (meaning NO header, just 0x0)
Link 0 - 640 packets
Link 1 - 4625 packets
Link X - 2799 packets (meaning NO header, just 0x0)
Link 0 - 640 packets.

Since the header information is not correct yet, i assume the header to be at each 8kB position (RDH for a packet). Looks a bit scrambled to me…

Cheers,
Torsten

Ciao,
I think it depends how much data you stored in the file.

The bench-dma program dumps the content of the superpages it has allocated after you stopped it (or after it collected the amount of data requested). It is possible that the scrambled data is just random data in the superpage, not written by the DMA.
There should be some options to clean the memory allocated (but I’ll have to check)

To be sure I would like to check the data, could you share the location of the file so I can check it? and tell me how much data you wanted to collect?

All of this is fixed by readout.
I am working with Sylvain to update the code to work with the latest release of the firmware.

Thx

Yo!
Alright, i think i know what happened. I started the roc-dma-bench with the following command:

roc-bench-dma --verbose   --id=05:00.0   --buffer-size=128Mi --superpage-size=1Mi --links=0-1  --no-errorche --bytes=12Mi  --to-file-bin=/tmp/rdo_link-0-1.bin --loopback=NONE --generator=0 ```

So it took 12 MByte of data (which corresponds to the 6 x 2 Links á 128 packets) but then also dumps the whole rest of the 128 MB buffer to disk. Which contained random or previous data. 

Will cross-check.

Yeah … I think you are right.
I just checking data of 16 GB with 4 links and all the headers looked fine so far.

I’ll keep checking, but it should be consistent with your observation.

Cheers