I am currently bench-marking the zcu104 platform and have setup the zcu104 following the Installation Guide.
Setup ZynqReleases / git directory
built / installed /deployed the zcu104 and xilinx19_2_aarch64 platform
I have been able to successfully run pattern_capture on the zcu104 in standalone, network, and server mode.
However, I am running into issues when I run testbias on the zcu104. Running testbias on the zcu104 (in Standalone or Network Modes) produces the test.output file with the correct number of bytes, but all of the byte values are zero, as observed using hexdump. Also, When bench marking the zedboard I was able to produce an identical md5sum of the test.input and test.output when applying a bias value of 0, this is not the case for the zcu104.
I have also noticed that I get the following error when I run testbias on the zcu104 in server mode. This error does not occur in network or standalone mode:
Property 40: file_write.countData = "false"
Property 41: file_write.bytesPerSecond = "0x0"
OCPI( 2:793.0813): Exception during application shutdown: error reading from container server "": EOF on socket read
Exiting for exception: error reading from container server "": EOF on socket read
Both the md5sum and hexdump verification can be seen in the attached text file as well as the standard out of testbias. Any help in this endeavor would be appreciated.
Thanks,
Joel Palmer