Forum Discussion

RubenPadial's avatar
RubenPadial
Icon for Contributor rankContributor
2 years ago

Intel FPGA AI Suite Inference Engine

Hello,

I'm using Intel FPGA AI 2023.2 on ubuntu 20.04 host computer and trying to infer a custom CNN in a Intel Arria 10 SoC FPGA.

I have followed Intel FPGA AI Suite SoC Design Example Guide and I'm able to copile the Intel FPGA AI suite IP and run the M2M and S2M examples.

I have also compiled the grpah for my custom NN and I'm trying to run it with the Inter FPGA AI suite IP but I have not clear how to do it. I'm trying to use the dla_benchmark app provided but for example, the input data of my NN (it is trained and graph was compiled in this way) must be float whereas the input data of the IP must be int8 if I'm not wrong.

Another problem I have is regarding the ground truth file. I have a ground truth file for each imput file because each groud truth is a 225 array.

Is there any additional information or guide to run custom model with Intel FPGA AI Suite?

Thank you in advance

31 Replies

  • JohnT_Altera's avatar
    JohnT_Altera
    Icon for Regular Contributor rankRegular Contributor

    Hi,


    Please create new forum to discuss on the DLA runtime or inference engine documentation