Forum Discussion

E-Hong's avatar
E-Hong
Icon for New Contributor rankNew Contributor
5 years ago

Getting OpenVINO Inference Time per Layer

Are there any way or method to obtain the inference time per layer of a deep learning model on OpenVINO?

Thank you.

4 Replies

  • Hi,

    May I know which device are you targeting to use the OpenVino with? Are you targeting to use one of the CPUs processors OR the FPGA?


    -Hazlina


    • E-Hong's avatar
      E-Hong
      Icon for New Contributor rankNew Contributor

      Hi Hazlina,

      I am targeting both CPUs and FPGA.

    • E-Hong's avatar
      E-Hong
      Icon for New Contributor rankNew Contributor

      Hi Hazlina,

      I am using a VGG16/19 deep learning model. My goal is to run inference targeting the Arria 10 PAC card. I am able to calculate the time taken for the whole inference process, but I hope to time the inference time per model's layer as well.

      Thank you.