Forum Discussion
Aswathy_C_Intel
New Contributor
6 years agoWe have limited answers for your questions.
Hope all your questions are about Intel FPGA.
Intel FPGA supports caffe and Tensorflow for inferencing through Openvino. For more details, Please refer : https://www.intel.com/content/dam/www/programmable/us/en/pdfs/literature/solution-sheets/intel-fpga-dl-acceleration-suite-solution-brief%E2%80%93en.pdf.
As of now, DL training is not supported in Intel FPGA. Multiple FPGAs could be used for increasing inference throughput.
For further details, we will contact the core team and let you know if they can help.