Altera_Forum
Honored Contributor
10 years agoData streaming for image processing
Hello felows,
I'm starting the implementation of a image processing pipeline with the Arrow SoCKit. I'll have the ARM capturing images from two fisheye cameras in the USB port and then the FPGA part should do some processing. My goal is to achieve at least 10fps. I had some problem with the cameras, because of the ARM processor speed, the cameras where to fast, but now I learnt how to configure them and get images at 10fps is the fastest I can go. I still have to work in the synchronization between them, but this is a specific issue with the cameras. My questions about the FPGA part are: which is the most efficient method to pass the data to the FPGA? I was thinking about a DMA here, as the images are to big to store in the internal memories of the FPGA. There is some reference design I could learn from? Or maybe a shared memory? I'm lost... The processing pipeline theoretical formulation is done, now I'm proceeding to the implementation, but the data transfer will be the stone in the way, as I have no experience in managing the external interfaces. Any help will be welcome! Best regards!