Altera_Forum
Honored Contributor
16 years agoImplement Phase Correlation algorithm
I'm putting a Phase Correlation algorithm in FPGA. Two images in, window them with 2D Hamming. 2D FFT on each. Complex multiply the matrices with one being conjugated first. The magnitude taken of this result and used as a normalizing divisor for the complex multiply output. Further steps are 2D IFFT the result, get the magnitude of that and search for the peak. The 2D location of the peak indicates the amount of shift between the two images.
I've been trying to use fixed point math for the cores I use. The FFTs are Block Float Point but the magnitude step's sqroot core and the divider core are Fixed Point. Also trying to trim various stages output widths so I'm not carrying all those bits around. Comparing Verilog sim runs to Matlab sim runs (using the FFT core Matlab model and trying to match the trimmed output widths in Matlab). Have good comparison (and correct operation) until I put in the sqroot and divider core in the verilog sim. Things go south at that point. Anybody out there implemented this kind of algorithm? Am I wasting my time using fixed point cores and trimming widths as I'm just going to lose too much accuracy? I hate the thought of using Floating Point cores and the resultant size increase as there is a lot of other functions in the FPGA (an entire color camera path).