PLL Compensation Mode explanation
I have a design in which I am attempting to transfer data from clk1 to clk2, where clk1 is an output from an upstream PLL, drives core logic, and is the input ref clock to a downstream PLL, which generates clk2 at the same frequency. The generated clk2 is used to drive core logic as well.
My understanding from the documentation is that using "normal mode" compensation for the downstream PLL (I can't edit the upstream PLL in this design) should minimize the clock skew between clk1 and clk2, which should be phase aligned by default.
However, my current understanding from experimentation/other questions on this forum is that when PLL compensation is in use in any mode (other than direct) the compensation for the PLL output is relative to the clock origin (the clock input pin to the device). With normal mode compensation I'm seeing significant clock skew/hold violations in transfers from clk1 to clk2.
Are there any settings or changes I can make to minimize clock skew between clk1 and clk2? Is it possible to force the PLL IP to compensate the output clock for the clock at the PLL input, instead of the original clock at the pin?
Thanks in advance!