Forum Discussion
Thanks, @Ash_R_Intel.
For experiment's sake, modifying my trivial design example to feed the PLL directly from the FPGA input pin, the PLL's refclk arrives at 0.704ns, and clk2 arrives at ff3 at -0.687ns. And to recap the original scenario, having a CLKCTRL upstream of the PLL, the PLL's refclk arrives at 3.784ns, and clk2 arrives at ff3 at -1.334ns.
With the addition of an upstream CLKCTRL / GCLK in the latter case, one would expect a later clk2 arrival time, not earlier as observed. So one must ask, does this observed result even make sense on the face of it?
It's the compensation figure in the PLL that is coming up vastly different between the two scenarios, -5.601ns vs. -9.485ns, respectively, and that's what's responsible for clk2 arriving even earlier with the upstream CLKCTRL rather than much later as one would expect. It is unclear where that difference in compensation arises, as the clock distribution downstream of the PLL is identical between the two scenarios, and I think we're both agreeing that the PLL should not in any way be compensating for the added delay of a CLKCTRL / GCLK upstream of it. Right? The observed difference in compensation does not correspond to a difference in the clock network delays. So from where does this difference in compensation arise?
A purely speculative possible interpretation of the observations above, it almost looks as though Quartus is attempting to adjust the delay of the compensation loop somehow to phase-align clk2 to the FPGA clock input pin in both cases, as though it IS trying to compensate for the added delay of the upstream CLKCTRL / GCLK if present (which is NOT what we expect nor want). Could that be the case? Is that what it's trying to do? And if so, is there a way via constraints or otherwise to prevent Quartus from doing that?
Taking a step back:
The available documentation (UG-01155 and A10-HANDBOOK) does seem to indicate, in the text and in block diagrams, that the IOPLL can optionally receive its refclk input from a GCLK or RCLK network. So, is that fully supported, or not? And if so, what are the expected compensation characteristics with refclk fed from a GCLK or RCLK network?
As to the suggestion of feeding the PLL directly from the FPGA input pin instead:
The trivial design example I presented for discussion is just that. In reality, I don't have the luxury of feeding the PLL directly from the FPGA's input pin. What I'm developing is a reusable IP block that will be integrated into numerous different FPGA designs, where other parties may own the top level and other IP blocks residing in it. There will generally not be much visibility between parties and their respective IP, nor the opportunity to collaborate on the top level clocking scheme. The top level FPGA designs into which this IP block will be integrated may have different and unknown top level clocking schemes, and I can't really make any assumptions about the ultimate origin of the system clock provided to my IP block other than it will already be on a GCLK when I receive it. Then, internally in my IP block, I have a need to generate one or more derived clocks at integer multiples of the incoming system clock frequency and in-phase with it (and possibly also with dynamic gating).
I hope that gives you some context and a better idea of what I'm ultimately trying to accomplish. And if you have other suggestions that can accomplish these requirements within these limitations, I'm all ears.
I should mention too, just for background, that I've been doing this routinely in Xilinx devices, with which admittedly I am far more familiar. I naturally assumed that a similar capability exists in Altera devices, and the documentation seemed to suggest as much, though not clearly... I hope that was not an incorrect assumption/interpretation on my part.
Fundamentally, the capability I'm seeking is this: To be able to take into my block what is already a global clock, and from it to generate new global clocks that are in-phase with it. Is that possible in the Arria 10's PLL / clocking architecture, or not? And if so, how?
I look forward to your input.
Thanks,
-Roee