There certainly are clock input requirements/specifications for jitter, duty, etc. if you make use of the MAX10's PLL resources:
https://www.intel.com/content/www/us/en/docs/programmable/683794/current/pll-specifications.html
If you don't use a PLL and just take your clock input straight into a global clock tree, then the jitter and duty characteristics of the incoming clock are just passed straight through the clock distribution to its endpoints. And different endpoint resource types (logic elements, DSP blocks, memory blocks, etc.) each have different timing specs/requirements/limitations. But these are static resources, so they wouldn't generally be sensitive to jitter, just to the minimum clock period. And they wouldn't generally be sensitive to duty cycle per se, but they may have minimum clock high and low times that must be met, though this wouldn't typically become the limiting factor with anything close to 50% duty cycle. Having said that, if your input clock has a duty cycle that isn't nominally 50%, then you should make sure to accurately reflect that in your timing constraints, so that static timing analysis in Quartus can properly analyze for it.