--- Quote Start ---
create_clock -name SPI_clock -period 100 [get_ports {SPI_clock}]
create_clock -name virt_SPI_clock -period 100
We want to clock the data in the middle of a bit, what set_input_delay do we use?
--- Quote End ---
from fpga perspective you don't clock an input as fpga receives it as it is. You pass this information to the tool.
if you are receiving data transitions right in the middle of clock period then min = max = 50 ns
It is at the fpga outputs that you have control over data/clk relation but then the tool stops once timing is achieved and does not give you the choice of optimising it any further.