--- Quote Start ---
I am a bit puzzled about the concept of lag, for instance. I understand how it seems to work from the second text, and in a static, statistical sense I can understand it, but in a streaming system I don't see why you would ever need to calculate it for anything other than a lag of zero? (since the stream itself will slide across the reference signal as it is received.) Am I missing something?
--- Quote End ---
You have two patterns you want to compare, but you don't know when (time) either was transmitted, eg., in your case, one is transmitted, and the other is the local copy of the known pattern. If both patterns were 'transmitted' at the same time, i.e., you managed to start comparing them at exactly the right moment, then sure, the zeroth lag would be the peak. However, in general you don't know when (time) the pattern was transmitted, and so you search for the peak in the lag response to determine when that time is, i.e., what the relative time is between the pattern start times.
Does that make more sense?
When you're doing pattern detection with transmitters, you can cheat a little, and just measure one lag, and keep adding in delays until you find a peak. If your transmitter pattern is a PRBS7, then you should find the peak within 127 tries (since after that the pattern repeats).
Cheers,
Dave