Thanx guys.
This is what I have done:
I defined a constant called DPERIOD that contains my one bit time. since in verilog i cannot make edge occur on a gaussian random since its a floating point, and I cannot make the edge occur on a floating point unit interval from the simulation time (can i?) I made the occurance of the edge based on a gaussian random decision.
after DPERIOD time, I generate a gaussian random variable between with mean 0 and variance one. If this random variable is less than a certain threshold I specify, the edge occurs and the data changes its level. If its bigger than the threshold, it repeats the generation until the condition is met, so the edge occurs.
This way, changing the threshold would help me control how much noise I want in my signal. And also I overcome the fact that I cant make the transition occur on a variable floating point in simulation time. (again, can i?)
Now I will be plotting my BER vs THRESHOLD.
would this actually be a good replacement for BER vs Eb/No?
What do u think of the above?