I am facing this problem with cadence:
I am running a transient noise simulation for a Super-Regenerative Receiver (SRR). Despite knowing the theory behind the SRR it requires to run a VCO centered at around 1 GHz. I've also included noise components up to 50 GHz. It resulted to have a step size as small as 2 ps. Actually the Spectre itself uses around 200 fs (in moderate state) step sizes. But I can tolerate the error due to this amount increment in step size.
on the other hand, I need to run this simulation for at least 1ms (1k bits at 1Mbps) to check for 0.1% BER. it means that there would be more than 500G samples!! and I only have saved 2 differential output nodes' voltages and it has been as large as 130GB of data.
The problem is although the simulation runs and completes without any error, the final results cannot be shown (or even written to a *.dat od*.txt file using ocnPrint command in CIW). The Cadence tends to say the server has gone out of memory (it has 40 GB of RAM).
my question is if there is any way that I can run the simulation with the same fine precision as 2ps steps but when it comes to saving the data, I can down-sample the result before writing it on the hard drive. Because although I need the simulation to be run with fine resolution to show the effect of noise, at the end what I want to look at is the 1Mbps data.
Regards,
Vahid