The Cyber-Spy.Com Usenet Archive Feeds Directly
From The Open And Publicly Available Newsgroup
This Group And Thousands Of Others Are Available
On Most IS NNTP News Servers On Port 119.
Cyber-Spy.Com Is NOT Responsible For Any Topic,
Opinions Or Content Posted To This Or Any Other
Newsgroup. This Web Archive Of The Newsgroup And
Posts Are For Informational Purposes Only.
From: Mike Monett
X-Mailer: Mozilla 2.02 (Win16; I)
Subject: Re: Binary Sampler
References: <3E244AFE.4F3E@sneakemail.com> <3E26E3E2.firstname.lastname@example.org>
Date: Thu, 16 Jan 2003 17:41:42 -0500
NNTP-Posting-Date: Thu, 16 Jan 2003 17:41:10 EST
Organization: Bell Sympatico
Mike Monett wrote:
> This is not quite true. The noise performance is shown in a series
> of simulations I did, but I took the web page down since it took too
> long to load. I plan to condense it and put it back on my web site
> sometime soon.
> The performance is improved by increasing the ripple amplitude. In
> the simulation, usable signals are recovered down to 60dB below the
> rms noise.
> I plan to do more simulations using band-limited Gaussian noise, but
> it will take a while to develop the software.
Mike, since you are one of the few persons interested in noise analysis
of the sampler, I went ahead and uploaded the page on simulations. It is
not linked from the other files and takes a while to download. Here's the
The most interesting graph is "Fig. 10. 640 units of noise, Ripple 40X"
This shows recovery of a 100MHz signal in 56 dB of noise. The increase in
the ripple value needed to obtain this result corresponds exactly with
your prediction. The recovered signal is quite usable.
A conventional sampler would take too long to obtain enough samples to
otain this result by averaging, and the signal would probably drift
during the measurement which would invalidate the data.
The simulation uses infinite bandwidth Gaussian noise, and the results
may change with normal bandwidth-limited noise. I have the algorithms to
generate this type of noise, but it will take a while to do the code. The
process is simple - generate Gaussian noise, store it in a huge array,
then go through and filter each data point.
Fortunately, I have developed a method of debugging software that
addresses huge amounts of data so it should not be too difficult, but it
just takes the time to do it. The addressing method is shown here:
Go Back To The Cyber-Spy.Com
Usenet Web Archive Index Of
The sci.electronics.design Newsgroup