The Cyber-Spy.Com Usenet Archive Feeds Directly
From The Open And Publicly Available Newsgroup
This Group And Thousands Of Others Are Available
On Most IS NNTP News Servers On Port 119.
Cyber-Spy.Com Is NOT Responsible For Any Topic,
Opinions Or Content Posted To This Or Any Other
Newsgroup. This Web Archive Of The Newsgroup And
Posts Are For Informational Purposes Only.
User-Agent: Mozilla/5.0 (Windows; U; Win98; en-US; rv:0.9.4.1) Gecko/20020508 Netscape6/6.2.3
Subject: Re: Can clock jitter in a A/D D/A system cause aliasing?
Date: Wed, 11 Sep 2002 20:43:00 GMT
NNTP-Posting-Date: Wed, 11 Sep 2002 16:43:00 EDT
Here is my spin on this issue:
When energy is present on the clock signal in addition to a 'perfect'
conversion clock it can behave in the same way that a 'dirty' L.O.
signal might introduce spurs if a mixer were being used. If the added
energy is noise-like then the resulting 'spurs' take on the
characteristics of noise. If the added signal is highly correlated (for
instance a single spectral line) then I would expect images of the input
signal to be manufactured.
If the power spectrum of the 'dither' signal has sufficiently wide
bandwidth then the resulting 'tails' generated can extend beyond Nyquist
frequency of the converter and cause images of the fundamental signal to
fold back due to aliasing.
I do not understand how dithering the conversion clock of the converter
can reduce aliasing per se.
In general I have found that adding a dither signal to the analog input
of the ADC is much better than dithering the clock itself as the
addition of dither can spread the quantization spurs of the intended
signal without modifying the spectral characteristics of the intended
signal itself. If you dither the clock, every signal passing through the
converter is 'modulated' by the dither.
A very interesting subject -- If you find a way to eliminate aliasing
via dither for an arbitrary signal I would be highly interested in
finding out how this works!
> Lately I have been reading literature on nonlinear sampling, with the idea
> that by jittering the sample clock on the A/D one can eliminate aliasing
> which can occur if there are frequency components of the input signal above
> the Nyquist frequency. This relaxes the input anti-aliasing requirements.
> On the other hand, I have heard it said that clock jitter in the D/A clock
> might cause aliasing. I can kind of see this but not quite. I imagine the
> output of the D/A before reconsrtuction filtering, where the aliases of the
> output spectrum are replicated at multiples of the sample rate. I would
> think that clock jitter on the D/A would cause some kind of distortion in
> the output spectrum which I would ecpect would be harmonically related to
> the sample frequency. I can see that some of this distortion might modulate
> down to the baseband part of the spectrum. However, I can't see how this
> distortion, even if it were in the baseband spectrum (below the corner of
> the output reconstruction filter) , could be considered aliasing.
> Reference materials are welcome.
Go Back To The Cyber-Spy.Com
Usenet Web Archive Index Of
The sci.electronics.design Newsgroup