From: Phil Hobbs
Subject: Re: Can clock jitter in a A/D D/A system cause aliasing?
Date: Fri, 13 Sep 2002 13:37:28 -0400
Organization: IBM T. J. Watson Research Center
NNTP-Posting-Date: 13 Sep 2002 17:37:30 GMT
X-Mailer: Mozilla 4.61 [en] (OS/2; U)
> Lately I have been reading literature on nonlinear sampling, with the idea
> that by jittering the sample clock on the A/D one can eliminate aliasing
> which can occur if there are frequency components of the input signal above
> the Nyquist frequency. This relaxes the input anti-aliasing requirements.
> On the other hand, I have heard it said that clock jitter in the D/A clock
> might cause aliasing. I can kind of see this but not quite. I imagine the
> output of the D/A before reconsrtuction filtering, where the aliases of the
> output spectrum are replicated at multiples of the sample rate. I would
> think that clock jitter on the D/A would cause some kind of distortion in
> the output spectrum which I would ecpect would be harmonically related to
> the sample frequency. I can see that some of this distortion might modulate
> down to the baseband part of the spectrum. However, I can't see how this
> distortion, even if it were in the baseband spectrum (below the corner of
> the output reconstruction filter) , could be considered aliasing.
> Reference materials are welcome.
If the jitter has a white frequency spectrum, it will just raise the
noise floor. On the other hand, if there are discrete spectral
components in the clock (e.g. a subharmonic due to an inadequate PLL
loop filter, or pickup, or anything of that sort), you can get mixing
products of the clock spurs and the input signal.
I wouldn't call this aliasing, because aliasing is a specific kind of
spurious signal, and the word is useful only as long as it keeps its