R/E/P Community

R/E/P => R/E/P Archives => Reason In Audio => Topic started by: sdevino on April 28, 2004, 07:35:51 AM

Title: Jitter Specification Input Requested
Post by: sdevino on April 28, 2004, 07:35:51 AM
I am a member of SC-02 committee of the AES and some discussions have developed within the group to develop some sort of standards for specifying jitter performance of audio systems and/or components.

Most of the replies and discussion are either extreme electrical designer specs (i.e. spec everything to CYA). Or audiophile responses (who simply want a means of identifying that their system is better than everyone elses  Rolling Eyes  )

While I have background as a designer, a test engineer and a user, I am more interested in the experiences of actual users and would like to represent the "users" concern in this discussion. When I buy gear or spec gear for use, I want meaningful specs and not something that is intended to fill out a marketing campaign.

So folks, I am asking for your inputs to help identify concerns of the user in terms of jitter. What information would you like the equiment vendors to supply? And why?

Thanks
Steve
Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on April 28, 2004, 11:51:47 AM
Steve,

> help identify concerns of the user in terms of jitter <

As far as I'm concerned jitter is a complete non-issue, popularized by gear makers who want an excuse to get you to buy yet more gear. As an audio pro, and a listener, jitter is the very last thing I'm concerned about. The specs I see typically put jitter 110 dB or more below the music. Who gives a flying you-know-what about anything that far below the program material? Especially since it's also masked by the program.

It amazes me when people obsess over minutiae like this, while blissfully ignoring numerous 30 dB nulls caused by standing waves in their control room.

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: oudplayer on April 28, 2004, 01:00:05 PM
With all due respect, Ethan, though I acknowledge your concern about room acoustics, there's many who've discovered that there are very substantial sonic differences between AD converters, between DA converters, and that the result of using combinations of these has a large impact on the quality of mixes we make. Since some of these differing units use the identical analog stage, the difference lies somewhere in the clocking/ filtering part of the equation, and regarding the former, jitter becomes an important issue.

I'm not sure if this gets you to a "spec" or not, but an increasing number of digital products are using what Lynx calls "steady clock" (I forget the other jargon terms in use), a technology that is able to correct drifting incoming clock signals. Perhaps there could be some measure of the ability to correct incoming clock sources?
Title: Re: Jitter Specification Input Requested
Post by: sdevino on April 28, 2004, 01:23:44 PM
oudplayer wrote on Wed, 28 April 2004 13:00


I'm not sure if this gets you to a "spec" or not, but an increasing number of digital products are using what Lynx calls "steady clock" (I forget the other jargon terms in use), a technology that is able to correct drifting incoming clock signals. Perhaps there could be some measure of the ability to correct incoming clock sources?



Sounds like marketing speak for technology that has been around since the 50's for Color TV. "Steady clock" is pure BS.

I also wnder if we really have any idea whther or not the "jitter" is the issue relative to improved audio performance of a converter. Many of us "think" it is but none of us (that I am aware of) have shown any real proof.
Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on April 28, 2004, 01:43:29 PM
Oud,

> Since some of these differing units use the identical analog stage <

I agree with Steve that lots of things could account for the difference in the sound of one converter versus another. Again I ask you to think about the relevance of anything that's 110 dB or more below the music. How could that possibly be audible? Why must jitter be the explanation, as opposed to lower distortion? Or any of a number of more likely culprits.

All of this stuff is easily measured. And a double blind test is at least as useful to determine for once and for all what really matters and what doesn't.

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: oudplayer on April 28, 2004, 03:21:32 PM
Ethan Winer wrote on Wed, 28 April 2004 10:43

 Why must jitter be the explanation, as opposed to lower distortion? Or any of a number of more likely culprits.

All of this stuff is easily measured. And a double blind test is at least as useful to determine for once and for all what really matters and what doesn't.



I apologize, as I wasn't precise enough in my first post - I'm not married to the jitter metric as the be-all and end-all in converter woes, yet notice that the use of different clock sources dramatically changes the audible characteristics of a converter (for better or worse or indifferent-yet-different). I also have the understanding that this shouldn't be the case if converters were all done "right," yet it still persists in today's technologies. If it's not jitter that's being most affected by alternative clock sources, then what phenomena/metric is it?

I'm curious about the "number of more likely culprits." What would these be, how would they be measured, and how would they map onto audible phenomenon (i.e., a higher slew rate in op-amps produces xyz audible result)?

Hopefully my DSP naivitae will assist (by provoking intelligent responses) in uncovering what is different and measurable between converters, what is significant, and the like.
Title: Re: Jitter Specification Input Requested
Post by: Johnny B on April 29, 2004, 12:59:15 AM
Steve,

May I make a few suggestions.

Have you talked to anyone at Burr-Brown
about A-to-D and D-to-A converters?

Have you talked to anyone at Audio Precision
about using their test gear?

Have you read any of Julian Dunn's white papers,
he did some work in this field and
worked for AP before his untimely demise,
but I think you may be able to downlaod his papers
off the web for free.

Have you talked to anyone at Anagram Technologies?

That's the technology that Manley Labs used in the SLAM.
Speaking of Manley, have you talked with anyone there?

My own wild ass guess is that we will get better sound when the speeds are drasticaly bumped up and the bit/word is increased to say 32 or 64 bits. I could be very wrong, look at Sony and SACD's specs. 1 bit at a million miles a second. LOL. Maybe I got it half right cuz with Sony, the speed is there.

One thing is sure, there is room for improvement, otherwise there would be no debate going on of "analog vs. digital." Although it is getting better all the time, I fear the debate will rage for some unknown amount of time.

Who knows, maybe it has something to do with the effect of heat on the sound or some other weird behavior that no one has yet noticed or measured. Science got us to the moon, but it has not yet cured the common cold nor ended the problems in the A-to-D and D-to-A debate.    

 
Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on April 29, 2004, 12:32:29 PM
OP,

> yet notice that the use of different clock sources dramatically changes the audible characteristics of a converter <

How dramatically? This is why I suggested that a true double-blind test is the only way to really know what matters and what doesn't. If you've ever tweaked an electric guitar track to perfection only to later discover you had your fingers on the snare EQ, you'll agree that human hearing and perception are flawed at best. I'm not saying you're imagining things! Only that logic and common sense rule out artifacts that are 110 dB or more below the music.

> What would these be, how would they be measured, and how would they map onto audible phenomenon (i.e., a higher slew rate in op-amps produces xyz audible result)? <

Sure, slew rate limiting could be a factor, though you'd need an awful lot of full-level high frequency content to run into that on any gear made in the past many years. Everything in audio can be measured, and slew rate limiting simply shows up as distortion. Audio transformers are among the worst offenders, but for some reason their flaws are considered charming, warm, and "musical."

Go figure.

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: Loco on April 29, 2004, 02:51:44 PM
sdevino wrote on Wed, 28 April 2004 07:35

So folks, I am asking for your inputs to help identify concerns of the user in terms of jitter. What information would you like the equiment vendors to supply? And why?


Well, I'm not sure what you can do about a converter once you find out how much jitter it carries, really. However, it would be very useful in order to compare different brands. I would say it would be important to know the jitter decorrelation between channels, since it could be the source of sloppy low end and/or loss of stereo image stability. I don't know if all of them are actually words, but you know what I mean.
Title: Re: Jitter Specification Input Requested
Post by: miroslav on April 29, 2004, 02:56:01 PM
Ethan Winer wrote on Thu, 29 April 2004 12:32

OP,

If you've ever tweaked an electric guitar track to perfection only to later discover you had your fingers on the snare EQ, you'll agree that human hearing and perception are flawed at best.


That's choice Ethan...

...oh yeah...been there done that!  Very Happy
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 29, 2004, 03:09:43 PM
sdevino wrote on Wed, 28 April 2004 12:35

So folks, I am asking for your inputs to help identify concerns of the user in terms of jitter. What information would you like the equiment vendors to supply? And why?


The most useful spec would be a jitter frequency chart, but it would have to be standardized such that it is taken at a given place in the system with given surrounding conditions.  In other words, it is far less useful to test a word clock master at it's crystal source than to test it at the end of a 10' cable.  On the contrary, it is far more useful to test an A/D converter's clock at the chip itself.  These are some of the problems with deriving a spec.  If I am only trying to impress people on specs I can have a great crystal and pll inside the box under test conditions, but allow it to erode quickly in the real world.  And unfortunately, jitter is so sensitive that small variations in very minor details between testing and real world can have significant variations.

The "spec" that is most useful is a frequency vs. amplitude chart of the jitter, but the testing method and protocol needs more attention.

One other minor detail - most people wouldn't know how to interpret the data anyway.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 29, 2004, 03:14:33 PM
Ethan Winer wrote on Wed, 28 April 2004 16:51

Steve,

As far as I'm concerned jitter is a complete non-issue, popularized by gear makers who want an excuse to get you to buy yet more gear. As an audio pro, and a listener, jitter is the very last thing I'm concerned about. The specs I see typically put jitter 110 dB or more below the music. Who gives a flying you-know-what about anything that far below the program material? Especially since it's also masked by the program.


But it isn't necessarily masked by the program material - that depends on the frequency of the jitter.  50Hz jitter?  Very definitely masked on a 1KHz waveform, but very noticeable on a 60Hz waveform.  

Quote:

It amazes me when people obsess over minutiae like this, while blissfully ignoring numerous 30 dB nulls caused by standing waves in their control room.


It's all important.  And inconsistencies in the room can help expose otherwise negligable problems like jitter - and any listening room in the real world has inconsistencies.

Acoustics is not the only problem...

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 29, 2004, 03:18:16 PM
Ethan Winer wrote on Wed, 28 April 2004 18:43

Again I ask you to think about the relevance of anything that's 110 dB or more below the music. How could that possibly be audible? Why must jitter be the explanation, as opposed to lower distortion? Or any of a number of more likely culprits.


Well first of all, jitter causes distortion.  Second, how do you come up with your -110dB numbers, and at what frequencies?  And finally, do you not conceded that it is possible for signals at less than -110dBFS to be audible?  (note, if you can have a 30dB null in a room, can you have a 30dB node in a room?  hint...)

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 29, 2004, 03:26:08 PM
Johnny B wrote on Thu, 29 April 2004 05:59

My own wild ass guess is that we will get better sound when the speeds are drasticaly bumped up and the bit/word is increased to say 32 or 64 bits. I could be very wrong, look at Sony and SACD's specs. 1 bit at a million miles a second. LOL. Maybe I got it half right cuz with Sony, the speed is there.


Nope.  The converters you use at 44.1KHz are actually clocking at 2.8224MHz on the converter chip and then being downsampled to 44.1KHz, so the rate we are using is the same as SACD.  And some converters do the conversion at higher rates than that, even.

Bit/word increased to 32 or 64?  What in the world?  Why on earth?

Quote:

One thing is sure, there is room for improvement, otherwise there would be no debate going on of "analog vs. digital."


This is a misnomer.  The fact that a debate exists does not, ipso facto, yield that there is room for improvement.  People can debate about anything, whether real or not.

Quote:

Science got us to the moon, but it has not yet cured the common cold nor ended the problems in the A-to-D and D-to-A debate.


Why do you think there is still a debate in the scientific community about A/D and D/A converters?  This industry is SOOOOO far removed from actual science that we would likely never know what is known by the mathematicians and scientists.  This is NOT a rhetorical question.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Johnny B on April 29, 2004, 08:46:01 PM
Nika,

I know you are a very smart man and have given this area a lot of thought, but there really are a lot of respected people in the industry who believe that analog still sounds better.

I was lucky enough to spend some time in NYC
with Walter Sear, nearly everything he has is analog.
He hates the sound of CD's, but he says SACD
is getting closer to analog. I think quite
a few people would agree with him.

To me, most CD's sound brittle and there is something
to be said about the "behavior" of analog. I do not think
it is fully understood yet.

That's why I think we need more study, and more scientific
research and more experimentation. I do not feel that we are there yet. That's my take on it. I sense you will
disagree and that's fine.

When the staunch adherents to analog agree
there is no debate as to the sound quality,
that's when the debate will end.

       
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 29, 2004, 08:52:34 PM
Johnny B wrote on Fri, 30 April 2004 01:46

Nika,

I know you are a very smart man and have given this area a lot of thought, but there really are a lot of respected people in the industry who believe that analog still sounds better.


So are we talking sounding better or more accurate?

And again, I point out that just because people debate it does not make it real.  And what, exactly, is the "A/D D/A debate" you refer to and how does the specific itteration you refer to relate to the difference in sonic character of analog and digital?

Finally, why are we again discussing analog vs. digital (tedium!) in a thread about jitter?  Smile

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Johnny B on April 29, 2004, 09:15:24 PM
Because if the A-to-D and D-to-A debate had ended there would
no need to even discuss jitter or filters or clocks.

If the the stuff really sounded as good as analog, no one would even care about jitter. Ahh, but they do care about "jitter" and "time smear" and "imaging" and a whole host of other factors.

Digital ain't perfect, it might be perfectable,
but it ain't perfect. Not yet. YMMV

 
Title: Re: Jitter Specification Input Requested
Post by: natpub on April 29, 2004, 09:26:02 PM
Johnny B wrote on Thu, 29 April 2004 19:46


When the staunch adherents to analog agree
there is no debate as to the sound quality,
that's when the debate will end.

       


To me, isn't this is like saying when everyone agrees that bronze-wrap guitar strings sound best, then the debate is over? Would such a debate ever end, because they are merely different sounds?

The discussion at hand seemed to start about jitter and distortion in the evaluation of converters. It keeps getting dragged into competition with analog, which indeed is loaded with distortion. Some distortion we may like, some we may dislike.

Many persons who grew up trained to hear certain tape and circuit distortions as warm and beautiful do indeed prefer such sounds. Quite naturally I believe, since we from that era were conditioned to do so; we associate it with fond memories.

Currently, a generation is being largely trained to digital and file-compressed sounds. For them, that "great old sound of so-and-so" will be all or mostly digital.

I am not sure accuracy is really what people want, when they explore converters, or any medium or tool. People started down the digital path trying to get away from noise. Once they got more silence and accuracy, many did not like it.

I must agree--when I was young I really liked my Muff-Pi fuzz box, because it hid all my little errors, and made me feel like I was a rock star Razz

I just feel that different media and tools can give us a variety of sounds, and with time and study, we can learn to use each of these things to create what we wish.

KT
Title: Re: Jitter Specification Input Requested
Post by: sdevino on April 30, 2004, 09:11:55 AM
Back on topic please.

There are many members representing the various semiconductor and equipment manufacturers on the SC02 comittee. I am looking to this group to attempt to identify anthing in a jitter spec that could be useful to a recording engineer.

So far Nika has suggested a jitter spectrum in the frequency domain.

I also agree that the only jitter I care about is at the input to the converter, but I do not know enough to be able to translate what if any effect jitter on an external wordclock would have on converter performance by the time the wordclock gets passed through the PLL or other lockup crcuits to the converter input.

as far as I can tell it might matter a lot or not at all. Certainly high frequency jitter that is outside the loop bandwidth of the PLL will not matter. So why do we think we hear a better soundstage?  I think Nathan has a point since it is pretty much impossible to do a true double blind comparison.

Steve
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 30, 2004, 10:55:24 AM
sdevino wrote on Fri, 30 April 2004 14:11

Certainly high frequency jitter that is outside the loop bandwidth of the PLL will not matter.


This is a myth.  Jitter, just like audio, aliases so that variations higher than the sample rate still fold back into the legal range.  Jitter at 45.1KHz will manifest itself as 1KHz jitter in a 44.1KS/s audio system.  For this reason, jitter specs that only show the jitter frequencies in the 1fs range are unhelpful.

Isn't a big issue here that it will be relatively easy with just about any given spec to have a clock source that does very well on spec but has drastically poorer performance in certain real-world conditions?  Give me a spect that this wouldn't be the case for?  With this realization, specs are nearly useless, or no?

Nika
Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on April 30, 2004, 02:07:53 PM
Nika,

> that depends on the frequency of the jitter <

As far as I'm concerned, if jitter is 110 dB or more below the signal, then it's not a problem.

> It's all important <

We've been through this before, so the bottom line for me is, Prove it. As Carl Sagan said, extraordinary claims require extraordinary proof. The notion that teensy amounts of noise or distortion at any frequency are a meaningful problem makes no sense to me. If you have to play extremely soft signals and then turn the monitor volume up by 40 dB past the usual speaker blowout level just to hear it means it's not a problem.

> how do you come up with your -110dB numbers<

Pohlmann's book. Most of the jitter levels in his various charts are softer than -110 dB, but that's the loudest number I found in any of the graphs so I use it to be generous.

> do you not conceded that it is possible for signals at less than -110dBFS to be audible? <

I do not concede this. To me -110 dB down is inaudible, period. Especially with a 16 bit system where anything below 96 dB is noise anyway.

> if you can have a 30dB null in a room, can you have a 30dB node in a room? hint <

I don't understand. A null occurs at a node point. What's this have to do with the audibility of noise/distortion/whatever that's 110 dB below the program?

Some day I'm going to fly out to your office, and give you the opportunity to show me the audibility of jitter in person! Very Happy

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 30, 2004, 02:43:33 PM
Ethan,

Rooms are inconsistent and they have the opportunity to have nulls and nodes.  When a node occurs an "artificial" amplification of certain frequencies occurs.  Just the same, where a null occurs so does an "artificial" gain reduction of certain frequencies.  For this reason it is possible to have jitter exist at a frequency that may be 110dB below program, but if program material happens to center around a null in the room and the frequency of the sideband produced by the jitter falls where the node in the room is then the manifestation of jitter in the room can be far less than 110dB below program.

The inconsistencies in rooms allows for the presentation of a lot of things that might otherwise be inaudible.  I have listened to music on highly specified systems in acoustically controlled rooms, and only upon hearing the same thing in my truck, driving on city streets did I hear a particular riff or effect.  While in theory and in practice that "effect" or whatever was below the threshold of audibility in perfect situations, in the reality of acoustical spaces I was able to hear something otherwise inaudible.

For this reason, 110dB down can still be an issue, and as I and others have heard, jitter is.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: sdevino on April 30, 2004, 03:11:22 PM
Nika Aldrich wrote on Fri, 30 April 2004 10:55

sdevino wrote on Fri, 30 April 2004 14:11

Certainly high frequency jitter that is outside the loop bandwidth of the PLL will not matter.


This is a myth.  Jitter, just like audio, aliases so that variations higher than the sample rate still fold back into the legal range.  Jitter at 45.1KHz will manifest itself as 1KHz jitter in a 44.1KS/s audio system.  For this reason, jitter specs that only show the jitter frequencies in the 1fs range are unhelpful.

Isn't a big issue here that it will be relatively easy with just about any given spec to have a clock source that does very well on spec but has drastically poorer performance in certain real-world conditions?  Give me a spect that this wouldn't be the case for?  With this realization, specs are nearly useless, or no?

Nika



Nika you missunderstood me. I am saying that HF jitter on an external clock will not make it to the ADC or DAC clock input in some applications. For instance many converter clocks are regenerated using PLL's or DDS. In these cases the external clock is used as an error or locking reference. A stable clock design would LPF the external clock such that the HF jitter is not even detected.

You are correct in that HF jitter on the clock interacts with the audio but I am not so sure what exact form it would be. I am sure that it is probably much more complex than simple aliasing. Since the jitter is probably not coherent or and could be very complex phase/frequency modulation it gets pretty dicey. ITs also really really really hard to measure jitter (and expensive).

Steve
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on April 30, 2004, 04:05:41 PM
sdevino wrote on Fri, 30 April 2004 20:11

Nika you missunderstood me. I am saying that HF jitter on an external clock will not make it to the ADC or DAC clock input in some applications. For instance many converter clocks are regenerated using PLL's or DDS. In these cases the external clock is used as an error or locking reference. A stable clock design would LPF the external clock such that the HF jitter is not even detected.


Correct, and agreed.  This is part of what I was complaining about - a clock circuit is highly susceptible to the PLL on the device that follows it.  The specs on an Apogee Big Ben in relation to the jitter specs on my MOTU 2408 are highly irrelevant until I know the PLL specs on the 2408!

Quote:

You are correct in that HF jitter on the clock interacts with the audio but I am not so sure what exact form it would be. I am sure that it is probably much more complex than simple aliasing.


Actually, no.  Jitter creates very simple distortion - it's all sideband distortion with the amplitude of the distortion dependant upon three factors - the amplitude of the jitter, the amplitude of the signal, and the frequency of the signal.  The higher the frequency of the signal the greater the amplitude of the sidebands.  The higher the amplitude of the signal the higher the amplitude of the sidebands.  And the higher the amplitude of the jitter the higher the amplitude of the sidebands.  

The frequency of the sidebands is dependant on one thing - the frequency of the jitter.  1KHz jitter has sidebands removed from your signal by 1KHz.  5KHz jitter has sidebands removed from your signal by 5KHz.  1+3KHz jitter has sidebands removed by both 1KHz and 3KHz, the amplitude of each logically dependant on the variables above.  White noise jitter yields sidebands of whitenoise - or rather just more white noise.  Etc.  

If the jitter frequency is above Nyquist then the sidebands appear in an aliased way.  If you have a 5KHz tone and you have 1KHz jitter you have sidebands at 4KHz and 6KHz.  If you have a 5KHz tone and you have 23.05KHz jitter (1KHz above Nyquist) the sidebands appear 1KHz removed from a false aliased tone.  The new "center frequency" is f(Nyquist) - f(tone), or in this case 18.05KHz.  So the sidebands appear at 17.05KHz and 19.05KHz.

If the jitter frequency is above the sampling frequency then it is as if the jitter is below Nyquist.  Jitter at 45.1KHz manifests itself exactly the same as 1KHz jitter.

Make sense?  Jitter is indeed difficult to measure, but the manifestation of it is very mathematically determinable and rather simple.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Zoesch on May 03, 2004, 09:00:05 AM
This thread has me scratching my head... must be my Telco background/leanings, but why would anybody say that jitter is irrelevant? That's akin to a reel to reel manufacturer saying that tape flutter is irrelevant, and we've all heard what flutter sounds like right?

I'm with Nika, I'd like to see jitter as a measure of both parts per million as well as frequency and I want to see the PLL specs... whether or not I want them measured at the end of a cable is kinda irrelevant, I can calculate that myself and most people would be utterly confused if they saw such a measurement.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 03, 2004, 11:44:13 AM
Steve & All,

From an EE standpoint, I think that the jitter test suite offered by AP pretty much covers the bases, but for the everyday recording engineer such specs are pretty unusable.

As Nika summarized above, a device’s jitter profile (jitter spectrum) is going add a certain amount of noise and distortion to the recorded signal. However, a widespread understanding of the exact amounts and character of noise and distortion that will be produced by a given jitter profile is absent in the audio community. The interchange between Ethan and Nika (‘is jitter nominally -110dB FS and if so why should I care?’) bears witness to this. What people care about the most is how and by how much a jitter spectrum will mess up a signal and that is what is missing from available specs. That would be a useful presentation of jitter specs.

Luckily, there is a fairly direct relationship between jitter components of a clock signal and the resulting THD+N  produced by an ideal A/D converter at a given sampling rate and resolution. I don’t know if this is the place to get into it, but I’ll summarize for now and write more if somebody asks. You can model jitter as the sum of two kinds of components: noise and periodic errors—that’s what a jitter spectrum tells us. If you decompose the jitter spectrum into these components, then you can automatically produce a ‘worst-case’ THD+N graph over the audio band. That THD+N graph would tell you how much distortion and noise a given clock device would give you at each frequency. Using such a graph, it would be fairly easy to settle Ethan and Nika’s argument.

There are a number of wrinkles to iron out, and I don’t have all the details worked out. But I do think that THD+N graphs of jitter against an otherwise ideal A/D converter would be a lot more useful to recording engineers than today’s jitter specs.

-Dennis
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 03, 2004, 12:04:42 PM
sfdennis wrote on Mon, 03 May 2004 16:44

Luckily, there is a fairly direct relationship between jitter components of a clock signal and the resulting THD+N  produced by an ideal A/D converter at a given sampling rate and resolution. I don?t know if this is the place to get into it, but I?ll summarize for now and write more if somebody asks. You can model jitter as the sum of two kinds of components: noise and periodic errors?that?s what a jitter spectrum tells us. If you decompose the jitter spectrum into these components, then you can automatically produce a ?worst-case? THD+N graph over the audio band. That THD+N graph would tell you how much distortion and noise a given clock device would give you at each frequency. Using such a graph, it would be fairly easy to settle Ethan and Nika?s argument.

There are a number of wrinkles to iron out, and I don?t have all the details worked out. But I do think that THD+N graphs of jitter against an otherwise ideal A/D converter would be a lot more useful to recording engineers than today?s jitter specs.

-Dennis



Dennis,

The only problem with this is that the THD+N spec would have to be curved in some way to represent how it is going to audibly transfer.  Sideband distortion of 1Hz at extremely high amplitudes is going to be practically negligable whereas sideband distortion of 1KHz at far lower amplitudes is going to significantly affect stereo image.  

I like the idea of generating a "worst case" THD+N graph, but since it is sideband distortion any frequency can impart distortion in the audible band, so we'd have to check (virtually) all frequencies.  

Also, jitter only presents noise if it is random jitter, such as clock phase noise.  I think we are generally far more concerned about the THD spec resulting from the jitter than the noise?  

Nika.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 03, 2004, 12:21:20 PM
Nika,

I was acutally suggesting a full audio spectrum plot of THD+N in the and not at a single nominal frequency. Though I expect that some folks will be interested (for silly reasons) in seeing THD+N all the way up to Nyquist and beyond. In any case, with a full audio spectrum THD+N, you'd see the sidebands if they were important and you wouldn't see them if they weren't.

As to whether random jitter is important or not, well if it was bad enough, it would show up in the graph.

BTW, it is good to banter with you again, Nika. Feels like the old days in the other place.

-Dennis
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 03, 2004, 12:37:16 PM
sfdennis wrote on Mon, 03 May 2004 17:21

I was acutally suggesting a full audio spectrum plot of THD+N in the and not at a single nominal frequency. Though I expect that some folks will be interested (for silly reasons) in seeing THD+N all the way up to Nyquist and beyond. In any case, with a full audio spectrum THD+N, you'd see the sidebands if they were important and you wouldn't see them if they weren't.


Sure.  Makes sense.  So you recommend running a sine wave sweep (for example) through a box wherein jitter will be induced, and then measure the THD+N that is presumed to come from the jitter alone, as opposed to the conversion process that occurs along with the jitter?  It seems like this will be heavily prone to error, as it does not actually measure the jitter but rather all of the distortions present INCLUDING jitter, many of those distortions being those that the clock designer can't control.

I think we would instead need to measure the jitter at the clock and then deduce the amount of THD+N that clocking device would yield.  So how far out do we measure the jitter on the clock in order to ascertain how much THD+N it will yield?  Do we start at 1Hz or lower?  Do we measure up to 1MHz, or higher?  

Nika.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 03, 2004, 01:22:31 PM
Nika,
Well, you’ve identified my major concern with the suggestion. The way I thought about it is was you would take the jitter spectrum as you suggested early in the thread, and feed it into a program that would spit out the THD+N graph. By using a program as a ‘virtual converter’, you’ll be able to get around the idiosyncrasies of any real converter that might be used in a given test. After all, real converters have their own flaws and you wouldn’t want a clock’s jitter spec to be influenced by a particular choice of converter.

As usual, the devil is in the details. First you’d have to take the output of a real jitter spectrum measurement such as is available from the AP setup. That’s the easy part. Then a program would have to model the spectrum as a linear combination of noise and jitter frequency components. Steve’s committee will have to agree on basis functions for modeling jitter. There aren’t many reasonable choices and they are all roughly equivalent. For example, you might choose A*gaussian(mean, variance) as the template for a noise term and B*phi(omega t) as the template for a periodic term. In any case the general idea is that you would derive a function J(t), that behaved like the real device and whose terms were a linear combination of the basis functions. The value of J(t) would represent the expected error of the real clock at a given time, t.

So the error at a given sample would simply be sin(omega*t) – sin(omega*t + J(t)).  To get the THD+N for a given frequency you’d just run the program for some nominal period (say 10 seconds) at the given sampling rate and resolution, and compute the RMS error.

Lots of details to work out, and after writing this, I’m not so sure that you could get any group to converge on the myriad decisions required for this to work. There are definitely compromises to be made, and of course, the programs that did all the computations would have to be open sourced to ensure that there wasn’t any cheating. Furthermore, after all this simulation and what-not, would you have a spec that truly represented a device’s capabilities and limitations?

In spite of all these obstacles, I really do think some sort of THD+N spectral plot would be infinitely more useful to audio engineers than a jitter spectrum.

-Dennis
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 03, 2004, 02:04:23 PM
sfdennis wrote on Mon, 03 May 2004 18:22

Nika,
Well, you?ve identified my major concern with the suggestion. The way I thought about it is was you would take the jitter spectrum as you suggested early in the thread, and feed it into a program that would spit out the THD+N graph. By using a program as a ?virtual converter?, you?ll be able to get around the idiosyncrasies of any real converter that might be used in a given test. After all, real converters have their own flaws and you wouldn?t want a clock?s jitter spec to be influenced by a particular choice of converter.


OK.  We're in agreement there.

Quote:

As usual, the devil is in the details.


OK.  We're in agreement there, too.  Smile

Quote:

First you?d have to take the output of a real jitter spectrum measurement such as is available from the AP setup. That?s the easy part. Then a program would have to model the spectrum as a linear combination of noise and jitter frequency components.


Well really there is no noise component (I mentioned phase noise earlier, but that is so not the problem and so far below the noisefloor of discussion that we should drop the noise spec).  As for reproducing the THD spec as a biproduct of noise that is good, but it will change dependent upon sample frequency.

Quote:

Steve?s committee will have to agree on basis functions for modeling jitter.


It doesn't need to be modeled, really.  It's a pretty straight ahead formula.  

The two issues that we run into are the range of the jitter spectrum we are going to require in the spec, and the deviance that any downline box can have on the results.  Again, much of the problem is the PLL, not the clock!

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on May 03, 2004, 02:50:16 PM
Stefan,

> why would anybody say that jitter is irrelevant? That's akin to a reel to reel manufacturer saying that tape flutter is irrelevant <

That's a great analogy, and I agree with it. The main difference between analog tape flutter and digital jitter is that one is typically present in amounts large enough to notice, and the other is not. You can easily hear 1 percent of flutter. You cannot hear 0.0001% flutter at all.

Hiss is a problem if it's not far enough below the signal, but it's not a problem if it's very soft. Same for distortion, hum, and every other audio artifact. It's all a matter of degree. If something is 80-90 dB or more below the music, it's not a terrible problem. And if it's below the noise floor of the medium, then it's not a problem at all.

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on May 03, 2004, 02:54:06 PM
Dennis,

> Using such a graph, it would be fairly easy to settle Ethan and Nika’s argument. <

Yes, which is why it amazes me that people are still arguing about this. So somebody please give me a worst case number expressed in dB below the program. Then apply Fletcher-Munson to make sure 3 KHz gets the weight it deserves, while minimizing the contribution of low frequencies.

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 03, 2004, 03:47:40 PM
Ethan Winer wrote on Mon, 03 May 2004 19:54

Dennis,

> Using such a graph, it would be fairly easy to settle Ethan and Nika?s argument. <

Yes, which is why it amazes me that people are still arguing about this. So somebody please give me a worst case number expressed in dB below the program. Then apply Fletcher-Munson to make sure 3 KHz gets the weight it deserves, while minimizing the contribution of low frequencies.

--Ethan


...and then put it in a real world room where nodes and nulls can expose things that otherwise would be burried below the thresholds of audibility...

Nika.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 03, 2004, 03:51:42 PM
Quote:

Well really there is no noise component (I mentioned phase noise earlier, but that is so not the problem and so far below the noisefloor of discussion that we should drop the noise spec).


Well, jitter has noise in that you can’t tell in advance what the time-error will be at a given instant—it is random. Is it white noise? Well, as I understand it, close-in phase noise falls off at 1/f making it more pink than white. But the fact that it is random—read not deterministic—makes it noise. And so you have to model it as a random variable. While such noise may be so low as to be uninteresting, why throw it out? If it is not interesting, then it won’t show up in the graph.

In general, the test should be as comprehensive as possible; noise should be part of the spec. Otherwise manufacturers might claim that their device is ‘better’ because it is quieter and they won’t have to defend (1) that their device is indeed quieter and (2) that it matters. In that event, we'd be back to where we started.

Quote:

It doesn't need to be modeled, really. It's a pretty straight ahead formula.


The resulting amplitude error from a given timing error is a pretty straightforward formula, but the actual sampling time error produced by the device is not so easy. If the jitter spectrum does not have a trivial (flat) shape, then a function that produces a timing error is not trivial at all. As I mentioned, the jitter function will be some combination of perhaps many noise and periodic components.

Quote:

The two issues that we run into are the range of the jitter spectrum we are going to require in the spec, and the deviance that any downline box can have on the results. Again, much of the problem is the PLL, not the clock!


While I haven’t thought much about the range of the jitter spectrum, I do think that the clock interfaces of downstream devices should be considered separately. Clock manufacturers have no control over what folks might be downstream of their devices.

A clock receiver should not only specify the jitter of its internal clock, but also its ability to reject incoming jitter—that’s what a PLL is supposed to do. All PLLs do to a greater or lesser extent, and that should be spec’d. I can offer no insights into how to make PLL specs more user-friendly.

-Dennis
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 03, 2004, 04:21:50 PM
sfdennis wrote on Mon, 03 May 2004 20:51

Well, jitter has noise in that you can?t tell in advance what the time-error will be at a given instant


Not exactly.  For the sake of this conversation there are two types of jitter - phase noise and everything else.  Phase noise is virtually irrelevant in clocks.  It's the everything else - the electronic circuitry, filtering, power supply, etc. that serves to provide the jitter that is consequential for the audio industry.  Even more than that, however, the internal PLLs are pretty crucial.  Everything else IS deterministic, periodic, and therefore causes distortion.  

In general, the test should be as comprehensive as possible; noise should be part of the spec. Otherwise manufacturers might claim that their device is ?better? because it is quieter and they won?t have to defend (1) that their device is indeed quieter and (2) that it matters.

That's OK.  It doesn't actually matter.  Phase noise is so far below anything that can be audibly discernable that it truly doesn't matter as far as I can tell.

The resulting amplitude error from a given timing error is a pretty straightforward formula, but the actual sampling time error produced by the device is not so easy. If the jitter spectrum does not have a trivial (flat) shape, then a function that produces a timing error is not trivial at all. As I mentioned, the jitter function will be some combination of perhaps many noise and periodic components.

Look at the jitter spectrum of any device in our industry and you'll find that the shape of the jitter spectrum is very static.

A clock receiver should not only specify the jitter of its internal clock, but also its ability to reject incoming jitter?that?s what a PLL is supposed to do. All PLLs do to a greater or lesser extent, and that should be spec?d. I can offer no insights into how to make PLL specs more user-friendly.

But really, perhaps we should just start paying more attention to the PLLs than the clocks.  It's the PLLs that really make the difference.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 03, 2004, 08:37:44 PM
Nika Aldrich wrote on Mon, 03 May 2004 13:21


Look at the jitter spectrum of any device in our industry and you'll find that the shape of the jitter spectrum is very static.



Not sure, Nika, I know what you mean by ‘very static’. I should say that I haven’t seen many jitter spectra, but the very best among the devices I’ve seen have a fairly dense 1/f (6-20db/decade) rolloff away from their clock frequency. That would represent the random noise. By ‘dense’ I mean that there is a lot of color under the curve. Many jitter spectra have one and sometimes two spurs in the tails. These would represent the periodic jitter components. None of the real jitter spectra I have seen consist of a few simple lines. Is this what you mean by ‘very static’?

Quote:


But really, perhaps we should just start paying more attention to the PLLs than the clocks.  It's the PLLs that really make the difference.



Sure, got any ideas?

-Dennis

ps. Why are my single quotes showing up as question marks when you quote me? It makes me thing that you question everything I say? Very Happy -D
Title: Re: Jitter Specification Input Requested
Post by: Zoesch on May 03, 2004, 09:24:19 PM
Dennis... your quotes are changing because you are probably typing your responses in word or some other word processor and when you copy them they are copied as Unicode...

Anyway, your appreciation of jitter is correct, the reality is that jitter is random within a specific frequency band (Or else clocks would be useless), it's the width of that frequency band and the periodicity that concerns us. Jitter in itself can only be measured as how many errored pulses there are within a specific time period, but if you were to compare two different samples of the clock signal spanning the same lenght in time, you'll realize that the errors don't happen exactly at the same pulses.

That's why measuring jitter in ppm to me is useless, were the manufacturers to provide the spectral component of that jitter and it would make a lot more sense.
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 04, 2004, 10:29:10 AM
Dennis Tabuena wrote on Tue, 04 May 2004 01:37

Not sure, Nika, I know what you mean by ?very static?. I should say that I haven?t seen many jitter spectra, but the very best among the devices I?ve seen have a fairly dense 1/f (6-20db/decade) rolloff away from their clock frequency. That would represent the random noise. By ?dense? I mean that there is a lot of color under the curve. Many jitter spectra have one and sometimes two spurs in the tails. These would represent the periodic jitter components. None of the real jitter spectra I have seen consist of a few simple lines. Is this what you mean by ?very static??


Dennis,

Are you looking at the crystal itself, or at the output of the box, on the BNC connector?

Also, dense jitter spectra that has a steep roll-off/octave will not manifest itself as noise on a signal.  It will manifest itself as distortion.  Feed a 1KHz signal through a converter with this signal and listen/look at the results.  Then do a 5KHz signal.  It would very clearly not manifest itself as noise.  Whereas a clock signal itself MAY be noisy, the way it manifests itself may not be.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 04, 2004, 10:42:51 AM
Zoesch wrote on Tue, 04 May 2004 02:24

Anyway, your appreciation of jitter is correct, the reality is that jitter is random within a specific frequency band (Or else clocks would be useless), it's the width of that frequency band and the periodicity that concerns us. Jitter in itself can only be measured as how many errored pulses there are within a specific time period, but if you were to compare two different samples of the clock signal spanning the same lenght in time, you'll realize that the errors don't happen exactly at the same pulses.


Zoesch,

This is absolutely erroneous or I completely misunderstand what you are saying?  We measure jitter by plotting the variations and amplitude of the clock irregularities.  If you measure the variation between every clock pulse and absolute time and plot it on a graph you will see predictable behavior in the clock's variation from accuracy.  We don't measure how many "errored pulses" there are over time.  Every pulse is erroneous, so we measure the amplitude of the error.  A 1KHz jitter signal will manifest itself as timing variations that cycle 1000 times per second, so on a 8KHz clock we'll see a complete cycle of error in 8 samples.  If A is the amplitude of the jitter then the formula for the errors might be as follows:

Pulse 1 - On time
Pulse 2 - early by A(pi/4)
Pulse 3 - early by A
Pulse 4 - early by A(pi/4)
Pulse 5 - On time
Pulse 6 - late by A(pi/4)
Pulse 7 - late by A
Pulse 8 - late by A(pi/4)
Pulse 9 - repeat.

In order to determine the frequency content of the jitter you do an FFT of the plot of the errors over time.  The FFT of the jitter (the jitter spectrum) will tell you the amplitude and frequencies of the sideband distortion created by the jitter in the sampling process.  The specific amount is also affected by the frequencies and amplitude of the frequencies being sampled, all very similar to wow and flutter.

Nika.
Title: Re: Jitter Specification Input Requested
Post by: Bob Olhsson on May 04, 2004, 12:24:53 PM
Zoesch wrote on Mon, 03 May 2004 20:24

 
That's why measuring jitter in ppm to me is useless, were the manufacturers to provide the spectral component of that jitter and it would make a lot more sense.

Manufacturers just love to advertise meaningless specs. A spectral analysis of FM distortion is the only meaningful way to describe it.

The other problem with wow and flutter (consumer electronics industry hacks changed the names so they could make digital gear look better on paper) is that when you chain a series of devices, you get additional sum and difference sidebands.

Thankfully jitter in digital video looks like very obvious crap on the screen so chip-makers and manufacturers are finally being held to account and are no longer able to hide their crappy engineering behind claims of a need for double-blind testing.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 04, 2004, 12:38:29 PM
Nika Aldrich wrote on Tue, 04 May 2004 07:29

Are you looking at the crystal itself, or at the output of the box, on the BNC connector?

Looking at the output of the box using a very nice spectrum analyzer and occasionally a DSO scope. In all cases with a measurement-grade bnc connector. But, FYI: Can't look at the crystal itself w/o the supporting oscillator circuitry.

Quote:

 
Also, dense jitter spectra that has a steep roll-off/octave will not manifest itself as noise on a signal.  It will manifest itself as distortion.  



Sure if it is far enough away from the clock frequency and rises well above the absolute noise floor. The only things I've seen like that are the spurs I mentioned earlier.

Quote:


Feed a 1KHz signal through a converter with this signal and listen/look at the results.  Then do a 5KHz signal.  It would very clearly not manifest itself as noise.  Whereas a clock signal itself MAY be noisy, the way it manifests itself may not be.


Again it depends on where the peak is, how high it rises and how fast. But that's what this whole thing is about. Take a jitter spectrum and produce a THD+N graph from it.

-Dennis
Title: Re: Jitter Specification Input Requested
Post by: sdevino on May 04, 2004, 12:55:52 PM
I think we ought to jump up and take a little broader view.

The only noise or jitter that will matter in the end is what ever it is that makes it to the actual input of the ADC or DAC circuit in the actual IC chip. So we need to understand how to determine this given a real world circuit.

It turns out this is pretty hard to do, very expensive and almost never really done. For one thing its hard to measure the jjitter at the iput to the converter because you need to get inside the chip itself with a probe. For another thing attaching a probe to the clock circuit changes the clock cicuit and therefore introduces errors contributed by the measurement process.

So from a practical standpoint you are left with measurements that can be used to correlate one error to another. For instance there is a direct correlation between jitter and noise, whether measured as a noise floor or in PPM. So one means of measure would be to measure other noise sources which are relatively easy, then derive the jitter as an additive element.

Sweeping the input is a questionable approach as well since you are using  a very non-typical input signal to determine typical response. Applying a broadband signal or carefully designed noise source and subtracting its spectrum from the measured result would provide much more real world data (in less time as well).

The topic is Audio Engineering though. We need to get back to what effect jitter has on the audio spectrum and how would measuring jitter provide usefull input to a Recording Engineer. There a lots of EE's out there analyzing this topic from  design standpoint.

So far I like the full jitter+noise approach. It seems to offer the most usefull insight to what would bother me as an audio engineer. Just as THD+Noise has become a pretty standard performance gauge..

Title: Re: Jitter Specification Input Requested
Post by: Ethan Winer on May 04, 2004, 02:57:50 PM
Nika,

> and then put it in a real world room where nodes and nulls can expose things that otherwise would be burried <

Of course one could make that point for simple harmonic distortion and even hiss. But first let's start with a number that reflects the inherent audibility of jitter alone. Adding that extra complication just obfuscates the issue, and doesn't address the only important question: How many dB below the music is jitter typically?

--Ethan
Title: Re: Jitter Specification Input Requested
Post by: Zoesch on May 04, 2004, 08:46:58 PM
Nika,

That should've read Clock Stability (not jitter) can only be measured as a function of errored pulses... anyway... even my lowly Tektronix analyzer can give you the jitter power density down to central frequency, width, amplitude, etc. Better scopes can give you statistical distribution of jitter including, mean, average, standard deviation and higher order moments and some can even give you time/amplitude measurements... so it's not as if the tools to measure jitter don't exist, manufacturers choose not to use them.

As to how to report these results... why are we reinventing the wheel? ITU-T and ANSI have very descriptive standards for clocking, wander & jitter performance and synchronization. Failing that, the spectral power density of jitter should be sufficient for 99% of people.
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 04, 2004, 10:00:52 PM
Zoesch wrote on Wed, 05 May 2004 01:46

 so it's not as if the tools to measure jitter don't exist, manufacturers choose not to use them.


I think it's more complex?  The issues of what and where to measure it, and how to report the results seem the bigger issue.  

Quote:

As to how to report these results... why are we reinventing the wheel? ITU-T and ANSI have very descriptive standards for clocking, wander & jitter performance and synchronization. Failing that, the spectral power density of jitter should be sufficient for 99% of people.


Then we get into the fact that most people can't look at a spectral distribution of jitter and figure out how it's going to manifest itself.  Is 50ns of jitter at 1Hz better or worse than 1ns of noise between 1KHz and 2KHz?  

Then again, most people can't figure out how it will manifest itself because the PLL in the next device will bastardize the clock signal far more than the usefullness of a jitter spec will allow.  Thus, spitting out jitter specs at the BNC output of the box is relatively pointless if the signal then gets upsampled through a series of PLLs in the next box in such a way as to significantly change or add jitter from the first box.  

Quite frankly I'm not disappointed at the fact that the industry, as of yet, doesn't have a jitter spec because adopting a spec suddenly adopts an opportunity to abuse it.  Currently most clock designers (such as Apogee, Lavry, dCS, Aardvark, etc) all avoid specs and say, rather, "ours is good, spec's are unpublishable, you'll have to listen."  And that is pretty much the truth, because the rest of a clock circuit weighs so heavily on the results that it really needs to be a case-by-case evaluation.

Take ANY wordclock source and feed it into an Apogee Big Ben or a Lavry DAC and check out the jitter at the output.  It will be FAR better than taking an Apogee Big Ben or Lavry clock and feeding them into an in-the-computer-PCI-sound-card and measuring the jitter on the converter chip there.  

Nika.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 05, 2004, 02:05:10 AM
Nika! When and how did you get so cynical?

Quote:

Quite frankly I'm not disappointed at the fact that the industry, as of yet, doesn't have a jitter spec because adopting a spec suddenly adopts an opportunity to abuse it.


That's like saying we shouldn't have any laws because people are bound to break them.

Factually, I agree with you: All specs can be 'gamed', and there will always be gamers. But we can't cede the world to the barbarians.

Specs are a good thing provided they are (1) relevant, (2) easy-to-use, and (3) readily verifiable--whether it is for clocks, converters, or anything else. If the specifications for a devices such as clocks aren't relevant beyond some minimum performance bar, than all the industry needs is a quick pass/fail test, and no specs are necessary at all. They're irrelevant. If they're not easy to use, no one will know what to do with them. If it takes a $40K piece of equipment and 6 months training to validate the device, well that's too hard and unscrupulous manufacturers will cheat.

Every one of these three aspects has been raised in this thread, and the discussion can be applied to specs for any device.

Clocks and PLLs are either important or they're not. Even if they are important, clocks might be a binary category with little room for differentiation: either a device is 'good enough' or it is crap. If this is so, then there will be no opportunity for competitive performance differentiation in the 'good enough' group. One good-enough clock will be just like any other. They will become commodities prices will fall, and manufacturers will lose their margins. Devices in the crap group will be outed by the market and won't survive. You can see why manufacturers might not want specs. Don't give in.

It is sad that in general, specs for audio equipment have a bad rap. I have a pet theory about this. Some of it results from the gamesmanship of godless manufacturers. But I think that the majority of it, and the resulting cynicism, stems from consumer's inability to receive provided specs and translate those specs into a specific expectation of an audio experience.

When I see a THD+N curve for an amplifier, I have a pretty good feeling of how certain passages of my favorite recording of Madama Butterfly are going to sound through it. Over time, I've been less and less surprised. When I am surprised, I will take some time to figure out why. You'd be amazed at how often circuits in demo stations are improperly wired (usually both channels left or right) or speaker elements and channels are downright fried. Every now and then it is the actual amplifier. By being surprised time and again, my ability to 'hear the spec' has improved. The process isn't too distant from learning to read music: see the notes, hear 'em in your head, play 'em.

We should encourage this learning feedback loop by creating relevant, easy-to-use, readily verifiable specs. I think that it is important work that thoughtful folks like you, Zoesch, Ethan, and others here can contribute to. So don't be discouraged.

Sorry for the rant.

-Dennis

ps. Thanks, Zoesch, for the tip on NOT using quotes in Word. -D
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 05, 2004, 10:15:28 AM
Dennis Tabuena wrote on Wed, 05 May 2004 07:05

Clocks and PLLs are either important or they're not.


Precisely, but specs on one are relatively useless without specs on the other.  The clocking of a system is exactly that - it is a comprehensive system, and cherrypicking one device from the system to measure is unto itself relatively useless information.  Because it is useless it inherently allows for a degree of bogosity - even formalizing a spec becomes bogus if it provides irrelevant information.  Until we can have some sort of spec on a clocking system that indicates what the jitter in the clock is that gets into the A/D chip then we don't have a useful specification at all.  I contend that so far the discussion on measuring jitter at the clock source does not get us closer to the jitter at the A/D chip, primarily because the eventual spec will be so misunderstood as to render it essentially useless, on top of the fact that it is relatively useless anyway in that it isolates a single component in an interactive and comprehensive system.


Nika.
Title: Re: Jitter Specification Input Requested
Post by: sfdennis on May 05, 2004, 11:20:49 AM
Nika Aldrich wrote on Wed, 05 May 2004 07:15

...but specs on one are relatively useless without specs on the other.  The clocking of a system is exactly that - it is a comprehensive system, and cherrypicking one device from the system to measure is unto itself relatively useless information.


If you really believe that, then when do you stop?

One could say that the system is the entire recording chain, and that's clearly ridiculous. It is akin to saying mic specs aren't relevant because how they sound depends on the preamp you use with them. Well it is true that the preamp shapes the sound, but geez, those are spec'd too. Having specs for both gives us an opportunity to identify how these things are going to translate our recordings.

If you put a crappy clock--say one that has a propensity to randomly and abruptly jump from one frequency to another far-way frequency, well there ain't no PLL that will be able to deal with that.

Conversely if you put a well conditioned signal from a cesium clock into a converter with an improperly damped PLL, well you're hosed anyway.

It is useful to spec both. Spec the stability of the clocks. Spec the jitter attenuation of the PLLs in their locking state as well as their lock-time. Also spec their stability in their flywheeling state (in this state, they're their own clock).

And yes, you can't take a direct measurement of jitter inside the chip, but if you know the quality of the incoming clock--whether it is from an external source or inside the converter--then you have some basis for identifying the weak link in the chain. And that is very useful.

-Dennis
Title: Re: Jitter Specification Input Requested
Post by: Nika Aldrich on May 05, 2004, 11:31:16 AM
Dennis Tabuena wrote on Wed, 05 May 2004 16:20

If you really believe that, then when do you stop?


When the quality of a single device is not overwhelmingly obfuscated by the quality of another, unspecified device that it is dependant upon, such that it is with clocks.

Quote:

It is useful to spec both. Spec the stability of the clocks. Spec the jitter attenuation of the PLLs in their locking state as well as their lock-time. Also spec their stability in their flywheeling state (in this state, they're their own clock).


And until we do both, and until the specs are meaningful for both then doing only one of them is relatively meaningful,  yes?

Nika.