Test accuracy of audio clock

Is there software that does this?

Does anyone know of any software that will test your soundcards clock and make sure it is accurate? It would also need to test each avialable sample rate.

Thanks
Dave T2

Hmmm…does the RightMark Audio Analyzer do what you want? Try 'em HERE.

TG

NO, he needs something to measure jitter. Read this article to get an idea as to the methods one uses. Personally, I dont know of any software that measures jitter that is free. There are several that are “paid for”, but I don’t know how well they might work with a sound card. Most everything I know of requires an ociliscope. It is hard to use a computer to meaure jitter becuase the CPU has its own timing issues as well as the jitter from the sound card. So you need a stable source to use as your reference and I don’t know if a PC and sound card on their own have any clock stable enough to use as a reference thus the need for the external ociliscope. But, if anyone knows of anything, I wouldn’t mind seeing it myself.

After I posted, I got to thinking, it probably would need to be an external test device, like an oscilloscope. Otherwise, like you said, you might be testing bad with bad.

I might ask around here at work about testing my soundcards. Do you just need to test the crystal?

Dave T2

AH. Now I see…sounds like a lot of work. According to the article Bubba linked to, you’ll need a 2 channel 'scope, a powered signal splitter, a delay line and an attenuator. That will just give you a “rough guess”. You’d need a very high-end digital 'scope with some expensive processing software to get meaningful measurements.

There are other factors to consider when talking about the clock crystal and measuring jitter. I think most designs use one clock crystal and a multiplier to achieve different clocks for other sample rates. That introduces another “path” for error to occur. (Please, somebody correct me if I’m wrong!)

Why fret? If it sounds good…it is good. An intellectual exercise perhaps? If so please post your findings. I find this kind of stuff very interesting. Alas, I have no time to pursue such tinkering anymore. Sounds like fun!

Good luck!

TG

Quote (davet2 @ Oct. 14 2004,10:06)
Does anyone know of any software that will test your soundcards clock and make sure it is accurate? It would also need to test each avialable sample rate.

What is it exactly that you're after - to find out how precisely your soundcard is to the "standard" clock rates (44.1 kHz, 48 kHz), or to determine jitter?

Modern crystals even in cheap devices are bloody good (I have a $20 watch that doesn't gain/lose more than 30 sec per year) , so there's not much to be learned there. If the device has a separate word clock out, you can view that with a frequency counter , but the measurement will be dependent on the accuracy of the frequency counter. Devices that accept or play digital audio signals will have a tolerance anyway, so if the nominal sample rate is say 0.1% off, it will still play no problem.

If it's jitter you're interested in, it's usually detected subjectively (eg "this sounds not quite as good as it should" and all other causes are eliminated) and measured by its artifacts - eg you play a known good WAV file of a 1kHz sine wave, and look at the audio distortion spectrum. Or you access the word-clock in the system and do a spectral analysis to see how much it varies.

I don't expect that you'd learn anything of value with just an oscilloscope. Maybe you could look at the test devices clock while triggering the scope from a known great clock, but the jitter would have to be fairly gross before you'd see it.

Here again, any device that's designed to accept an external clock or digital signal should have some level of re-clocking ability so that a low level jitter will be ignored.

I work with a lot of video. On the forum for the software I use, a lot of people seem to have problems with audio going out of sync with the video. I have a vague theory that possibly some soundcards are off clock some and thus may be causing some of the OOS issues. I thought if there was a simple way to check the clock accuracy that would either enforce or debunk my theory.

Another guy on a similar forum claims that his video and audio is running faster on his computer than on the original video. Same idea, I am thinking maybe audio clock.

It is just a theory with no technical expertise but I thought I would try.

But a while back, I did have a friend who had a cheap soundcard who wanted to do some recording. He could not use the internal MIDI because it was offkey. Presumedly, due to a bad clock reference.

If a MIDI plays on key ( i.e. a middle C is a true middle C), then is then any indication of the clock accuracy, or are they related?

Dave T2

Surely, the AD/DA clock and the clock source for the MIDI sythesizer chip would be separate? Maybe not…hmmm…interesting. Sorry I’m not offering a whole lot of help here. I just find the whole topic very fascinating. Are you synched to your video equipment via SMPTE or summat?

TG

Regardless of the accuracy of watches, it’s not unusual for soundcard clocks to be different. For most purposes, low jitter in a soundcard clock is WAY more important than an accurate rate (unlike a watch, where the opposite is true).

Many people that try to record with two soundcards and don’t take steps to synchronize them find that the audio is noticeably out of sync by the end of a 3 minute song. Note that even a 25 msec discrepancy can cause problems, and a 50 msec discrepancy can sound pretty bad, depending on what’s recorded.

So the question about what you want to measure and why is significant.

If all you want to know is how good your soundcard is, then just use RightMark. It’s a good idea to do anyway, just in case there’s a problem in your setup, it’ll show up. If you have a jitter problem, it will show up as noise. You won’t be able to distinguish between jitter and analog circuitry noise using RMAA alone. However, if you can borrow a certified word clock, you can use that as the clock input and compare the results.

If you have two soundcards, you can measure 4 times: each soundcard by itself, and each soundcard being clocked by the other. If the results are the same, then all you know is that the two clocks are comparable, or both have lower jitter than the analog noise. But if the results change, you’ll know which soundcard’s clock to use: the one whose results are worse in the mixed trial.

The only way to measure a soundcard’s clock rate (not jitter, but overall rate) without using another clock is to record a perfectly calibrated tone, and then measure the frequency numerically. However, the pitch discrepancy will be so small that this probably isn’t practical – you won’t find a tone source that accurate. And to the technical folks, yes, this is essentially using another clock: the calibrated tone.

What are you worried about? Why do you want to measure it?

"What are you worried about? Why do you want to measure it? "

I thought I explained that.

Dave T2
???

Dave T2,

Out of curiosity… what video editing software are you using (that gives the sync problem), and what video and sound cards? Any extra strapping or configuration for ensuring sync between the sound & picture, or is the video software controlling that ?

I don’t have the problem myself, but other people seem to.

This particular software is Pinnacle Studio 9. I have very good luck and stability with it.

Dave T2

Hrm, I would think everyone would sync to SMPTE for video work…

If you are talking about SMPTE timecode, edit applications at this level do not use timecode as a sync clock (like a midi or in a linear tape editing situation). I assume they are using the clocks on the computer and some pseudo sync references.

The video capture device I use is an A/D device. The audio is captured on a seperate path thru the audio card and then then signals are combined to an MJPEG AVI file.

This has always been problematic for A/V sync, but I am trying to find a reason that it works for many people and not for others. Obviously there are many variables in hardware, soundcards, video capture cards, etc. I am just persuing a theory

Thanks for your help and any other suggestions.

Dave T2

I don’t know if this will help, but my ATI video card has a feature to test the audio clock and tell if it is out of sync by more the +/- 2%. I can only guess if that is a truly acceptable range as I would suppose over a long time that could really add up.
The card I have is ATI All-in-Wonder 9600. I don’t know if the software is available anywhere else or if it only works with the All-in-Wonder cards.
Bax

Thanks, I have an ATI Radeon 7000. I will see if there is a utility for that.

Dave T2

I thought I explained that.
Oops, you sure did. I missed that post somehow.

So the issue is frequency accuracy, not jitter-free operation. I don't know about the detailed engineering of clock hardware, but usually in engineering, you can trade off one thing for another. Based on obvious clock discrepancies I've seen, soundcard manufacturers are sacrificing frequency accuracy for low jitter, and that makes perfect sense for most applications.

Of course, SMPTE is the correct solution for pro video.

For non-pro applications, you need to find a way to slave the audio clock to the video recorder (or vice versa). Otherwise, you're going to have this problem, or you have to measure clocks between video and audio gear until you find a pair that happen to match.

So, assuming you can't synch them electrically:

The good news is that audio/video sync isn't nearly as critical as audio/audio sync. Audio and video can be off by probably 250 msec before it becomes annoying. On the other hand, recording sessions might be long, exacerbating the problem.

Like I said above, you can minimize the problem by comparing clocks. The way to do that is obvious: you video something that ticks both audibly and visibly. Record for whatever your max recording time will be. Merge the two in whatever software you're using. Then check the timing in various places in the playback. If you can, measure the time between the visual and audio tick at the end of the recording, and divide by the recording time to that point, and you have your error term.

You could buy a bunch of cheap soundcards like Soundblasters on ebay, measure them all this way, and pick the best of the lot, and sell or give away (or keep ... more on this later) the rest. If the soundcard is close enough, then that's your baby!

However, clock rates drift with temperature, and also as the unit ages. Different clocks drift differently. So, the best soundblaster today may not be the best one later -- and ditto for hot days versus cold ones, etc.

Another option is to measure the sync discrepancy (as a ratio: seconds lost or gained per seconds of record time) and then compensate using an audio wave editor and use the "resampling" feature. You'll have a little simple math to work out, is all.

To do the measurement all you need is a clear audiovisual sync signal, like the clacker they use in the movies (for just that purpose). But you need it at the end of any long recording. If there isn't an obvious one in the program, just walk in front of your camer and clap your hands (or get someone to do it). Then you can measure that later in your video editor.

See what a hassle? So it's best to slave the soundcard to the video, or vice versa. Ideally, just don't buy equipment where you can't do this. I bet cameras today have S/PDIF outputs -- that's all you'd need to slave a soundblaster.

Clocks are interesting. I did some work in NTP, "Network Time Protocol", and learned a lot from papers by Dave Mills, the genius behind it. If you're interested in learning more, his web page should be a good place to start. (It used to be, but I see it's been "cleaned up".)

Thanks for the response Learjeff. I will try the visual clock test and see what happens.

Dave T2