SPIDF vs. Analog a Visual Comparison

  • I'm not saying that the Kemper clock worries me quality wise, but for the paranoid users out there it might be worth mentioning that e.g. RME interfaces interpolate external/received clocks up to their standards. The feature is called Steady Clock IIRC.
    I'm not sure about other manufacturers, but I guess RME aren't the only folks that incorporate ideas like that into their devices.

    https://www.rme-audio.de/english/techinfo/steadyclock.htm

  • I'm not saying that the Kemper clock worries me quality wise, but for the paranoid users out there it might be worth mentioning that e.g. RME interfaces interpolate external/received clocks up to their standards. The feature is called Steady Clock IIRC.
    I'm not sure about other manufacturers, but I guess RME aren't the only folks that incorporate ideas like that into their devices.

    Exactly. Digital clocks sync with each other, they don't replace. If you only have the choice of syncing a poor clock and a quality clock, choose the poor one as the master. The better clock will stay in sync much better than the poor one.

  • There are other factors involved in this stuff too, such as delay or phase compensation on the card and so on.

    Don't assume that SPDIF will be better, check out the format details : https://en.wikipedia.org/wiki/S/PDIF it's really targeted to something that barely anything uses and has significant downsides, using anything other than 48k 20bit audio with a central word clock results in compromise one way or the other. It's upside is that it's cheap, relatively robust and ubiquitous.

    I use SPDIF because I find it convenient as it frees up inputs on my interface for mic's, at the same time it makes re-amping pretty easy. Latency mostly occurs at the computer and driver end of things rather than in the sound card, I've not measured the different between analog and SPDIF here but I imagine they're out by only a few samples, and audio quality is pretty much indistinguishable. All in all it's just more convenient.

  • ... and everyone knows you always need more cowbell!

    Great to see you back, Jen. <3

    I've been MIA also doing "analog stuff"(see below), but I popped in occasionally to see if you were back.

    Great to see you back also!

    You know I'm born to lose, and gambling's for fools
    But that's the way I like it baby
    I don't wanna live forever

  • OTOH, like many audio professionals I have a high quality master clock on my system and it made a massive difference in the sound quality of my converters.
    So the chances of my using the Kemper as master clock instead are, lemme think... nil.

    I honestly wouldn't worry about this these days. In the times of Pro Tools mix systems I bought an Apogee Big Ben and it did, as you say, make the difference. With more recent converters I cannot hear anything. In fact the Big Ben is a pain because you have to manually change sample rates.

    Karl

    Kemper Rack OS 10.2.2 - Mac Sonoma 14.5

  • Back in the mid 1970s humankind found out that measuring audio circuits doesn't reveal or explain the audible sonic differences perceived by the human ears.

    I am amazed that some can't hear the difference between the digital out (pure therefore the reference) and the analog (two additional conversions, unpure?) no matter what convertors are used. I can only conclude the listener is using very distorted "amplifier/speaker rigs".

    As I've stated here previously and still my experiece:
    With a clean amplifier/speaker rig - the digital out to the DAW recording software being the reference:

    When comparing this reference digital out recording to the analog out recording there is audible degradation applied to the analog output recording that is blaringly obvious.
    The analog out (dual converted) track has an addition of raspy hash or harshness (distortion) added to the upper mid/high end or put another way, lacking of clarity or purity that is there in the reference digital recording.

    I also completely understand that many people using the KPA don't hear, don't care, use it in a different type of application or have to use the analog outputs.

    I had to use the analog outs up until the 48k was added.

    Will

    Edited once, last by WillB: typo (July 4, 2018 at 8:17 PM).

  • "Um, what?"

    Further Clarification, if needed:
    People realized that if it is good on the test bench it does not necessarily translate to good sound in the system.
    Example: Transistor amps. Whether for guitar amplification or in audio playback systems-They measure good therefore they must sound good. Would you rather have a Fender Twin tube amp or the Fender solid state equivalent introduced as "better" in the1970s.
    Think Vox, Marantz, McIntosh etc.
    In the context of this thread the original poster had some "measurement" of waveform and frequency spectrum snapshots, as is that proved something.

    Will

  • No. The reason why we favour one before the other is because the human ear/brain loves non-linearities (distortions). I.E. This is why people love vinyl above CD's or streaming.

    If the two signals null (OP would have to level- and time-match) to any significant degree, then the outputs are identical in any practical way.

    We observe and measure states because we can not trust our own sensory abilities and to reach an objective consensus about the world we share. This is not magic. The science works.

  • I am amazed that some can't hear the difference between the digital out (pure therefore the reference) and the analog (two additional conversions, unpure?) no matter what convertors are used. I can only conclude the listener is using very distorted "amplifier/speaker rigs".

    When comparing this reference digital out recording to the analog out recording there is audible degradation applied to the analog output recording that is blaringly obvious.
    The analog out (dual converted) track has an addition of raspy hash or harshness (distortion) added to the upper mid/high end or put another way, lacking of clarity or purity that is there in the reference digital recording.

    Blaringly-obvious? :huh:

    May 22nd 2018

  • "Um, what?"

    Further Clarification, if needed:
    People realized that if it is good on the test bench it does not necessarily translate to good sound in the system.
    Example: Transistor amps. Whether for guitar amplification or in audio playback systems-They measure good therefore they must sound good. Would you rather have a Fender Twin tube amp or the Fender solid state equivalent introduced as "better" in the1970s.


    You're confusing taste with physics. You said "measuring audio circuits doesn't reveal or explain the audible sonic differences" - yes, yes, it does. Everyone has known, since they started coming up with solid-state gear, that solid-state is capable of being "cleaner", "more pure", etc, every single time. It's the same with vinyl and CDs - vinyl simply cannot be as clean as a CD recording because there's distortion inherent in the LP-cutting and LP-playing equipment. That doesn't mean a CD is better than vinyl, because "better" is relative. Better at reproducing the exact recording, for pristine classical music with a huge dynamic range? 100%. Better at delivering a warm, pleasing sound? No, vinyl probably wins.


    Quote

    In the context of this thread the original poster had some "measurement" of waveform and frequency spectrum snapshots, as is that proved something.

    It does prove something - with decent gear, the difference between the SPDIF and Analog outputs is minimal. And the OP is correct - it is. Depending on your interface you might see a bit of a low-pass effect, but it tends to be higher than what many of us are low-passing anyway so it's not worth worrying about.

    You're using a computer in a box and transferring the signal, either digitally via SPDIF or electrically via analog, to another computer. There's no magical analog warmth in the three feet of cabling between your KPA and PC.

  • I am amazed that some can't hear the difference between the digital out (pure therefore the reference) and the analog (two additional conversions, unpure?) no matter what convertors are used. I can only conclude the listener is using very distorted "amplifier/speaker rigs".


    The analog out (dual converted) track has an addition of raspy hash or harshness (distortion) added to the upper mid/high end or put another way, lacking of clarity or purity that is there in the reference digital recording.

    Does this all add up to a claim that the Kemper output converters are not up to the job? I will certainly have a go at comparing again though.

    Karl

    Kemper Rack OS 10.2.2 - Mac Sonoma 14.5

  • Does this all add up to a claim that the Kemper output converters are not up to the job? I will certainly have a go at comparing again though.

    More likely that the interface used is not up to the job.

    Also, keep in mind that the audio interface's converters themselves are not necessarily to blame. There's a whole analogue circuit before the signal hits the actual converters.