Jump to content

the distortion of sound


techristian

Recommended Posts

  • Members

The video wasn't about the loudness wars at all, it was about data compression (MP3 et al.)

 

But still, the average listener BOTH doesn't care AND can't hear it. Is 128kbps still the standard? I don't personally download much music these days, and when I do I get FLAC. But I'm assuming 128k is still the standard. As a long-time audio guy myself, I have to admit that while I CAN hear a difference between 128k and WAV, it's not huge. And the difference lessens on lower-quality playback devices like tiny iPods and earbud headphones. But I've been immersed in audio for long enough that I know what to listen for. The average listener? Even if you could make them care, few would hear it without extensive training.

 

The video mentioned how better TV technology was immediately adopted. That's true, because everyone could immediately see how much better it was. Video was harder (engineering-wise) to make better. Not the same for audio. 16 bit/44.1kHz (through decent reproduction equipment) is close enough to perfect that no one cared if it was roughed up a little. Hell, people didn't really complain that much about vinyl for decades, did they? Despite tempo fluctuations, scratches, skips, degeneration, etc. 128kpbs MP3, while a compromised version of 16/44.1k, is still better and longer-lasting than vinyl (or god forbid cassette!)

 

The difference I hear in (approximately) 128kbps MP3 versus WAV is subtle, and mostly in the low end (until the bit rate gets lower). I usually rip CDs to 192kbps for my iPod. Below that the high end starts to fade, especially below 128. Around 128-192, I find the effect on the high end to be minimal, but on the low end it makes it kind of 'flat'. I've struggled for years to clarify this statement even to myself, and I can't. It just sort of seems like something that was once 3D and making it 2D. I'd prefer going 256 or 320, but I make the compromise to 192 so I can fit more on my iPod. My tastes are varied enough that I never know from one hour to the next what I want to listen to.

 

But when I'm listening at home from my computer, I play from WAV when available. I don't use any streaming services, and I've never bought non-physical media. The only time I ever stream music is when I'm checking out something new.

Link to comment
Share on other sites

  • Members

128 was the going rate in the late 90s -- but even then the move was to less-lossy bitrates. I was an old Emusic customer (during their all-you-can-eat period) and they started at 128 but midway through my time with them they began moving their content to 192 -- the 128s had struck me as listenable (roughly equivalent to a normal bias cassette with no hiss but a definite lack of high end detail) but the 192s struck me as a very noticeable improvement. I felt I could still tell a significant difference in critical listening situations -- but the point of an mp3 was to save space after all, and 192 looked like a very good compromise.

 

But, you know, after a while I found myself ABXing 192 and 256 kbps files and thinking, gee, I can nail this pretty much all the time, so I moved up to 256 for a long time.

 

But then I switched subscription stream services from Rhapsody (which always seemed to have a noticeable 'slur' or haze to my ear, even vis a vis other stuff of similar bitrate -- 160 kbps, I believe) to MOG, which had all 320 kbps. And they sounded great.

 

I did an informal ('sighted') listening comparison and thought I could hear the diff between a 256 and a 320 of the same file. I really figured that was cognitive distortion. I felt I could just barely tell a 256 from a full CD track.

 

But I did some ABX'ing with a very familiar CD track I'd used for bitrate comparisons in the past -- and though I was really quite sure I wouldn't be able to tell with statistical significance, I was able to differentiate the 256 from the 320 with statistical significance.

 

With 320 and full CD, no way, not even close. Not even a hint of a 'sighted' differentiation.

 

(THAT all said, my ears are old, I'm in my early mid-60s and my hearing goes way south over 10kHz -- in fact, 10 kHz is on the faint side at this point. However, it appears from testing by others that those who can consistently tell a 320 from full CD are extremely rare. At best. There are, however, some tracks that contain electronically/digitally created sounds -- that could not pass through free air intact -- that are what are informally known as codec-killers, sonic information so 'unexpected' -- when considering real world sounds -- that they 'confuse' the perceptual encoding algorithms.)

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...