I just don't get how digital could ever reproduce an analog signal faithfully to the original information...

I just don't get how digital could ever reproduce an analog signal faithfully to the original information. Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.

Attached: signal.png (755x521, 66.18K)

I just don't get how digital could ever reproduce an analog signal faithfully to the original information
it cant

>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way
yes it will

The issue is that any thing that could ever receive, process or “perceive” an analog signal, regardless of whether that thing is analog or digital at the end, is somehow bandwidth limited by its receiver. There is a limit to the amount of information per second that can be conveyed over some medium with a given bandiwidth called the Shannon limit. Want to know what Shannon also determined was the fundamental unit of information? Bits. Once you sample with a resolution in time and bit width exceeding the Shannon/Nyquist requirement for a medium it doesn’t matter anymore. You have captured that information. A lossless format that exceeds the Shannon limit of human hear is all you need. The retarded kbps levels some “audiophile” smoothbrains are pushing now have lapped that long ago.

You can have quantization noise in an ADC just as anyones bajillion dollar vacuum tube can also pick up the ham radio coommer’s noise as well. If the sampling is not done correctly you can have excess spurious free dynamic range, but that is the fault of the technician recording the sample, not the concept.

dude just make the delta go towards zero. God u engineers are so stupid.

Number of bits merely determines the noise floor, the signal itself is always perfect.

Did you know that square wave + low pass filter creates a perfect sine wave? Fourier is magic, man.

Yes here we go again. A ton of people with nothing but youtube video experience explaining how Digital somehow perfectly duplicates a direct copy of audio.

It turns out humans aren't that good at noticing the differences

Especially when there's no difference.

humans can't see past 24 fps btw

Nothing can replicate a signal perfectly.
With analog, you have unavoidable thermal noise.
With digital, you have unavoidable quantization noise.
Of the two, digital is the easier one to work with and improve, but analog can't be disregarded as it is necessary for all physical I/O.

TL DR: audiopedophiles get the rope.

Nah, applied math degree here and grad degrees in computer engineering. Enjoy your gold HDMI cables and $10k headphone set up, though!

it would be noise anyway outside the process anyway. Digitizing the music actually improves the quality.

I can’t see past my own gut. I hope my dick is doing okay.

Analog can't even produce analog faithfully. Recording implements degrade over time and the output varies in quality by whatever analog circuits sit between you and the data.

It's why computers can reproduce a digital signal exactly but there are such things as "good speakers" and "bad speakers".

Sampling at 40kHz with infinite precision = 20kHz and below signals reconstructed PERFECTLY.
Sampling with finite precision + dithering = perfectly reconstructed signal + some fixed amount of noise. That fixed amount of noise is lower than anything analog can possibly achieve.
It's all mathematically proven.

As if humans would be able to tell the difference anyways between analog and digital

yes but can you tell the difference? Probably not.

You can tell the difference, analog has a hiss.

>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
Yes. You can't get the same signal out as you've put in, but it can be pretty damn close that your little monkey ears can't hear the difference, or, more likely, that your audio equipment can't play back such subtle differences.

The analog signal is already not a perfect representation of the original sound. A membrane needs to record the signal.

This is why tube amps will always sound better. They're analog.