Its a synchronization thing... Time stamping embedded in the signal ensures slave is synchronized with master...
Otherwise the slave doesn't recognize digital boundaries. The digital receiver doesn't recognize what is audio data and what is clock data.
Like having two stopwatches, its next to impossible to have them in sync, unless something actually synchronizes them.
Spdif protocol uses embedded clock in signal in order to support multiple data rates, 44.1/48 etc.
Different bit depths are supported too - the standard supports up to 20-bit - 24-bit is a non standard extension.
The algorithm that embeds clock in audio data is called biphase mark code.
Eanna wrote:Its a synchronization thing... Time stamping embedded in the signal ensures slave is synchronized with master...
Otherwise the slave doesn't recognize digital boundaries. The digital receiver doesn't recognize what is audio data and what is clock data.
Like having two stopwatches, its next to impossible to have them in sync, unless something actually synchronizes them.
Spdif protocol uses embedded clock in signal in order to support multiple data rates, 44.1/48 etc.
Different bit depths are supported too - the standard supports up to 20-bit - 24-bit is a non standard extension.
The algorithm that embeds clock in audio data is called biphase mark code.