Most computers use digital codes containing only two symbols, 0 and 1, to perform all operations. Continuous signals (analog) must be transformed into digital codes before they can be processed by a computer.
No closely related standards have been identified.