The bit, taken to be shorthand for "binary digit," is the fundamental element of modern digital technology. Bits are used in telecommunication, in the workings of computers, and in the digital representation of video, audio, and other signals.
The bit is also fundamental in another sense. Since the foundation of information theory by Claude Shannon in 1948, it is understood that the most efficiently-achievable representations of data are in binary forms [Shannon2014].
The prevalence of bitness, binary-ness, and just plain bits is exemplified by the Art of Computer Programming section "7.1. Zeros and Ones" consisting of 234 dense pages plus an additional 145 pages discussing the exercises [Knuth2011].
Bits, although not know as such at the time, also figure in Boolean logic, where the two values 0 and 1 are regarded as values of logical propositions cast in mathematical terms [Boole1854].
In this note, bits are examined not so much as simply 0 and 1 values but as abstractions where 0 and 1 are merely one representation. The idea is to dig into what constitutes bitness and how that is manifest in computational procedures, especially ones that come into play in Miser.
-- Dennis E. Hamilton
Seattle, Washington
2014-03-30
George Boole. An Investigation of the Laws of
Thought: on which are founded the mathematical theories of logic and
probabilities. Macmillan (London: 1854). Dover (New
York: 1858) reprint, ISBN 0-486-60028-9 pbk.
Donald E. Knuth. The Art of Computer
Programming, volume 4A: Combinatorial Algorithms Part 1.
Addison-Wesley (New Jersey: 2011). ISBN 0-201-03804-8.
Claude Shannon. Wikipedia entry, 2014-03-25 accessed at <http://en.wikipedia.org/w/index.php?title=Claude_Shannon&oldid=601127268>.
n010202b: Homage to the Bit [latest]
n010202c: Homage to the Bit - Initial Notes
n010202a: Diary & Job Jar
|
created 2002-12-07-22:31 -0800 (pst) by orcmid |