Sampling
It’s been said that a characteristic irony of this information age in which we live is that the more information we have the less informed we seem to be. For indeed, it would seem that the presence of such vast and easily-accessible measures of information would provide us with the data from which we could become both informed and knowledgeable, that is, able to make sensible and defensible opinions based on up-to-the-minute facts and figures. But hidden in the process of sifting all of this apparently indispensible information is a moment of digital error whose “lossyness” substitutes probability for truth. And without that grand narrative of truth, who’s to say what information truly means?
Data collection involves sampling, and sampling, as we all know, is only a snapshot at best and a horrible excuse for headlines at worst. Ordinary, everday citizens—the subjects of all of this information—do not think in ones and zeroes, but rather in the ambiguous murk that lies at the edge of culture’s river of opinion and (static and noise of our culture’s endless electric chat and feedback). But for statisticians and observers, pundits and analysts, information comes in the form of recognizable patterns. Patterns formed by the informed, curves drawn between points in a field of remarks. In many cases, of course, these patterns only resemble what the observers originally expected to find. But in all cases, they were rendered from data points that in themselves are only a snapshot of human opinion.
Quantization error is the lossy process of digization, also sampling, known to audiophiles as the bane of digital sound: the approximation of an original sound wave through the digital (binary) accuracy required of digital media. A point on the original wave, as it’s sampled, must be shifted to make it a number. While that number can be relatively large, the “last bit” still only has two existential opportunities: one or zero. In every digital sample, there is an error, introduced at birth, and carried by the last bit. To some (analog) audiophiles, what’s lost is the subtlety of truth. To get mystical about it, the part or remainder that cannot be forced through the digital gate (one or zero).
The very same process is involved in the production of information. Data points involve one and zero, yes or no, either/or decisions. These representations, while admittedly crude, are hoped, through the sheer volume of data collected and the error correction techniques applied to samples, to result in a picture of accuracy and truth. When in actuality, information ways as much about itself as it does about what it represents. And it cannot speak. It cannot answer its questioners, but offer only hidden (or sometimes obvious) patterns and curves.
What makes truth, and what makes information uninformative, is the ambiguity left in the original statement or claim that permits testing through conversation and interaction. It’s in the resolution of ambiguity through the interactions required of social reproduction that we constitute the force truth. Information has no force.
28
- June
2002
No Comments