top of page

Entropy, in one sense, is concerned with measurements: the amount of useful energy available in a system, how efficient that system is, how much noise is present in it, how much we can know about about any system at any given time. Information regarding entropy can itself become entropic. Dates and temperatures are measurements concerning time and energy: not how much there is but how much is useful and available. A date or a temperature may attempt to point to something but inevitably only self-signify. 64 degrees Fahrenheit refers to itself no matter how hard it tries to refer to some state of affairs in the world, measured or felt. The transmission of such information is always inherently noisy, clouded by over-signification. A signal can become noise either due to being consumed by connotation or devoid of it, buried or untethered. 

<

bottom of page