The
Entoptic Field Monitor [EFM] is a post-GDPR surveillance application that continuously generates synthetic images outputs. There is little to no apparent resemblance between input and output, and input images fade from the cache. The simple material connection of pixels to pixels remains—yet the line that's being crossed here may lie elsewhere.
The EFM is an inquisitive play on the prolific, but often suble, role of "AI" technologies in everyday imaging scenarios, such as smartphone photography. Even in the latter, the use of such technologies often seems benign---for instance, in color correction or synthetic depth-blur. Yet as
vivid examples show, even such generally subtle effects can drastically alter perceptions of reality; bringing up questions regarding representation, historicity, and the perception of reality.
Aside from the use of a well-known image synthesis model, the EFM features a custom neural network trained to predict the
Structural Similarity Index Measure of input and output image. With a number netween 0 and 1, this value indicates how visually different two images are on the level of pixel appearance. Applied in entoptic media such as the EFM (or the other prototypes found
here), this measure gives a first indication of what (post-)imaging media could show.
At the same time, this speculative use in a surveillance application has further implications; for instance: Do these images still cross some boundary, even if they may not show subjects as these perceive themselves? Will images be decodable—by whom, through what chain of events or further components? What material, political and cultural systems does this imply?