AI & Cognitive Science & Technology & Philosophy & Art

The “Environment” That Raised AI Is Language. And The Mental Models Inside AI Are Statistical Echoes of Billions of Human Minds.

Watch the Full Video

Overview

Echoes of the Minds is an interactive installation that explores how both humans and AI construct reality from partial information.

Through a blend of optical illusions, generative art, and AI-driven emotional interpretation, the piece reveals that human perception and machine perception share the same core traits: reliance on prior experience, pattern recognition, and filtered understanding.

Participants spin laser-cut acrylic discs patterned with generative geometries, casting moving light illusions on the wall — illusions that the human brain “completes” using predictive coding. At the same time, an AI system “watches” the participant, infers their emotional state, and generates its own text-based interpretation — tinted by the iridescent acrylic and projected alongside the visual patterns.

The result is a side-by-side portrait of how two very different systems — the human brain and a neural network — process the same moment, each through the lens of its own biases, memories, and training.

Human Echoes — How the Brain Constructs Reality

In neuroscience, “echoes” can refer to neuronal reverberation — the brain’s ongoing activity after a stimulus is gone. These reverberations shape how we perceive the world, allowing us to fill in gaps, predict what comes next, and make sense of incomplete information.

In Echoes of the Minds, these principles are brought to life through optical illusions generated by motion and light:

Generative geometric patterns — designed in JavaScript and Python — are laser-cut into acrylic discs.
• When spun, the overlapping shapes and light create depth and layers that do not physically exist, but are perceived as real by the brain.
• This effect is powered by predictive coding — the brain’s tendency to combine incoming sensory data with prior experience to create a “best guess” of reality.

The illusion is not just a trick of the eye — it’s a direct expression of how the brain constructs meaning. Each visual layer, each shifting shadow, is a trace of your brain’s internal “echo” of what it thinks it’s seeing.

Technical Description: Geometry Patterns are created through generative coding in Javascript and Python leveraging mathematical algorithms and iterative processes to produce optical illusional designs when spun.

Features laser-cut acrylic boards with intricate generative geometric patterns, designed to create optical illusions through human visual system when spun using a gear system connected to a hand crank.

AI Echoes — How Machines Construct Reality

While the human brain interprets the spinning light patterns through predictive coding, the AI system in Echoes of the Minds is busy forming its own perception of the moment.

A multimodal AI pipeline captures and interprets participants’ emotional states in real time:

1. Sensing — A wireless camera records the participant’s facial expressions.
2. Emotion Analysis — The feed is processed through DeepFace, a facial recognition model, to classify emotional states.
3. Interpretation — These emotional readings are passed to a fine-tuned GPT-3.5-turbo model, which generates text-based reflections — the AI’s “thoughts” about the participant’s state.
4. Projection — The generated text is projection-mapped through spinning acrylic boards, tinted by their radial iridescent colors.

The AI’s perception, like the human’s, is shaped by its own “past experience” — the massive datasets it was trained on. Just as the brain’s illusions are filtered by memory and prior exposure, the AI’s interpretations are filtered by the cultural and statistical biases embedded in its training data.

Projected side-by-side with the optical illusions, these AI “echoes” reveal a striking truth: both human and machine see the world not as it is, but as their histories have taught them to expect it.

Experience Design — Walking Through the Echoes

The installation is designed as a layered encounter, where participants move from curiosity to self-reflection through both human and AI “perceptions” of the same moment.

1. Invitation — Curiosity Sparked
Participants enter a dimly lit space where large acrylic discs, patterned with intricate generative geometries, are mounted on a hand-crank gear system. Soft, shifting light hints at movement, inviting touch and exploration.

2. Human Perception — The Illusion Unfolds
As they spin the discs, overlapping patterns dance on the wall. Depth and motion appear where none exist — a direct experience of the brain’s predictive coding at work.

3. AI Perception — Being “Seen” by a Machine
A wireless camera quietly observes, sending facial expression data to the AI system. Within seconds, the AI produces a text-based emotional interpretation — its own filtered “echo” of the participant’s state — which is projection-mapped onto the same wall. The text glows through the spinning patterns, tinted by the acrylic’s iridescence.

4. Convergence — Two Minds in Parallel
Standing before the wall, participants see the human illusion and the AI’s interpretation side by side — one constructed from biology, the other from algorithms. The juxtaposition invites a moment of reflection: What is real? And how much of what I see — or believe — is just an echo?

Technical Architecture — Bridging Human and AI Perception

The installation integrates physical computing, generative design, and AI-driven interpretation into one seamless interactive loop.

Human Perception Layer

• Generative Patterns
— Created in JavaScript and Python using mathematical algorithms to design intricate geometric forms.
Fabrication — Laser-cut acrylic discs mounted on a custom gear system with a hand crank for tactile engagement.
Optical Illusions — Depth and motion effects emerge as the discs spin, powered by human predictive coding.

AI Perception Layer

• Sensing
— Wireless camera captures facial expressions in real time.
Emotion Analysis — Processed via DeepFace for emotional classification.
Interpretation — Passed to fine-tuned GPT-3.5-turbo to generate reflective, text-based “thoughts.”
Projection Mapping — Text sent through OSC and Syphon servers to Processing and MadMapper, then mapped onto the wall through spinning acrylic, tinted by its radial iridescence.

Integration

The Python backend orchestrates data flow between the sensing system, AI model, and projection mapping tools.
• The spinning acrylic acts as both a physical interface and a dynamic lens, merging the AI’s output with the human-generated visual illusions in real time.

This architecture allows two distinct perceptual systems — human and machine — to interpret the same moment and display their “echoes” in parallel, making the invisible processes of cognition visible and shareable.

Reflection & Critical Questions

Echoes of the Minds reveals that perception — whether biological or artificial — is never a direct copy of reality.

Both human and AI “minds” construct their version of the world through prior experience, pattern recognition, and selective filtering.In the human brain, predictive coding fills in missing details, creating illusions that feel real.

In AI, neural networks generate interpretations based on patterns learned from vast datasets — equally filtered by the biases and limitations of their training.When their outputs are placed side by side, we’re confronted with a shared truth: neither sees the world as it is — only as their history has taught them to expect it.

This raises urgent questions:

• As society races to design superintelligent systems, are we building them in a way that aligns with the values we truly want to preserve?
• What happens when AI’s “echoes” begin to shape — and be shaped by — our own cognitive biases at scale?
• How can we design systems that make these filtering processes visible, so we remain aware of where perception ends and projection begins?


By making the parallels between human and AI perception tangible, Echoes of the Minds invites us to slow down, look closer, and decide — with eyes open — what kind of cognitive futures we want to create.

— The END —

YOU MIGHT ALSO LIKE

Projection Mapping

Back to Top