Kinetic Light Installation

Interactive sculpture reacting to sound and movement unveiled at NYC Fashion Week 2024

Explore Project

What it does

This kinetic sculpture combines motion sensors, responsive lighting, and ambient sound feedback to create an immersive art experience. The installation adapts in real-time to human proximity, with layers of light patterns that dance across its surface.

Motion Interaction

Uses AI-based gesture tracking to respond to audience movement with dynamic light patterns.

Ambient Lighting

Multi-layered LED system that shifts color palettes based on time of day.

Sound Integration

Audio-reactive system that maps sound frequency to light intensity variations.

Technical Implementation

Engineered with hardware and software integration for live interaction

Core Components

  • • 48 custom motion sensors (LiDAR)
  • • 1200 WS2812B RGB LEDs
  • • Raspberry Pi 4 master controller
  • • Node-RED for logic orchestration

Algorithm Highlights

  • • Kalman filter for motion prediction
  • • Perlin noise generation for light patterns
  • • Frequency domain audio analysis
  • • Real-time light mapping engine
Sample Algorithm
const pattern = (time) => {
  let result = [];
  for (let i = 0; i < 600; i++) {
    const x = i/20;
    const y = time + (i/100);
    result.push([
      Math.floor(50 + 50 * Math.sin(x + y*2)),
      Math.floor(50 + 50 * Math.sin(y + Math.sin(x)))
    ]);
  }
  return result;
}

Performance Stats

Frame Rate

60 FPS

Latency

<150ms

Live Interaction Demo

Simulate how the installation responds to movement

interactive canvas

*Simulation requires motion input or touch gestures

12+

Exhibition Venues

830K

Estimated Audience

42

Interactive Sessions

"The kinetic lighting transformed our event space into a living canvas. Eegrithas's ability to blend engineering with artistry is unparalleled."

John Davis

John Davis

Event Curator, New City Arts