Tai chi with Microsoft Kinect

Your wellness Guide

project aspires to merge cutting-edge technology with interactive design to create immersive and enriching experiences. By leveraging Microsoft Kinect and TouchDesigner, we aim to transform simple interactions into visually stunning and engaging particle simulations, enhancing both physical and digital spaces.

I embarked on this journey during my 4th semester at NID, driven by the urge to explore new technologies. The idea was to take live video elements and transform them into dynamic particle simulations. Initially, using a basic camera, I quickly realized the need for better depth perception and IR scanning, which led me to Microsoft Kinect. With the Kinect's advanced capabilities, I integrated machine learning plugin within TouchDesigner to recognize human figures and track palm movements.

After numerous tests and observations, I discovered that this setup had practical applications in slow-moving martial arts like tai chi, helping practitioners visualize their movements better. Showcasing this at the SAP Design Exhibition and receiving positive feedback from actual tai chi practitioners validated the project's potential. Further exploration led to an audio-reactive visual display for our college fest, enhancing audience engagement with music transitions.

This project not only expanded my technical skills but also deepened my appreciation for the potential of interactive design in creating impactful user experiences.

The Process

Timeline

Disciplines

Responsibilities

Tools

Feb '24 - Mar '24

Experience Design

Prototyping

TouchDesigner

Microsoft Kinect

Research

Desk Research

Ideation

Developing a Solution

Final designs

Tai Chi

Reflection

Post Designs Outcome

Challenge

Integrating advanced technology like Microsoft Kinect and machine learning into TouchDesigner to create real-time interactive particle simulations.

Opportunity

Leveraging this technology to enhance user experiences in slow-moving martial arts like tai chi and creating engaging audio-reactive visuals for live performances.

PROCESS HIGHLIGHTS

Design challenge and responsibilities overview

BACKGROUND

The Vision

RESEARCH

Desk Research

How the project was set up on TouchDesigner

project aspires to merge cutting-edge technology with interactive design to create immersive and enriching experiences. By leveraging Microsoft Kinect and TouchDesigner, we aim to transform simple interactions into visually stunning and engaging particle simulations, enhancing both physical and digital spaces.

I embarked on this journey during my 4th semester at NID, driven by the urge to explore new technologies. The idea was to take live video elements and transform them into dynamic particle simulations. Initially, using a basic camera, I quickly realized the need for better depth perception and IR scanning, which led me to Microsoft Kinect. With the Kinect's advanced capabilities, I integrated machine learning plugin within TouchDesigner to recognize human figures and track palm movements.

After numerous tests and observations, I discovered that this setup had practical applications in slow-moving martial arts like tai chi, helping practitioners visualize their movements better. Showcasing this at the SAP Design Exhibition and receiving positive feedback from actual tai chi practitioners validated the project's potential. Further exploration led to an audio-reactive visual display for our college fest, enhancing audience engagement with music transitions.

This project not only expanded my technical skills but also deepened my appreciation for the potential of interactive design in creating impactful user experiences.

IDEATION

Developing a Solution

IDEATION

Major Improvements + Design Decisions

During the ideation phase of my project, I embraced a creative and methodical approach to develop a solution. I started by brainstorming various ways to leverage TouchDesigner and Microsoft Kinect's capabilities. My goal was to create an interactive, visually engaging experience.


I sketched out initial concepts, focusing on how to translate live video elements into particle simulations. From there, I identified the core technical requirements, such as depth perception and human figure recognition, which led me to incorporate Kinect for its advanced features.


By exploring different plugins and experimenting within TouchDesigner, I refined my concept. I tested various configurations to ensure the particles responded accurately to user movements, particularly focusing on the accumulation of particles on the palms. Throughout this process, user feedback and iterative testing were crucial in shaping the final design, ensuring both functionality and user engagement.

In the ideation phase, I focused on creating major improvements and making key design decisions to ensure the project's success. I started by identifying the core features and functionalities that would enhance user experience. Recognizing the importance of depth perception and accurate tracking, I decided to integrate Microsoft Kinect, leveraging its advanced IR scanning capabilities.


One of the significant improvements was incorporating machine learning within TouchDesigner to recognize human figures and detect palm movements. This decision was pivotal in creating a more interactive and engaging experience. I also experimented with different particle behaviors, ultimately choosing to have particles accumulate on the user's palms to create a visually captivating effect.


I conducted numerous tests and gathered user feedback to refine these design elements. Observing how users interacted with the system helped me make informed adjustments, ensuring the final design was both intuitive and immersive. These major improvements and design decisions were guided by the goal of enhancing the overall user experience and exploring the full potential of the technology.

FINAL DESIGNS

The Exhibition

In the final phase of the project, I conducted a live exhibition to showcase the interactive experience I had created. I set up the system in a dedicated space, with the Kinect and TouchDesigner projecting the particle simulations onto a large canvas. My colleagues eagerly gathered around as I explained the technology and the concept behind the project.


As each person stepped in front of the Kinect, the spherical particles began to flow around them, accumulating on their palms when they moved. The real-time interaction captivated everyone, prompting them to make various hand movements and experiment with the effects. The experience was enchanting and engaging, with participants expressing amazement and delight.


Throughout the exhibition, I observed how users interacted with the system and gathered valuable feedback. The success of the demonstration was evident as my colleagues started posting about it online, sharing their experiences and videos of the particle simulations. This organic promotion helped spread the word about the project and highlighted the potential of merging technology with interactive design.

IDEATION

Mid-Fidelity

To provide some background on our vision, we believe that stores are evolving into social clubs. Through our app Visavis, we aim to facilitate a future where brands can accomplish several key objectives. These include acquiring new customers, fostering engagement through real-life events and experiences, incentivizing valuable actions with points, conducting surveys and polls, and cultivating genuine relationships within their store communities.

IDEATION

Mid-Fidelity

To provide some background on our vision, we believe that stores are evolving into social clubs. Through our app Visavis, we aim to facilitate a future where brands can accomplish several key objectives. These include acquiring new customers, fostering engagement through real-life events and experiences, incentivizing valuable actions with points, conducting surveys and polls, and cultivating genuine relationships within their store communities.

IDEATION

High-Fidelity

To provide some background on our vision, we believe that stores are evolving into social clubs. Through our app Visavis, we aim to facilitate a future where brands can accomplish several key objectives. These include acquiring new customers, fostering engagement through real-life events and experiences, incentivizing valuable actions with points, conducting surveys and polls, and cultivating genuine relationships within their store communities.

REFLECTIONS

Post Designs Outcome

REFLECTIONS

Pictures from the performance

Leveraging the knowledge gained from working with TouchDesigner, I embarked on a new challenge: creating an audio-reactive display for the college dance performance. Building on my understanding of dynamic particle systems, I focused on synchronizing visual elements with audio inputs to create an immersive experience.


I started by experimenting with various audio-reactive techniques in TouchDesigner, learning how to analyze audio signals and map them to visual effects. Through iterative testing, I refined the system to ensure smooth transitions and captivating visuals that responded to the music’s rhythm and beats.


The final setup involved projecting these audio-reactive visuals onto display panels during the performance. As the music played, the visuals transitioned seamlessly, enhancing the atmosphere and engaging the audience. The result was a harmonious blend of sound and visuals that complemented the dance performance without overpowering it.


Reflecting on this project, I realized how valuable my prior experience with TouchDesigner had been. It taught me the importance of iteration, user feedback, and the power of integrating technology with creative expression. This project not only enriched the dance performance but also deepened my appreciation for the potential of audio-visual synchronization in interactive design.

Gracias!

EXPERIMENTS

LET'S CONNECT