Algorhythm is an interactive experience that features music tastes from viewers and visually synthesizes audio data to create a collaborative gallery exhibit. This project features AI-generated imagery using music trend data from participants as seeds.
These audio features are calculated using advanced audio analysis techniques, such as signal processing and machine learning algorithms. They are designed to capture different aspects of the music, such as its rhythm, timbre, harmony, and emotional content.
By simply sharing your favorite song, this app will generate a visual soundscape of the aura that your song may represent—a process I like to call algorithmic synesthesia.
Participants may also choose to share their location (no personal information is associated) to further influence their artwork. This means that, even if two people in different places choose the same song, the resulting artworks will still differ, such that no two pieces are the same.
Please enjoy this exhibition and explore the visualized beauty of music. I encourage you to join the gallery and add a song of your own.