Dancing through the vibe code!!πΊπͺ©π¨βπ»
Built a CRT wall that sees, hears, and feels.
Speech β emotion β AI visuals in real time. 8 emotions. 9 languages. 4 caption engines. 300+ audio-reactive shaders. All feeding into
@daydreamlive_ Scope App and back into
@wallspace.studio app for live 3D/VR/XR display anywhere in the world.
For this Scope AI cohort I partnered up with
@matthewisraelsohn @chillquipo , who is profoundly deaf. This reel is from our Virtual VJ session where he was mixing visuals with me live from London! Bottom timelapse videos are all our vibe coding sessions we did over the last two weeks. Together we built a system where sound becomes visible β captions, colors, energy, all driving the AI generation. Accessibility isnβt an afterthought, itβs the architecture.
93 commits. 18 releases. One cohort. Love all over the world.π π
Check my story for a link to vote! Deadline is EOD today pst at the time of this post (3/23/23). Would love your support or vote for any other project you find interesting!! Tough competition for sure, such an amazing group!! π«Άπ€
More to come on the app and our collab, this is just a taste. π It is currently in private beta but if you are a creator and interested in joining DM for more details!
#Scope #AIVideo #VJ #vibecode #claude