My ARKit app named “Liquid Hands” 🤷♀️ is in beta testing now. Sign up for my Patreon Free tier to gain access. I’ve only got 90 spots. iOS only. Works on iPhone 13 and above dunno how well it would work on older iPhones. I added one more feature this morning these laced flower shapes that change petals based off of how wide your fingers are spread.
Trying to solve an instancing problem with mitosis orbs in ARKit: each touch splits the orb to a smaller orb and plays one note on the Hirajoshi scale. Trying to see how efficient I can get it for something else later.
The biggest mystery to me is still why Apple chose to make their Vision hand tracking model 2D for iOS. Reprojecting the positions into 3D has been one of those problems I’ve probably sunk 200 hours into and it’s still very fragile.
🎧🔊I coded this in @p5xjs this evening it’s a sound bath sketch where interactive watercolor dynamics are each mapped to a Hirajoshi MIDI scale note and the fluid mechanics are expressed as MIDI dynamic controller values then sent from the browser to @ableton where i gave them a sampled hocketing voice.
🔊🎧I added a new feature to the fabric physics section of the ARKit app for iOS mobile that I’m building: collapse into a flower shape. I also added a stable record / capture flow so I can finally release a beta of these.
🔊You came here for broken ARKit demos and boy can I deliver broken ARKit demos. I took this one out the other day and everyone was upset with me immediately.
🔊I added volumetric particles that react to the live microphone input, in real time, from my old iPhone; this is an ARKit app that I am building “by hand” (lol) not using LLM’s.
🔊🎧You cannot imagine how difficult it is to get ARKit / swift audio analysis engine protocols to run at the same time as they are monitoring and recording live audio lol but I finally did it! This is live, realtime, AR on iOS mobile: not compositing.
🔊I added a bunch of features to this part of my reality distortion ARKit Air Harp app that I’m building this morning. This is not video editing it is realtime AR running on an old iPhone. It is an ARKit app I am making. This video is not AI and I did not use LLM’s to make it.