We recently had the pleasure of hosting @a.m.architect_ ’s release + AV show for their latest record Avenir. Here are some images of the evening from @vvilmstock . The record can be heard at https://bit.ly/avenirLP + we highly recommend doing so. Cheers to all who attended.
This Friday we are releasing a new @a.m.architect_ album called Avenir, and to celebrate we are also doing a show at @dadalab.io in Austin. We’ve been working on this release for a long time and I’m super excited to share it with everyone. We’re creating interactive audio visual installations to go with each song, and fridays show will feature one of those installations followed by a live set as well as a DJ set from @king.khary . Come check it out if you are around! Just really happy to have this new album out in the world. You can check out album preorders on amarchitect.bandcamp.com 🍻
#DrinkThisAndVote
Artist: Daniel Stanush • @mr_daniel_
🗳️ To cast your vote for 𝗧𝗛𝗜𝗦 2024 Drink This & Vote submission, 𝗦𝗜𝗠𝗣𝗟𝗬 𝗟𝗜𝗞𝗘 𝗧𝗛𝗜𝗦 𝗣𝗢𝗦𝗧. Help boost this design by sharing it with your friends and getting more votes in! 📈
🎖️ 𝗙𝗢𝗨𝗥 submissions with the most likes (votes) will be featured on the Drink This & Vote Hazy IPA, brewed by @opbrewco 🍺!
🎉 Winners Announced: Oct. 11, 2024
Birds of a feather - some doodles in procreate. Always wanted to illustrate children’s books. Maybe 2025 will be the year 🤓
If you are interested in authoring a children’s book and need an illustrator, remember your friend Daniel!!
To create the music video for “Hydra”, we developed a unique workflow that included generating custom style codes for continuity in Midjourney, generating over 1500 images to build out the characters and actions. Then we moved into Runway to create video versions of those images selected images. From there we created a custom TouchDesigner network that switched between videos using beat detection, as well as selecting shots via MIDI control. We then ran all of this through an analog vj board to soften the crispness of the generated video as well as provide glitch feedback effects. For more info on “Hydra” check out our website in our profile LinkTree. URL: amarchitect.ai/films/hydra
Vocals: @jedcraddock_music
Recorded at @earbender_studios
// Hydra offers a glimpse into a place where technology has evolved to offer near-limitless creation, and a group of elusive tech-savants that turn their abilities inwards to create a new vision of themselves - will technology bring lead them to nirvana or is their self-experimentation blurring their essence and identity?
// Hydra was created through generative, machine-learning technology, incorporating a workflow moved from storyboard to Midjourney image generation, producing over 1,500 still images, to video utilizing @runwayapp and Stable Diffusion, before being run through a @touchdesigner network that switched between videos in synchronization with music, before finally running through a circuit-bent @bpmcanalog video mixer to add noise, grain and glitch effects.
Special thanks to @jedcraddock_music for lending his amazing vocal talents to this song.
To see the full video and more photos and behind-the-scenes video, check out the link in our bio or visit: amarchitect.ai/films/hydra
A few weeks back we joined up with @exploremidtown and @faithjmckinnie to bring an interactive art experience to an event. We used some cameras to capture video of participants and then generate, in real time, unique flowers based on their poses. We then compiled those flowers to create a garden of dancing flowers. @touchdesigner running media pipe and stream diffusion, a custom LORA for the flower aesthetic, and a network of cached video to create a garden that was projected at the event.
Last week we debuted a new installation, “Svara-Svapna” at @texaspublicradio in San Antonio. We designed this installation to be part of @makemusicday
Svara-Svapna allows participants to create musical compositions through interacting with a projected grid of shapes. Through simple gestures and body movements, a participant can create unique musical compositions voiced by an array of different instruments. We created the visuals and backend using @touchdesigner and created the audio components in @ableton , and some amazing tones via @feltinstruments 🤓
We released “Color Field” in 2019 on @79ancestors - this release further evolved our love for music and visual art being interwoven. We created an award winning short film as part of the release, and in collaboration of @deru and @effixx , we created the Spectrasphere - a wholly unique artifact that contained abstract visuals paired with each song, and allowed the viewer to interact with the art through hardware created by the amazing @bpmc_glitch . This album really helped us find our way as multimedia composers.
Track list // 1. Autonoe 2. Sofia 3. Color Field 4. Innervisible 5. Azure 6. Topaz 7. Theia 8. Fawn 9. Rivers 10. Vermillion
A few weeks back we joined up with @exploremidtown and @faithjmckinnie to bring an interactive art experience to an event. We used some really cool tech to identify people in a video feed and then generate unique flowers based on their poses. Using @touchdesigner running media pipe and stream diffusion, a custom LORA for the flower aesthetic, and a network of cached video to create a garden that was projected at the event.
Two shows in Austin this weekend at @dadalab.io . Friday night we will be debuting some new @amarchitect_ music and visuals, then Saturday we will be running live visuals for the entire evening - getting the most out of dadaLabs amazing AV setup 🤓