Home blapcodePosts

Benjamin Lappalainen

@blapcode

šŸ‘¾ Creative Technologist • Artist • Educator 🐐 XR Development Lead @ukaiprojects šŸŽ Program Lead @softlaunch_net šŸ—ŗļø Toronto, Canada
Followers
775
Following
513
Account Insight
Score
24.51%
Index
Health Rate
%
Users Ratio
2:1
Weeks posts
šŸ”“šŸŸ¢šŸ”µ RGBoids is a procedurally generated artwork that simulates multiple flocks of ā€œboidsā€ (bird-oid objects) flying around a 64x64 LED matrix, diffused with frosted acrylic. 🪶 Each colour of boid (red, green, blue) has different preferences that control how it behaves, including the number of neighbours it will pay attention to, how much it will wander, and the range at which it will attempt to flock. šŸ•Šļø The motion of the boids is calculated in real-time from a simple set of rules and is non-deterministic - the piece will run infinitely without repeating. šŸ“Come see it in-person at the @interaccessto flashDRIVE Digital Arts Fundraiser exhibition (April 22-May 9) curated by @skyfinefoods along with the work of many talented artists from in and around Toronto! Poster by @pegahpeivandi ✨ šŸŽ‰ Opening party on April 22, 7-10pm at 32 Lisgar St! šŸ‘¾ The logic for the piece is written in CircuitPython, running on an ESP32. LED matrix and driver board by @adafruit , custom cut frosted acrylic by @dangdesigns.ca ✨
50 2
29 days ago
Benjamin Lappalainen ( @blapcode ) is a new media artist, creative technologist, and educator based in Toronto. His interactive installations expose technology’s mechanisms rather than hiding them—asking participants to negotiate with systems instead of passively consuming them. He teaches workshops, builds technical infrastructure for cultural projects, and makes art that’s playfully critical about our relationship with machines. šŸ¤– Benjamin is very excited to have the opportunity to nerd out about code and TouchDesigner, and share his creative technology skills and experience in the soft_launch weekend intensives! šŸ‘¾
73 0
3 months ago
šŸŒ€time to be perceived šŸŒ€ come join me in the studio May 14th at 4-5pm for an informal showing of amoeba at Studio 693 Dance Annex (the piece will be shown two times 4-4:30 and 4:30-5- with a mini dialogue and feedback session between each) pay what you can tickets (recommended $5 min- I’m self funding this creation so every bit helps) šŸ’™ I’m so excited to share this awesome world I’ve been cooking up with some amazing artistsšŸ«‚ send me a dm or email at [email protected] to let me know ur coming šŸ¦‹
0 8
18 days ago
@touchdesigner šŸ‘¾ Hope your brain is feeling as wrinkly or smooth as it takes to get you through the day 🧠 Playing with POPs and dynamic lighting + some different filtering techniques with a single-camera setup šŸŽ„ Finding some unusual ways in which my photography and new media practices have begun to blend together - the halation effect (red glow on highlights) is something I emulated because I liked the look from when I shot @cinestillfilm some time ago šŸ“ø I’ve been having a blast playing with this 3D model of a brain for iterating on some visual aesthetics - interactive side of things will return soon in full forcešŸ‘‹šŸ’” #livevisuals #newmedia #touchdesigner #creativecoding
29 0
4 days ago
ā€œThe marble index of a mind forever Voyaging through strange seas of Thought, alone.ā€ (Wordsworth, 1850) This is fsaverage5 - a ā€œstandard brainā€ template that captures the general shape of the human cortex. I converted it from its original neuroimaging surface format to a standard OBJ file and imported it into @touchdesigner to create some visuals with it in real-time using POPs šŸ‘¾ My end goal is to use the (somewhat unsettling) TRIBE v2 brain encoding model which predicts how an average brain responds to video/audio/text content, and pipe that into TouchDesigner to visualize the activity on this model. I’ve yet to successfully run the full pipeline without crashing my computer, so some Ghost in the Shell-inspired visuals will have to do for now šŸ¤– šŸ“© DM or email for commissions (data visualization, motion design, interactive installations). Thank you to @visualcodepoetry for so many fantastic resources that have supported my own TouchDesigner journey such that I can now teach othersšŸ’” #touchdesigner #livevisuals #interactiveart #newmedia
41 13
12 days ago
Views from the Machine Developed and Facilitated by @blapcode & @luisalyji Views from the Machine is a playful interpretation of how we can reclaim our attention, foster agency, and recognize that what the machine ā€œseesā€ often extends beyond the picture presented to ā€œit.ā€ At UKAI Projects, we turn abstract ideas into engaging experiences. When it comes to challenging topics like AI, we work from both technical and philosophical aspects to untangle the messy reality of AI adoption and implementation in cultural work. Our program participants walk away with: * A personal framework for engaging critically and creatively with digital technologies * Clarity in articulating their own values and perspectives on algorithmic systems You can collaborate with us to develop experiences tailored to the challenges you face—such as workplace culture, innovation, public life, tech education, and more. You can also select from our previous projects and learning programs to apply in new contexts.
16 0
1 month ago
Digest: Views from the Machine #5 Art and the Art of (not) Being Seen Developed and Facilitated by @blapcode & @luisalyji UKAI Projects is run by artists. We each have our own practices and methods of making sense of the world. Where we converge is that art is a place where we learn about rules, bend them, and make our own. When AI is sold to us through its speed and efficiency in ā€œgenerating art,ā€ art as a spectacle to be consumed or pure technical excellence becomes less relevant. Art, as a conduit for agency, relationship-building, and attention, creates new possibilities for how we want to live in an AI-saturated world. We asked participants to think of a game we could play while being ā€œwatchedā€ by one of the models we tested out, and to specify the conditions for winning. With simple rules, we jumped into action to play the game of ā€œhow not to be a person." --------------- Message UKAI Projects for more information about collaborating with us to develop experiences tailored to the challenges you face—such as workplace culture, innovation, public life, tech education, and more. You can also select from our previous projects and learning programs to apply in new contexts.
29 0
1 month ago
For all grievers: what do you need? And what do you need to bury?
67 3
1 month ago
Digest: Views from the Machine #4 Indexing AI Developed and Facilitated by @blapcode & @luisalyji AI is littered everywhere. It is presented to us in a monolith, with its own mythological qualities. Along with the opportunity to take a peek at a handful of machine vision tools, we encouraged indexing these tools, which would otherwise be communicated as simply ā€œAI,ā€ even though their use cases have specific real-life implications. What is its purpose? Who and what can it help? In what ways could it be misused? What are some of its alternatives? --------------- Message UKAI Projects for more information about collaborating with us to develop experiences tailored to the challenges you face—such as workplace culture, innovation, public life, tech education, and more. You can also select from our previous projects and learning programs to apply in new contexts.
14 1
1 month ago
Digest: Views from the Machine #3 Attention (machine) Developed and Facilitated by @blapcode & @luisalyji How machines ā€œseeā€ the world is often heavily anthropomorphized. The assumption that what a vision model processes is similar to human ocular processing creates a gap in comprehending how machine vision technologies are integrated into other systems. These systems can be technological and automated, or cultural, political, and social. Views from the Machine highlights a brief selection of tools, each with specific use cases, to help participants better understand the implications of each ā€œAI.ā€ We explored examples from a self-supervised vision transformer, purpose-specific detection models, and a visual language model to bring everything together. -------------------- Message UKAI Projects for more information about collaborating with us to develop experiences tailored to the challenges you face—such as workplace culture, innovation, public life, tech education, and more. You can also select from our previous projects and learning programs to apply in new contexts.
24 1
1 month ago
Digest: Views from the Machine #2 Attention (human) Developed and Facilitated by @blapcode & @luisalyji When you meet someone for the first time, what catches your attention? There is no beginning or end to this process of ā€œpaying attention,ā€ as we continually try to make sense of our context. We take notes, make quick sketches, use verbal cues, and mark other psychological landmarks, sometimes with our phones’ cameras, to capture what we focus on daily. Attention is a skill that can be learned and developed, and it is just as easily hacked, overloaded, or misdirected. ----------- Message UKAI Projects for more information about collaborating with us to develop experiences tailored to the challenges you face—such as workplace culture, innovation, public life, tech education, and more. You can also select from our previous projects and learning programs to apply in new contexts.
38 0
1 month ago
Digest: Views from the Machine #1 Attention Developed and Facilitated by @blapcode & @luisalyji When we speak of AI, we tend to relate the term to a broad philosophical concept. In many cases, we refer to them by their brand names: ChatGPT, Gemini, Copilot, Claude, etc. When we describe AI, we often use language that humans can relate to. Attention is one of them. In 2017, researchers at Google co-authored ā€œAttention is All You Need,ā€ a paper that introduced the transformer, a machine learning architecture. This rapidly accelerated the mainstream applications of machine learning. The "GPT" in ChatGPT stands for "Generative Pre-trained Transformer," which is an example of how we encounter this technology in our daily lives. GPTs are primarily known for their applications in language models, but it also applies to images and audio. Today, the familiar ā€œbrand-name AIā€ products built upon this architecture are capable of complex, multi-step reasoning and of processing and generating text, images, audio, and video in mere seconds. Views from the Machine is a playful interpretation of how we can reclaim our attention, foster agency, and recognize that what the machine ā€œseesā€ often extends beyond the picture presented to ā€œit.ā€ -------------- Message UKAI Projects for more information about collaborating with us to develop experiences tailored to the challenges you face—such as workplace culture, innovation, public life, tech education, and more. You can also select from our previous projects and learning programs to apply in new contexts.
19 6
1 month ago