Home jperlPosts

James

@jperl

Code. Math. Art. Music.
Followers
45.3k
Following
1,406
Account Insight
Score
40.55%
Index
Health Rate
%
Users Ratio
32:1
Weeks posts
@drinkpoppi this summer YOU are my #1 šŸ˜ šŸ’ƒ @summergerardi šŸŽØ @elenakulikovastudio šŸŽ¬ @jperl
588 83
1 year ago
Check this out šŸ‘€ This is a NeRF, or Neural Radiance Field. It is an algorithm which attempts to learn the most likely 3D representation that matches a series of photos captured from different angles. Basically @jperl swam in circles around @sharker.parker for 5 min taking as many photos as possible. These photos are stitched together to create a 3D model! Epic! #underwaterphotography #underwatervideo #3d #3dart #3drender #3dvisualization
154 8
2 years ago
A #HydnellumPeckii captured in 3D. Mendocino, CA, USA. . Also called the Bleeding Tooth Fungus or Strawberries and Cream, hydnellum peckii produces drops of red-colored liquid called guttation on its fruiting bodies. These are natural metabolites, you can think of it as the mushroom sweating as it bulks up! Honestly this thing could star in a Gatorade ad. . The 3D reconstruction was done with @nvidiaai #instantNeRF, and involved crawling around on the forest floor capturing all angles of this patch of fungus with a macro camera. . This has to be one of our favorite finds ever. . #mycology #neuralrendering #mycotechnology
1,930 25
3 years ago
do I look pretty? #facezoom
1,618 46
3 years ago
@fascinatedbyfungi and I went out hunting, to document some mushrooms in 3D! . Reconstruction uses #neuralrendering to achieve photorealism via @nvidiaai #instantNeRF . From Dr. Gordon Walker (@fascinatedbyfungi ): . These bizarrely shaped fungal vases are the #WoolyChantrelle or #TurbinellusFloccosus. They are mycorrhizal with conifers and fruit fall through winter along the pacific coast. They are often impressively large mushrooms that fruit in prodigious numbers (even forming fairy circles). These are young specimens just starting to come into their own and will continue to grow upwards and outwards as long as conditions stay cool and humid. . Turbinellus are impressively dense meaty looking mushrooms, but are generally disregarded as edible because they contain an fatty acid (α-tetradecylcitric) that can cause GI upset in some people. Despite this, it is a widely consumed mushroom around the world, prized for it’s texture and sour flavor. I have tried the California version and found it to be fairly sour and not entirely pleasing, although the texture was good. I would suggest peeling the gills and red part off then washing and cooking thoroughly (maybe even grill or whole roast). The organic acid is generally chronically toxic vs acutely toxic. Still be careful if you try eating this mushroom. . These cartoonish fungal cones also produce a potent anti fungal compound (oxylipin) that inhibits several pathogenic fungi (helping the fruiting body to persist for longer). T. floccosus also contains the spermidine (this compound occurs in wine, guess what it smells like) derivative pistillarin, a bioactive compound that inhibits DNA damage by hydroxyl radicals. The Wooly Chanty used to be classified close to the Pigs Ear/Gomphus, but was reclassified into its own genus after DNA sequence comparisons. I’m always happy to find these as they are usually herald the arrival of other delicious more edible mushrooms.
1,617 18
3 years ago
@jperl and I came together to make this shameless mushroom thirst trap (song is Unholy by Sam Smith and Kim Petras). These mushrooms are rendered from novel 3D rendering image capture techniques being pioneered by folks like @jperl . Represented here in this video we have a bloody #DevilsTooth #Hydnellumpecki (mycorrhizal inedible), a solid meaty #WhiteChanterelle #Cantharellussubalbidus ((mycorrhizal delicious edible), a slimy #PineSpike #Gomphidius (parasitic on Suillus, mediocre edible), a highlighter yellow #PowderySulfurBolete #Pulveroboletusravenelii (mycorrhizal mediocre edible), and nascent tubes of the #ScalyChanterelle #Turbinellusfloccosus (mycorrhizal, mildy toxic, mediocre edible). . It’s so cool to be able to zoom around these mushrooms in 3D, can’t wait to capture more mushrooms in their full form. Absolutely amazed at the technology and people like @jperl who make it all come together. So cool to see this, definitely check out his TikTok and Twitter accounts for more cool content like this. . . . . . . . . . . . . #mushroom #mushrooms #3dmushroom #mushroomart #mushroomthirsttrap #mushroomsinthewoods #3dart #3dartist #mycology #fungi #fungiphotography #neuralrendering #mushroomvideo #mushroomnft
4,078 50
3 years ago
šŸ‘°ā€ā™€ļø - @elenakulikovastudio šŸ¤µā€ā™‚ļø- @stephenleonhard TL;DR - a public message of personal contents, technical description at the very bottom. Elena and Stephen, I love you guys so much. Congratulations on your marriage. My gift to you is this 3D capture of a moment frozen in time, right before your beautiful ceremony. I actually cried at this wedding. I have known Elena since I moved to LA about 3 years ago, and in that time I have seen her go through the trials and tribulations of finding true love. There is one thing she seemed to always want, and now she finally has it. I guess that is worth crying tears of joy over. Elena you are one of my favorite people on the entire planet, you are beyond creative and you deserve every bit of joy that comes from this. Stephen you are incredibly awesome, strong, passionate, and kind, and I am so happy to know you. You two are so perfect for each other. It’s inspiring to see. I wish you many, many happy years to come. ā¤ļø Technical details šŸ› ļøšŸ‘Øā€šŸ’» We showcase a cutting edge technology called NeRF, or neural radiance fields - a novel approach to rendering arbitrary views from a sparse set of input photos. First, I recorded a video using my iPhone of the couple holding a pose. They had to remain completely still for about 60 seconds as I moved around them and shot them from as many angles as I could. Next, I extracted frames from the video and hand-picked the best ones for clarity and sharpness. Then, I ran the photos through COLMAP to get camera poses, and wrote a custom Blender add-on to import the photos and corresponding virtual cameras. I further refined the dataset by setting a near plane for each camera as close to the nearest surface as possible. After, I exported the dataset and loaded it into a modified version of mip-NeRF 360 to train the scene. This part took about 30 hours, and was fully automatic. Finally, I created a simple orbit and rendered out the video. I must say that this is my cleanest NeRF yet. Credit for mip-NeRF 360 goes to Jon Barron and his team at @google . Absolutely incredible project. I bow deeply.
852 40
3 years ago
@hok x @jperl 🧠The fact that this was created not using a drone, but just from a simple handheld iPhone shot is seriously INSANE to me🤯🤯 One of the things that have been fascinating me like crazy recently is the development in A.I. space. It may not make much sense since the result looks so convincing… but don’t let it fool you, All the angles from up top & down below WERE NEVER SHOT. Let me repeat that again… THOSE ANGLES NEVER EXISTED😱😱😱 The video captured is converted into a 3D model surrounded by a 3D environment, which then can be viewed from any angle… EVEN THE ONES THAT WERE NEVER SHOT IN THE PHYSICAL SPACE. Ready to create the impossible? . . . #Nerf #photogrammetry #neuralradiancefield #ai #drone #neuralrendering #metaverse #aiedit #aiediting #artificialintelligence #artificial_intelligence #siggraph #siggraph2022 #thomasmüller #creators #igcreators #jperl #hok #nvidia
6,060 159
3 years ago
An alt version of the @karenxcheng piano mirror loop! How we did it ā¬‡ļø 1. Scan Karen X in front of a mirror. This just means take a video of as many angles as possible. 2. Prepare the footage for NeRF. For this we use a tool called COLMAP which figures out the camera locations. If we feed it the images from the video we took, it will figure out where all the cameras are outside the mirror. However, if we duplicate our footage and flip it horizontally, we have created a mirror image of our original scan. If we feed both the normal scan and the flipped scan into COLMAP, we can trick it into thinking there is a lot more data inside the mirror world! 3. Once COLMAP has processed a NeRF dataset, we can create camera paths for it. For this, I’m using a custom blender addon that I have created. If you’re interested in this check out: /JamesPerlman/blender_nerf_tools 4. Finally, we just render it out! #instantnerf #neuralrendering #machinelearning #artificialintelligence
2,688 103
3 years ago
Troll Hunting ~~~ Collab with @jperl using phone footage I captured in Iceland that he put through an AI program. The program @jperl used created a volumetric 3D models based off of the footage. Next, he wrote a custom code to bring the model into @blender.official and move around the scene.
983 23
3 years ago
#birbs am I right? Credits: šŸ“· - IG: @anselmodaffonseca - White Bellbird video šŸ“· - IG: @adrianruppbirding - Bare Throated Bellbird video šŸ“· - Unknown - Three Wattled Bellbird video (please reach out if you know who shot this) Source video via TT: @featheryfriends No copyright infringement intended. I do not own these videos or audio samples. This work is not monetized, just art for the sake of art.
2,756 106
4 years ago
How we made this! We had fun reading all your guesses (there were so many creative ones). Shout outs to the first few who guessed correctly: @ocetheguy1 @jen_surine @fabi28.03 I'm really excited to incorporate AI into my editing workflow more this year šŸ‘©šŸ»ā€šŸ’» For this video we're using DAIN to interpolate the frames. DAIN stands for Depth-Aware video frame Interpolation - it uses AI to estimate the depth of a video, and with this information uses more AI to interpolate between frames with insane accuracy. The result is WAY smoother than traditional ways of interpolating frames, which are less sophisticated techniques that only use 2D data (e.g. optical flow). DAIN isn’t perfect (you can see some artifacts around my hands and the sand) but I find the result incredibly impressive. I've mostly seen DAIN be used to increase frame rates on animations or create artificial slow motion. But I haven’t seen it be used to create "impossible" movements, so I was excited to try and test that. Over the last month, I shot over 100 test clips to figure out which movements worked best. I worked closely with @jperl who wrote a custom program to batch run DAIN. Batch running allowed us to iterate and test much faster. I would send @jperl a bunch of clips, he'd batch run them overnight, and then I'd review them in the morning before trying another round of tests. All this for a 7 second video 😜 But it was well worth it cause we had a lot of fun figuring it out If you want to try it, you can download DAIN-app by Gabriel Poetsch (available on Windows + NVIDIA only) Director @karenxcheng Editor @jperl DAIN-App by Gabriel Poetsch Credits to the incredible AI researchers who created DAIN: Wenbo Bao, Wei-Sheng Lai, Chao Ma, Xiaoyun Zhang, Zhiyong Gao, Ming-Hsuan Yang See their research here:Ā Ā /abs/1904.00830 Shoutout to @creyety for inspiring me to try experimenting with interpolating stop motion videos #artificialintelligence #ai #AIedit #stopmotion #interpolation
47.8k 429
4 years ago