How we made this! We had fun reading all your guesses (there were so many creative ones). Shout outs to the first few who guessed correctly:
@ocetheguy1 @jen_surine @fabi28.03
I'm really excited to incorporate AI into my editing workflow more this year š©š»āš»
For this video we're using DAIN to interpolate the frames. DAIN stands for Depth-Aware video frame Interpolation - it uses AI to estimate the depth of a video, and with this information uses more AI to interpolate between frames with insane accuracy.
The result is WAY smoother than traditional ways of interpolating frames, which are less sophisticated techniques that only use 2D data (e.g. optical flow). DAIN isnāt perfect (you can see some artifacts around my hands and the sand) but I find the result incredibly impressive.
I've mostly seen DAIN be used to increase frame rates on animations or create artificial slow motion. But I havenāt seen it be used to create "impossible" movements, so I was excited to try and test that.
Over the last month, I shot over 100 test clips to figure out which movements worked best. I worked closely with
@jperl who wrote a custom program to batch run DAIN. Batch running allowed us to iterate and test much faster. I would send
@jperl a bunch of clips, he'd batch run them overnight, and then I'd review them in the morning before trying another round of tests.
All this for a 7 second video š
But it was well worth it cause we had a lot of fun figuring it out
If you want to try it, you can download DAIN-app by Gabriel Poetsch (available on Windows + NVIDIA only)
Director
@karenxcheng
Editor
@jperl
DAIN-App by Gabriel Poetsch
Credits to the incredible AI researchers who created DAIN: Wenbo Bao, Wei-Sheng Lai, Chao Ma, Xiaoyun Zhang, Zhiyong Gao, Ming-Hsuan Yang
See their research here:Ā Ā /abs/1904.00830
Shoutout to
@creyety for inspiring me to try experimenting with interpolating stop motion videos
#artificialintelligence #ai #AIedit #stopmotion #interpolation