Some quick shots I did on the latest video by @sabrinacarpenter
It's fun to only have to rough in a scene with basic geometry, knowing it'll be heavily blurred or distorted in the final comp
Some shots I did on @sawyer_skipper 's latest video for @littleimage
It was great to have a long development time to carefully refine the look with no real deadline. Ultimately keeping it subtle was the way to go, using minimal refractions and highlight pings when it hits the sun
BETA RELEASE
I put together this little collection of handy color tools for working with footage in the Blender shader editor. The slides show a few examples of what's included.
Originally it started as just a gamut converter, for adapting RGB values in shaders and lights (since they're currently ignored when changing working space in blender). So shoutout to @ianhubertz for pointing that out in the first place!
Now it's expanded to include HSV tools, keying tools, gamut compression and more.
Most of them are things that you may find yourself wishing for when working with footage planes in 3D space, i.e. a "viewport compositing" VFX workflow.
It's very much still in beta as there are a number of changes needed, but it's available to download over at Eat The Future for anyone who might find it useful!
(The tools are packaged into an addon, so just install it like normal and access it from the shader editor sidebar or search menu.)
Download it for free here:
/EatTheFuture/colorkit
#blender3d
Couple quick shots I did for @sawyer_skipper 's new video for @thebandcamino
Basic embergen sim on some proxy geo. Tracked with syntheyes, rendered with cycles, comped in fusion
Quick breakdown/before and after of the new music video from @charlotteslawrence , Us Three
Nothing too complicated in terms of technique. The real challenge was getting it all finished in 10 days from ingest to final render.
Unbiased rendering was off the table due to the long render times, so everything was finished in Eevee. Compositing in Fusion.
The real trick was having a high quality scan of the set which could be used for tracking and blending the footage into the CG. Reality Capture was used for processing the scan and Syntheyes for tracking/alignment.
director / dp - @aidenmagarian
production designer - @rennapilar
stylist - @annesophiebine
asst by @christopher.m.lacey
featuring @lizaapullen@lucastarrago_
Here's a trick I use all the time for magic masking in Fusion.
This could conceivably work in the color page too but it's not recommended.
Keep in mind Fusion uses the 'object' model while the color page has both the Object and People models to choose from.
Here are a couple shots I worked on for @littleimage 's new video. Kudos to @sawyer_skipper for the great direction and @ruslan_delion for the character model/anim.
Swipe for breakdowns --->
Playing around with the new Lens Sim plugin by Håvard Dalen.
The results are pretty remarkable and the render hit is reasonably small, nothing like the old iris stencil technique or recreating the elements with geometry and refraction shaders.
In addition to the fancy bokeh characteristics that everyone wants, the real win here is the way this technique accurately models field curvature, soft focus, sagittal and tangential astigmatism, and all the other more subtle effects that make an image feel truly photographic (see closeups on slide 3).
Definitely recommend grabbing it for yourself
/products/lens-sim
Quick breakdown of a shot I did for @annesophiebine 's new spot! Thanks @betthole for the connect and for being an awesome dp!
See the full piece here @secretpizzala
Random experiment; Using motion vectors captured from live footage to drive animation on still images.
Shoutout to @emilio_sapia for the initial technique of replicating Nuke's Smart Vectors in Fusion!