Blender: Animation and Product Rendering
Understand lighting and create amazing renders with blender.
Where do you put the lights in the 3D scene.
This course will cover lighting in the greatest detail because lighting is the main focus of a good 3D render.
1. Introduction and understand what light actually is.
2. Learn everything about the lights that are available in blender.
3. Uses of lights will be explained and how we can manipulate light sources to create the desired effects.
4. Will go over the rendering pipeline.
5. There will be multiple lighting exercises ready for you!
Which type of Blender Render Engine is suitable for your Project
As you make your 3D models in Blender, your goal will probably be to generate (render)
an image or a movie as a final result. The software that determines how your scene will
look is the render engine. The render engine will need to know how to handle materials
on your objects, how the lighting in your scene should react with reflections, refraction,
bounced ambient lighting, shadows, etc.
There are 3 Types of Render engines built into Blender : Eevee, Workbench & Cycles
Eevee is a physically based realtime renderer.
Workbench is designed for layout, modeling and previews.
Cycles is a physically based path tracer..
Originally known as Blender Render, and not in any way meant to be an engine for rendering, Workbench is the software’s original engine, optimized for the fastest, most graphically simplistic rendering during the modeling and test animation process.
We don’t recommend using it as a final rendering engine, but you can use it as a way to display your work while modeling in the 3D viewport. When you need to show your client a basic progress report on where the scene, animation, or rig setup is in terms of completion, Workbench is a great option.
Similar to 3D sculpting programs such as Zbrush, Workbench offers an option to randomly or individually attribute colors and basic material captures (or MatCap) to specified shapes and objects to make your model stand out for test animations or scene compositions. Vertex and texture painting are also options for further distinction. Additionally, it offers viewport rendering options, such as see-through, or x-ray shading, as well as cavity and simple shadow shading.
Here is where things begin to get a bit interesting.
Cycles is a physically-based, unbiased path tracing rendering solution for Blender. It can be utilized by either your workstation’s CPU or GPU to produce beautiful final scenes and animations.
Rendering in Cycles involves using samples (a single ray of light that is being sent from a pixel to the camera) to generate the operations in a single scene. In order to account for every possible light path (reflective or diffractive) that potentially exists within a scene, multiple samples are necessary to provide a better result. The more samples you have, the higher the quality of your scene. At the same time, the more samples you have, the longer the rendering time and the more taxing it is for the processor.
Cycles’ main characteristic is its path tracing capabilities. Similar to ray tracing, path tracing is an algorithmic process that renders imagery by simulating how light reflects on an object. The algorithm traces its light paths from the camera as opposed to from the light source, making it a sort of “backwards” path tracing engine.
Since many materials tend to make light bounce and scatter, Cycles has to take this into account by using Progressive and Branched Path Integrators. Progressive Integration splits the ray at the first bounce with a material while Branched Path Integration reads every potential path and follows it before and after the ray meets a material, splitting it in multiple directions.
Cycles can handle all post-processing effects including depth of field (DOF), bloom, motion blur, ambient occlusion, and screen-space reflections. Pixar’s native rendering engine Renderman is actually quite similar in behavior to Blender’s Cycles engine, but where Renderman falters is its lack of a user-friendly interface that Cycles is known for.
On top of all this, Cycles is also an external plug-in that can be utilized by other software like Cinema 4D and Maya, making it one of the most versatile engines used in modern 3D software. It’s free and already integrated into Blender, making it a great deal for any 3D model designer on a budget.
There are times when speed is more important than accuracy and Blender’s Eevee engine caters to that need. Eevee (short for Extra Easy Virtual Environment Engine) is Blender’s built-in rendering engine, powered by similar code used for the Unreal Engine developed by Epic Games. And while it can’t compete with Cycles’ visual capacity, its unmatched speed is where it shines.
This physically-based engine can be used not only as a renderer, but also in real time for creating physically based rendering (PBR) and procedurally textured assets, yielding impressive, immediate results in Blender’s viewport. Where Unreal is mainly used for gaming, Eevee shines brightest when used for animation and VFX.
It shares the same node-based material (most notably, the Principled BSDF material) system available in the Cycles engine, and it can be easily supplanted or substituted in pre-existing scenes made in Blender. However, more advanced materials like subsurface scattering and clearcoat materials tend to be less than stellar, though still quite impressive, when running on Eevee.
Differences between Cycles vs Eevee in Blender
1. Ambient Occlusion Cycles vs Eevee
Cycles ambient occlusion is based on the distance between surfaces in 3D space. Eevee AO is based on the distance between surfaces according to the 2D image the screen sees. It manages to still look quite good because it uses a depth pass to determine where to fade out the AO so that it doesn’t cast on objects far behind it (you’ll notice this in 2.79’s viewport AO). However, you’ll still often see occlusion ‘attach’ itself to objects before the depth cutoff takes effect.
Eevee can only occlude a set distance away from an object. In the first Cycles render above, notice how the sides of the box have large gradients and the surfaces where the monkey heads sit have small gradients. The combination of both sharp and broad transitions between light and dark looks very natural. In Eevee, both areas of gradients are the same size, causing the sides of the box to look flat and the occlusion beneath the monkeys to look overly dark.
2. Global Illumination Cycles vs Eevee
Cycles has global illumination (a.k.a indirect lighting, radiosity, etc…) baked deep into how it works. Light paths pick up data (like what color has already been absorbed) from each time they bounce and uses that information to determine what to do when it hits the next surface.
Eevee definitely uses light probes to approximate it. Light probes essentially take snapshots of the scene from their point in space, and project the appropriate colors onto objects in their vicinity. This is called baking, and the data is stored and not updated every frame. Therefore, objects with baked indirect lighting should not be animated.
An massive amount of light probes would result in the scene looking about the same as in Cycles, but of course that would take an eternity to bake. The limited amount of light probes means that the space between them must be approximated, which can result in artifacts. In the render above, notice the faint dark splotch on the blue cube, and the uneven occlusion where it meets the ground.
At this point you can safely assume that any feature of Cycles works by bouncing rays of light, so great reflections are not surprising. Eevee uses a couple tricks to try to mimic this, but of course it still ends up a ways away from accuracy. Screen space reflections work by taking the resulting image from the camera and flipping it around. This is fast and looks great in most circumstances, but cannot work with anything not seen by the camera such as the backs or bottoms of objects.
Eevee can also use reflection probes, which are planes, cubes, or spheres that act like virtual cameras. They render what is around them and then map that image onto the object. Since planes are quite simple they can be updated in real time, but cube and sphere maps need to be baked and therefore the objects they are getting data for will not look right when animated. It’s also rare that an object is a perfect cube or sphere, so the mapping will also introduce some imperfections.
Conclusion – Blender Eevee vs Cycles
In this Blender Eevee vs Cycles comparison, on a first glance it may seem that Eevee is just a preview engine. That there is not much use of it otherwise. But it turns out that that is not true at all. Eevee can render on par with the Cycles depending on a scene and optimisations made.
So whether to choose Cycles or Eevee for your render depends heavily on what your render is. Is it something abstract, cartoonish and stylised? You can totally go with Eevee and get a good result much faster than with Cycles. On other hand, if you want something realistic with good reflections, emissive surfaces, physically-based lighting – you most probably will need to use Cycles, as Eevee can’t handle that yet.
What you will learn
What light is.
The importance of light.
The positioning of light.
How light effects different materials.
The lights available in blender.
How to create amazing renders.
Have Blender installed
Who this course is for:
Anyone who wants to make their renders more pleasing.
Students that are willing to learn.
Students that have a basic understanding of blender.