In this tutorial, I want to take a closer look at the famous “Principled BSDF” node and its inputs. In the following minutes, we’ll take the pieces that compose this “master node” apart and learn how we can have total control over our materials.
Before we begin, I want to mention about a few things. There are many topics out there that explain how this node works, but in my opinion, they overload beginners with too much technical information that suck the fun out of the process of learning Blender. Because of that, in this tutorial, I’ll keep everything as simple as I can and make the information that you’ll gather here as practical as possible. If you guys are ready, let’s begin!
Note: If you are tired of watching video tutorials and would like to see more text-based tutorials like this one from me, you can support me on Patreon so I can dedicate more time to creating them.
“Principled BSDF” node in the material editor
First we’ll start with the mostly used inputs.
How to connect “Base Color Texture” to “Principled BSDF” node
A Very Important Note: When we download an albedo texture, sometimes it can come with an additional opacity texture which I’ll be explaining how to use below. But in some cases you might see that your albedo texture has a transparent background. When an albedo texture consists of transparent elements, its “Alpha” pin should get connected to “Principled BSDF” node’s “Alpha” input.
Your advertisement can be placed here for $200 permanently. Suggested resolutions: 1800x663 or 1400x1400. Contact us.
A Leaf Image With Transparent Background
How to tell Principled BSDF node about albedo texture’s transparent parts
“Alpha” input in action
If we do not connect the alpha input of the albedo texture to alpha input of the Principled BSDF node, we’ll get a black background because the master node does not know which parts of the texture are transparent. And the way to tell that to Blender is connecting the alpha pins into each other.
For this blending method to work, there is one more step that we need to take. We need to go to our material’s “Material Properties” tab and set the blending mode to “Alpha Clip” so it can clip out the transparent parts of the texture.
How to set our material’s “Blend Mode” to “Alpha Clip”.
How to blend AO texture with Albedo texture in Blender
Roughness Map In Action
Inverting Gloss Texture Via “Invert Node”
Inverting Gloss Texture Via “Math Node” (Subtracting 1 from the original value)
A shoe render from one of my commercial projects
You see those little “dots”? Those details are faked with the usage of a normal map.
Normal map usage to fake polyester details
Description continues: This is one of the cases where you might use a normal map/texture on a high poly model. As we can see in the picture, this model is already high poly so we could keep adding those needed details by hand but it would cost us extra time. So to save time, we can use normal maps on high poly models. The second usage case of normal maps are: In video games. Video game engines use real time renderers. They can not handle too many meshes with high polygons at the same time well, but the problem is, if you want something to look very detailed, you need to model them in high poly. So what’s the solution then? The solution is: baking. And what is baking? We can model something in high poly and model the same thing with low poly counts. Then we can transfer the details on the high poly mesh to a texture (normal texture) and slap that texture on the low poly model. The details would not be there present physically, but they would appear like they are as I touched on this matter before.
A low poly mesh without a baked normal map from one of the projects I worked on
Same low poly mesh with the baked normal map
Description continues: If you want to learn how baking is achieved, I talked about it in this post. So in short, the second usage of normal cases would taking a normal map and applying it to your low poly model, which in turn gives your low poly model the appearance of the high poly model, but at a far, far lower performance cost.
How to connect normal texture to Principled BSDF node
More Examples (Optional Read): The picture below represents how normal maps can be used to “bake” details. Can you tell which model has the actualy geomatry and which one fakes it?
Normal Map Usage Cases: Faking Geometry Example 01
I believe it’s clearer in the following picture that which mesh has the real geometry:
Normal Map Usage Cases: Faking Geometry Example 02
Additional Info (Optional Read): It might be confusing hearing people using different terms when talking about same stuff. When someone says “bump map”, today they usually are talking about normal maps. But you can also run into people saying bump and refering to height/displacement maps. The one key difference between bump and normal maps is bump can only store height information whereas normal map can store angle information as well.
How to connect Metalness texture to Principled BSDF
How to connect transluceny texture to Principled BSDF
How to connect opacity map to Principled BSDF node
Opacity Texture In Action
If we can fake the geometry with normal maps, why do we need displacement maps in the first place?
Imagine that you need to create a very realistic looking render or even further, you do really need to create a real place in 3D. In these cases, using normal maps would not get the job done.
Since displacement maps add real geometry to the mesh, as you might expect, they add real geometry to the mesh and hence they consume a lot of resources. Let’s take the example of needing to model an environment where usage of displacement maps could come in handy. Consider the process of sculpting each individual bump on your terrain. Depending on the terrain, such work can take days or even weeks to complete depending on the complexity of the work. With displacement maps, we only need to import the required texture and bum! Almost all of the details we needed are already present on our screen.
Additional Info (Optional Read): Just as it happens with normal maps, when it comes to displacement maps we also see a lot of terms being used interchangeably. To cut it short and make it practical, I like to see what diferencies the following terms: Height maps generally used to describe the details that can be used to create valleys, mountains, etc. In other terms, height maps are suited best when one needs to create terrain. Displacement maps most of the time refer to additional but compared to height maps, small details.
In short:
And finally we are coming to the part that excites you the most I assume: How to use displacement maps in Blender?
There are multiple ways of connecting a displacement map onto the Principled BSDF shader. The first one I’ll be touching on is faking the displacement info by using a bump node:
Fake Displacement Through Bump Node
The Result (Fake Displacement Through Bump Node)
One can argue that we actually just added tiny details rather than creating an actualy geometry and I’d agree with them but this method might come in handy if you need to work on someting quickly without generating actual geometry so I wanted to share it with you guys too.
Another way of faking the displacement geometry without creating actualy geometry would be connecting the displacement map to the material output node directly with the help of displacemet node:
Displacement Texture > Displacement > Material Output Node
The Result (Displacement Texture > Displacement > Material Output Node)
Note: If this method is not working for you, you might need to go to settings (Make sure Cycles renderer is enabled.) and switch the displacement method from bump only to Bump and Displacement. If you’d like to generate real displacement geometry with this method, all you need to do is adding some subdivision modifiers (non-destructive workflow) or subdividing the real mesh (destructive workflow) and Blender will take care of the rest of it for you. As one may guess, strength of the map can be adjused within “Displacement” node whose screenshot can be seen above. One disadvantage this method over the one that I am about to talk about is that, the displacement applied across the surface of the mesh can be only seen in rendered mode. So you need to switch to render view (Shift + Z) to see how the displacement map is affecting your mesh.
Displacemet Settings For Cycles (Material Editor Displacement Activation)
Now, let’s talk about the most commonly used method: Creating actual geometry.
As an example, I will be using the default plane. I’ll go to edit mode and subdivide the mesh a few times to have some geometry that we can manipulate.
Subdivided Plane
After the subdiving action, I’ll bring in a subdivision modifier to make the geometry even denser, and a displacement modifier to bring in my displacement texture. Within displacement modifier, we’ll hit the new button to create a new texture. You can name this texture however you wish. Then we’ll head to the texture section and import our displacement texture.
Modifier stack matters.
As you can see in the following screenshot, I named my texture as “Displacement_Tex”, clicked on the “Open” button, searched for my downloaded displacement texture and imported it into Blender.
How to import displacement texture into Blender
And the final result is:
Displacement Map Final Result
By going back to “Displacement” modifier and tweaking the strength value, you can adjust the intensity of the map.
Subsurface & Subsurface IOR & Subsurface Anisotrophy Inputs
When a material has some degree of translucency or in simple terms “transparency”, these inputs are tweaked to achieve the desired results. For example; human skin or leaves are not one hundred percent see through but neither opaque.
Leaf Example Picture (Leaf Veins)
Skin Example Picture
In basic terms, what subsurface inputs do is to tell how much of the material should be opaque or transparent.
With more complicated terms, subsurface is a technique of describing how light should enter and exit a “translucent” surface, so Blender can make the neccessary calculations to tell its rendering engine at which locations should the light be reflected off or absorbed.
Subsurface scattering requires a lot of render time, so you need to strike a careful balance between accuracy and usability. By all means, boost the quality as necessary if you’re using subsurface scattering for a character’s skin in a still render. Finding the appropriate balance will be important, especially if you’re working under pressure to meet a deadline. But note that if you’re working on an animation, it’ll cost you render times. If you guys are interested in this topic, I could dive into more details but for now I’ll leave the info on this subject as it is.
Specular and Specular Tint
Before we jump into what these settings does, we need to get familiar with one term:
We can think diffusity as a “number” or a “rate” that tells Blender how diffusion should be spreaded.
I know it didn’t make sense really. Let’s put it this way:
When we tweak the roughness input in Principled BSDF for example, we tell Blender how much diffusion of lightrays there is. But when we tweak specular input, we tell Blender about the amount of reflected light rays, in other words the intensity.
For beginners, I don’t think it is “essential” to fully understand what this input does but I think everybody should at least know what it does which is to imitate the reflection of light sources by producing the vivid highlights that one would see on a glossy surface.
Anisotropic values define the degree of specular reflection of anisotropy. Positive numbers produce highlights that are shaped perpendicular to the tangent direction, whereas higher values produce highlights that are longer along the tangent direction.
Rotates the anisotropy direction, with 1.0 making a complete turn.
Clearcoat Input
Clearcoat is mostly used in materials like car paints. This input injects an extra white specular layer on top of other layers.
Self explanatory.
Sheen Attribute
This one is very diffucult to explain without a specific case usage but to provide a general information about it, it can be said it is a 0 to 1 slider found with the Principled shader, controls how much specular reflection is added to the surfaces’ fresnel reflections. It affects the surfaces’ ability to reflect light when viewed at an angle.
You might come across some other textures labeled as fuzz, etc. but for the sake of the tutorial not being super long, I might touch on that topic another day if there’s a demand for it. This tutorial should be enough to understand the essentials of how PBR textures work generally. If you think I missed something or some part of this tutorial is hard to understand, please feel free to contact me and I’ll update this tutorial.