How to use PBR textures in Blender (A Deep Dive)

In this tutorial, I want to take a closer look at the famous “Principled BSDF” node and its inputs. In the following minutes, we’ll take the pieces that compose this “master node” apart and learn how we can have total control over our materials.

Before we begin, I want to mention about a few things. There are many topics out there that explain how this node works, but in my opinion, they overload beginners with too much technical information that suck the fun out of the process of learning Blender. Because of that, in this tutorial, I’ll keep everything as simple as I can and make the information that you’ll gather here as practical as possible. If you guys are ready, let’s begin!

Note: If you are tired of watching video tutorials and would like to see more text-based tutorials like this one from me, you can support me on Patreon so I can dedicate more time to creating them.

Lazy Loaded Image

“Principled BSDF” node in the material editor

First we’ll start with the mostly used inputs.

1. Base Color Texture

  • Also Referred As: Albedo, Color, Diffuse
  • Color Space: sRGB or Filmic Log Encoding
  • Description: This type of textures tell “Principld BSDF” what color our material should be in. In other words, if an albedo texture is red, the color of our material will be red. If it’s a leaf image made out of green and its variant colors, then “Principled BSDF” will display those colors across the surface of our mesh.
  • Additional Info (Optinal Read): There are only two texture types that use sRGB/Filmic Log Encoding as their color space. These are albedo and AO maps. Rest of the textures are going to be always set to “Non-Colour” data without any exceptions. In some occasions diffuse and albedo might mean different things but in more than most cases these terms are used interchangeably.
Lazy Loaded Image

How to connect “Base Color Texture” to “Principled BSDF” node

A Very Important Note: When we download an albedo texture, sometimes it can come with an additional opacity texture which I’ll be explaining how to use below. But in some cases you might see that your albedo texture has a transparent background. When an albedo texture consists of transparent elements, its “Alpha” pin should get connected to “Principled BSDF” node’s “Alpha” input.

Lazy Loaded Image

Your advertisement can be placed here for $200 permanently. Suggested resolutions: 1800x663 or 1400x1400. Contact us.

Lazy Loaded Image

A Leaf Image With Transparent Background

Lazy Loaded Image

How to tell Principled BSDF node about albedo texture’s transparent parts

Lazy Loaded Image

“Alpha” input in action

If we do not connect the alpha input of the albedo texture to alpha input of the Principled BSDF node, we’ll get a black background because the master node does not know which parts of the texture are transparent. And the way to tell that to Blender is connecting the alpha pins into each other.

For this blending method to work, there is one more step that we need to take. We need to go to our material’s “Material Properties” tab and set the blending mode to “Alpha Clip” so it can clip out the transparent parts of the texture.

Lazy Loaded Image

How to set our material’s “Blend Mode” to “Alpha Clip”.

2. Ambient Occlusion Texture

  • Also Referred As: AO
  • Color Space: sRGB or Filmic Log Encoding
  • Description: AO textures are used to reflect fake shadows across the surface our mesh. It’s usage is not suggested withing Blender’s Cycles engine since it is capable of tracing and calculating realistic shadows. But if you are using Eevee, they play an important role on capturing a more realistic looking scene. Why do we need AO anyway? Cycles is not a real time rendering engine so it can take its sweet time to make those neccessary calculations to present the most accurate information for us. But Eevee or for example Unreal Engine’s rendering engine is real time. Real time means you’ll get your textures rendered in seconds. But this rapidness needs to sacrifice some accuracy. By plugging AO to our shader, we’re giving our render engine an extra info on where those “soft” shadows appear should appear, which normally would appear in a non-real time engine such as Cycles. In other words, AO tells Principled node which parts of the mesh should be brigher or darker so it knows which parts of the texture should be highlighted.
  • Additional Info: Ao textures can not be used independetly. They must be blended with Albedo texture.
Lazy Loaded Image

How to blend AO texture with Albedo texture in Blender

3. Roughness Texture

  • Also Referred As: Gloss, Glosiness
  • Color Space: Non-Colour
  • Description: This type of textures give info on how rough or smooth of the surface of our mesh is. The term rough means less light will be reflected across the surface of the mesh and the term gloss means more light will be reflected.
  • Additional Info: A gloss texture is the inverse version of a roughness map. Therefore it needs to be inverted to be used. To do this we can use the “Invert” node. Be aware of that though, some textures found online, even labeled as Gloss, they are actually roughness and there is no need for conversion. To understand if something is a roughness or gloss map, we can open the texture either in an external software such as “Photos by Windows” or within Blender via “Image Editor”. After opening the texture, we can start checking it. You’ll see that roughness/glossiness maps are made out of black and white areas. Black parts in the image represent shiny areas and white parts represent rough areas. Depending on what type of material you want to create (let’s say something metallic) you should be able to tell if the image is inverted or not. If not, we can always connect the texture to the roughness input, switch to either material or render view and check our mesh. If something looks off, most probably you’ll need to invert the map.
Lazy Loaded Image

Roughness Map In Action

Lazy Loaded Image

Inverting Gloss Texture Via “Invert Node”

Lazy Loaded Image

Inverting Gloss Texture Via “Math Node” (Subtracting 1 from the original value)

4. Normal Texture

  • Also Referred As: Bump
  • Color Space: Non-Colour
  • Description: Normal textures give our mesh depth without changing its geometry. In other words, it creates the illusion of something being there, when it is not there. To explain it further: Let’s say you are modeling a high poly shoe and most modern shoes use polyester as one of their materials:
Lazy Loaded Image

A shoe render from one of my commercial projects

You see those little “dots”? Those details are faked with the usage of a normal map.

Lazy Loaded Image

Normal map usage to fake polyester details

Description continues: This is one of the cases where you might use a normal map/texture on a high poly model. As we can see in the picture, this model is already high poly so we could keep adding those needed details by hand but it would cost us extra time. So to save time, we can use normal maps on high poly models. The second usage case of normal maps are: In video games. Video game engines use real time renderers. They can not handle too many meshes with high polygons at the same time well, but the problem is, if you want something to look very detailed, you need to model them in high poly. So what’s the solution then? The solution is: baking. And what is baking? We can model something in high poly and model the same thing with low poly counts. Then we can transfer the details on the high poly mesh to a texture (normal texture) and slap that texture on the low poly model. The details would not be there present physically, but they would appear like they are as I touched on this matter before.

Lazy Loaded Image

A low poly mesh without a baked normal map from one of the projects I worked on

Lazy Loaded Image

Same low poly mesh with the baked normal map

Description continues: If you want to learn how baking is achieved, I talked about it in this post. So in short, the second usage of normal cases would taking a normal map and applying it to your low poly model, which in turn gives your low poly model the appearance of the high poly model, but at a far, far lower performance cost.

Lazy Loaded Image

How to connect normal texture to Principled BSDF node

More Examples (Optional Read): The picture below represents how normal maps can be used to “bake” details. Can you tell which model has the actualy geomatry and which one fakes it?

Lazy Loaded Image

Normal Map Usage Cases: Faking Geometry Example 01

I believe it’s clearer in the following picture that which mesh has the real geometry:

Lazy Loaded Image

Normal Map Usage Cases: Faking Geometry Example 02

Additional Info (Optional Read): It might be confusing hearing people using different terms when talking about same stuff. When someone says “bump map”, today they usually are talking about normal maps. But you can also run into people saying bump and refering to height/displacement maps. The one key difference between bump and normal maps is bump can only store height information whereas normal map can store angle information as well.

5. Metallic Texture

  • Also Referred As: Metalness
  • Color Space: Non-Colour
  • Description: Defines if the surface of the mesh is metallic or not.
Lazy Loaded Image

How to connect Metalness texture to Principled BSDF

6. Translucent Texture

  • Also Referred As: Translucency
  • Color Space: Non-Colour
  • Description: A translucent material is an object that you are not able to clearly see through because only some light can pass through. (For example; leaves, skin, etc.)
Lazy Loaded Image

How to connect transluceny texture to Principled BSDF

7. Opacity Texture

  • Also Referred As: -
  • Color Space: Non-Colour
  • Description: Defines the opacity
Lazy Loaded Image

How to connect opacity map to Principled BSDF node

Lazy Loaded Image

Opacity Texture In Action

8. Displacement Map

  • Also Referred As: Height
  • Color Space: Non-Colour
  • Description: Used to give our model “real” geometry/actual depth

If we can fake the geometry with normal maps, why do we need displacement maps in the first place?

Imagine that you need to create a very realistic looking render or even further, you do really need to create a real place in 3D. In these cases, using normal maps would not get the job done.

Since displacement maps add real geometry to the mesh, as you might expect, they add real geometry to the mesh and hence they consume a lot of resources. Let’s take the example of needing to model an environment where usage of displacement maps could come in handy. Consider the process of sculpting each individual bump on your terrain. Depending on the terrain, such work can take days or even weeks to complete depending on the complexity of the work. With displacement maps, we only need to import the required texture and bum! Almost all of the details we needed are already present on our screen.

Additional Info (Optional Read): Just as it happens with normal maps, when it comes to displacement maps we also see a lot of terms being used interchangeably. To cut it short and make it practical, I like to see what diferencies the following terms: Height maps generally used to describe the details that can be used to create valleys, mountains, etc. In other terms, height maps are suited best when one needs to create terrain. Displacement maps most of the time refer to additional but compared to height maps, small details.

In short:

  • Displacement Maps: Small real details
  • Height Maps: Large real details

And finally we are coming to the part that excites you the most I assume: How to use displacement maps in Blender?

There are multiple ways of connecting a displacement map onto the Principled BSDF shader. The first one I’ll be touching on is faking the displacement info by using a bump node:

Lazy Loaded Image

Fake Displacement Through Bump Node

Lazy Loaded Image

The Result (Fake Displacement Through Bump Node)

One can argue that we actually just added tiny details rather than creating an actualy geometry and I’d agree with them but this method might come in handy if you need to work on someting quickly without generating actual geometry so I wanted to share it with you guys too.

Another way of faking the displacement geometry without creating actualy geometry would be connecting the displacement map to the material output node directly with the help of displacemet node:

Lazy Loaded Image

Displacement Texture > Displacement > Material Output Node

Lazy Loaded Image

The Result (Displacement Texture > Displacement > Material Output Node)

Note: If this method is not working for you, you might need to go to settings (Make sure Cycles renderer is enabled.) and switch the displacement method from bump only to Bump and Displacement. If you’d like to generate real displacement geometry with this method, all you need to do is adding some subdivision modifiers (non-destructive workflow) or subdividing the real mesh (destructive workflow) and Blender will take care of the rest of it for you. As one may guess, strength of the map can be adjused within “Displacement” node whose screenshot can be seen above. One disadvantage this method over the one that I am about to talk about is that, the displacement applied across the surface of the mesh can be only seen in rendered mode. So you need to switch to render view (Shift + Z) to see how the displacement map is affecting your mesh.

Lazy Loaded Image

Displacemet Settings For Cycles (Material Editor Displacement Activation)

Now, let’s talk about the most commonly used method: Creating actual geometry.

As an example, I will be using the default plane. I’ll go to edit mode and subdivide the mesh a few times to have some geometry that we can manipulate.

Lazy Loaded Image

Subdivided Plane

After the subdiving action, I’ll bring in a subdivision modifier to make the geometry even denser, and a displacement modifier to bring in my displacement texture. Within displacement modifier, we’ll hit the new button to create a new texture. You can name this texture however you wish. Then we’ll head to the texture section and import our displacement texture.

Lazy Loaded Image

Modifier stack matters.

As you can see in the following screenshot, I named my texture as “Displacement_Tex”, clicked on the “Open” button, searched for my downloaded displacement texture and imported it into Blender.

Lazy Loaded Image

How to import displacement texture into Blender

And the final result is:

Lazy Loaded Image

Displacement Map Final Result

By going back to “Displacement” modifier and tweaking the strength value, you can adjust the intensity of the map.

Less Used Inputs

Lazy Loaded Image

Subsurface & Subsurface IOR & Subsurface Anisotrophy Inputs

When a material has some degree of translucency or in simple terms “transparency”, these inputs are tweaked to achieve the desired results. For example; human skin or leaves are not one hundred percent see through but neither opaque.

Lazy Loaded Image

Leaf Example Picture (Leaf Veins)

Lazy Loaded Image

Skin Example Picture

In basic terms, what subsurface inputs do is to tell how much of the material should be opaque or transparent.

With more complicated terms, subsurface is a technique of describing how light should enter and exit a “translucent” surface, so Blender can make the neccessary calculations to tell its rendering engine at which locations should the light be reflected off or absorbed.

Subsurface scattering requires a lot of render time, so you need to strike a careful balance between accuracy and usability. By all means, boost the quality as necessary if you’re using subsurface scattering for a character’s skin in a still render. Finding the appropriate balance will be important, especially if you’re working under pressure to meet a deadline. But note that if you’re working on an animation, it’ll cost you render times. If you guys are interested in this topic, I could dive into more details but for now I’ll leave the info on this subject as it is.

Specular and Specular Tint

Lazy Loaded Image

Specular and Specular Tint

Before we jump into what these settings does, we need to get familiar with one term:

“Diffusity”

We can think diffusity as a “number” or a “rate” that tells Blender how diffusion should be spreaded.

I know it didn’t make sense really. Let’s put it this way:

When we tweak the roughness input in Principled BSDF for example, we tell Blender how much diffusion of lightrays there is. But when we tweak specular input, we tell Blender about the amount of reflected light rays, in other words the intensity.

For beginners, I don’t think it is “essential” to fully understand what this input does but I think everybody should at least know what it does which is to imitate the reflection of light sources by producing the vivid highlights that one would see on a glossy surface.

Anisotropic (Cycles Only)

Anisotropic values define the degree of specular reflection of anisotropy. Positive numbers produce highlights that are shaped perpendicular to the tangent direction, whereas higher values produce highlights that are longer along the tangent direction.

Anisotropic Rotation (Cycles Only)

Rotates the anisotropy direction, with 1.0 making a complete turn.

Clearcoat

Lazy Loaded Image

Clearcoat Input

Clearcoat is mostly used in materials like car paints. This input injects an extra white specular layer on top of other layers.

Clearcoat Roughness

Self explanatory.

Sheen Attribute

Lazy Loaded Image

Sheen Attribute

This one is very diffucult to explain without a specific case usage but to provide a general information about it, it can be said it is a 0 to 1 slider found with the Principled shader, controls how much specular reflection is added to the surfaces’ fresnel reflections. It affects the surfaces’ ability to reflect light when viewed at an angle.

Final Thoughts

You might come across some other textures labeled as fuzz, etc. but for the sake of the tutorial not being super long, I might touch on that topic another day if there’s a demand for it. This tutorial should be enough to understand the essentials of how PBR textures work generally. If you think I missed something or some part of this tutorial is hard to understand, please feel free to contact me and I’ll update this tutorial.