LÖVR Framework – Shaders and Lighting Adventures

The LÖVR framework allows for pretty quick prototyping of 3D and VR interactive scenes through simple Lua coding, but compelling scenes require good lighting. It’s time to tackle shaders.

In this post, I’m continuing work on a little scene I started in my introduction to LÖVR.

Contents

Introduction

When we last left off, we had a simple outdoor scene started, lit by the default shader by simply writing no shader code at all.

This is a nice option when starting a project. We immediately get confirmation that our models are loading and a good sense of how everything looks. With max lighting from all directions, we just don’t get any real sense of detail as the scene shows no shadows or shading whatsoever.

As I mentioned last time, fixing this requires you to jump straight into shader code. This proved to be tricky and not quite in the way I expected.

Creating our basic shader

Knowing that shader concepts are scary and weird for the uninitiated, lovr.org has a simple lighting tutorial available to get new developers up to speed. It’s not quite perfect at the moment, which we’ll get into later.

lovr.org’s tutorial aims to walk you through implementing and taking control of the Phong lighting model in ambience, diffusion, and specularity phases as adapted from the Learn OpenGL Basic Lighting tutorial, originally written in C++. It’s important to keep in mind that “fragments” are pixels as processed by shader language code…for whatever reason.

To start, the LÖVR tutorial walks you through adding the same code used in the default vertex and fragment shaders into your project.

For simplicity, I’ll omit my project-specific code just to show what I’ve added or changed in each major step.

function lovr.load()

  customVertex = [[
    vec4 position(mat4 projection, mat4 transform, vec4 vertex)
    {
        return projection * transform * vertex;
    }
  ]]
  customFragment = [[
    vec4 color(vec4 graphicsColor, sampler2D image, vec2 uv)
    {
        return graphicsColor * lovrDiffuseColor * vertexColor * texture(image, uv);
        }
  ]]
  shader = lovr.graphics.newShader(customVertex, customFragment, {})
end

function lovr.draw()
  lovr.graphics.setShader(shader)
  --models drawn here
  lovr.graphics.setShader()
end

If implemented correctly, running the scene shows no changes at all. We’re just explicitly running the default shader code from our project. Now we get to take control of the three phases of the Phong model.

Phase 1 – Ambience

Here’s where my trouble started.

By default, ambient lighting in LÖVR stays at 100% causing all surfaces to emit their colors at full unshaded brightness–no shading and no shadows. Usually, your first move toward custom lighting will be to reduce this to less than full strength. This requires new fragment shader code. The tutorial’s code reduces the ambient lighting to 20% (0.2 values) and looks like this:

customFragment = [[
    uniform vec4 ambience;
    vec4 color(vec4 graphicsColor, sampler2D image, vec2 uv) 
    {
        //object color
        vec4 baseColor = graphicsColor * texture(image, uv);
        return baseColor * ambience;
    }
]]
shader:send('ambience', { 0.2, 0.2, 0.2, 1.0 })

There was just one problem: this turned all my models dark gray.

It’s never good to panic in the middle of a tutorial, but I’m here to tell you that finishing the entire remainder of the tutorial to see if it’s fixed later on resulted in very nicely lit, completely gray models.

I spent hours troubleshooting this.

I looked at the source project provided at the end of the tutorial. While I don’t feel the tutorial guided visitors through a quite close enough approximation of this code, following it should result in a lit, colorful scene with minimal adjustments. The source project itself runs straight out of the archive with no problems.

Eventually, I opened the source project GLB model side-by-side with my Kenney survival kit tree to compare. Sure enough, I noticed a big difference: the tutorial model uses a UV image texture while Kenney’s tree has nice low-profile materials assigned.

This pulled me into a rabbit hole where I tried to determine how to quickly close the gap between material assets I wanted to use and the image textured models this lighting method seemed to require.

I did eventually find a method of baking multiple simple materials to a texture image in Blender that allowed me to export new GLBs that worked with this method, but it’s time consuming, unlikely to work with more complicated assets, and results in asset files that are nearly ten times larger for the exact same asset.

This turned out to be a massive waste of time.

After this fragment shader change, the tutorial author explains that they decided not to use several values–specifically, lovrDiffuseColor and the vertexColor from the previous version–as a means of optimizing the code by “omitting a few unneeded variables.”

Except, I personally did need them, and you might one day, too.

I’ve been using Blender for over 20 years now–good Christ–and I’ll be the first to tell you I don’t understand materials and textures nearly as well as I should. Here in the age of nodes, I’m as far from a deep understanding as I’ve ever been, but I learned a lot today.

Simple materials in Blender change vertex colors. This makes more sense when you think of assigning multiple materials to a model by selecting vertices, creating groups, and assigning or unassigning materials to those groups. In fact, vertex color multiplied by diffuse color often gives you the value crucial to the renderer when you look at the model, and that’s exactly what the author removed from this version of the shader code.

Using a UV image texture for the entire model, to the author’s credit, does eliminate the importance of any underlying vertex color data. But I was using elegant little assets that don’t rely on big bulky image textures. When I reintroduced the lovrDiffuseColor and vertexColor values into the new shader code, I had my sweet, sweet color back.

This tutorial should probably either add a note about keeping the lovrDiffuseColor and vertexColor values for untextured assets or, ideally, drop the optimization mods completely for a developer’s first day working with the framework’s shader code. I submitted this suggestion to the official Slack group.

Update: Folks seemed receptive to this suggestion, and it sounds like the tutorial may be updated to skip the troublesome optimization.

The tutorial confirms that the scene should look considerably darker for this rendering. We’re finally back on track.

Phase 2 – Diffusion

The theory part of the diffusion phase, in short, is that this change will add lighting bias to polygons based on their relationship to the light source. This means, among other things, that we’ll be specifying the position and color of the light source for the first time.

The implementation part of the phase, for me and any other vertex color users, consists of adding the next set of tutorial changes before bringing lovrDiffuseColor and vertexColor back into the fragment shader’s baseColor equation.

For simplicity, here’s most of the program, minus lines specific to my project and models:

function lovr.load()
  --shader code
  customVertex = [[
  out vec3 FragmentPos;
  out vec3 Normal;
  vec4 position(mat4 projection, mat4 transform, vec4 vertex)
  {
      Normal = lovrNormal * lovrNormalMatrix;
      FragmentPos = vec3(lovrModel * vertex);

      return projection * transform * vertex;
  }
  ]]

  customFragment = [[
  uniform vec4 ambience;
  uniform vec4 liteColor;
  uniform vec3 lightPos;
  in vec3 Normal;
  in vec3 FragmentPos;
  vec4 color(vec4 graphicsColor, sampler2D image, vec2 uv)
  {
      //diffuse
      vec3 norm = normalize(Normal);
      vec3 lightDir = normalize(lightPos - FragmentPos);
      float diff = max(dot(norm, lightDir), 0.0);
      vec4 diffuse = diff * liteColor;

      vec4 baseColor = graphicsColor * lovrDiffuseColor * vertexColor * texture(image, uv);
      return baseColor * (ambience + diffuse);
  }
  ]]
  shader = lovr.graphics.newShader(customVertex, customFragment, {})
end

function lovr.draw()
  lovr.graphics.setShader(shader)
  shader:send('ambience', { 0.2, 0.2, 0.2, 1.0 })
  shader:send('liteColor', {1.0, 1.0, 1.0, 1.0})
  shader:send('lightPos', {2.0, 5.0, 0.0})
  --I draw my models here
  lovr.graphics.setShader()
end

The result is the best-looking scene yet:

Phase 3 – Specularity

The specularity phase is all about detail lighting, adding those shiny highlights where they’re appropriate. In this last step, we’re adding properties for strength of that specular shine, metallic quality, and the view position which is used for the most accurate calculations. This needs to be updated as the player or camera moves around, so it’s the one thing we’ll add to the update loop.

I went ahead and added some new models and tweaked values closer to my liking for a final render for the day.

playerViewPos = {x=0,y=0.5,z=0}

function lovr.load()
  --I load my models here
  --I set my viewPose here
  lovr.graphics.setBackgroundColor(0.56, 0.87, 1)

  --shader code
  customVertex = [[
  out vec3 FragmentPos;
  out vec3 Normal;
  vec4 position(mat4 projection, mat4 transform, vec4 vertex)
  {
      Normal = lovrNormal * lovrNormalMatrix;
      FragmentPos = vec3(lovrModel * vertex);
      return projection * transform * vertex;
  }
  ]]
  customFragment = [[
  uniform vec4 ambience;
  uniform vec4 liteColor;
  uniform vec3 lightPos;
  in vec3 Normal;
  in vec3 FragmentPos;
  uniform vec3 viewPos;
  uniform float specularStrength;
  uniform int metallic;
  vec4 color(vec4 graphicsColor, sampler2D image, vec2 uv)
  {
      //diffuse
      vec3 norm = normalize(Normal);
      vec3 lightDir = normalize(lightPos - FragmentPos);
      float diff = max(dot(norm, lightDir), 0.0);
      vec4 diffuse = diff * liteColor;

      //specular
      vec3 viewDir = normalize(viewPos - FragmentPos);
      vec3 reflectDir = reflect(-lightDir, norm);
      float spec = pow(max(dot(viewDir, reflectDir), 0.0), metallic);
      vec4 specular = specularStrength * spec * liteColor;

      vec4 baseColor = graphicsColor * lovrDiffuseColor * vertexColor * texture(image, uv);
      return baseColor * (ambience + diffuse + specular);
  }
  ]]
  shader = lovr.graphics.newShader(customVertex, customFragment, {})
end

function lovr.draw()
  lovr.graphics.setShader(shader)
  shader:send('liteColor', {1.0, 1.0, 0.73, 1.0})
  shader:send('lightPos', {4, 4, 2})
  shader:send('ambience', {0.6, 0.6, 0.6, 1})
  shader:send('specularStrength', 0.1)
  shader:send('metallic', 8)
  modelTree:draw(0,0,-2)
  modelChest:draw(0.4,0,-1.6,1,-0.8,0,1,0)
  modelRockB:draw(-0.7,0,-1.6,0.75)
  modelGrassLarge:draw(0,0,-2)
  modelRockFlatGrass:draw(0,-1,-2,4,1.6,0,1,0)
  lovr.graphics.setShader()
end

function lovr.update()
  shader:send('viewPos', {playerViewPos.x,playerViewPos.y,playerViewPos.z})
en

Our test scene looks a lot more interesting with a solid lighting foundation implemented.

Still ahead

It’s uncommon to identify the most direct route when you run into difficulty like my vertex color problem, so I’m still proud of eventually getting to the other side of the issue and not letting it be a project stopper. The lighting setup is pretty basic, and it will probably take me some learning to get the best out of my assets by lighting them up just right (they deserve better). I’d love to get some proper shadows in, but a basic Google search seems to suggest it’s complicated. I don’t want to sink another day into that just yet. I’d much rather make the scene interactive!

In the next session, I’ll work on basic keyboard controls and movement, collision detection, and maybe some basic UI features. With that in place, it’s practically a game!

Update: The next post on implementing keyboard and mouse controls is available!

Leave a Comment