Vulkan Volumetric Cloud Rendering

This project formed the basis of my final year dissertation "Implementing and Comparing Real-Time Volumetric and Billboarded Clouds". I implemented volumetric cloud rendering and instanced billboarded cloud rendering as part of the project. This page will focus on the volumetric implementation.

The repository for the project can be found at: FYP Cloud Rendering

There are a variety of variables the user can edit to change the appearance of the volumetric as can be seen in the demo.

Voxel Generation

The first step in rendering the volumetric is the generation of a voxel grid. This solution uses a 512x512x512 3D image to represent the voxel grid using the VK_FORMAT_R8_UNORM format in order to reduce memory overhead as much as possible.

The voxel grid takes place in a compute shader and makes use of different types of noise in order to determine the density value of the voxel at any given point. There are a few different types of noise used for different purposes. One type of noise determines the baseline shape of the clouds. Another type of noise is used to 'carve away' at these clouds in order to create detail. These two noises are combined in order to create the final result.

The exact approach used in my implementation can be seen below:

  
    ivec3 pos = ivec3(gl_GlobalInvocationID.xyz);
    vec3 nPos = vec3(pos) / vec3(imageSize(densityTex));

    float density =0.0;
    float height = nPos.y;

    vec3 edgeProximity = min(nPos, vec3(1.0) - nPos);
    float edgeDistance = min(min(edgeProximity.x, edgeProximity.z),edgeProximity.y);

    vec3 shapeOffset = vec3(voxelGenInfo.time * voxelGenInfo.cloudSpeed);
    vec3 shapePos = nPos + shapeOffset;

    vec3 noiseOffset = vec3(voxelGenInfo.time * voxelGenInfo.detailSpeed);
    vec3 noisePos = nPos * voxelGenInfo.detailNoiseScale + noiseOffset;

    vec4 shapeNoise =texture(shapeNoiseTex, shapePos) ;
    vec4 detailNoise =texture(detailNoiseTex, noisePos) ;


    float fbm = dot(shapeNoise, normalize(voxelGenInfo.shapeNoiseWeights))  * heightMap(height);
    float detailFbm = dot(detailNoise, normalize(voxelGenInfo.detailNoiseWeights)) * (1.0-heightMap(height));

    float cloudDensity =fbm;
    if(cloudDensity<=0)
    {
        imageStore(densityTex, pos, vec4(density,0,0,0));
    }
    else
    {
        density = cloudDensity - detailFbm * pow(1-fbm,3) * voxelGenInfo.detailNoiseMultiplier;
        density = pow(density,2);
        density*= edgeDistance;

        density*=voxelGenInfo.densityMultiplier;

        imageStore(densityTex, pos, vec4(density,0,0,0));
    }
}
  

Ray-Marching

Ray-Marching is the method used to generate an image from the voxel grid. A ray is fired from the viewer's point of view into the scene. At every differential point along the ray, a sample of the density is taken from the voxel grid. This sample is then used to alter the accumulataed transmittance and illumination values for that pixel. Once the raymarch has finished the pixels colour can be derived.

The method used can be seen below:

  
    float I =0.0; //illumination
    float sunI =0.0;
    float transmit = 1.0;
    float sunTransmit =1.0;
  
    while(tMin<=stepMax &&I<0.7 && transmit>0.0)
    {
      tMin+=stepSize;
      float jitter =(random((gl_FragCoord.xy)*voxelInfo.time - 0.5)) * stepSize;
      tMin += jitter;
      vec3 samplePos = rayOrigin+ (rayDir*tMin);
  
      if (!(samplePos.x >= voxelGridMin.x && samplePos.x <= voxelGridMax.x &&
              samplePos.y >= voxelGridMin.y && samplePos.y <= voxelGridMax.y &&
              samplePos.z >= voxelGridMin.z && samplePos.z <= voxelGridMax.z)) continue; 
          
      vec3 uvw = (samplePos - voxelGridMin) / (voxelGridMax - voxelGridMin);
      uvw = clamp(uvw, vec3(0),vec3(1));
      float density =vec3(texture(voxelBuffer, uvw)).r * stepSize;
  
      if(density>0.0)
      {
        while(sunTMin0.0  )
        {
          sunTMin+=sunStepSize;
          jitter =(random((gl_FragCoord.xy)*voxelInfo.time - 0.5)) * stepSize;
          sunTMin+=jitter;
          vec3 sunSamplePos = (samplePos) + (toSun*sunTMin);
  
              if (!(sunSamplePos.x >= voxelGridMin.x && sunSamplePos.x <= voxelGridMax.x &&
                    sunSamplePos.y >= voxelGridMin.y && sunSamplePos.y <= voxelGridMax.y &&
                    sunSamplePos.z >= voxelGridMin.z && sunSamplePos.z <= voxelGridMax.z)) continue;
  
              uvw = (sunSamplePos-voxelGridMin) / (voxelGridMax - voxelGridMin);
  
  
              float sunDensity = vec3(texture(voxelBuffer, uvw)).r*sunStepSize;
  
              sunTransmit *= beer(sunDensity) ;
              sunI+= sunTransmit * powder(sunDensity) * phase;
        }
  
        I+=  transmit* phase * powder(density);
        I+=  max((sunI*0.05), 0.001);
        transmit*= (max((beer(density) + powder(density)), beer(density*0.25)*0.7) * (1- voxelInfo.outScatterMultiplier));
        transmit*=(sunTransmit);
  
      }
    }
  
    vec3 finalColor = (sunlightColor * I) + (backgroundColor * transmit) ;

  

Measuring Results

A large part of this project involved benchmarking results. As such, I had to familiarise myself with a number of tools that I planned to use. These included NVIDIA Nsight, Optick Profiler and AMD's Radeon Developer Tool Suite.

Optick Profilier was used to analyse the CPU performance of the program for the purposes of my paper.

NVIDIA Nsight was used to analyse the GPU performance, both so that it could be analysed in of itself and so that it could be compared against the CPU performance.

AMS's suite of GPU tools were used to find out exact low level details of the system the tests were ran on. This information informed some analysis.

Who Did What?

Vulkan Base Project via Vulkan Guide

Tileable Volume Noise by Sebastien Hillaire

Cloud Textures by WickedInsignia

When Was it Made?

This project was worked on from October 2024 - February 2025

What Went Well?

The project was well-scoped as strict deadlines had to be met. In this time, I was able to create a visually pleasing research artefact that allowed for my research goals to be completed.

What Could Be Better?

Reprojection could have been implemented in the manner it is in Horizon: Zero Dawn. The fidelity of the end result is not as good as it could be as to make the solution performant. Implementing reprojection would allow for greater fidelity as more time could be spent on a more rigorous raymarch.

 

 

Rhys Elliott 2023. contact@rhyselliott.com