D3D11 Project

This project involved creating a basic rendering pipeline using D3D11 and was completed as part of Staffordshire University's 'Real Time Rendering' module. The pipeline features bloom, water shading using Gerstner waves, transparency, terrain generation, fog, model and texture loading and more.

Bloom

I implemented bloom into my rendering pipeline using multiple post-processing render targets which I created alongside the use of three shaders. The first of these shaders sampled the texture of the initial rendered scene and then checked the luminance of each pixel and set it to black if it was not bright enough. The second shader performs a Gaussian Blur on the texture generated by the previous shader. The blur can either be performed horizontally or vertically in any one pass. I set up the blurring so that the texture is passed back and forth between the horizontal and vertical blur until the texture has been blurred a total of 20 times as this is what I found to produce an adequate result and not slow down rendering too much. The third shader is responsible for a variety of post-process effects but importantly, it blends the blurred texture with the orignial render as to create the bloom effect.

Terrain Generation

My project features terrain generation via a heightmap. This is done by generating a flat triangle grid which can have its height, width, depth and the number of rows and columns to use in generation. When the terrain is drawn it uses a seperate shader which uses the world height of the vertex its drawing to decide what textures to sample from and it also blends across each texture smoothly.

Water Shading

After completing my terrain generation, I decided that I wanted to re-use some of the same principles to generate a flat plane mesh of triangles which I would stylise using a water shader. The plane mesh was easy enough to generate and involved restructuring the code used for terrain generation to cut out the actual deformation. This plane then gets rendered with a unique water shader which uses Gerstner Waves in the vertex shader in order to simulate realistic waves. I used two of these wave functions for the water in my game but more complex waves would use more. For the pixel shader, I sample a Voronoi noise texture and make use of the Schlick Fresnel approximation in order to create the look of basic stylised water.

Transparency

In order to support transparency, I needed to be able to order transparent objects by the distance from the camera. My solution uses a JSON file to store the scene data which is then loaded into a scene graph. Each time the draw function is called, the scene graph is iterated over and the transparent objects are selected to be rendered after every other object. Given the list of transparent objects, they are then sorted every draw call by distance from the camera and then rendered in that order to avoid common issues with transparency. The below code shows how transparent objects are drawn.

  
    void DX11Framework::DrawSceneGraph(SceneNode& node, ID3D11DeviceContext* immediateContext, ConstantBuffer cbData, ID3D11Buffer* constantBuffer, ID3D11VertexShader* vertexShader, ID3D11PixelShader* pixelShader)
    {
        //draw objects normally if not transparent
        if (node.objectData.GetName() != "Root")
        {
            std::string str =node.objectData.GetName();
            //if transparent dont draw 
            if (node.objectData.GetTransparency().x == 0.0f || node.objectData.GetTransparency().y == 0.0f || node.objectData.GetTransparency().z == 0.0f)
            {
                immediateContext->OMSetBlendState(0, 0, 0xffffffff);
                node.objectData.Draw(immediateContext, cbData, constantBuffer, vertexShader, pixelShader);
            }
            else
            {
                m_TransparentObjects.push_back(node.objectData);
            }
        }
        //call recursively for children of current node
        for (int i =0;iGetView()));
        XMVECTOR cameraPosition = cameraWorld.r[3];
        std::map goDistanceMap;
        //calculate distance of object from camera 
        for (int i = 0; i < m_TransparentObjects.size(); i++)
        {
            XMMATRIX object = XMLoadFloat4x4(m_TransparentObjects[i].GetWorld());
            XMVECTOR objectPos = object.r[3];
            XMVECTOR distance = objectPos- cameraPosition;
            float distanceScalar = XMVectorGetX(XMVector3Length(distance));
            goDistanceMap[i] = distanceScalar;
        }
        //sort objects in terms of distance
        for (int i = 0; i < m_TransparentObjects.size(); i++)
        {
            for (int j = 0; j < m_TransparentObjects.size() - i - 1; j++)
            {
                float distanceJ = goDistanceMap[j];
                float distanceJPlus1 = goDistanceMap[j + 1];
                if (distanceJ < distanceJPlus1)
                {
                    std::swap(m_TransparentObjects[j], m_TransparentObjects[j + 1]);
                    std::swap(goDistanceMap[j], goDistanceMap[j + 1]); 
                }
            }
        }
        //draw transparent objects
        for (int i = 0; i < m_TransparentObjects.size(); i++)
        {
            FLOAT blendFactor[4] = { m_TransparentObjects[i].GetTransparency().x, m_TransparentObjects[i].GetTransparency().y,
                m_TransparentObjects[i].GetTransparency().z, m_TransparentObjects[i].GetTransparency().w };
            immediateContext->OMSetBlendState(_transparency, blendFactor, 0xffffffff);
            m_TransparentObjects[i].Draw(immediateContext, cbData, constantBuffer, vertexShader, pixelShader);
        }
        m_TransparentObjects.clear();
    }

  

Debugging

This project found me using a GPU profiler for the first time which I found to be extremely useful and intriguing. I used RenderDoc in order to check that correct data was being sent in constant buffers on certain draw calls, what textures were being created and sampled and other things. I found it especially useful when trying to get the bloom post-process effect working due to the back and forth nature of passing textures between render targets.

Who Did What?

Original Framework by Staffs Uni

ACES Film Map Curve by Krzysztof Narkowicz

When Was it Made?

This project was worked on from October - December 2023

What Went Well?

I believe that I was able to implement many features into this project at good quality given the time frame. I am particularly proud of implementing bloom as it familiarised me with passing textures between render targets and generally handling a more complex draw cycle due to render targets having to render to textures instead of the framebuffer.

What Could Be Better?

The way light rendering is set up could be better. This is because in a commercial engine, each light would result in another draw call due to having to calculate the effect each light. In my setup, there is a singular lighting pass which involves pushing all the lights on the constant buffer and calculating all the lighting at once. This means there is a hardcoded limit on the number of lights that are passed through due to the need to use an array. I have set this limit to 100 which also means I am reserving space for 100 lights even when there may not be that many passed through.

 

Social Media

 

Rhys Elliott 2023. contact@rhyselliott.com