Archive for the ‘Real-Time Rendering’ Category

Terrain

Posted: October 29, 2011 in Real-Time Rendering

Terrain rendering through multiple draw calls by dynamically stitching together patches. The pictures shown use a 4096×4096 heightmap giving around 6 square miles of exploration using units of 1 meter per game unit. The shader accepts up to 16 layers, allowing for the painting of different textures using blending. The patches are processed in the Vertex shader (displacement, normals, tangents), which makes the terrain completely dynamic, and could be made destructable by using a dynamic heightmaps. The terrain is quite low in terms of memory cost as only different LOD  patches are stored (2^4, 2^5, 2^6, 2^7, 2^8), and are reused. The real cost comes into storing the heightmap which can be fairly large.

4096 * 4096 ~ 16MB (6 Square miles)
8192 * 8192 ~ 67MB (25 Square miles)
16384 * 16384 ~ 268MB (100 Square miles)

However, these need not be loaded all at once and can be streamed in when required, releasing the out of view ones.

Particle Editor

Posted: August 29, 2011 in General, Real-Time Rendering

This is my Particle Editor, which was inspired by Unreals Cascade, I call it ‘AssCade’. Its got a lot of the same functionality, including being able to graph all sorts of effects. As you can see from the images, each particle system is made from a series of emitters, which are in turn affected by modifiers. All the computation is done on the CPU as it made it easier to do effects on sibling emitters. Once all the emitters have been processed, they are then sent to the GPU to render. Where each particle consists of a single vertex, and is then expanded into a quad in the Geometry shader. Rather than have each emitter manage a vertex buffer, I instead chose to use the more optimized route of having each particle system manage the vertex buffer, that is used by its child emitters, requiring only a single lock per Particle system, as opposed one for each emitter, which can really add up with effects usually having at least 3-4 emitters.

I am toying around with the idea of having all the computation performed on the GPU to increase performance, by compiling\creating a HLSL effect file from the particle system nodes, similar to what Epic does with their material editor, but I havent got around to that yet.

http://www.youtube.com/watch_popup?v=OT76CpuqoT0&vq=hd1080

Shader Editor

Posted: July 29, 2011 in General, Real-Time Rendering

This is a shader editor application I wrote that allows you to write and then test your shaders in the same editor. You write the shader in HLSL on the right and then see it rendered to a mesh on the left. The editor has a bunch of default meshs such as cubes, spheres and planes etc, but also allows you to import your own using my 3dsmax exporter. Ive written my Effect library to compile shaders for multiple vertex formats, so that if you have a shader you want to be used on different formats, you can use preprocess compilation techniques to break the HLSL code into a series of #if #else defines for each format your shader supports. The shader is then compiled for each format. When a mesh is then loaded in and an Effect is to be applied, its vertex format is compared against the compiled versions of the effect, and if a match is found( ie, if that shader was compiled for that particular vertex format), then that specialised shader is applied. The editor is pretty handy as it also reports back any compilation errors from fxc.exe in text window at the bottom of the editor, and also includes an assembler view, which allows you to see the code in assembly. Textures are supported by simply specifiying the location of the file within the Texture definition using Annotations, this is also made easier by using the snippet tool for common code. Once the effect is compiled and produces the correct results, the editor allows you to then make a material from that effect. A material within my engine defines an effect to be applied and the effect parameters(such as textures, samplers and material values), which are either created in code, or loaded\saved as an xml file. The editor creates a material file by parsing the effect file for values that the user can set, this being textures\samplers and any data within “cbMaterial”, which is the material constant buffer. Again, using annotations within the shader, the author can specify how the user sees this data within the editor. So a “float4 DiffuseColour” can be given the annotation “GUI_COLOUR”, which will present the user with a colour selection tool, or a “float SpecularPower” can be given the annotation “GUI_SLIDER MIN[0] MAX[100]”, which will present the user with a slider tool to set floating point values. This list of values are then presented to the user using GUI widgets, which they can use to set the values in a convenient way. As settings are changed, the results are again seen in the view window. Once the user is happy, they can then export to material to an XML file. The Effect is exported separately, and is only referenced by the material.

Ocean Shader

Posted: December 1, 2010 in Real-Time Rendering
Tags: , ,

This is a Real-Time ocean shader I’ve been working on. The bumpyness is achieved from both vertex displacement and normal mapping, where the vertices are displaced vertically on a grid that is projected in view frustum space. This way, no buffers need to be updated, they are simply projected in the vertex shader so that all the vertices are being seen and none are wasted.

Model Viewer

Posted: October 29, 2010 in General, Real-Time Rendering

This is a model Viewer I made to visualize Meshes and their materials that are exported using my 3dsMax plugin.

Radiosity

Posted: October 29, 2010 in Real-Time Rendering

This is an approximate Radiosity effect I was trying out. Approximate in that it instead renders from the vertices point of view as opposed to the patches point of view. It also doesnt render the half sized bits either, which seems to cause artifacts in the corners,  not a bad result though. The app runs in real time (125fps in debug mode), and works by first rendering the view from each vertex to a render target, and then using that target in a seond stage to light the vertex itself, and this process continues. The viewport rendered from each vertices point of view is actually very small (only 32×32 pixels in the images below, which you’ll need to zoom in quite far), where I use a 1024×1024 render target to store them all in a grid. I also included some debug functionality that allows you to view the scene from each vertex (shown in the last few images) which was handy. The scene used here is very low poly, you can see the wireframe in the last few images.

This first image shows the end effect.

This second image shows the views from the vertices from one iteration.

These final images show the view from several vertices, which made debugging easier.

Atmospheric scattering technique shown by Sean O’neil in his GPU Gems 2 article.

FPS Images

Posted: August 29, 2010 in Real-Time Rendering

These are some pictures of an old FPS game I was working on. Unfortunately I lost all the code 😦

Rendering Techniques

Posted: July 29, 2010 in Real-Time Rendering

This demo shows three common rendering techniques and how they vary in performance. Forward-Multi-Light, Forward-Multi-Pass and Deferred Rendering. The scene comprises of  a collection of spheres and 16 point lights.

Multi-Pass Single Light = ~64FPS
Multi-Light Single Pass = ~200FPS
Deferred = ~300FPS

Multi-pass rendering is hugely inefficient as you have to redraw every thing N times, where N is the number of lights. Multi-Light rendering is more efficient as you only draw things once, but you’re lighting the geometry as its being drawn to the BackBuffer, this isn’t great as you’re probably lighting a lot of geometry that’s going to be behind other opaque geometry, wasting processing time. This could be solved by first rendering a Z-Pass and only then lighting geometry that’s going to be in front, but then you’re back to drawing things more than once again. Deferred Rendering is probably the most efficient in terms of processing as you perform the lighting as a second pass on only the effected area, and the geometry is only drawn once, but it can be both memory and bandwidth intensive when you consider the “baggage” that needs to be stored and accessed.

Using Gildors UE Viewer, I was able to extract Unreal Meshes and their animations and then export them from 3dsMax using my Exporter plugin.

Skeletal animation exported using my 3dsmax plugin.

Noise

Posted: April 29, 2010 in Real-Time Rendering

Noise generation on the GPU. (Marble, Turbulence, Fractal, Perlin Noise, QuickFractal, QuickNoise) .

You can get some really neat  and natural looking effects by applying lots of layers of noise of varying attributes. When applied in a procedural way, the possibilities are endless. It’s main use is to break up repetitive sequences, such as when textures are tiled which looks horrible, especially when viewed in high frequency. Unfortunately its really not used today as much as it should, as to get nice looking noise you usually need a lot of layers blended and this is still expensive even on modern GPUS. Most games I see still use a few blended pre-made noise patterns. However there have been a number of optimizations such as this one, which lowers the instruction count and texture fetch count significantly. Maybe one day there will be Noise functions implemented in hardware.

Screen Effects

Posted: October 29, 2009 in Real-Time Rendering

Some screen effects I was playing with that fade in and out that are supposed to show the characters state. Freezing, freezing and injured, and infected.

Texture Viewer

Posted: October 29, 2009 in General, Real-Time Rendering

This was just for fun really. I wanted to make a really fast Texture Viewer like the one in the Unreal Editor that you can move and scale around. I think it came out quite well 🙂

Refraction

Posted: October 29, 2009 in Real-Time Rendering

Refraction.

Text Rendering

Posted: October 29, 2009 in Real-Time Rendering

This is my new Text rendering component, which has the following features :

  • Left, Right and Centre Justification
  • Manual positioning or Grid snapping
  • Per letter colouring and transparency
  • Background colouring and texturing
  • Default or custom font bitmaps
  • Rotation
The data that changes per frame gets updated by filling a Dynamic Vertex buffer with values for each letter as a point. These points are then expanded into quads in the Geometry Shader.

Normal Mapping

Posted: October 29, 2009 in Real-Time Rendering

Normal mapping on models exported from 3ds Max using my exporter.

Water Ripples

Posted: October 29, 2009 in Real-Time Rendering

This is an implementation of the technique I read about here.

Pipe Water Particle Effect.

http://www.youtube.com/watch_popup?v=ZQPlTU2XxcQ&vq=hd720

Ocean Rendering

Posted: October 29, 2009 in Real-Time Rendering

Been workin on my ocean shader recently. Its very basic at the moment, and simply uses normal maps for shading and height maps for depth. The mesh itself is a radial clump of vertices that follows the camera around, where the vertices move further apart with distance from the camera. I use procedural noise for the normal mapping and depth map render targets for the depth of the water, so it has no static dependencies in relationship to the rest of the environment. I still want to add :

  • A better merge of the shallow water and the terrain, maybe something like froth as Crytek does
  • Some kind of wave system for the shallow water
  • Splash particles effect for interactions with the water
  • An underwater view
  • Caustics and GodRays