Archive for the ‘Real-Time Rendering’ Category

Terrain

Posted: October 29, 2011 in Real-Time Rendering

Terrain rendering through multiple draw calls by dynamically stitching together patches. The pictures shown use a 4096×4096 heightmap giving around 6 square miles of exploration using units of 1 meter per game unit. The shader accepts up to 16 layers, allowing for the painting of different textures using blending. The patches are processed in the Vertex shader (displacement, normals, tangents), which makes the terrain completely dynamic, and could be made destructable by using a dynamic heightmaps. The terrain is quite low in terms of memory cost as only different LOD  patches are stored (2^4, 2^5, 2^6, 2^7, 2^8), and are reused. The real cost comes into storing the heightmap which can be fairly large.

4096 * 4096 ~ 16MB (6 Square miles)
8192 * 8192 ~ 67MB (25 Square miles)
16384 * 16384 ~ 268MB (100 Square miles)

However, these need not be loaded all at once and can be streamed in when required, releasing the out of view ones.

Advertisements

Particle Editor

Posted: August 29, 2011 in General, Real-Time Rendering

This is my Particle Editor, which was inspired by Unreals Cascade, I call it ‘AssCade’. Its got a lot of the same functionality, including being able to graph all sorts of effects. As you can see from the images, each particle system is made from a series of emitters, which are in turn affected by modifiers. All the computation is done on the CPU as it made it easier to do effects on sibling emitters. Once all the emitters have been processed, they are then sent to the GPU to render. Where each particle consists of a single vertex, and is then expanded into a quad in the Geometry shader. Rather than have each emitter manage a vertex buffer, I instead chose to use the more optimized route of having each particle system manage the vertex buffer, that is used by its child emitters, requiring only a single lock per Particle system, as opposed one for each emitter, which can really add up with effects usually having at least 3-4 emitters.

I am toying around with the idea of having all the computation performed on the GPU to increase performance, by compiling\creating a HLSL effect file from the particle system nodes, similar to what Epic does with their material editor, but I havent got around to that yet.

http://www.youtube.com/watch_popup?v=OT76CpuqoT0&vq=hd1080

Shader Editor

Posted: July 29, 2011 in General, Real-Time Rendering

This is a shader editor application I wrote that allows you to write and then test your shaders in the same editor. You write the shader in HLSL on the right and then see it rendered to a mesh on the left. The editor has a bunch of default meshs such as cubes, spheres and planes etc, but also allows you to import your own using my 3dsmax exporter. Ive written my Effect library to compile shaders for multiple vertex formats, so that if you have a shader you want to be used on different formats, you can use preprocess compilation techniques to break the HLSL code into a series of #if #else defines for each format your shader supports. The shader is then compiled for each format. When a mesh is then loaded in and an Effect is to be applied, its vertex format is compared against the compiled versions of the effect, and if a match is found( ie, if that shader was compiled for that particular vertex format), then that specialised shader is applied. The editor is pretty handy as it also reports back any compilation errors from fxc.exe in text window at the bottom of the editor, and also includes an assembler view, which allows you to see the code in assembly. Textures are supported by simply specifiying the location of the file within the Texture definition using Annotations, this is also made easier by using the snippet tool for common code. Once the effect is compiled and produces the correct results, the editor allows you to then make a material from that effect. A material within my engine defines an effect to be applied and the effect parameters(such as textures, samplers and material values), which are either created in code, or loaded\saved as an xml file. The editor creates a material file by parsing the effect file for values that the user can set, this being textures\samplers and any data within “cbMaterial”, which is the material constant buffer. Again, using annotations within the shader, the author can specify how the user sees this data within the editor. So a “float4 DiffuseColour” can be given the annotation “GUI_COLOUR”, which will present the user with a colour selection tool, or a “float SpecularPower” can be given the annotation “GUI_SLIDER MIN[0] MAX[100]”, which will present the user with a slider tool to set floating point values. This list of values are then presented to the user using GUI widgets, which they can use to set the values in a convenient way. As settings are changed, the results are again seen in the view window. Once the user is happy, they can then export to material to an XML file. The Effect is exported separately, and is only referenced by the material.

Ocean Shader

Posted: December 1, 2010 in Real-Time Rendering
Tags: , ,

This is a Real-Time ocean shader I’ve been working on. The bumpyness is achieved from both vertex displacement and normal mapping, where the vertices are displaced vertically on a grid that is projected in view frustum space. This way, no buffers need to be updated, they are simply projected in the vertex shader so that all the vertices are being seen and none are wasted.

Model Viewer

Posted: October 29, 2010 in General, Real-Time Rendering

This is a model Viewer I made to visualize Meshes and their materials that are exported using my 3dsMax plugin.

Radiosity

Posted: October 29, 2010 in Real-Time Rendering

This is an approximate Radiosity effect I was trying out. Approximate in that it instead renders from the vertices point of view as opposed to the patches point of view. It also doesnt render the half sized bits either, which seems to cause artifacts in the corners,  not a bad result though. The app runs in real time (125fps in debug mode), and works by first rendering the view from each vertex to a render target, and then using that target in a seond stage to light the vertex itself, and this process continues. The viewport rendered from each vertices point of view is actually very small (only 32×32 pixels in the images below, which you’ll need to zoom in quite far), where I use a 1024×1024 render target to store them all in a grid. I also included some debug functionality that allows you to view the scene from each vertex (shown in the last few images) which was handy. The scene used here is very low poly, you can see the wireframe in the last few images.

This first image shows the end effect.

This second image shows the views from the vertices from one iteration.

These final images show the view from several vertices, which made debugging easier.

Atmospheric scattering technique shown by Sean O’neil in his GPU Gems 2 article.