Saturday, June 5, 2010
Priest character image behind the scenes...

Priest Character modeled for Demigod PC game. Character design by Steve Snoey. Render is created using in game model, rendered with displacement maps created in zbrush.
I designed the character pipeline for this game and while I am biased, I had fun and enjoyed the process.
The jist is to create the character models with good topology and silhouette/volume detail for zbrush. This makes for models and textures that can easily be reduced as needed for in game; but also hi fidelity for use in cutscenes and marketing, typically a separate set of models and textures would be outsourced for cinematics and marketing.
Here are some pics showing the different assets that went into making the finished character:
Creating a Demigod
Creating a Demigod: Vampire Lord Erebus
character pipeline document by Matt Dudley
(note: this document was translated and published in Russian in PC Games RU magazine)
The Demigod characters are at the core of the game. The characters all had initial designs created during preproduction.
Common themes established during pre production include: future fantasy (mechanical technology infused with magic, etc.) An epic scale for environments and characters. Environments were to be ancient arenas passed on between civilizations that came and went.
Concept/Visual Design:
The Vampire Lord Erebus design is centered on the idea of a custom suit that recycles and circulates blood to his body.
Image#1 Early concept art by Nate Simpson:
Earlier designs of the Vampire Lord had the character’s proportions much more like a wrestler or football player. In order to add more variety to the look of each character it was decided to exaggerate the proportions toward a thinner silhouette.
The existing design was to be left intact but the proportions were stretched out and exaggerated. Ex: his arms and fingers elongated to give him a more creepy and sinister appearance and his features softened to give him a more androgynous/mysterious vampire look.
Image#2 Orthogonal drawings with revised proportions by Steve Snoey:
Action sketches were done to explore the character’s attitude.
Some show him cackling in the heat of battle as he uses his powers, etc.
Image#3 Action sketches by Steve Snoey:
As the design evolved it was decided that the Vampire Lord should still maintain regality and elegance as a gentleman of substance, even in battle.
The design goal for these changes was for players to be able to tell the difference between the Vampire Lord and Regulus (another hero of similar stature and height) at a glance just from their silhouettes.
Modeling:
The character models for Demigod were planned from the beginning to be highly detailed so as to look cool in game and be useful in marketing pieces and pre rendered movies.
After concept art is approved, orthogonal drawings are made as reference to model from. (see Image#2 orthogonal drawings above.)
This is an important step as it helps to reinforce the design of the silhouette. This also provides another pass at refining and resolving details that may work in the original concept but are problematic in 3d.
Models were created in stages/passes. 1st pass is very rough with a focus on volume, silhouette and proportions.
Image#4 1st pass rough model, based on an early version of the concept:
Image#5 Rough geometry with revised proportions:
After the first pass model is complete it is sent to animation for rough rigging and animating.
This is more efficient as animation can begin and the animators can give informed feedback towards the final model.
Animations are focused around having strong easy to read silhouettes for each given movement. With this approach it is easy to block in animations around key poses or moments of a given action. By focusing the animations and the model around the concept of the silhouette it is easier to imagine how to improve both as the work progresses.
The rough character is exported to the game and any changes to proportion for animation or readability in game can be decided on before the final model is completed. (See Image#5 above, this model was sent to animation for export to the game.)
Image#6 Work in progress toward the final model:
For the final geometry there was an emphasis on building enough detail so as to make the model easier to work with in Zbrush.
For example a focus on quad polygons and topology flow, that are less efficient for display in the game, but much more efficient to work with in Zbrush.
Because the Demigod engine has great support for level of detail swapping of models this makes the high detail/high quality approach possible. (See Image#6 above, the face has more detail than appropriate for in game.)
Once the high detail model is complete it can be used in many ways, while a more efficient level of detail model is generated quickly using poly reduction tools in Softimage XSI for better performance in game.
Image#7 Final model:
After the final geometry is complete, UV’s are created. A standalone program called UVlayout is used for this. Even though it is another piece of software; its efficiency is worth the effort up front.
Image#8 Final UV’s created with UVlayout:
UV’s are created to fit one texture per model. This is more efficient for rendering in the Demigod engine. UV pieces/shells are organized to be easier to work with when texturing and editing the normal maps generated by Zbrush.
(see Image#8 UV shells are oriented vertically. Limbs and other symmetrical elements are mirrored in cases where the UV seams will be less obvious. This is easier to work with when editing a normal map.)
Once UV’s are completed the model is exported to Zbrush for sculpting details and generating the normal map.
Image#9 Work in progress in Zbrush:
Texturing:
The normal map is the first texture created. It serves as the basis for the rest of the textures applied to the model.
Image#10 Normal map generated from Zbrush:
After the normal map is generated, the diffuse color map work begins.
The diffuse color map is created in photoshop.
First step: an ambient occlusion texture is generated. (There are many different tools for this, for Demigod a standalone program called FAOgen is used for speed and ease of use.)
Image#11 Ambient Occlusion texture generated from FAOgen:
The ambient occlusion texture is useful as a starting point but it only shows low frequency detail.
For finer high frequency detail a useful trick is copy the blue channel from the normal map texture in Photoshop. Apply it as a multiply layer for referencing finer details when painting the diffuse color map.
Image#12 blue channel of normal map:
A typical diffuse color map has relatively simple colors that are each saved on separate layers in Photoshop. (This makes it easier to refine and change the colors as needed.)
In many areas the colors are applied very quickly and the ambient occlusion and blue channel layers are used as unifying elements.
As the normal map is the main source of texture detail, better results come from a simpler diffuse color map that reinforces details found in the normal map.
Image#13 Final Diffuse color map:
The final texture created for a Demigod will be referred to as the SpecTeam texture.
This texture is composed of four individual grayscale textures that are stored in each channel of a 32bit color texture. Each channel is used as a mask related to four specific rendering attributes.
Ex: Red Green Blue Alpha. The Red channel contains a mask that controls the amount of reflectivity (environment map reflection). The Green channel is a mask that controls the amount of specularity (shiny highlights). The Blue channel masks the amount/intensity of glow that is applied at render time. The Alpha channel masks the amount of player/team color that is displayed on the Character.
Image#14 Mask textures stored in each channel of the SpecTeam texture:
Shaders and Rendering in game:
Rendering Demigods involves using shaders. Shaders define how the various textures are used in order to render the character and other elements in game.
Image#17 WIP sequence of images showing how various textures are used when rendering a Demigod:
Image#18 Vampire Lord Erebus rendered in Demigod engine:
softparticles10_directx10_sample
Post Mortem
Direct3D 10 Sample: Soft Particles 10
Date: 6_13_06
(above: softparticle10 sample image. -3d/volumetric particle smoke).
Objectives:
Create an art presentation to help illustrate the Direct3D10 soft particle technique.
The SoftParticle10 sample was created to showcase depth tested/sorted particles in order to remove noticeable particle intersections (between particles and geometry).
Team members:
Art Director: Cyrus Kanga
Developer: Shanon Drone
Lead Artist: Matt Dudley
Duration of Project: 4 weeks
Technology used:
D3D10 reference rasterizer (software emulation)
Concept Process:
The team identified key areas that the sample would need to address:
The main requirement for the art was to create a compelling context for a subtle technical improvement to the way particles are drawn and interact in D3D10.
A basic amount of complexity was needed in terms of surfaces for the particles to intersect (or not) with. There needed to be a source for the particles to emanate from as well as surface(s) for the particles to pass through.
After some brainstorming a direction for the sample was set. A destroyed tank would be the setting for the sample. A blasted hole in the tank would be the source of the particle smoke, which in turn would pass through other parts of the destroyed tank as well as the ground plane.
The concept process started with the requirement of having a recognizable battle tank. For fun it was decided to design a tank that would be recognizable, but not taken from history or completely conventional.
Pre Production:
Pre production time was spent up front researching various tank designs and histories for interesting forms and details that would help make for an interesting scene. Thick smoke or steam was a focus for the sample, as open flame effects were to be avoided in order to keep things simple.
The focus on thick smoke led to research on early steam engines and boiler powered tractors etc. This led to a tractor derived tank design.
The details of the concept evolved and were refined during production.
(above: early color & value concept)
Production Process:
Requirements and Limitations:
The shader/material attributes and textures were limited to the following: diffuse color, specular mask and normal map.
Textures were created at 2k and 4k resolutions per map, with a plan to reduce resolution as needed.
Poly limit was capped at ~36k (tri) for the tank and scene geometry.
Tools and Techniques:
How the technique works:
The SoftParticles10 technique uses the Pixel Shader (PS) to do a depth test on each particle per frame, testing the distance between the particles and nearby surfaces (geometry). At a determined threshold/distance the particle is faded out to avoid drawing the intersection between the particle and the surface it is passing through.
(above: diagram illustrating the “soft” intersection result of the softparticle10 technique.)
(above: example images showing the difference between hard and soft intersections with 2d and 3d particles).
Smoke Particles:
Dev chose to show examples of both 2d and 3d particles for the sample. The 3d (volume) particles were created by Dev using Perlin noise as a basis for the volume effect. Smoke/particle behavior was planned in advanced and created by Dev.
In order to reduce distraction between the 2d and 3d techniques, the 2d particle textures were created by rendering out frames of the 3d particles in motion. These frames were then saved to a volume texture (one larger texture containing sequential images) and applied to the 2d particles.
The end result was a clear distinction between the volume effect and the more traditional flat particles.
Tank Geometry:
Software used: Maya for geometry, Photoshop for textures.
The build process for the tank geometry was pretty strait forward. Most shapes started out as cylinder primitives, with beveled edges and other details added after the overall silhouette was complete.
UV’s:
In an effort to the most texture detail several revisions to the UV coordinates were needed.
UV’s were created in maya using planar projections based on camera view. Projections were stitched together, leaving seams where they were least likely to be seen.
Scene Geometry:
The landscape that the tank sits on started out as a poly plane primitive in maya. The plane was subdivided in the area closest to the tank in order to add details like pushed sand and tracks in the ground.
The barricades were added last to add more interest to the scene and composition.
Textures:
With geometry complete, the next step was to create final textures.
(above: example shot of scene in maya; untextured and flat shaded vs. fully textured, lit and smooth shaded).
Early in production a plan formed around using ambient occlusion (AO) to help tie together the lighting and texture detail within the scene.
The main advantage AO brought to this project was the consistency across all the different elements being textured. The most common disadvantage to using AO is that in order to generate usable results it requires non overlapping UV’s. Since non overlapping UV’s were already planned to make use of a normal map for the tank there was really no drawback to using AO as well.
(ambient occlusion pass)
(diffuse color, specular mask, normal map for tank)
(diffuse color map for ground plane).
(skybox texture).
The specular mask consists of the unedited AO texture stored in the alpha channel of diffuse maps.
The normal maps were generated using nvidia’s normal map generator plugin for photoshop. The rivet details were taken from photo reference while the welded edges, dents and scratches were painted using a custom brush in photoshop.
The diffuse ground plane texture was created by layering images of desert sand on top of rock. The tank and ground plane were rendered together to provide ad AO texture that was the basis for the ground texture detail.
After texturing all of the elements it was clear that the ground plane detail was detracting from the tank and the focus of the scene. The simplest solution turned out to be to blur the edges of the ground texture in photoshop, this helped refocus the detail within the scene.
Summary of key technological requirements and concepts:
This soft particle technique is especially useful for effects that will be intersecting geometry in an obvious way.
The biggest improvements can be found when using the technique with 2d particles.
For this sample all particle effects were created procedurally by the Dev. The 3d particles are based on Perlin noise, and in turn the 2d particles use textures that were rendered from the same 3d noise.
instancing10_directx10_sample
Post Mortem
Direct3D 10 Sample: Instancing 10
Date: 3_30_06
Objectives:
Enhance art and presentation of existing Direct3D10 sample.
The Instancing10 sample was created to showoff Geometry Shader (GS) based instancing and how the GS can be used to create a complex scene with very few draw calls.
Team members:
Art Director: Cyrus Kanga
Developer: Shanon Drone
Lead Artist: Matt Dudley
Duration of Project: 3 weeks
Technology used:
D3D10 reference rasterizer (software emulation)
Concept Process:
The existing sample was a scene of grass and trees, growing out of islands suspended high in the air, conceived by the Dev.
The team reviewed the existing sample and decided early on that it was effective but would benefit from a better art presentation.
The team discussed and categorized what worked and what didn’t during the review. With notes in hand, art set out to concept and refine a direction.
The focus of the sample was the instancing of blades of grass and individual leaves on the tree. The concept process focused on the tree and island because they served as the foundation for the leaves and grass.
Pre Production:
Pre production time was spent up front researching compelling tree forms as well as rock formations for the island.
Classic bonsai trees were an early inspiration during pre production. Research into the twisted forms of exotic bonsai led to research on ancient trees and natural twisted trunk growth.
Desert rock formations were the inspiration for the island. Researching rock and erosion anomalies provided a wealth of compelling reference to use.
Thumbnail composition sketches were created and refined, leading to concept sketches of specific tree and island forms.
The details of the concept evolved and were refined during production.
(above: early color & value concept)
Production Process:
Requirements and Limitations:
It was a requirement of the sample to scale uv’s in order to tile/tighten texture detail.
In order to avoid over complicating the sample, all geometry needed to use the same shader/attributes. Each object drawn was limited to one texture/pass.
Poly limit was capped at ~150k for the source tree and island geometry together.
Tools and Techniques:
The existing sample used a procedurally generated tree created by Dev as the source for the instanced tress. After some testing it was decided that designing a new tree from scratch would provide the best result for the sample.
Tree Geometry:
In order to get the most out of the tree geometry the trunk, branch placement, size etc. were adjusted to create a unique silhouette from as many angles as possible.
Software used: Maya and Zbrush for geometry, Photoshop for textures.
To create the first pass of the tree geometry Zspheres were used create a simple volume of trunk and branches. Zspheres were chosen because of their flexible nature, allowing the placement of branches and general form to change while still providing relatively clean and evenly spaced geometry. The lowest subdivision level was exported from Zbrush as .obj for refinement and UV projection in Maya.
The exported mesh from Zbrush was brought into maya to add detail and improve the even spacing of verices.
UV’s:
UV’s were created in maya using planar projections based on camera view. Projections were stitched together, leaving seams where they were least likely to be seen.
Based on the tree reference already gathered, the trunk of the tree received the least UV surface area, while the smallest branches received the most. This allowed the texture detail to shrink in a natural looking way out towards the tips of the branches.
A first pass texture was used to help determine the flow of the UV’s from the trunk to the branches.
After UV’s were projected the model was exported back to Zbrush for added detail.
The smaller branches did not need to be subdivided or detailed as they were going to be covered up with leaves anyway. The smallest branches were separated from the main tree, to avoid subdividing them, after subdividing the tree the branches were combined back into the same mesh shape/group.
Zbrush was then used to add surface details like knots and folds/wrinkles and bulbous growth etc.
Once the general surface was complete another pass was taken to improve the proportions of the tree using a large brush with the move command. After the proportions were finalized one last pass was painted over the areas most affected by the move. The tree geometry was exported back to Maya where UV’s were adjusted and finalized.
There was a plan to create several variations of trees -all to be modified versions of the Zsphere original, we decided not to in the end for a couple of reasons:
A low level priority to begin with, as additional unique geometry dilutes the point of the sample (instancing).
It turned out to be more work than anticipated to transfer projected UV’s between Zsphere created meshes as Zspheres proceduraly generate/change vertex count and order.
Island Geometry:
The island geometry was created in Maya. The form was arrived at by creating a sphere primitive and reshaping it using Maya’s sculpt polygon tool (Artisan tool). UV projection consisted of cylindrical for the sides and bottom and a planar projection for the top where the grass and tree would sit. Afterwards the island was exported to Zbrush for added detail and exported back to Maya for UV adjustment.
Textures:
With geometry of the tree and island complete, the next step was to create final textures.
The tree texture was created from two different tree bark images that were color adjusted and edited to tile seamlessly. The two images were used to create a look of having areas of bark stripped away overtime. The goal being that when tiled, the irregular pattern would help to make the tree look interesting from multiple angles.
The island textures were based on one image of a desert rock formation and made to tile seamlessly. The island top grass was created using Maya paint effects in “canvas” mode, then composited in Photoshop.
Textures for the individual grass blades were created by combing results from Maya paint effects with digital photos of grass blades. The major benefit of using paint effects for this was that paint effects create a natural looking randomized result, with a perfectly matching alpha...
For simplicities sake all 3 grass types as well as the flower type would use identical quad primitives. So in order to get more natural looking height variation the grass images were reduced in different amounts within the texture area.
Dev used hardware multi sampling functions originally meant for ant aliasing in order to draw the leaves and grass without having to do depth sorting on every instance. (a significant performance savings)
In order to use the new technique there were two technical requirements for the alpha channels.
1. Alpha’s needed a highly blurred, non aliased alpha edge.
2. Alpha’s needed to be opaque in the interior of the alpha image. (the multisampling tech only deals with edge transitions, so alpha detail in the interior will cause artifacts.)
The flower texture was painted from scratch in Photoshop, based on photo reference. The matching alpha was created by painting the flowers into an empty layer and then selecting and pasting selection into the alpha channel.
The background consisted of a panoramic sky image taken from a stock library and edited to add more color and contrast to key areas. Final colors were arrived at by choosing the favorite among a host of variations of color themes:
The sky was projected onto a sphere primitive that was then scaled non-uniformly along its Y axis. This compensated for the majority of the stretching that occurred due to the differences between the panorama and sphere dimensions.
Instanced leaf placement:
The placement of leaves was based on specific vertices of specific branches. The plan was to use Maya’s quick select sets to save a selection of points and then extract the positional info from the saved Maya ascii (.ma) file. This was tested and it was discovered that quick selection sets and their vertex positional data is not displayed in an obvious manner in the .ma file.
The solution turned out to be just exporting the selected faces of smaller branches as .obj; because only the selected faces were stored in the file it was much easier to find and use the positional data that was needed.
Instanced grass placement:
The placement of grass in the original sample was uniform and determined by the number of faces on top of the island. The team discussed different ways of controlling grass placement in a more artistic way. The first approach was to use vertex color data as a reference of where to distribute different grass types and densities.
Painting the vertex colors was easy enough but .obj was not storing the data. Dev felt that it was easier to use an RGBA texture as a guide, using the four channels of the texture to describe where each grass type would be placed. The guide texture itself would not be rendered.
The density was still determined per face from the island geometry. Because of the relatively high amount of subdivision on the island the low end of grass distribution was still very high. The solution was to poly reduce the top of the island in an irregular way so that grass density would look more natural and less uniform.
Instanced island placement:
Island placement was randomly generated by Dev.
Summary of key technological requirements and concepts:
GS quad primitives were used for all grass, flower and leaf objects.
Hardware multisampling technique was used to avoid depth sorting of instanced primitives. (requiring specifically blurred alpha edges.)
Subscribe to:
Posts (Atom)


