Zygote* is a first-person action/platforming video game. Bio-engineering has fundamentally advanced and scientists now routinely integrate nano-technologies into human gametes. Previously arrested for treason, you're held captive inside a simulated labyrinth, and must utilise an ESS implant to escape the virtual prison. But the preceding years of class-warfare have born an idealistic chancellor, intent on eradicating the unmodified populace and any sympathizers.
* The project's name during development is expected to change at release. You'll see alternate trial names as we test/release prototypes and demos.
The game's main mechanics focus on 3d platforming, management of vital-signs, use of tactical sensors and player boosts, via the player’s Electronic Surveillance System (ESS).
This post takes a look at the process used to create a 3d Soldier. The game is in development and I’m very excited to see these sort of details coming together.
Check out the IndieDB page for more information and download an alpha version from a few months ago in all it's 'placeholder' glory.
The Starting Point
I use Blender for 3d Modelling, GIMP for image editing, Unity3d for the game engine and MakeHuman to build humanoid templates. I’ve been using various versions of each of these tools for many years so I’m pretty familiar with them:
If you’re a hardcore modeller, you might start your 3d character model from scratch. By subdividing, loop-cutting, extruding and sculpting some basic shapes into something special. But I tend to start my humanoids from a baseline template model, mainly due to my lack of time and resources. In this case, I used the MakeHuman app to generate a basic skinned character that can be imported into Blender.
After initially importing the model into Blender, I cleaned up the basic model and created a ‘template‘ for all the humanoids in the game. I removed shape keys and tidied up the armature until I was satisfied with the basic setup and then saved the template 3d model in Blender.
Working between Blender and Unity3d couldn’t be any easier, since Unity has an excellent Blender importer that automatically reimports the model as soon as it detects an updated file.
Creating the Low Polygon Mesh
I cut the model in half, threw the left half away and used a Mirror Modifier to replace that side of the character. This meant I only had to work on half the model and kept both the mesh and armature consistent. I also add an Edge Split modifier to see the edges and set the object to Smooth Shading for visualisation.
Then I edit the model, by extruding, masking and moving vertices, and adding/removing edge loops as required. MakeHuman does a pretty great job of setting up the mesh topology, but sometimes I’ll have to do a bit of retopology to get things looking okay after my editing. In the end I test the model in Blender by entering Pose Mode and moving the armature (which I normally keep on a separate layer to avoid cluttering the view). I can visualise the joints and see how the model looks in different positions.
Sculpting a High Polygon Mesh
Next I add the Multiresolution Modifier to the Mesh, subdivide the mesh a few times from the modifier menu (in this case 4 added enough detail for my needs) and start Sculpting.
The high detail mesh will eventually be used to bake the normal map and be used as a reference when painting the base texture.
Painting the Basic Diffuse Texture
Blender allows you to paint directly onto the 3d model in it’s Texture Paint mode. Be sure to have the Textured Solid option enabled so that textures show on the model. I’ll paint the textures using both the low and high detail meshes. Texture painting is pretty cool. You can paint solid colors, gradients, procedural textures created in Blender or imported textures directly onto the model.
Before working with textures in Blender, the model needs to be UV unwrapped. Sometimes you can get away with using Blender’s Smart UV Unwrap, but I manually added seams to the Low Poly model to control the UV unwrap and resulting islands created in the output textures.
Baking a Normal Map
Blender has all the facilities to bake normal maps directly in the editor. First I needed to create a blank texture in Blender and assign it to the mesh UV vertices in Edit mode. Secondly, I change to Object mode and set the Preview option of the Multires Modifier to 0 to show the base mesh. Thirdly, I go to the Render panel, scroll to the Bake section, select the Normal Map output and enable the ‘Bake from Multires‘ option. Finally I hit the Bake button and watch as the normal map is generated.
This result in the high poly normals being baked onto the low poly mesh. I save the texture as a new Unity image asset and later assign it as a normal map in the model’s Unity material (I’m using Unity’s new Standard shader in the game engine).
Baking the Ambient Occlusion
While I’m in the baking mood, I created and assigned a new texture to the model. Then I changed the Bake output option to Ambient Occlusion and hit the Bake button again. The output is a greyscale image that I can use in GIMP to modify my basic texture and give it some depth/details.
Combining the AO and Base Texture
I opened GIMP and imported the base texture and the AO texture as separate layers. By overlaying the AO on the base texture and setting it’s blend type to Multiply, the result is a final texture that has new contrast and details compared to the original basic texture. NB: Unity’s shader supports an AO channel so I might use that with future modelling.
Creating an Emissive Texture
Unity’s Standard Shader has channels for diffuse textures, normal maps and emissive maps (among other things). To create the emissive texture I opened GIMP and loaded the original base texture. Then I overlayed an empty layer and filled it completely black, leaving it at about 50% opacity so I could see through to the underlying texture.
Painting the emissive layer in GIMP is simply a matter of adding colour where I want emissive light in the final Unity material.
I saved the emissive layer to it’s own image and placed it with the other Unity assest for the 3d model.
Testing in Unity 3d
Finally I had all the textures and mesh/armature assets ready for Unity. After import, Mecanim provides the ability to test and finetune the Skinned mesh. Normally, if things have been setup correctly in Blender, the Unity will Automap the bones and the mesh is ready for animating in the engine.
To assign the textures to the model, Unity uses Materials. I created a couple of materials for the model (one for the body/skin and another for the armour/clothes). I then assigned the AO-modified base texture, the normal map and the emissive map to the Unity material and tinkered with the parameters to achieve the right look.
Here’s the 3d model in the Unity3d engine under a variety of lighting conditions.
Thanks for reading,
No articles were found matching the criteria specified. We suggest you try the article list with no filter applied, to browse all available. Post article and help us achieve our mission of showcasing the best content from all developers.
No downloads were found matching the criteria specified. We suggest you try the download list with no filter applied, to browse all available. Post download and help us achieve our mission of showcasing the best content from all developers.