"Zygote: The Rebel Uprising" development build (alpha v0.1.3) has been released, requesting gamedev and community feedback.
Zygote is a first-person action/adventure video game. Set in the near future, at a time when bio-engineering has fundamentally advanced, scientists now routinely integrate nano-technologies into human gametes. But the preceding years of class-warfare have born an idealistic chancellor, intent on eradicating the unmodified populace. The ruling senate reprieved his genocidal plans, but will soon capitulate. Only one person now stands in their way.
The protagonist, previously deported for treason against the senate, arrives in an unknown province to discover an emerging insurrection. A small group of unmodified, calling themselves the Kinship Resistance, have prepared plans to escape the province, return home and overthrow the chancellor. The protagonist joins the rebel alliance to master vital skills that will shape their combined destiny.
Zygote’s unique adventure is interleaved between it’s hand-crafted hub-world and it’s many procedurally-assembled platforming levels. The primary story develops as the protagonist interacts with characters, completes objectives and overcomes obstacles to recover the Angkor Orbs, unlock the Angkor Gate and ultimately return back to their homeland.
Zygote’s primary game mechanics focus on 3d platforming, management of vital-signs, use of tactical sensors and player buffs, via the player’s Electronic Surveillance System (ESS).
The ESS monitors the player’s vital-signs to provide an overall health status, and includes:
Every player action, activated buff, Patogen environment and Kinship challenge affects the player’s vital signs in real time. Blackouts or death are a threatening consequence to deteriorating vital signs.
The ESS also comes with a collection of active and passive sensors to monitor the player’s environment, including:
Using the ESS, the player can derive a tactical advantage or discover nearby points of interest.
This post takes a look at the process used to create the Kinship Soldiers for Zygote’s beta. The game is in development and I’m very excited to see these sort of details coming together.
Check out the IndieDB page for more information and download an alpha version from a few months ago in all it's 'placeholder' glory.
So a quick explanation of Zygote: The Rebel Uprising…
It’s a first-person action adventure game. The player arrives in the middle of an emerging war between the Kinship and the attacking Patogens. Joining the Kinship, the player helps recover the Angkor Orbs to unlock the Angkor Gate and ultimately return back to the homeland. Along the journey the player discovers whom to trust, who’s being betrayed and why everyone’s isolated in a mysterious province.
These characters add story details and exposition, so it’s important that they deliver sufficient quality to present an engaging narrative and world.
The Starting Point
I use Blender3d for 3d Modelling, GIMP for image editing, Unity3d for the game engine and MakeHuman to build humanoid templates. I’ve been using various versions of each of these tools for many years so I’m pretty familiar with them - I like to call ‘this’ my comfort zone:
If you’re a hardcore modeller, you might start your 3d character model from scratch. By subdividing, loop-cutting, extruding and sculpting some basic shapes into something special. But I tend to start my humanoids from a baseline template model, mainly due to my lack of time and resources. In this case, I used the MakeHuman app to generate a basic skinned character that can be imported into Blender3d.
After initially importing the model into Blender, I cleaned up the basic model and created a ‘template‘ for all the humanoids in the game. I removed shape keys and tidied up the armature until I was satisfied with the basic setup and then saved the template 3d model in Blender.
Working between Blender3d and Unity3d couldn’t be any easier, since Unity has an excellent Blender importer that automatically reimports the model as soon as it detects an updated file.
Creating the Low Polygon Mesh
I cut the model in half, threw the left half away and used a Mirror Modifier to replace that side of the character. This meant I only had to work on half the model and kept both the mesh and armature consistent. I also add an Edge Split modifier to see the edges and set the object to Smooth Shading for visualisation.
Then I edit the model, by extruding, masking and moving vertices, and adding/removing edge loops as required. MakeHuman does a pretty great job of setting up the mesh topology, but sometimes I’ll have to do a bit of retopology to get things looking okay after my editing. In the end I test the model in Blender by entering Pose Mode and moving the armature (which I normally keep on a separate layer to avoid cluttering the view). I can visualise the joints and see how the model looks in different positions.
Sculpting a High Polygon Mesh
Next I add the Multiresolution Modifier to the Mesh, subdivide the mesh a few times from the modifier menu (in this case 4 added enough detail for my needs) and start Sculpting.
The high detail mesh will eventually be used to bake the normal map and be used as a reference when painting the base texture.
Painting the Basic Diffuse Texture
Blender allows you to paint directly onto the 3d model in it’s Texture Paint mode. Be sure to have the Textured Solid option enabled so that textures show on the model. I’ll paint the textures using both the low and high detail meshes. Texture painting is pretty cool. You can paint solid colors, gradients, procedural textures created in Blender or imported textures directly onto the model.
Before working with textures in Blender, the model needs to be UV unwrapped. Sometimes you can get away with using Blender’s Smart UV Unwrap, but I manually added seams to the Low Poly model to control the UV unwrap and resulting islands created in the output textures.
Baking a Normal Map
Blender has all the facilities to bake normal maps directly in the editor. First I needed to create a blank texture in Blender and assign it to the mesh UV vertices in Edit mode. Secondly, I change to Object mode and set the Preview option of the Multires Modifier to 0 to show the base mesh. Thirdly, I go to the Render panel, scroll to the Bake section, select the Normal Map output and enable the ‘Bake from Multires‘ option. Finally I hit the Bake button and watch as the normal map is generated.
This result in the high poly normals being baked onto the low poly mesh. I save the texture as a new Unity image asset and later assign it as a normal map in the model’s Unity material (I’m using Unity’s new Standard shader in the game engine).
Baking the Ambient Occlusion
While I’m in the baking mood, I created and assigned a new texture to the model. Then I changed the Bake output option to Ambient Occlusion and hit the Bake button again. The output is a greyscale image that I can use in GIMP to modify my basic texture and give it some depth/details.
Combining the AO and Base Texture
I opened GIMP and imported the base texture and the AO texture as separate layers. By overlaying the AO on the base texture and setting it’s blend type to Multiply, the result is a final texture that has new contrast and details compared to the original basic texture. NB: Unity’s shader supports an AO channel so I might use that with future modelling.
Creating an Emissive Texture
Unity’s Standard Shader has channels for diffuse textures, normal maps and emissive maps (among other things). To create the emissive texture I opened GIMP and loaded the original base texture. Then I overlayed an empty layer and filled it completely black, leaving it at about 50% opacity so I could see through to the underlying texture.
Painting the emissive layer in GIMP is simply a matter of adding colour where I want emissive light in the final Unity material.
I saved the emissive layer to it’s own image and placed it with the other Unity assest for the 3d model.
Testing in Unity 3d
Finally I had all the textures and mesh/armature assets ready for Unity. After import, Mecanim provides the ability to test and finetune the Skinned mesh. Normally, if things have been setup correctly in Blender, the Unity will Automap the bones and the mesh is ready for animating in the engine.
To assign the textures to the model, Unity uses Materials. I created a couple of materials for the model (one for the body/skin and another for the armour/clothes). I then assigned the AO-modified base texture, the normal map and the emissive map to the Unity material and tinkered with the parameters to achieve the right look.
Here’s the male Kinship Model in the Unity3d engine under a variety of lighting conditions.
Thanks for reading,
@BelowZeroGames It's such a happy zone to be in. I'm looking forward to the day when I have no wall space left - covered in art :D
17hours 42mins ago by cfantarella
@eigenbom Marry Me Steve… I mean Ben ;D
19hours 2mins ago by cfantarella
@eigenbom Yikes! ;D
19hours 4mins ago by cfantarella
@rm2kdev You have a wonderful working space. I love it! I gotta come find you if I'm ever in Perth! Haven't made it West yet, but one day :)
19hours 11mins ago by cfantarella
@rm2kdev Qn: With those instruments, do you play in bands? What sort of music do you like to play?
19hours 15mins ago by cfantarella
@rm2kdev There's something beautiful about a drum kit. That looks so gorgeous!
19hours 18mins ago by cfantarella
@rm2kdev Trello's my proxy-whiteboard. I have it on laptop and phone. Works pretty well. I'd just doodle on a real whiteboard ;)
19hours 20mins ago by cfantarella
@rm2kdev I've almost run out of wall space, with Edan every day giving me a new picture ;D But I wouldn't have it any other way.
19hours 20mins ago by cfantarella
@rm2kdev PS. I'm great on the drums in Guitar Hero ;P
19hours 22mins ago by cfantarella