"Zygote: The Rebel Uprising" development build (alpha v0.1.3) has been released, requesting gamedev and community feedback.
Zygote is a first-person video game set amid the emerging battle between the Patogens and the rebel Kinship. The player joins the Kinship resistance to master vital skills that will shape their combined destiny.
Zygote’s primary game mechanics focus on the management of player’s vital-signs, use of tactical sensors and player buffs, via the player’s Electronic Surveillance System (ESS).
The ESS monitors the player’s vital-signs to provide an overall health status, and includes:
Every player action, activated buff, Patogen environment and Kinship challenge affects the player’s vital signs in real time. Blackouts or death are a threatening consequence to deteriorating vital signs.
The ESS also comes with a collection of active and passive sensors to monitor the player’s environment, including:
Using the ESS, the player derives a tactical advantage over the Patogens for strategising plans of attack, evasion and avoidance in every circumstance.
The player skills are regularly honed in the Kinship Simulator, boosting their vital-attributes to mitigate any degradation of their vital-signs. The player’s vital-attributes include:
To rescue the Kinship, the player enters Patogen portals on recovery missions for the Books of Zy. These sacred books contain the knowledge necessary to activate the gates between the Kinship worlds. Without the books, the Kinship remain stranded and their homeland vulnerable to future Patogen attacks.
Buffs bestow the player super-human abilities, boosting attributes like running, jumping, strength and telekinesis at the cost of the ESS vital signs.
This balance between managing vital-signs, improving vital-attributes and boosting player abilities is of utmost importance to the player, in order to gain any tactical advantage over the enemy. Get this balance right, and the player transforms into a super-human juggernaut, the antipathy of all Patogens.
This post takes a look at the process used to create the Kinship Soldiers for Zygote’s beta. The game is in development and I’m very excited to see these sort of details coming together.
Check out the IndieDB page for more information and download an alpha version from a few months ago in all it's 'placeholder' glory.
So a quick explanation of Zygote: The Rebel Uprising…
It’s a first-person action adventure game. The player arrives in the middle of an emerging war between the Kinship and the attacking Patogens. Joining the Kinship, the player helps recover the Angkor Orbs to unlock the Angkor Gate and ultimately return back to the homeland. Along the journey the player discovers whom to trust, who’s being betrayed and why everyone’s isolated in a mysterious province.
These characters add story details and exposition, so it’s important that they deliver sufficient quality to present an engaging narrative and world.
The Starting Point
I use Blender3d for 3d Modelling, GIMP for image editing, Unity3d for the game engine and MakeHuman to build humanoid templates. I’ve been using various versions of each of these tools for many years so I’m pretty familiar with them - I like to call ‘this’ my comfort zone:
If you’re a hardcore modeller, you might start your 3d character model from scratch. By subdividing, loop-cutting, extruding and sculpting some basic shapes into something special. But I tend to start my humanoids from a baseline template model, mainly due to my lack of time and resources. In this case, I used the MakeHuman app to generate a basic skinned character that can be imported into Blender3d.
After initially importing the model into Blender, I cleaned up the basic model and created a ‘template‘ for all the humanoids in the game. I removed shape keys and tidied up the armature until I was satisfied with the basic setup and then saved the template 3d model in Blender.
Working between Blender3d and Unity3d couldn’t be any easier, since Unity has an excellent Blender importer that automatically reimports the model as soon as it detects an updated file.
Creating the Low Polygon Mesh
I cut the model in half, threw the left half away and used a Mirror Modifier to replace that side of the character. This meant I only had to work on half the model and kept both the mesh and armature consistent. I also add an Edge Split modifier to see the edges and set the object to Smooth Shading for visualisation.
Then I edit the model, by extruding, masking and moving vertices, and adding/removing edge loops as required. MakeHuman does a pretty great job of setting up the mesh topology, but sometimes I’ll have to do a bit of retopology to get things looking okay after my editing. In the end I test the model in Blender by entering Pose Mode and moving the armature (which I normally keep on a separate layer to avoid cluttering the view). I can visualise the joints and see how the model looks in different positions.
Sculpting a High Polygon Mesh
Next I add the Multiresolution Modifier to the Mesh, subdivide the mesh a few times from the modifier menu (in this case 4 added enough detail for my needs) and start Sculpting.
The high detail mesh will eventually be used to bake the normal map and be used as a reference when painting the base texture.
Painting the Basic Diffuse Texture
Blender allows you to paint directly onto the 3d model in it’s Texture Paint mode. Be sure to have the Textured Solid option enabled so that textures show on the model. I’ll paint the textures using both the low and high detail meshes. Texture painting is pretty cool. You can paint solid colors, gradients, procedural textures created in Blender or imported textures directly onto the model.
Before working with textures in Blender, the model needs to be UV unwrapped. Sometimes you can get away with using Blender’s Smart UV Unwrap, but I manually added seams to the Low Poly model to control the UV unwrap and resulting islands created in the output textures.
Baking a Normal Map
Blender has all the facilities to bake normal maps directly in the editor. First I needed to create a blank texture in Blender and assign it to the mesh UV vertices in Edit mode. Secondly, I change to Object mode and set the Preview option of the Multires Modifier to 0 to show the base mesh. Thirdly, I go to the Render panel, scroll to the Bake section, select the Normal Map output and enable the ‘Bake from Multires‘ option. Finally I hit the Bake button and watch as the normal map is generated.
This result in the high poly normals being baked onto the low poly mesh. I save the texture as a new Unity image asset and later assign it as a normal map in the model’s Unity material (I’m using Unity’s new Standard shader in the game engine).
Baking the Ambient Occlusion
While I’m in the baking mood, I created and assigned a new texture to the model. Then I changed the Bake output option to Ambient Occlusion and hit the Bake button again. The output is a greyscale image that I can use in GIMP to modify my basic texture and give it some depth/details.
Combining the AO and Base Texture
I opened GIMP and imported the base texture and the AO texture as separate layers. By overlaying the AO on the base texture and setting it’s blend type to Multiply, the result is a final texture that has new contrast and details compared to the original basic texture. NB: Unity’s shader supports an AO channel so I might use that with future modelling.
Creating an Emissive Texture
Unity’s Standard Shader has channels for diffuse textures, normal maps and emissive maps (among other things). To create the emissive texture I opened GIMP and loaded the original base texture. Then I overlayed an empty layer and filled it completely black, leaving it at about 50% opacity so I could see through to the underlying texture.
Painting the emissive layer in GIMP is simply a matter of adding colour where I want emissive light in the final Unity material.
I saved the emissive layer to it’s own image and placed it with the other Unity assest for the 3d model.
Testing in Unity 3d
Finally I had all the textures and mesh/armature assets ready for Unity. After import, Mecanim provides the ability to test and finetune the Skinned mesh. Normally, if things have been setup correctly in Blender, the Unity will Automap the bones and the mesh is ready for animating in the engine.
To assign the textures to the model, Unity uses Materials. I created a couple of materials for the model (one for the body/skin and another for the armour/clothes). I then assigned the AO-modified base texture, the normal map and the emissive map to the Unity material and tinkered with the parameters to achieve the right look.
Here’s the male Kinship Model in the Unity3d engine under a variety of lighting conditions.
Thanks for reading,
My lovely wife's out at the movies with friends. Edan & I had night in with popcorn & Shrek. I think we win ;)
17hours 51mins ago by cfantarella
@bajopants He, he "When I was a kid". You know you ain't fooling anyone. … You're there now, aren't you? … Save me a piece, Thanks :D
17hours 54mins ago by cfantarella
@darkestkale Right back at'cha Kale. Have a great weekend.
May 28 2015, 12:57pm by cfantarella
May 28 2015, 12:53pm by cfantarella
May 28 2015, 12:34pm by gamesofedan
May 28 2015, 11:20am by cfantarella
May 28 2015, 11:18am by cfantarella
May 28 2015, 11:17am by cfantarella
Lasers WIP T.co
May 28 2015, 10:47am by cfantarella