Post news Report RSS Designing a cross platform VR game in Mr.Hack Jack

Sharing some game design takeaways from cross-platform VR game Mr.Hack Jack: Robot Detective.

Posted by on

I’m Andrea, designer and producer at Donuts Co. Ltd. in Tokyo. My team and I are just out of a 4 month production cycle that led to the creation of Mr.Hack Jack: Robot Detective, which is my first fully fledged cross-platform consumer VR game for Oculus/Vive and Oculus Go. YAY!

I'm writing this because I'd like to share a few game design decisions and takeaways from this short but interesting project.

KEYVISUAL title

Our goal with Mr.Hack Jack:

I joined the team at the beginning of December 2018 to help out on the creative direction and project management side for a new VR detective game project. At that time, the Donuts VR team just completed their previous game, The Last Letter, in a six month time frame. I thought that was pretty impressive but the production process was somewhat chaotic because of frequently changing creative vision.

The goal we were given was to get the game done with better quality and reviews than the previous one, on Oculus/Vive and most crucially Oculus Go, in roughly 3 months with a team of four people:
2 x 3D artists, 1 x programmer and myself. No pressure.

Also, since our art team is small but wants to push quality, we wanted to make a genuinely nice looking game with great performance. Of course.

What is Mr.Hack Jack: Robot Detective?

The player plays the role of Hack Jack, a detective called to help out on several crime scenes (a shooting in Jazz bar, a mysterious murder...). Mechanically, it plays out like this: every scene has several clues and puzzles that need to be solved to push the case resolution progress bar to 100%. Clues about what needs to be found can be gathered by talking to a squad of super incompetent police bots that are working the scene.

Whenever a clue is found, a hologram representing what happened at that spot in the scene appears. By the end of the level, the scene will be filled with holograms giving a better understanding of what went on.

Game design iteration

Because of the very short time frame, we decided to keep the mechanics extremely simple to cut on the gameplay iteration time.

We decided to focus on a few mechanics:

Picking stuff up
This one is fairly straightforward and we always went for the ability to “force pull” items. This is especially important if you’re going for Oculus Go as it doesn’t have any movement any depth movement (You can only twist your controller around the 3 axis but you can’t move it in space to get closer to things). Yet at first we toyed with the idea of being able to extend your arms and keep it at length… ultimately we killed the extendable arms feature as we didn’t find any immediate use for it.

To facilitate the grabbing, we display a fancier rendition of the hand raycast.

Using a magnifier/scanner to reveal invisible clues
As this is a detective game, this was a natural feature to have. We did struggle with it more than we anticipated, the main issue being it’s sensitivity. You would expect people to keep it close to their eyes to maximize the “scanned are” but it turns out testers liked to keep it at a certain distance, sometimes scanning areas a few centimeters away from a clue, not seeing anything and never going back to that area. In hindsight, maybe a clue proximity indicator would have helped to the risk of making the whole thing too easy.

The Detective Pad

“Hacking” things
We wanted an easy way to interact with the world machines and characters and decided to go with a ubiquitous “hack plug” mechanic. Basically you throw your hack plug into appropriate sockets found in the world, on characters, on objects. This turned out to be pretty funny...most robots actually have a socket on their mouth, making them look like babies with a pacifier when they are “hacked”.

Switching tools
This actually took some time! At first we decided that each tool would be a hand replacement went with a UI docked on your hand with a selection wheel to allow choosing between grabbing, magnifier and hacking plug. This felt cumbersome so we switched to a 3D floating… but that didn’t solve the issue that having to switch hands to be able to grab things was a pain.

Ultimately we went with more streamlined solutions. The grabbing hand is always there, the magnifier is a grabbable object attached to your left arm (drop it and it floats back to its position on the left arm), the hacking plug “hand” automatically appears whenever you aim at a socket.

Simple and very effective!

The Detective Pad
What we call the dPad is where all the communication messages and all the clues to a case are gathered. It is effectively the player’s left arm. Again this design stems from the need to adapt to the Oculus Go’s specs which feature only one controller (!). We came up with a screen that could be docked to the left side of your body on Oculus Go and just be freely moved around on the other platforms.

Another note about the dPad: the grabbing hand (the right hand) shoots a raycast towards the world to highlight which item you can pickup. When that raycast intersects with the dPad screen, it turns into a mouse cursor that allows you to seamlessly interact with the UI on the screen.

The dPad went through some iterations as well, at first being limited to receiving messages and eventually becoming the centralizing repository for the crime scene information.

The old dPad CHECK1

Evolution of the Detective Pad


Puzzles/Minigames
We designed two main types of puzzles/mini games, like a puzzle where you need to reproduce a sequence of inputs. I won’t go into much detail about them here but we went for simple solutions for which we could easily create content by adding gameplay modifiers or toying with the amount of elements in the puzzle (like the number of possible buttons in a sequence that needs to be reproduced). We designed all puzzle systems to be flexible and their visual elements and layout to be customizable. For instance, the sequence mini game can be floating in mid-air or contextualized as the buttons on a machine.

Space and movement
Ah VR movement! Two issues here, how to move around in a scene and if/how to use room scale.

Let’s get the room scale out of the door. If you’re doing Oculus Go you can’t really rely on it since there is no depth motion nor movement. Room scaling is there on devices that support it but we decided not to rely on it for gameplay. At one point we did have clues that were hidden behind a corner or beneath a table but we decided to scrap that.

In regards to movement, I think we just iterated on the existing. We went with an aim-teleport system. The movement itself happens by tilting the stick for a .5 seconds and the curve appears. Releasing it will teleport. We were careful not to rely on stick clicking on Oculus, as we found it to be really uncomfortable. On top of that we added quick turning by rapidly tilting the sick. The turn angle is 45° as 90° proved to be too disorienting. This is similar to what you have in the Oculus Home. We also added a 180° quick turn that proved really convenient.

Adjusting the fade in/out during the teleport to making non-instant and adding a subtle sound also proved beneficial to simulate the feeling of moving somewhere. Tweaking that teleport duration actually has a huge impact on the game rhythm in my opinion.

Art direction

We went with a Noir movie inspired setting as we liked it a lot and we felt there was enough preexisting zeitgeist to give us a head start. Obviously, films like Blade Runner covered the blend of SF and Noir but we chose a light hearted approach and didn’t end up taking inspiration from any well known, darker material.

We’ll share more about our tech art process soon but we’re really happy with the results and performance, which is especially important for the Oculus Go build.

Here are a few pics of the visual evolution of the game. From month 1 to month 4.

What do you think?

Sprint1 end 5 CHECK2


Alley wip 01 CHECK5
Visual evolution of some of the levels


Characters
Since in a first person VR game the opportunity to see your character are pretty slim, we chose to create a set of characters that would keep the player (and us) company during the game. I like characters in general (duh) and I think having some makes your development process funnier as it creates fun and interesting things to play with and talk about. And as a general rule I like to think that things that are fun to develop will most likely translate into funny things in the final project.

Plus we could use those characters to make fun screenshots and promotional material. I worked on a game before that didn’t look as good in screenshots as it did in motion and I promised myself not to fall into that trap again.

Also, name your stuff, always. During development, you won’t have the same feelings for Bot_001 and Bot_002 as you’ll have for Pablo and Samantha. Just my two cents. ^_^

The bot squad

Testing

One last thing I want to touch upon is testing. As always player feedback is super important...BUT, VR is an extremely specific platform that is so surprising to first-timers that your test results may be inconclusive. I think it’s safe to assume that if you develop for Vive or Oculus, the audience is already well trained in VR... so unless you’re doing something exotic, I don’t think your game should have the ambition of being usable by anyone (like, let’s say, a mobile game).

By all means, also get total VR-newbies to make sure all the information is easily understandable but a good yet slightly awkward solution is to make testers go through the Oculus tutorial before playing your game.

What’s next?

Our Oculus Go port in not finished yet! At this point we’re confident in the game performance on the platform but I’m pretty sure we’ll have something more to say about porting the gameplay. We’re planning a couple more articles about some things we did art-wise to optimize the game but if you have any specific questions, feel free to ask!

And of course, check out Mr.Hack Jack! I'd love to get your feedback on it!

Post a comment

Your comment will be anonymous unless you join the community. Or sign in with your social account: