My previous mod project was Dissolution, a project that was in development for the best part of two years. It's always easy to identify the mistakes you've made on a project long after the project is complete; Dissolution's biggest mistake was not testing early enough.
I went through two years of development and only tested a month before release, when things were largely set in stone. The result was that players had gripes, and these gripes were with flaws that playtesters would have easily picked up, things that I, the developer, was completely blind to.
Lunar Descent has hopefully learned from the mistakes of Dissolution, and early testing has been a feature of the development process. I've pushed out two gameplay alphas to a group of testers and asked them for their thoughts. Some of them even recorded their palythrough, playing through and commentating on their experience. I really can't thank my testers enough, because already they've identified issues that I wouldn't have considered issues.
However, I decided to take things a step further. A lot of game developers have been collecting playthrough statistics for years, most notably (in my view) Valve. I decided to experiment with a similar system for the Lunar Descent alpha testing programme.
Testers were made aware that statistics regarding their playthrough would be collected. It's only been in the last week or so, however, that I've been able to get the time to take the thick chunks of data and process it into something readable. You can find the gameplay statistics below:
One thing is immediately obvious: I managed to get more playtesters for the first alpha than I did the second. This is unfortunate, as obviously the less players playing, the less representative the statistics are going to be.
Let's take a look at Alpha 2 first. Interestingly, only 70% of testers actually finished the second alpha. Now, 70% is a huge percent for a full product. Most games get around <20% completion, but for an alpha where the playbase is entirely made up of people whovolunteered to play, it seemed quite low.
The statistics turn up an interesting fact, though. 100% of players did make it through the first combat arena. The first arena is definitely not easy by any imagination. The player is thrown straight into tough, medium-distance combat.
However, while 19 players entered a stealth arena, only 15 of them actually completed it successfully. As odd as it sounds, I was kind of pleased to see that area being the area that knocked our players off. Feedback returned from testers was very divided with regards the stealth section, so it was kind of vindicating to see the statistics back up the traditional feedback.
The playtimes for Alpha 2 were very varied. Some players completing in under twelve minutes and others taking as long as 45-50 minutes. The trimmed mean is about what I'd want it to be (21 minutes). The target game length for the whole mod is 40 minutes, so it's reassuring to see that for most players, we're on target.
While there were fewer testers for Alpha 3, all of the playtesters were much more involved in the experience. All of them explored all areas, all of them completed the game and, again, mean playtime was about what I would hope for.
The average deaths for both alphas were roughly the same. Interestingly, in the Alpha 3, the players collected a lot of medkits (222) but only used 93! This is in contrast with Alpha 2, where medkit collection and consumption were fairly close. My theory is that players forgot about the medkits, forgot the key to use them or didn't need them as much. Judging from the statistics, though, Player 7 sure could have used with taking more medkits. :)
I'm looking forward to the Beta statistics. In the Beta, the entire campaign is playable (though not 100% finished by any means), and so metrics for things like game length and completion rate are going to be more indicative of the final product.