Thetaball is an authentic digital sport with rich, physics-based gameplay and lifelike AI.
In the single player game, your task is to field a team of robotic athletes in order to advance through a series of progressively more challenging leagues, earning as many coins as you can along the way. You can remotely control any of the bots on your team, or let them play on their own. A number of practice drills are available to help you improve your skills.
In the multiplayer game, 2-8 local players can compete against each other or cooperate against an entirely robotic opposing team.
Thetaball's special sauce is a genetic algorithm. Each of the game's 256 robotic players is a distinct individual, with behaviors created by a process of simulated evolution. Because the bots have coevolved within a complex, physics-based environment -- both cooperating with and competing against one another -- their behaviors are diverse, robust and lifelike.
A custom, highly-optimized engine was developed for the game so that the process of evolution could unfold at a reasonable pace. The bots that ship with the game are drawn from a population that has been evolving for roughly 60 years of simulated game-time, corresponding to a few months of compute time. Obviously, without such a speed-up, an evolutionary approach would be impractical.
A side-benefit of the fast engine is that matches not involving the user's team can be played out in a matter of seconds, without relying on statistical techniques to determine the outcome (i.e., without "rolling dice").
The genetic algorithm itself is implemented as a sort of continuous round-robin; the successful players reproduce at periodic intervals, while the others do not. The genome is comprised of real-valued genes subject to crossover and mutation.
The most direct benefit of an evolutionary approach is the richness of play that arises when every player is a unique individual. If properly implemented, a genetic algorithm can explore a vast behavior-space, focusing its attention on just the feasible regions. As evolution proceeds, the programmer can select representative individuals from the population to serve as the final players to ship with the game.
For established games and sports, the best overall approach to the game is gradually refined through a process of trial and error. But for a programmer designing the AI for an entirely new game, the "best overall approach" is not obvious (unless the game is very simple) and can only be guessed at. A genetic algorithm by its nature can hone in on the ideal overall style of play while still maintaining a diversity of individual styles. The coevolving players essentially act as automated play-testers, both balancing the game and, as a side benefit, uncovering any simplistic exploits that might exist.
The obvious drawback of an evolutionary approach is that evolution is slow. The good news is that a genetic algorithm can be parallelized in a straightforward way, by assigning a separate population to each processing core. The cores can thus be conceptualized as islands or continents, with periodic migrations between them. It's easy to imagine a distributed, massively parallel implementation where every installation of the game constitutes an island, and migrations travel through the Cloud.
The last benefit is subtle, closely related to the first two, and perhaps the most important. It is the tendency of a co-evolutionary system to maximize the variety of events that unfold within the environment. To understand how, consider the relationship between goalies and shot-takers. When a goalie plays close to the goal, it's to the shot-taker's advantage to take the shot from a distance; but if the majority of shots are taken from a distance, then there is selective pressure on the goalies to play further out.
Intuitively, it's easy to see how the population will tend towards an equilibrium, with the goalies playing at some intermediate distance. And that's the key; because the goalies have "split the difference", opportunities remain on either side of the split. On some occasions, goals will be scored from a distance, and on other occasions, from close-in. This principle applies across the board to all types of interactions: players will sometimes cut back, sometimes not; sometimes play the ball, sometimes the ball-carrier; sometimes push up the middle, sometimes go around the ends, and so on.
Here's some sample gameplay from Thetaball, the game I've been working on. All players are AI-controlled.