Mission Rovee is a charming game (releasing Apr 29th on Steam) developed by a team of students at Ball State University, IN. The game seats you in front of two old style CRT monitors, at a desk littered with 5 ¼” floppies and there's a post-it note on the monitor with a login and password. You type it in and you find yourself looking through the camera attached to a rover on the surface of another world. The command line is your only interface with the rover and by figuring out what commands you can type into the terminal you learn how to move the rover around, manipulate its arm, and collect minerals to complete the game.
But the unique setting of the game is not the only thing that sets it apart. What's remarkable is the level of automated testing that the students have incorporated into their development process, which puts most professional teams to shame.
By structuring their code for testability, adding tests at all levels of the test pyramid, and running those tests automatically on every commit, they have reaped the rewards of a stress-free development cycle and the ability to release at will, while other student teams are crunching to complete their game projects.
I was lucky enough to sit down with team members Pandora Roberts and Jeffrey Harmon, along with their professor Dr Paul Gestwicki, who has long prioritized technical practices that lead to high quality code such as Test Driven Development and Refactoring.
The Tests
Let's dive into what tests the team has. At the top of the pyramid they have a full E2E test of their game, which they're able to achieve quickly and reliably by replaying a list of terminal commands at a very high engine time scale. This runs the entire game start to finish, entering the text commands into the UI element and observing the result. Of course, this would be more challenging if the game randomized the location of the minerals that must be collected, but it's natural that as the game becomes more sophisticated, so must the tests at this level. The whole test takes about 3 seconds to run.
At the layer below this are tests for individual components such as the rover arm, which must pivot to collect minerals. These tests work with smaller scenes containing only the component(s) to be tested and run very fast.
At the layer below this are some unit tests which have typically been created using test driven development, such as those for the text parser.
The tests can be run from the editor, as well as running headless as a pre-commit hook (5 second duration). They also run as part of the team’s Continuous Delivery pipeline.
The Developer Experience and Benefits
The main benefit I heard the students talk about was the ability to spend more of their time on actual development work because they knew instantly when they introduced bugs. It was noticeable to me that they did NOT talk about all the time they wasted because the build was broken, or all the time they spent hunting for and fixing bugs, which in my experience are common features of student (and indeed professional) projects. Instead, they told me that without the safety net of their tests they would not have been able to add so much juice/polish to the game (extra assets vs placeholders, atmospheric fog, camera shake, etc). Dr Gestwicki noted that this was the difference between making something just to pass his class and “making something good”.
The students did not have a dedicated team of manual testers to search for defects they may have introduced – it was all on them. Since they were also acting as artists and designers, any time spent manually hunting for bugs was time taken away from feature development. They estimated that if they had manually tested every commit it would have taken 10 to 15 minutes each time, which probably would have meant that they wouldn't have tested every commit. By contrast, with the entire test suite executing in five seconds on a pre commit hook they got the value of instant feedback without any effort on their part beyond initially writing the tests as they developed their features.
The test framework they used, GUT, was easy to master. The main challenge was making their code modular enough to be testable. However, they told me that they appreciated this pressure towards modularity, which they felt improved the quality of their code base. I predict it would also make it easier to scale their testing effort if the project evolves further.
I asked the team if writing tests had ever slowed them down or prevented them from achieving their goals. They told me they couldn't think of any cases where having written a test was a negative for them and they said that every single one of their tests has failed at some point.
I find it interesting that in a student project, the epitome of a small-scale effort, the team found incredible value in the tests they wrote. Too often we hear that small projects don't need automated testing, but that doesn't mean small projects don't benefit from automated testing, as this team so competently demonstrates. It is too common to consider only the cost involved in writing automated tests and neglect the many benefits they bring.
The students were also very disciplined when it came to fixing bugs as soon as they were found, carrying at most two bugs at any one time. They had to be, or their tests would fail. This is another clear benefit of working with an automated safety net; there is pressure to avoid carrying a large, risky, backlog of defects which must be fixed prior to release. The ability to release at will offers more predictability and more options to the game production effort.
Pragmatism
While they have an excellent quality mindset, the team do not slavishly add tests for everything, but consider the return on investment they expect. While they say that they wish they had written more tests I think it's important to note that they have clearly written enough tests to exceed the critical mass necessary for the approach to deliver huge benefits. Testing does not have to be perfect to be worthwhile.
If anything, they told me, it was their linter that they found annoying, not their tests. To me that indicated that they were seeing value from their tests but not from some of the linter rules they had set up, suggesting that it might be sensible to loosen some of the stylistic requirements. It is always important to find the set and cadence of validations that most empowers the development team.
I asked the team if they had ever been tempted to disable any of their tests. They recalled a time when they were adding a camera shake feature and the accelerated time step of their end-to-end test was causing a lot of failures. They were tempted to disable the test or reduce the time step acceleration, but they realized that the test was exposing genuine, but hard to reproduce, issues. So they decided to fix them instead.
Conclusion
To me, the success of these students is more evidence that these techniques work, whatever the size of the game project. For teams who value sustainable and humane delivery of fun, high quality games, all it takes is a little discipline and some basic practices to radically change the development experience for the better. I'm so glad to see these practices taking hold in higher education and I'm sure their practitioners will shape the future of the industry.
Key points
No game is too small to benefit from test automation.
Test automation doesn’t have to be hard.
The main function of automation is to make development easier and faster, not make it harder. As a side effect, it promotes code modularity and bug hygiene.
Discipline and attention to ROI are the keys to success.
Thanks for reading, and if you’d like to see the full interview, please check it out on YouTube.
Info Box
Name: Mission Rovee
Timeline: The project was under active development for 9 months, with each team member committing 9 hours per week.
Team: Sphere Province Games (Ball State University)
Members:
Jared Bowman - Developer
Andrew Everage-Scheible - Developer
Jeffrey Harmon - Developer
Victoria Moon - Developer
Tommy Nguyen - Developer
Pandora Roberts - Developer
Harmonic Legion (Robin Walma) - Music Production