Knoxville, TN

Game Interface Design Gotchas

December 12, 2016

Over the course of the last two Ludum Dare jams (and subsequently developing Shifty Shapes and Retrofuture into full-fledged games), I’ve started thinking about user interface as a significant part of the design process. I won’t say it’s a grand unified theory of inputs, more like a checklist of where I’ve been bitten in the past.

Consider your platform

Assuming you’re design a fairly standard game that you can build and test on PC (i.e., not using features like multitouch, GPS, accelerometer, etc.), your input scheme is probably going to fall into one of three categories: (1) pressing buttons or keys, (2) clicking/tapping specific points on the screen, or (3) doing both.

It sounds simple, but this is a choice that you make as a designer, and one I often gloss over. My early Ludum Dare entries started as a set of game concepts, and I built whatever control scheme made sense on PC, because that’s where I spent all of my time in those first 48 hours.

The problem is, not every platform supports every control scheme equally. Viable cross-platform games don’t just happen because you used a tool like Unity or GameMaker that builds to every platform.

True, you could retrofit an on-screen gamepad, click-to-move, or a thumbstick cursor, but that might mean rewriting huge portions of your input code. It’s also a different experience than your original control scheme, which may not balance well with your current gameplay. It’s helpful to understand this upfront and build it into your design, rather than considering it last-minute. example, I was happy with Upper Crust, an overhead action-puzzle game. You used the keyboard to move your character, and clicked on terrain to cast spells.

I really wanted to do more with it, but I got bogged down in rewriting the spell-casting code to work with a gamepad (which seemed the simplest way to go–targeting Android TV for release). It was more productive to spend time on other game projects, so it killed my interest in taking apart the existing code piece-by-piece and rethinking the interactions.

Right now, I tend to target mobile platforms when I design. That’s a conscious decision; I want to be able to point at a ubiquitous platform, but I don’t feel like my games have enough content to be Steam-worthy. That means I design from the perspective of being on a touchscreen: no keyboard, and no right-click. (It doesn’t mean these can’t be available as shortcuts on PC, I just don’t want my game to depend on them.)

Toasts and ads

Shifty ShapesShifty Shapes- screenshot was originally designed as a single-screen game.

When that screen was a PC that wasn’t connected to any game services, it certainly felt like it would work on mobile, but it was more complicated to actually make that work.

The real challenge with Shifty Shapes was it really required a certain board size to play: too few tiles, and it just wouldn’t have been fun. And those tiles had to be a certain size, because they couldn’t be any smaller than your finger when displayed on a phone touchscreen.

By the time I started integrating Google Play Game Services, I realized I hadn’t accounted for the achievement “toasts” that pop up on screen–and I hadn’t realized how big they were. I got a complaint during beta testing that toasts prevented you from clicking certain areas of the game board.

Luckily, you can reposition GPGS toasts (although there’s no direct call in the GPGS Unity package for this–you have to use the Android Java classes to do it.) Putting it in the bottom-right is better, but it still covers up one or two tiles in the corner.

After release, I realized it would have have been nice to leave room for ads, because this would be a great free-with-ads game. Trying to make that fit would require reworking my entire interface at this point.

In contrast, one of the first things I did with Retrofuture was implement drag-to-scroll. It’s complex to get it to place nicely with the rest of the game interface, but it’s worth it. (And, it was much less of a headache to integrate since I did it before writing any of the other interface code.) Screen real-estate will be a little more flexible when I start wanting to integrate services. And, more importantly, the screen size doesn’t impose a hard limit on the size of my sprites or puzzles.

Test it on the device

As much consideration as you put into your target platform(s), there’s only so much you can do sitting in front of a PC without a touchscreen.

My initial design for Retrofuture targeted a purely touch interface–no right-clicks or keyboard shortcuts involved. Click a tool in the toolbox to select it, click a tile to drop it. Click “X” in the toolbox and then click a placed tool to delete it. Seemed simple, especially when using it with a mouse.

I exported it to Android soon after LD36 was over, and I found this wasn’t the case. If you misplaced something, moving back to the toolbox for the “X” was awkward. And as much testing as I’d done on drag-to-scroll speed with a mouse, it too felt “off” on a phone touchscreen.

To make matters worse, I’d made these design decisions at the exclusion of PC; several comments on Ludum Dare mentioned that keyboard shortcuts or right-click deletion would make things much easier.

Ironically, the best result came from considering both platforms at the same time while respecting the differences between them. I changed it so clicking an empty tile would place the selected tool, while clicking a tile where the selected tool had been placed would delete it. This made the awkward round-trip to the “X” icon simpler for both touchscreen and mouse users.

I also added in right-click deletion of any tool. While I’d avoided platform-specific controls during the initial design, the real danger was primary platform-specific controls. Since I tightened up deletion on both mouse and touchscreen, adding a secondary method of deletion as a “bonus” for mouse users wasn’t so bad.

All of this is to say: I’d never have made these changes without first testing them on a touchscreen. I trusted that the mouse was a suitable representation, and assumed that the only real difference between mouse and touchscreen were some missing keys.