Using Source Control with RPG Maker VX Ace

I mentioned in my intro to RPG Maker scripting post (and in the panel that’s based on it) that you can use source control systems like Mercurial or Git with RPG Maker VX Ace, allowing you to take periodic snapshots of your work.

(First, you’ll want to read a tutorial about how your chosen source control system works, such as hginit.com for Mercurial. This post will make a lot more sense once you do.)

What I neglected to mention is there’s two big, easy-to-forget gotchas with this setup.

Binary vs. Text Files

Most source or version control systems are built with two assumptions:

  • you’re storing text files, such as source code, which can be compared in a meaningful, human-readable way (as opposed to binary files like images, which you can’t easily compare)
  • you’re working side-by-side with other users, and you’ll periodically need to merge your work (which, again, you can only safely do with text data)

VX Ace stores its data binary rather than text files, so these assumptions are both right out the window. (MV uses text files, so it’s more source control-friendly.)

You can use source control for collaboration, but you’ll need to coordinate your saves and updates carefully, ensuring you’re never working on the same files at the same time. There’s no way to merge changes within the same file, so one of you will lose your changes.

Steam Cloud and Source Control Conflicts

The point of using source control is that, if you screw something up, you can easily rewind your project to a working backup.

However, there’s one easy-to-miss configuration setting that will prevent you from actually being able to do this.

So what’s the problem?

Let’s walk through setting up a new RPG Maker VX Ace project in Mercurial using TortoiseHg. (Using Git is similar, although the terminology is different.)

This is sort of the long way around–I could simply point you to the button that will fix this issue–but I think it’s helpful to understand what’s going on behind the scenes here.

Find your project directory in Windows Explorer, right-click, and go to “Create Repository Here”:

Create the repository, and then in the TortoiseHg workbench, mark all files in the project (except Game.exe, which is the built game) as added to the repository.

Mercurial (and most other source control systems) assume you don’t want any files tracked unless you explicitly say to “add” them. (Likewise, they won’t remove them from the repository unless you explicitly say to “remove” them.)

Once that’s done, commit your project to the repository:

If you’re backing your work up to an online service like Bitbucket or Github, you’d push your commit at this point.

Then, you’d go back to working. So, for example, I might add an island to the default ocean map:

Let’s say that I’ve hit a stopping point, and want to make another snapshot of my work. I’d save my project in RPG Maker VX Ace, and then go back to TortoiseHg Workbench and refresh my files.

You’ll see that three files have changed (“M” for modified): Map001.rvdata2, System.rvdata2, and Game.ini. Game.ini is a text file, so I’ll see my changes, but the .rvdata2 files are binary–whatever I commit is all-or-nothing.

Once I commit, I’ll see two commits in the panel above. My newest commit, with the message “Changed the map,” is the tip (that is, the latest commit) in the default branch (because I could technically commit multiple versions side-by-side). When I click on each commit, I can see which files changed.

I can also right-click on a commit and Update to it, reverting my project folder to the state it was in when I committed that snapshot. (I might want to do this if, for example, I broke something in the Script Editor and couldn’t fix it.)

When I Update, I’m going to close and re-open RPG Maker to reload my project–it doesn’t know that TortoiseHg updated my files on disk.

TortoiseHg’s Graph will now show that my Working Directory (that is, the current version of the project on disk) is built off of the files in my first commit. It will also warn me that this isn’t the most current version I have in source control. (If I was trying to recover from an error I made in that commit, this might be what I want to do.)

And here’s the gotcha

So, let’s recap: using TortoiseHg, I’ve essentially rewound history, reverting the project files on my hard drive to a new, fresh RPG Maker VX Ace project.

The island that I created should be gone, replaced with the default ocean map that RPG Maker creates in that new project.

And so, I open RPG Maker again, and…

Wat.

At this point, if you were trying to recover from an error, you’d be screwed, because you couldn’t get to that backup. And you’d probably think I sold you a bill of goods on this whole complicated source control ordeal.

So what happened?

Steam Cloud happened. RPG Maker VX Ace saves its files to Steam Cloud so that you always have a backup–which would otherwise be useful.

When I opened RPG Maker, it checked Steam Cloud and made sure I had the latest copy I’d saved. Of course, this isn’t what I want; I deliberately want to go back to an old backup of the project.

From my experimentation, it looks like this might even overwrite the .hg folder in my project, which is how Mercurial’s magic works–all my snapshots and related information is stored there. If the .hg folder goes away, I’ve lost all of my backups.

Turning off Steam Cloud on a project

By default, RPG Maker enables Steam Cloud on every project. You can find these options under File > Manage Projects:

This will show me a list of all RPG Maker VX Ace projects I’ve ever worked on:

It’s hard to see, but the projects on Steam Cloud have a little cloud icon and show the size of the project under the Cloud column. Projects not tracked by Steam Cloud have a little hard drive icon and show the size of the project under the Local column.

If you select your project and click “Delete From Cloud,” RPG Maker will delete whatever is saved on your Steam Cloud and default to using your local copy. Remember, you’ll need to do this every time you start a new project–you should probably use it before you create your repository.

If you decide you want to stop using Mercurial or Git, you can simply delete the .hg or .git folder from your project and click “Save on Cloud” to go back to syncing over the internet.

Conclusion

If you decide to try working with source control, hopefully this saves you some pain.

Source control was something that took me a long time to understand and even longer to really appreciate, but once I did, it became indispensable. (It’s way more convenient than emailing files back and forth, or making multiple copies of your project folder.) If you want to get deeper into game development or programming of any sort, it’s worth trying out sooner rather than later.

Upcoming Panels (April 2017)

MTAC, April 14-16, Nashville, TN

  • Otaku Board & Card Gaming (Friday 4pm, Panel B): A discussion of tabletop games that were either originally designed in Japan or have otaku-related themes. We’ll also talk about a bit about types of mechanics/genres these games fit into.
  • RPG Maker Scripting Crash Course (Sunday 10am, Panel D 2:45pm, Panel A): Learn the basics of customizing your RPG Maker VX Ace game using Ruby scripting.

MomoCon, May 25-28, Atlanta, GA

  • Otaku Tabletop Gaming (Saturday 10pm, Panel 406): A discussion of tabletop games that were either originally designed in Japan or have otaku-related themes. We’ll also talk about a bit about types of mechanics/genres these games fit into.
  • Game Dev for Fun (and not profit) (Friday 10am, Panel 406): How to get into game development as a hobbyist. We’ll focus on some examples of 2D games in Unity (a cross-platform game dev tool), other tools that are available (some free), and how to find the support and motivation you need to get involved.

RPG Maker VX Ace scripting: Thinking through a UI change

My last post was basically an info dump on what I’d learned about RPG Maker Ruby scripting during Ludum Dare 37. From the comments I got on it, I realized “info dump” is not an exaggeration–it’s literally a bunch of abstract, raw information without any examples, and so it’s confusing if you haven’t messed with it before.

So, let’s walk through how we’d think through a very simple change.

Continue reading

Instantiating a Procedurally Generated Platformer in Unity

This is a follow-up to my last post, Building a Procedurally Generated Platformer, which (not being a disciplined blogger) I’d sort of forgotten about until a comment about an off-hand remark I’d made about another blog post popped up in my RSS feed. Then (again, not being a disciplined blogger), I forgot about that half-finished post until I started thinking about writing a new post and checked my WordPress admin.

In the last post, we discussed one way you could randomly generate a two-dimensional array of values representing ground, platform, and trap tiles. For example, it might look like this:

ProceduralGeneratedPlatformerBlog9The trick, of course, is to convert each of these chunks into actual Unity objects, and to do so in a way that performs well.

Because we’ll be adding new chunks as needed, we want to make sure we don’t freeze play due to adding large chunks that are costly to instantiate. Because we want to restart the level quickly (without reloading the scene), we want to minimize the cost of creating new game objects.
Continue reading

Game Interface Design Gotchas

Over the course of the last two Ludum Dare jams (and subsequently developing Shifty Shapes and Retrofuture into full-fledged games), I’ve started thinking about user interface as a significant part of the design process. I won’t say it’s a grand unified theory of inputs, more like a checklist of where I’ve been bitten in the past.
Continue reading

Building a Procedurally Generated Platformer

ProceduralGeneratedPlatformerRecently, I’ve been playing around with a “procedurally generated endless runner” game concept in Unity. It’s really meant to be a set piece for Knox Game Design–a  multiplayer game that we can show off that’s both quick to play and has a lot of replay value. Here’s an explanation of how I accomplished that procedural generation. Continue reading

Ludum Dare 35 “Shifty Shapes” Post-Mortem

Here’s a quick rundown of the ups and downs of my compo entry, Shifty Shapes, which was written in Unity. (You can play it on itch.io.)

What went well

Concept

Shifty Shapes

Usually I make some notes about the top themes in each round of voting, but this time I went in cold. (Honestly: I have a few different game ideas floating around in my head right now, so I was open to doing something off-the-cuff.)

Once I decided to go with the “shape” wordplay, I had all of the rules for the game written down within minutes.

The sliding concept was inspired by a board game called Ricochet Robots that I played in analog gaming at Geek Media Expo, plus standard match-3 mechanics which I’d already figured out an algorithm for in Ludum Dare 30.

Animation

I knew I wanted this game to have some nice visual effects, since I envisioned it as one of those shiny mobile puzzle games.

Fortunately, there weren’t a lot of moving pieces in the concept, and I started building the effects early (even before I’d replaced the placeholder art).

Animation is something I don’t tend to think about (or I think about so late in the jam that it’s complicated to implement), but a little bit seems to go a long way.

Bouncing UI elements and blocks/items flying to their respective counts were easy to implement. I feel like my biggest win was making item and score counts only increment when the block/item reached it.

Music

The main riff was based on something I’ve played around with before on guitar–variants of C, with Am and Em thrown in, followed by a quick “bridge.” It worked pretty well, which means I spent about 10 minutes working out the tune, leaving most of Sunday for recording and mixing.

Unity UI

Because I wanted new blocks to fill in like a circular progress bar, I ended up having to mess with Unity UI early (as it supports “filled” images). I’ll confess, it’s something I’ve avoided for the longest time, because it’s intimidating.

Rather than mess around with the large block of numbers that don’t seem to change anything (Unity just recalculates the X and Y positions when you change them), I’ve preferred to stick to world space, Camera.orthographicSize and Camera.aspect, 2D Toolkit, etc.

However, I feel like I’ve made serious progress learning Unity UI thanks to this game. And because I didn’t need to dip into 2D Toolkit for text, I think this is the first Ludum Dare where I’ve had no Asset Store dependencies.

Coding

There are few things more satisfying as a coder than being able to call a “Reset” method after a Game Over screen (as opposed to “screw it, I’ll just reload the scene”).

What didn’t go well

Block pop-in effect

One of my big plans for visual effects was to have blocks “fill in” like a pie chart. That’s actually pretty complicated if you’re using Unity’s Sprites, and I spent more time than I’d like trying to make it work. Once I realized sprite masking was going to require shader code, I gave up on this approach.

Even though I was reluctant to mess with Unity UI, UI Image supported this exact feature, and I got the effect I wanted. However, I wish there was a way to do this via Sprites, and I spent more time than I’d have liked chasing that solution. (Although, I did do this first thing on Friday night so I could budget my time.)

Graphics

I’ve been playing around with Krita recently, and because of some of its pen and paint effects, I initially picked it over GIMP. However, the art I created didn’t feel right–it had a dark, hard-contrast feel.

This wasn’t Krita’s fault, I just wasn’t familiar with it. It was definitely a case of thinking a tool based more on physical mediums would magically produce something “painterly” without me having to understand how it worked. I ended up redoing everything (except the cloud particles) in GIMP, and was pretty satisfied with the result, despite spending a lot of extra time on unused art assets.

Music

Because I guess I have something to prove, I decided to try to mix in cajon, banjo, and mandolin in with guitar. I can’t tell whether it works or whether it’s all out of tune, because my first impression changes each time I listen to it.

I suspect part of the problem is I changed the speed on the guitar part in Audacity, and then tried to increase the pitch by the same percentage to compensate. Also, I’m not enough of a musician to pull this sort of thing off with consistent quality (which is why you’ll note all of the non-guitar tracks are mixed very low and silenced in places).

Build and release

To be fair, it went OK, but this is the first time I’ve been frustrated with myself for not planning in advance.

I always go in building for the default Unity 960×600, only to remember on Sunday afternoon that it’s slightly too large for the LD site’s embed (and even a little awkward in the browser alone). I really need to standardize on a resolution and aspect ratio before I go into a jam. I tend to just jump in the preview window and forget about this part.

Also, given that WebGL builds seem to take somewhat longer than Unity plugin builds did, I need to start planning for this in advance. Luckily, I did a release to itch.io on Saturday, so I was able to iron out some WebGL issues with particle effect sprites early, but it would’ve been stressful if I had waited until Sunday.

2D Platforming Mechanics in Unity

Building a platformer has been something of a holy grail for me since I first started tinkering with Unity. I grew up playing NES games like Super Mario Bros., so in some ways platformers are what I think of when I think of video games.

It’s not hard to find third-party utilities to help with this, depending on what level of functionality you want and what price you’re willing to pay.

Eventually, however, you’re going to have to bite the bullet and learn what’s going on inside of the black box. Either your third-party script’s settings are just plain confusing, or you want it to interact with the world in a way it doesn’t intend. (For me, it was trying to build Castlevania II-style stairs.) Even if you don’t roll your own, digging into the guts of a platforming script isn’t exactly straightforward.

The Basics

A basic platformer is going to have three mutually exclusive states at a minimum: Grounded, Jumping, and Falling.

unity-platforming-jumping-statesGrounded is what the user perceives as the “default” state. In reality, it’s the state that the character is in only when its lower edge is colliding with a surface. Typically, a player can only jump when they’re in a grounded state (or, in the case of double jumping, the jump limit would be reset as soon as the character is grounded again).

Jumping is the state where a character is moving upwards into the air. It’s an active state–that is, it’s usually triggered by player or AI intervention, not simply interaction with environment like Grounded or Falling. It’s also a temporary, limited state which ends when a maximum height is reached or a timer is up.

Jumping can also end abruptly due to collision. If the character hits their head–that is, their upper edge collides with a surface, the opposite of the check done for grounded–it’s naturally expected that the player immediately start falling.

In games where the player has more control over their character, jumping can end early if the player releases the jump button. This allows for both quick, short jumps and long, floatier jumps.

Falling is actually the character’s “default” state–when they’re neither in a jump nor colliding with the ground. When falling, the character is affected by gravity. Falling ends when the character becomes grounded again, based on collisions with their lower edge.

Thinking in frames

While it may not seem obvious, there are only a few possible state transitions in a typical platformer. You can’t go directly from Jumping to Grounded without encountering a case that would trigger Falling. You can’t go from Falling to Jumping without first encountering a case that would trigger Grounded. Because you can rule out certain state transitions, you can simplify your logic.

I say this isn’t obvious because I’ve never been a hardcore twitch-gamer who can pick out specific frames of animation. To me, playing a Mario Bros. game is a smooth experience–I take in the “big picture,” so to speak. In reality, it’s a series of very tiny moments where the computer is performing very tiny updates and then making decisions based upon them.

This is tricky because it’s easy to work out a “good enough” platformer controller that fails in odd situations if you don’t consider this frame-to-frame precision. For example, 99% of the time the character might behave correctly–but if you end up off by just a fraction, you’ll get weird behaviors. Precision is possible–you just have to figure out how to make it work. One tip is to be aware of Unity’s lifecycle.

Testing for state changes

So how do we detect when the player changes states?

As a Unity newbie, my first attempted solution was to detect collisions with the ground or other platforms using colliders. This isn’t a bad instinct. Colliders take the hard work out of hit detection by turning them into events. You don’t have to constantly check for a state–you simply need to respond to enter, stay, and exit events.

unity-platforming-colliderFor detecting Grounded/Falling states, this can actually be a pain. Each collision is handled as a discrete event, so you don’t know if that OnCollisionExit event means you’re actually grounded or not because that collider doesn’t know whether other colliders are triggering OnCollisionStay and OnCollisionEnter events. Technically, it should be possible to track this, but that’s a lot of bookkeeping, and it’s highly dependent upon the order the events are called in (relative to the Unity lifecycle).

Not to mention that it’s hard to get an accurate picture of which direction a collision came from without multiple colliders. That means it’s not easy to differentiate running into a wall (or an enemy) from hitting the ground. Believe me, I’ve tried testing the collision relative velocity, and I’ve never felt like I could rule out all the false positives.

Raycasting

Unity has various physics casting functions (some which, I suppose, aren’t technically raycasting) that all do variants of the same thing: they test a specified area for colliders.

unity-platforming-raycastpointsCurrently, I tend to lean on Physics2D.OverlapCircleAll, which returns all colliders within a given radius of a point. It returns a set of Collider2D objects which can be used to determine whether any ground layers are hit.

I attach a couple of empty GameObjects as test points, which I then pass into my controller script. For example, in the screenshot here I use two ground and one head (highlighted with red and green icons respectively). Add as many as you need based on the size of your cast area–you want to avoid any case where a character could technically be over a walkable object without returning a raycast hit.

In my last Ludum Dare game, I used something like this:

var colliders = Physics2D.OverlapCircleAll(transform.position, 0.1f,
     LayerMask.GetMask("Terrain"));
// Test colliders to see if there are any ground objects

You call this each Update or FixedUpdate, depending on how you’re adjusting your physics. Test for grounded (or whether a character hits their head in a jump) before you do anything else, because it’s going to determine what state transitions you allow, what movement you apply to the character, and what you allow the player or AI to do. (That is, the character must react to the environment before they can react to the player or AI’s input.)

Let me give a disclaimer: I really need to work out a better approach. I suspect there’s some issues with my circular cast not lining up perfectly with my character’s box collider, and there’s probably a more performant approach than returning all collisions. Still, this should be a good starting point, although I encourage you to try out some of the other methods in Physics2D (or Physics, if you’re using 3D colliders) that may work better for your particular setup.

Admittedly, tinkering is difficult. This stuff is happening at such small intervals, sometimes at varied points in the Unity lifecycle (Update vs. FixedUpdate vs. LateUpdate vs. coroutines). And you can’t easily visualize it without writing your own editor gizmo code. As a newbie, I’ve found there’s a certain level of blind trust (sometimes misplaced) involved.

This is also a case where setting up your physics layers matters: if you do it right, you should minimize the amount of “testing” you have to do with the results.

Note that while I’m using raycasting to determine the player’s overall state, I’m also relying heavily on Unity’s physics and hit detection to handle movement. There’s still a collider on my player character and it still blocks movement when it comes into contact with something.

I use OverlapCircle because I don’t really care how far the character is from the ground, only that it’s near enough to a surface to register as touching. If I didn’t have a collider attached to the character, I’d want to know exactly how far the closest ground object was so that I could adjust falling speed so as not to overshoot.

Speaking of which, that brings us to…

Movement

There are a couple of ways you can handle movement in Unity. For the most part, you should technically be able to get similar results with any of these, assuming you can figure out how.

The approach you take determines how much of Unity’s built-in features you can use (versus how much you have to write yourself) and what mode you’ll have to think in.

You can set an object’s transform position directly. This is sort of a brute-force approach. Triggers and colliders will still fire, but you don’t have the benefit of continuous collision mode, unless you code it yourself. In mathematical terms, this means you have to think about position at a given time, or f(x). (Realistically, you’ll probably be working out speed/velocity, or f'(x), and applying it.)

You can set a Rigidbody’s velocity directly. This is my preferred approach, because it means I don’t have to think too much about physics, but I still get the benefit of the physics system adjusting for collisions. In mathematical terms, this means you’re only working with the player’s velocity at a given time, or f'(x).

Typically, I set Rigidbody velocity in FixedUpdate as long as there’s a change in velocity required from the last FixedUpdate call. I set drag and gravity scale to 0, because I don’t want anything else tinkering with velocity if I can help it. Because Unity velocity represents units per second, it’s possible to use Time.fixedDeltaTime to work out just what velocity needs to be.

Note that one reason I like using Unity’s built-in physics and hit detection is I have to think less about exact collision positions. I’ve seen platformer controllers that don’t require colliders, but that means you have to think about not just the character’s current speed, but the distance to the nearest object. This isn’t impossible, but it’s an extra layer of complexity I’d prefer to get into only if necessary.

You can apply force to a Rigidbody. This takes full advantage of Unity’s physics, but it requires a little more work. You’re not only going to have to determine what force you will apply, but you’ll have to tweak the object’s gravity scale, mass, and drag to get it feeling right. In mathematical terms, you’re indirectly working with acceleration, or f”(x).

I’d avoid this on the player character, because you want those controls to feel tight, which is harder to figure out with several degrees of separation between your code and the character’s actual movement speed.

In a later post, I’ll get into more of the math behind movement, since that’s the difference between a clunky game and a smooth one. (I will say that I find Microsoft Mathematics incredibly useful for this sort of thing, since you can easily plot changes in speed or position over time.)