I’ve been working on porting Pulse to mobile devices (Android and iOS). The game was written in Flash, which means it could be deployed to both Android and iOS as an Adobe AIR application. The controls and UI would need some tweaking but otherwise the game would work as it is. Or so I thought… I deployed Pulse to a Samsung Captivate and found that the frame rate is very poor. I isolated the issue to the use of vector graphics, and the typical optimizations didn’t have any significant impact. Essentially, all the graphics in the game need to be redone with sprites.
I found a couple of cool sprite frameworks for Flash: ND2D and Starling. Starling is especially promising and I spent some time testing it. Then I realized — after several hours trying to figure out why my app wouldn’t run on my phone — that Stage3D, the new API that both Starling and ND2D are built on, is not available on mobile devices yet. On top of that, the effort required to update Pulse to use sprites was starting to look substantial (especially considering how simple the game is).
So, of course, I decided to port the game to Unity instead. My plan was to look at Unity after I finished Pulse anyway so I figure the experience won’t be a waste, and Unity will let me target more platforms than Flash.
Unity is designed for building 3D games so the first thing I have to learn is how to use it for a 2D game. I’m finding this tutorial to be very helpful.
I also need to figure out how to handle audio syncronization and latency (by far the greatest challenges of making the original Pulse in Flash). In Flash I had used the excellent StandingWave 3 library. There doesn’t seem to be anything like it for Unity. Luckily, Unity seems to be much better at both syncronization and latency out of the box. When you play multiple sounds simultaneously Unity will actually play them in sync (unlike Flash) and you can choose to prioritize latency over performance, which is nice.
I’m currently working on porting the game logic over from Flash.