Reducing the size of images in iOS apps

For a surprisingly simple app, Keep Calm on iOS has required a reasonable amount of maintainence. Like many iOS apps it uses a lot of images to improve the user experience, and in the first version I had over 150 pictures because I allow users to change the crown. From the most recent version onwards there are over 1000 pictures as users can optionally purchase extras. This has pushed the app download size up from 2MB to around 20MB with around 19.5MB solely being images.

I decided that I wanted to be able to reduce the size and number of images in the app bundle but I didn’t want to reduce image quality or number.

The first option was to compress all of the PNGs using a tool like pngcrush however I’m only storing PNGs with one channel (alpha) so this had virtually no effect. I had also considered storing the original SVG files I had generated them from, but the addition of SVG rendering libraries, saving code and Core Image filters meant that I wouldn’t have seen any major reduction in the bundle size. I also had a look into zipping the files (and tarring them) however this reduced less than 1% because of pre-existing compression in the PNG files.

The next option I decided to investigate was putting all of the images in one single file, like a sprite sheet. This would mean PNG compression could help to reduce the amount of overhead on each file whilst maintaining the original quality.

I then wrote some ridiculously simple code for a Mac app that read in all of the files and drew them onto a Quartz 2D canvas (they’re all 300px by 300px at the most, so I just drew them in a square grid). This then produced a PNG file that was around 18MB, so I didn’t really gain anything.

Just to see if the NSImage compression code wasn’t great, I exported the image in GIMP but there was virtually no change. My images were basically one giant white PNG with an alpha channel, so in theory the file should have been a bit smaller (I was being optimistic, it is around 36 megapixels). I then added a black background to the image and exported it with GIMP without the alpha channel and magically the filesize reduced from 18MB to just under 4MB. This would keep my app size reasonably low and reduce the installation process because the iOS devices wouldn’t have to unpack over 1000 files, it would be less than 200.

The next problem was that I would now have to ‘unpack’ all of the extra icons when the user purchased them. It turns out that with a UIImage category you can pretty quickly (according to an Xcode log I managed 40 images/second on an old generation iPod Touch) crop the images out of the original and save them to disk. I had been concerned that it would not be able to load such a large image into memory, however I incredibly didn’t get any memory warnings when doing so.

The next problem I faced was that I didn’t need the black background in each image. The easy solution to fixing this is to use a Core Image filter called CIColorMatrix which multiplies each color value and adds a value onto it. I could then just multiply all RGB values by 0 and add on 1 to set them to alpha. The new alpha could then just be 1 multiplied by one of the original RGB values.

I then wrote some new code that loaded the image using UIImage and applied the filter using Core Image before running the same splitting routine. This worked perfectly (and at about the same speed) in the simulator but I couldn’t get it working on the device – the images would crop but they would be completely blank, which was useless. I was also getting memory warnings.

From what I can gather UIImage will keep the original compressed version of the image in memory, hence why it could load a 4MB 36 megapixel image on a device. Core Image, on the other hand, needs the raw image data and so uncompresses it, which meant holding 36 megapixels * 4 bytes per pixel = 144MB of data in RAM. Instead of feeding me back a useable image it just gave up and gave me a blank 36MP image, which was useless. My final solution is therefore either to split the large image into smaller images or just apply Core Image filters when the images get displayed.

In conclusion, if you’ve got a large number (probably less than 200) of small images in your app you could probably reduce the app size significantly by putting them all in one image and unpacking the individual images from that. On the other hand, if you have a small number of large images it is probably best to keep them in individual images so that your app doesn’t crash.

Advertisements

Text in OpenGL on iOS

I’ve been playing around with OpenGL for the first time in a while for the last couple of days and I’ve now written up a group of classes that make my life a lot easier so that I don’t have to worry as much about displaying graphics in 2D games. Although the classes aren’t exactly Cocos2D standard they do show how to set up OpenGL with GLKit and produce complex 2D games without too much overhead.

My current challenge has been displaying text in OpenGL. Displaying text in a game is incredibly useful; you can feedback to the user on performance, score, achievements and simple notifications. I’ve been working on a simple game that needs to display the score at the top of the screen. Working on this I found that there were at least three good approaches.

The first is to use bitmap fonts and create vertices for each character and display the appropriate texture in each position. An advantage of this method is that it is very quick if you have relatively simple text to display. There are three main problems with this:

  • It wastes time and space: You have to create new font assets from a system font or build a custom font. This is going to take a lot of time without some automation and it will add to the size of your end executable.
  • You will likely limit yourself to regular English text: Unless you create fonts for a huge variety of character sets you are limited to ASCII characters which makes English the only viable language, which is a problem if you are displaying foreign names from Game Center for example.
  • Bad scaling: You have to include your bitmap font at the highest resolution it will need to be and if you are using iPhone, iPad, retina and non-retina graphics in your game this is going to be pretty large.

The second method is to use UIKit elements such as UILabel to present text. This works alright in a 2D scenario where you don’t need to regularly update text (if you do, enjoy seeing your frame rates drop to about 10fps) but Quartz transformations aren’t ever going to allow you to perfectly position your text where you need in a 3D environment. I don’t particularly like this method, but it will work for some people. It is worth noting that this method doesn’t fall down on any of the problems that the first method does, however.

Finally, the most effective method in my opinion is to use GLKit to create textures from CGImages which can be drawn to with Quartz 2D. This is incredibly advantageous because it means that you can use any of the system fonts with a huge variety of character sets. You can also effectively scale for different screen resolutions, which you simply wouldn’t be able to do with bitmapped fonts. Here is roughly how I’m doing it at the moment:


//You may wish to use the extras to set an appropriate scale factor
CGSize size = CGSizeMake(100,100);
float scale = [[UIScreen mainScreen] scale];
UIGraphicsBeginImageContextWithOptions(size, NO, scale);
CGContextRef context = UIGraphicsGetCurrentContext();
//Drawing code
CGImageRef image = CGBitmapCreateFromContext(context);
UIGraphicsEndImageContext();
GLKTextureInfo *texture;
texture = [GLKTextureLoader textureWithCGImage:image options:nil error:nil];
CGImageRelease(image);
self.scoreTexture = texture;

Initially I found this code to work very well, however as soon as I needed to update the image regularly I found that my app would go from using 10MB of memory total (including other sprites) to around 1GB of virtual + real memory before it crashed out. This seemed very odd as I was replacing scoreTexture each time. I was releasing unneeded Quartz resources. It turns out the real reason is that GL takes over the texture memory, so you actually have to use OpenGL functions to release the texture first:


GLuint name = self.scoreTexture.name;
glDeleteTextures(1, &name);
//Replace GLTextureInfo in memory

Thanks to alokoko on Stack Overflow for point this out.

Once you’ve done this you can easily update text regularly and draw on screen however there are disadvantages. I firstly tried doing the new texture generation in a separate thread using NSOperationQueue however I found it a little unreliable because there was no guarantee of the length of time it takes to generate the new texture, so the queue rapidly filled up which produced further memory errors. Ultimately, it seemed that the safest way of doing this was creating a new texture up to 10 times a second in a separate thread and then queuing an operation to update the main texture in memory on the main thread.

Of course, there are other approaches and I would be interested to hear if you have any better ones.