In the previous article we used 2D graphics via the Android Canvas to draw a custom UI widget. The Canvas approach is very useful when you need to draw relatively static graphics or when there is not too much complex animation. That is the typical case for UI widgets.
In this article we will develop a live wallpaper that displays a fire simulation that you might remember from the good old days of demo scene and DOS. Since we need very fast graphics, we will use OpenGL as the rendering engine. Although OpenGL is typically used for 3D graphics, it can also be used for hardware-accelerated 2D graphics, such as in this case.
Here’s our plan:
- Understand and implement the fire effect
- Enable OpenGL to render the fire on the screen
- Convert our simulation into a live wallpaper
The source code is available at the bottom of the article (but read on to learn some of its limitations).
Note: The part related to live wallpapers will only work on Android 2.1 or higher.
Step 1: The Fire Effect
I am lucky to remember the old days when CPUs were weak, there was no OpenGL or GPUs and the only way you could achieve a reasonable frame rate in a graphical application was memcpy()’ing into the video memory. A typical desktop PC you would write code for was much, much weaker than a Motorola Droid.
Those days are gone, but they left us some interesting tricks and ideas, and some graphic effects that still look cool (at least to a nostalgic geek like me). Among them is the fire effect. It is built on a relatively simple algorithm that nevertheless creates impressive results when done right and tweaked for some time.
So how do yo do it? First, imagine that you have an M x N matrix that contains a value in each cell. Values can be of any fixed range, but let’s say they are integer values between 0 and 255. This way we can represent our matrix as an array of bytes.
Let’s call this matrix the intensity map. Our fire is 2D so let’s say a cell at (x, y) tells us how intensive the fire at this part of the surface is. Obviously, the greater M and N are, the more fine-grained the fire is. But let’s see how we fill and transform the intensity map.
Now, in order to start the fire we do the following: let’s fill the bottom row of the matrix with random dashes of maximum intensity (in our case 255) so that it looks like this:
As you can see, the bottom row is filled with either 0 or 255, and 255‘s occur in series rather than one at a time. How often they appear and how long each dash is affects the resulting fire so you will play with this later. You can also seed several bottom rows rather than one.
Next, for every row, starting from bottom – 1, bottom to top, you do the following:
- Take each cell in the row, left to right or right to left
- Take the current value of the cell, plus the value of the cell on the left, plus the value of the cell on the right, plus the value of the cell directly below this one, and average them (divide by 4)
- Use the resulting value as the new value of the cell
You can repeat this multiple times. What happens is the intensity you seeded at the bottom row becomes smoothed or “smeared” towards the top. You can do the same in Photoshop using the smudge tool.
At this point you have an intensity map that looks something like this:
..:......... .:::.....:.. .:#:....:#.. ####....###.
Sorry for the wretched illustration, but the idea is that the bottom row contains either 255‘s or 0‘s while the rows above contain the averaged, smoothed up values that naturally decay the further up we go from the bottom row.
Now, if we regenerate (re-seed) the bottom row every X milliseconds, the result will look like randomly appearing flames of fire. But we are still talking about the intensity map rather than colors or pixels. However, it is easy to create a bitmap (pixel map) from the intensity map. Let’s say we map the values from the 0-255 range to a color range that is built so that 0 corresponds to black and 255 corresponds to the lightest color of our fire. The colors in between smoothly transform from black to the lightest, and maybe change in their hue (which is percentage of red, green and blue components) if we want so.
(Just a note – you don’t need to clear the entire matrix to re-seed it. You just refill the bottom row with 255‘s and 0‘s like in the beginning, but do not touch the rest of the matrix.)
In this case, our intensity map can be directly mapped into a bitmap of the same size by converting each cell into a bitmap pixel according to the color range we just defined.
So here’s how the fire loop works:
- If it’s time to seed (>= X ms elapsed since the previous seed, or it is the first iteration of the loop), then seed the intensity matrix
- Iterate (smear up) the intensity matrix several times
- Fill the bitmap according to the intensity map via the color range
- Draw the bitmap on screen
Believe it or not, this creates a very realistic fire simulation after you play around with the parameters for some time.
However, if we go back to the real world, there are several apparent issues:
- How large the intensity matrix and the bitmap should be to look smooth on a high-resolution screen?
- How do we redraw the bitmap fast enough to have a smooth animation without eating 100% CPU?
You can go ahead and try implementing this using a Canvas if you want to understand how critical those two issues are. But I did that for you and I can tell that Canvas is not an option in this case.
I suggest OpenGL as the final answer to both of those questions.
Step 2: OpenGL as the Blitting Engine
Blitting is an old 2D graphics term for drawing a bitmap on another bitmap or a surface, in our case the screen. Finding the fastest way to blit bitmaps has been the holy grail of 2D graphics since ages. And even in our simple file simulator performance becomes a problem we have to solve.
Those of you who know OpenGL might be confused by this non-typical usage of its capabilities, and those who don’t know it should not be afraid. What we will use from OpenGL is only its texture loading features and the draw_texture extension. You don’t even have to know much about OpenGL to understand our approach:
- We will create an empty (black) texture within the OpenGL context, of the same size as our fire bitmap
- Our fire bitmap (that is filled according to the intensity map) will copy its data to the OpenGL texture after every update
- We will use the draw_texture function glDrawTex to draw the 2D texture on the screen
- We will use the GL_LINEAR interpolation that will allow us to use a low-resolution fire bitmap and have OpenGL smooth up its pixels into a beautiful non-pixelated result
Now the points above might sound like gibberish if you’re not familiar with OpenGL, but don’t worry. The bottom line is that OpenGL will provide us fast blitting and includes efficient bitmap smoothing for free, so we can have a small intensity map (that is faster to update) and still have a non-pixellated result on the screen.
However, there is a catch that draw_texture is not available on every phone. I happen to have a Motorola Droid, and I assume all newer phones have that extension. So I’m sorry if it’s not available on your phone. There are ways around that so just ask me in a comment if you’re interested.
In order to write OpenGL based code, you should have a look at the GLSurfaceView class and the OpenGL samples in the Android SDK sample collection. Of course, you are welcome to use the code at the bottom of this article as your guide too.
The outcome on my phone is:
- Up to 35 FPS (I added some thresholding to keep it just around 10 FPS in order not to eat too much CPU)
- A 128×16 bitmap looks completely smooth on a 854×480 Droid screen thanks to GL_LINEAR smoothing (which, curiously, does not seem to impact FPS significantly)
I am quite happy with those results so let’s move on and turn our isolated GLSurfaceView fire simulator into a Live Wallpaper.
Step 3: Live Wallpaper
Live Wallpapers were introduced in Android 2.1. They look amazingly cool, and the standard distribution I have on my Verizon Droid includes wonderful, creative and fascinating live wallpapers, such as Water or Grass, or Magic Smoke. Our wallpaper won’t be exactly as cool, but you can develop it further and then sell on the market among other live wallpapers that are already sold there.
Technically, a live wallpaper is a service. I already covered services, but in a somewhat different context. Unlike most typical services, live wallpaper services draw something on the screen. You should have a look at the WallpaperService API class if you want to understand the details, but the bottom line is that you are responsible for providing your own custom drawing code, as well as stopping and resuming your animations when requested to do so.
However, an unexpected turn-off I brushed into was that WallpaperService does not support drawing with OpenGL if used as prescribed. This was so weird that I was almost shocked, since I had seen multiple live wallpapers that obviously used OpenGL.
Then I found this life-saving article by Robert Green that shows how to subclass the WallpaperService class to allow OpenGL rendering. The solution he has is rather complicated and you have to understand how OpenGL works and how its native capabilities are represented in the Java API, but I still encourage you to read the article and to subscribe to Robert’s blog because he is really cool.
Now, I used Robert’s GLWallpaperService class which hides all the complexity of his solution, and the rest was easy. I converted my existing OpenGL code that implemented GLSurfaceView.Renderer into a GLWallpaperService.Renderer, added the magic descriptors to the AndroidManifest.xml file and that was it. You can see the result in the video at the beginning of this article.
Update: I just found a problem that the fire bitmap does not resize correctly when the screen orientation changes. This must be something in the GLWallpaperService code since it works fine in a GLSurfaceView. I will try to fix it later but for now this might become an exercise for you guys
Attachment: the complete source code