The Problem
As I was porting functionality from the old MHFramework to the new one, I realized that I needed a platform-independent solution for generating images from buffers. Since the old engine was based entirely on Java's Abstract Window Toolkit (AWT) which isn't available on Android, I needed to find a way to accomplish the same thing for both of those platforms in a consistent manner.Proposed Solutions
The first solution that came to mind was to encapsulate graphics contexts (MHGraphicsCanvas) into the the MHBitmapImage classes just like AWT does. However, as with all such decisions, it comes with some immediate advantages and disadvantages.Pros:
- Our engine is never going to use a graphics context for anything other than drawing to a buffered bitmap. Combining these classes would hide the coupling. This cleans up the class structure of the platform layer and also greatly simplifies the implementation of more advanced visual effects.
- AWT and Android sort of reverse the association between bitmaps and canvases, and this would encapsulate those differences internally so we'd have a uniform way to work with image data. (AWT's Image has Graphics, and Android's Canvas has a Bitmap, so even though they're semantically equivalent, their compositions are inverted.)
Cons:
- MHBitmapImage no longer just stores image data. It now also provides an interface for manipulating that data, so we may be in violation of the Single Responsibility Principle.
- Not all image data requires a graphics context until it's rendered, so this could incur some memory overhead.
- HOWEVER, we can solve both problems through composition and lazy instantiation. Besides, this relationship already exists at the platform level anyway.
The Chosen Solution
I decided to keep MHBitmapImage and MHGraphicsCanvas as separate classes, but I removed MHPlatform's factory method for creating a graphics canvas. Now the only way to retrieve a canvas for drawing is to extract it from the image object. Now you always have immediate access to the results of every rendering operation.For example, the double-buffered rendering now happens by using an MHBitmapImage as the back buffer, and then passing its associated MHGraphicsCanvas to the screen manager. When the call returns, the bitmap image is presented physically to the screen device.
I am happy with this solution. Although I was unable to satisfactorily eliminate a class, I feel very comfortable with the design principles involved and the improvement in general usability of these critical elements.
No comments:
Post a Comment