Fusion (a.k.a. pan-sharpening) is a way of combining the spectral information of a coarse-resolution image with the spatial resolution of a finer image. The result is a product that synergistically combines the best characteristics of each of its components. We frequently use fusion to combine the simultaneously-collected multispectral and panchromatic components of a high-resolution satellite scene. For example, we would combine the 2m multispectral portion with the 0.5m panchromatic portion of a WorldView-2 scene to produce a 0.5m fused image. Although not as common, fusion can also be used to combine images from different sources. At i-cubed, we have been at the forefront of research into fusion with the development of the Modified IHS method, which has been incorporated into ERDAS IMAGINE software. Currently we prefer the UNB algorithm, which is well-known in image processing circles for its fidelity in preserving the color from the source imagery.