In developing a multimedia system, the participants need to consider the type of hardware as well as software used. The hardware must be capable of supplying the needs of the multimedia system. These needs are:
  • Primary and secondary storage capabilities – which will enable bit depth and colour to be represented and also enabling the audio data to be sampled.

  • Processing speed – which will enable video data and frames rates to be maintained, the processing of images such as morphing and distorting as well as animation.

  • Display devices – which are capable of displaying good pixel and resolution quality.
In order for hardware to cope with this demand, several components need to be installed. Such items as additional memory, both VRAM and RAM, as well as additional hard disk space. The hardware itself must have a high processing speed to enable the number crunching involved in processing the data, which can range from video and audio back to word processing. The monitor or VDU should be capable of displaying a large resolution capacity to enable it to display the digital data accurately.



An image is composed of individual pixels. In an typical VDU screen for a PC, there are about 500 000 to well over 1 000 000 pixels. Each pixel in most graphics systems is controlled by data stored in the systems video RAM or frame buffer.

Image Characteristics

High resolution (HIRES) and low resolution (LORES) are often used to describe graphic images and graphic display systems. However, due to the development in display technology, these terms no longer have a clear definition. In the early 1990s, a screen with resolution 640 x 480 used to be a HIRES but is now considered as a LORES. The amount of pixels affect the resolution of an image, which affects its quality. More pixels in the display image means that each pixel is smaller and that much more information and detail can be shown. Icreasing the number of pixels also means that more storage (a bigger frame buffer) is required.


Bit depth plays a roll once colour is included and it is important to keep in mind that colour includes shades of gray. Because the bit depth is no longer a 1 or a zero it plays an important part in the formula.

Frame buffer
Frame buffer

Image Display

A palette is the number of available colours for a graphic display or image. The size of the palette depends on the image bit depth. The maximum number of colours available in a display system, or a system palette, is fixed by the graphics display hardware. Individual images can have much smaller image palettes by selecting and utilising only some of the colours available in the system palette.
GIF uses an image palette of up to 256 different colours. Data describing those colours are stored as a colour table inside the image file. Each colour in the image palette has its own code number in the colour table. For example, a 256 colour GIF image would then have a colour table with 256 values (from 0 to 255).

Colour picker
Colour picker

Colour System
red green blue
matches the video signals used in VDUs
hue stauration lightness
alternative to RGB
cyan yellow magenta black
matches the colour inks used in traditional printing

In some graphics software packages, HSL is called HSV (Hue Saturation Value) or HSL (Hue Luminosity Saturation). In the HSL system, there are 360 colour values, called hues. Each has a value that indicates its angle around the colour wheel. Saturation is expressed as a percentage. It is the level of purity or brightness of a hue. If an image is in black and white, its saturation is 0 per cent. Lightness (Luminosity or Value) is the amount of white added to a hue and ranges from 100 per cent (pure white) to 0 per cent (pure black). Grey scales are the shade of grey between black and white in a graphic display system. They are stored as RGB values. Greys are created when red, green, and blue values are equal. And since there are 256 values for each of these three colours, there must be 256 shades of grey (including black and white). The light shades of grey have high RGB values (closer to white) while dark greys have low RGB values (close to black).


Audio Characteristics
Audio file formats
Audio file formats

Different formats for storing audio in multimedia: WAV (waveform), MIDI, MP3 and RealAudio. New formats are constantly being introduced.
A waveform is frequently displayed on a VDU in the shape of a wave. This allows the wave-like characteristics of the sound to be seen and altered by the user.

Wave Characteristics:
  • Amplitude - wave height which gives the sound its volume
  • Wavelength - the spacing between waves which gives the sound its pitch or note. The pitch of a sound is normally expressed as a frequency, which is the number of wavelengths that pass in one second

A MIDI file is not a digitised sound file and cannot be played directly through speakers or displayed and altered in the same way as a waveform file. It contains instructions for musical instruments.


Waveform file
Waveform file

Type of file
- small file sizes with less CPU processing required
- all notes and instruments can be edited
- playback speed can be altered without affecting sound quality
- quality depends on the attached musical instruments
- cannot reproduce speech
- editing requires some musical knowledge
- more reliable playback
- does not require expensive equipment
- files are very large
- files require more CPU processing
- it is not possible to edit all the characteristics of the digitised sound

Song editor software

Video and Animations

GIF animation software
GIF animation software

Video and Animation Characteristics

Creating a multimedia video is like audio where it is the process of recording and storing many samples of a signal. The signal represents visual data - light intensity levels and colours. In multimedia, creating a high quality video is not as easy as it is for audio. Here are two main problems in using video in multimedia systems:
  1. A single full screen (800x600) video frame in 24-bit true colour need 1.4 megabytes of storage space. A single second of video played at full speed (30 frames per second) would need over 40 megabytes of storage. Video require a fairly large amount of storage space.
  2. The video data has to be transferred from secondary storage (e.g. hard disk, DVD) into RAM. It must then be processed and moved into the frame buffer without affecting the quality of the display. Video also makes big demands on the system's processing abilities.

Making compromises to overcome these problems allow a smooth full motion effect, reduction of storage space and processing power needed to play the video. Here are two most common compromises:
  1. Reduced screen size - instead if using the entire screen area to replay the digitised video, a smaller playback area is used
  2. Reduced speed - instead of using the VDU screen refresh rate (often 30 frames per second), a much slower playback speed of 10 or even fewer frames per second is used.

The most popular multimedia file storage formats for video images are AVI, QuickTime, and MPEG. For animations, files are stored as GIF. Multimedia animations are rarely full screen , are usually played at a reduced speed and almost always use a reduced colour palette. Resulting to only minimal problems. The best known example of multimedia animation would be the animated GIFs used on the Web, many are small icon-sized animations. Methods of animating:

Cel-based Animation - This is the most common method in animating images. Each picture is a frame called a cel. The graphic artist must create each each cel separately, although most animation software packages have tools that will speed up this process. For example, the graphic artist may create the background (all the stationary parts of the image) first and then copy it into all the cels. The moving objects can then be created and placed separately in each individual cel. This type of animation gives the animator complete control over every frame of animation.

cel-based animation
cel-based animation

Path-Based Animation - In this animation the graphic artist describes the path or every movement of every object. The system then creates the animation frames with each object drawn in its correct position. Each frame is created by the system. Path-based animation is faster to create that cel-based however the animator does not have the same level of control over the animation.

Path-based animation
Path-based animation

Morphing - Morphing is an animation technique where one image is changed, pixel by pixel, into a completely different image. Starting and finishing images are needed for a morph. Generally, to create a smooth animation effect between the original images, it would need a large number of image frames. The system creates a morph by altering the colour and/or intensity of the individual pixels by small amounts for each frame it creates. A graphic artist may often need to spend some time retouching individual frames to improve the animation effects.

MorphBuster Software
MorphBuster Software

Warping - This animation technique gradually distorts an image by changing its pixels. The system creates the warp by altering the colour and/or intensity of the image pixels by small amounts for each frame it creates. The end result is a smooth animation where the origional image appears to gradually distort itself into a different form.

Morpheus Photo Warper software
Morpheus Photo Warper software

The file size is interfered by compression and the inclusion of metadata