Photobiology Unit, Department of Dermatology, Ninewells Hospital & Medical School, Dundee, UK
Light is a basic requirement for life on earth. The sun sustains life and is still the major source of lighting for humankind. It is an atomic furnace that turns mass into energy. Each second, four million tons of mass is discharged into space as energy, although, the earth receives only about two billionths of this (1). For many centuries, man relied on flames to produce artificial light until in 1809 Sir Humphrey Davey demonstrated the electric carbon arc at the Royal Institution in London. Progress in electric lighting continued throughout the 19th century, with Thomas Edison and Joseph Wilson Swan being credited with the invention of the incandescent lamp in 1878, albeit at least 22 other inventors would challenge this claim (2). In 1906, the tungsten filament was introduced as a more efficient incandescent source and this formed the basis of the common light bulb that has been in widespread use for over a century.
Now, however, its days are numbered. The compact fluorescent lamp (CFL) has all but replaced it. In an article on emerging lighting technology, the Economist asks the question ‘How many inventions does it take to change a light bulb?’ (3). Lighting technology is changing rapidly from traditional incandescent bulbs to CFLs, on to light-emitting diodes (LEDs) and, just over the horizon, quantum dots. However, health concerns have been raised over the possible risks of exposure to light from CFLs to individuals with photosensitive skin disorders and the shortage of studies that have been carried out before the rapid introduction of this new technology.