Learning Colours from Textures by Effective Representation of Images
- Publisher:
- International Frequency Sensor Association (IFSA) Publishing
- Publication Type:
- Chapter
- Citation:
- Advances in Signal Processing: Reviews, 2018, 1 pp. 277 - 304
- Issue Date:
- 2018-01-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Pages from Advances_in_Signal_Processing_Vol_1.pdf | Published Version | 1.12 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Arguably the majority of existing image and video analytics are done based on the texture. However, the other important aspect, colours, must also be considered for comprehensive analytics. Colours do not only make images feel more vivid to viewers, they also contains important visual clues of the image [20, 54, 24]. Although a modern point-and-shoot digital camera can easily capture colour images, there are circumstances where we need to recover the chromatic information in an image. For example, photography in the old days was monochrome and provided only gray-scale images. Adding colours can rejuvenate these old pictures and make them more adorable as personal memoir or more accessible as archival documents for public or educational purposes. For a colour image, re-coloursation may be necessary if the white balance was poorly set when shooting the picture. In this case, a particular colour channel can be severely over- or under- exposure, and makes infeasible to adjust the white balance based on the recorded colours. A possible rescue of the picture is to keep only the luminance and re-colourise the image. Another example of the application of colourisation arises from the area of specialised imaging, where the sensors capture signals that are out of the visible spectrum of light, e.g. X-ray, MRI, near infrared images. Pseudo colours for these images make them more readily for interpretation by human experts, and can also indicate potentially interesting regions.
Please use this identifier to cite or link to this item: