TECHNICAL ADVANCE Open Access Automatic colorimetric calibration of human wounds Sven Van Poucke1*†, Yves Vander Haeghen2†, Kris Vissers3, Theo Meert4, Philippe Jorens5 Abstract Background: Recently, digital photography in medicine is considered an acceptable tool in many clinical domains, e.g. wound care. Although ever higher resolutions are available, reproducibility is still poor and visual comparison of images remains difficult. This is even more the case for measurements performed on such images (colour, area, etc.). This problem is often neglected and images are freely compared and exchanged without further thought. Methods: The first experiment checked whether camera settings or lighting conditions could negatively affect the quality of colorimetric calibration. Digital images plus a calibration chart were exposed to a variety of conditions. Precision and accuracy of colours after calibration were quantitatively assessed with a probability distribution for perceptual colour differences (dE_ab). The second experiment was designed to assess the impact of the automatic calibration procedure (i.e. chart detection) on real-world measurements. 40 Different images of real wounds were acquired and a region of interest was selected in each image. 3 Rotated versions of each image were automatically calibrated and colour differences were calculated. Results: 1st Experiment: Colour differences between the measurements and real spectrophotometric measurements reveal median dE_ab values respectively 6.40 for the proper patches of calibrated normal images and 17.75 for uncalibrated images demonstrating an important improvement in accuracy after calibration. The reproducibility, visualized by the probability distribution of the dE_ab errors between 2 measurements of the patches of the images has a median of 3.43 dE* for all calibrated images, 23.26 dE_ab for all uncalibrated images. If we restrict ourselves to the proper patches of normal calibrated images the median is only 2.58 dE_ab! Wilcoxon sum-rank testing (p < 0.05) between uncalibrated normal images and calibrated normal images with proper squares were equal to 0 demonstrating a highly significant improvement of reproducibility. In the second experiment, the reproducibility of the chart detection during automatic calibration is presented using a probability distribution of dE_ab errors between 2 measurements of the same ROI. Conclusion: The investigators proposed an automatic colour calibration algorithm that ensures reproducible colour content of digital images. Evidence was provided that images taken with commercially available digital cameras can be calibrated independently of any camera settings and illumination features. Background Chronic wounds are a major health problem, not only because of their incidence, but also because of their time- and resource-consuming management. This study was undertaken to investigate the possible use of colori- metric imaging during the assessment of human wound repair. The outline design of the current study is based on the system requirements for colorimetric diagnostic tools, published previously [1,2]. Digital photography is considered an acceptable and affordable tool in many clinical disciplines such as wound care and dermatology [3-12], forensics [13,14], pathology [15], traumatology, and orthodontics [16,17]. Although the technical features of most digital cameras are impressive, they are unable to produce reproducible and accurate images with regard to spectrophotometry [18-22]. Taking two pictures of a wound with the same camera and settings, immediately after one another, nor- mally results in two slightly different images. These dif- ferences are exacerbated when the lighting, the camera or its settings are different. Therefore, reproducibility is * Correspondence: svenvanpoucke@woundontology.com † Contributed equally 1Department of Anaesthesia, Critical Care, Emergency Care, Genk, Belgium Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 © 2010 Van Poucke et al; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. mailto:svenvanpoucke@woundontology.com http://creativecommons.org/licenses/by/2.0 poor. This may be less important when photographs are taken for documentation purposes, but when digital photography becomes part of medical evaluation or is used to perform measurements, it becomes critically important [5,6,18,23-29]. In our view, the quality of medical photography is principally defined by its repro- ducibility and accuracy [21]. Without reproducibility and accuracy of images, any attempt to measure colour or geometric properties is of little use [27]. A simple, practical and validated algorithm to solve this problem is necessary (Figure 1). Almost all colours can be reconstructed using a com- bination of three base colours; red, green and blue (RGB) [30]. Together, these three base colours define a 3-dimensional colour space that can be used to describe colours. The accurate handling of colour characteristics of digi- tal images is a non-trivial task because RGB signals gen- erated by digital cameras are ‘device-dependent’, i.e. different cameras produce different RGB signals for the same scene. In addition, these signals will change over time as they are dependent on the camera settings and some of these may be scene dependent, such as the shutter speed and aperture diameter. In other words, each camera defines a custom device-dependent RGB colour space for each picture taken. As a consequence, the term RGB (as in RGB-image) is clearly ill-defined and meaningless for anything other than trivial pur- poses. As measurements of colours and colour differ- ences in this paper are based on a standard colorimetric observer as defined by the CIE (Commission Internatio- nale de l’Eclairage), the international standardizing body in the field of colour science, it is not possible to make such measurements on RGB images if the relationship between the varying camera RGB colour spaces and the colorimetric colour spaces (colour spaces based on said human observer) is not determined. However, there is a standard RGB colour space (sRGB) that is fixed (device- independent) and has a known relationship with the CIE colorimetric colour spaces. Furthermore, sRGB should more or less display realistically on most modern display devices without extra manipulation or calibration (look for a ‘sRGB’ or ‘6500K’ setting) [31]. One disad- vantage of sRGB is that it cannot represent all the col- ours detected by the human eye. We believe that finding the relationship between the varying and unknown cam- era RGB and the sRGB colour space will eliminate most of the variability introduced by the camera and lighting conditions. The transformation between the input RGB colour space and the sRGB colour space was achieved via a col- our target-based calibration using a ‘reference chart’, namely the MacBeth Colour Checker Chart Mini [MBCCC] (GretagMacBeth AG, Regensdorf, Switzer- land). This chart provides a checkerboard array of 24 scientifically prepared coloured squares or patches in a wide range of colours with known colorimetric prop- erties under a CIE D65, noon daylight illuminant (6504 K). Many of these squares represent natural objects of special interest, such as human skin, foliage and blue sky. These squares are not only the same col- our as their counterparts, but also reflect light the same way in all parts of the visible spectrum. Different cali- bration algorithms defining the relationship between the input RGB colour space of the camera and the sRGB colour space have been published using various methods such as 3D look-up tables and neural networks. The algorithm in this study is based on three 1D look-up tables and polynomial modelling, as previously published by Vander Haeghen et al. [32] (Figure 2). This is a little different than e.g. the general methods used in the well- known ICC profiles http://www.color.org/index.xalter. In Figure 1 Chronic Wound and Reference Chart. Chronic Wound and Reference Chart after (left) and before (right) calibration. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 2 of 11 http://www.color.org/index.xalter ICC profiles the relationship of an unknown colour spaces to the so-called ‘profile connection space’ (PCS, usually CIE XYZ) are computed and stored. Output is then generated by going from this PCS to the desired output colour space, which in our case would be sRGB. This means 2 colour space transformations are required (RGB to PCS to sRGB), while our algorithm only needs 1. Although an inherently more flexible system, ICC profiling seems overkill for our intended application (straight camera RGB to sRGB transformation, without the need of determining and storing or embedding a device profile). However, It must be said that the advent of e.g. LittleCMS http://www.littlecms.com/ which is a free colour management system that focuses on determination and immediate application of profiles on images may change this view in the future, and that such a system could be a viable alternative for the cur- rent colour space transformation algorithms in our system. Methods The research has been carried out in accordance with the Helsinki Declaration; the methods used were subject to ethical committee approval (B32220083450 Commis- sie voor Medische Ethiek Faculteit Geneeskunde Leuven Belgium). Patients received detailed written and verbal explanation and patient authorization was required before inclusion and analysis of the images. Figure 2 RGB to sRGB Transformation Scheme. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 3 of 11 http://www.littlecms.com/ Experiment 1 The purpose of the first experiment was to investigate whether camera settings or lighting conditions nega- tively affect the quality of the colorimetric calibration [33]. Chronic wounds are assessed in different locations and environments. Therefore, we assessed the calibra- tion algorithm under extreme lighting conditions and with inappropriate camera settings. Image Acquisition Digital images of the MBCCC on a grey-coloured back- ground, in a Colour Assessment Cabinet CAC 120-5 (VeriVide Leicester, UK), were taken using two digital cameras; the Nikon D200 SLR (10.2 effective mega pix- els) with a 60 mm AF Micro Nikkor lens, and the Canon Eos D10 (6.3 million effective pixels) with a 50 mm Canon EF lens. All images were processed in high-quality jpeg mode. This means after the camera has applied processing (demosaicing, colour correction curves, matrixing, etc. ...) to the images. (Table 1). Calibration Procedure During the calibration procedure uniform illumination is assumed, as is a reference chart as part of the image of interest. The calibration provides a means of transform- ing the acquired images (defined in an unknown colour space, which is normally RGB), to a standard, well- defined colour space i.e. sRGB [34]. sRGB has a known relationship to the CIE L*a*b* colorimetric space, allow- ing computation of perceptual colour differences. The CIE L*a*b* colorimetric space, or CIELAB space with coordinates L*, a* and b*, refers to the colour-opponent space; L* refers to Luminance, a* and b* refer to the col- our-opponent dimensions [34-36]. The ‘detection of the MBCCC’ in the digital image can be done manually or automatically. The algorithm behind MBCCC detection is based on the initial detection of all the bright areas in an image (areas with pixel values close to 255), followed by a shape analysis. Shapes that are not rectangular, and either too small or too large compared with the image dimensions, are discarded (in pixel, we do not know the real dimension yet). The remaining areas are candidates for the MBCCC white patch. For each of the white patch candidates, the corresponding MBCCC black patch is searched for, taking into account the typical layout of the colour chart and the dimensions of the white patch can- didate. If this succeeds, the patches are checked for saturation (average pixel value > 255-δ or < \delta with \delta a small number, e.g. 3) in each of the colour chan- nels individually. If the number of saturated patches is acceptable (typically fewer than 6 out of 24 patches), cali- bration proceeds and its quality is assessed. Quality assessment consists of examining various conditions relating to the colour differences between the known spectrophotometric and the computed sRGB values, in accepted and rejected patches. If any of these tests fail, the algorithm rejects the calibration and continues the search. Analysis In this experiment precision is defined as a measure of the proximity of consecutive colour measurements on an image of the same subject. This is also known as reproducibility. The precision of the MBCCC chart detection, together with the calibration process, were evaluated by computing the perceptual colour differ- ences between all the possible pairs of measurements of each colour square of the MBCCC chart. These percep- tual colour differences are expressed in CIE units, and are computed using the Euclidean metric in the CIE L*a*b* colour space. Theoretically, one unit is the ‘just noticeable colour difference’ and anything above five units is ‘clearly noticeable’. The accuracy of a procedure is a measure of how close its results are to the ‘real’ values, i.e. those obtained using the ‘standard’ procedure or measurement device. For colour measurements this would be a spec- trophotometer. Consequently, the accuracy of the chart detection and colour calibration can be assessed by computing the perceptual colour differences between the measurements of the colour squares of the MBCCC chart and the spectrophotometric values of these squares. For this assessment the calibration was per- formed using half the colour patches of the MBCCC chart, while the other half were utilised in evaluation of accuracy. Accuracy is likely to be higher when the whole chart is used for calibration purposes. Precision and accuracy result in a probability distribution for the dE_ab errors. Tukey’s five-number summary of the dE_ab colour differences of each patch was also calcu- lated and visualized using a box plot (the minimum, the lower quartile, the median, the upper quartile and the maximum). Wilcoxon rank-sum statistics were used to Table 1 Parameter Settings Camera Type Canon Eos D10 Nikon D200 Scene lighting (cabinet) D65 TL84 A Camera sensitivity 100 ISO 400 ISO Camera exposure -1EV 0EV +1EV Camera white balance Auto Manual A D65 Lighting and camera settings records all the parameters that were varied during image acquisition. Note that these include some combinations that are inappropriate, such as setting the camera white balance to D65 with an A scene illuminant to produce off-colour images. This was done in order to challenge the calibration algorithm. 1st Illuminant D65: ‘Artificial Daylight’ fluorescent lamps conforming to Standard Illuminant D6500 (6500 K), 2nd Illuminant TL84: Philips Triphosphor fluorescent lamps, often chosen as a ‘Point of Sale’ illuminant, (5200 K), 3rd Illuminant A: ‘Filament (domestic) lighting’ (3000 K). In digital photography, ISO measures the sensitivity of the image sensor. Exposure is measured in lux seconds, and can be computed from exposure value (EV) and scene luminance over a specified area. White balancing is a feature that allows to compensate for different lighting conditions by adjusting the color balance based on the difference between a white object in the image and a reference white. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 4 of 11 test the calibration, which compares the locations of two populations to determine if one population has been shifted with respect to another. A sum of ranks compar- ison, which works by ranking the combined data sets and summing the ranks for each dE_ab, was utilised to compare the sum of the ranks with significance values based on the decision alpha (p < 0.05). Experiment 2 The second experiment was designed to quantify the impact of the automatic calibration procedure i.e. the chart detection, on real-world measurements. This may be of importance in a clinical setting, where automatic calibration of large batches of images in a single run is required. To examine this, 40 different images of real wounds were acquired, and a region of interest (ROI) was selected within each image. Three rotated versions (at 90°, 180° and 270°) of each image were created and automatically calibrated (Figure 3). Comparisons between the colour measurements of the ROIs of the rotated versions of each image highlighted the errors introduced by the automatic chart detection component of the calibration procedure. Image Acquisition Digital images (n = 40) of the chronic wounds were taken using a Sony Cybershot DSC-F828 digital camera (8.0 million effective pixels) and Carl Zeiss 28 - 200 mm equivalent lens, with fully automatic settings at different indoor locations, as is usually the case in daily clinical practice. Calibration Procedure and Analysis The calibration procedure was carried out in accordance with that recorded for experiment 1. The dE_ab colour differences between the average colour of the ROI of the four rotated versions of each image were computed and visualized using a probability distribution graph. Results Experiment 1 Figures 4, 5, 6, 7, 8 and 9 demonstrate examples of rea- listic sample images under different illuminants and show the corresponding calibrated images taken with Figure 3 Chronic Wound Images with Reference Chart and a Region of Interest. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 5 of 11 Figure 4 Example with Nikon D200: MBCCC under illuminant A (3000 K). Camera: exposure bias -1, automatic white balance. Figure 5 Example with Nikon D200: MBCCC under illuminant A (3000 K). Camera: exposure bias -1, calibrated image. Figure 6 Example with Canon 10D: MBCCC under illuminant A (3000 K). Camera: exposure bias -1, manual white balance. Figure 7 Example with Canon 10D: MBCCC under illuminant A (3000 K). Camera: exposure bias -1, calibrated image. Figure 8 Example with Nikon D200: MBCCC under illuminant D65 (6500 K). Camera: exposure bias -1, manual white balance. Figure 9 Example with Nikon D200: MBCCC under illuminant D65 (6500 K). Camera: exposure bias -1, calibrated image. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 6 of 11 different cameras. The images contained many saturated patches (see the ‘x’s on the patches) that were not used for the calibration, resulting in a lower quality calibration. The accuracy and reproducibility of the colour calibra- tion using different cameras, camera settings and illumina- tion conditions are presented using a probability distribution of dE_ab errors of all the MBCCC patches (Figure 10). A distinction is made between the full set of images and the ‘normal’ images, which were acquired with proper camera settings: correct manual or automatic white balance and no exposure bias. Indeed, the full set contains several images that were strongly over- or under- exposed, or had a mismatched white balance. These images demonstrated the effectiveness of the calibration method, but are not representative of day-to-day photo- graphy. Moreover, the term ‘proper patch’ was used to indicate patches that were not saturated during acquisition i.e. those patches with pixel values too close to 255 or 0. It was not possible to calculate the pixel value of these satu- rated patches, and their calibration was unfeasible. The accuracy and reproducibility results for the set of proper patches of normal images are representative for colours in properly photographed images, which are dif- ferent from the colours of the patches that were disre- garded during calibration due to saturation (marked by ‘x’ on the calibrated image) (Figure 11). Saturation in normal images or skin imaging is rare, but if it does occur it normally manifests itself as an overexposure of white, deep red, yellow and orange MBCCC patches. If this problem is frequent with a particular camera, it can be remedied by slightly underexposing images by, for example, half an f-stop (exposure bias). Tukey’s five-number summary of the dE_ab colour differences of each proper patch of the normal images was calculated and visualized using a box plot (the mini- mum, the lower quartile, the median, the upper quartile and the maximum) (Figure 12). Outliers were marked with a red ‘x’. To evaluate accuracy, the chart patches were split in two groups of 12 patches and only the sec- ond group was used for calibration, resulting in a lower quality calibration than if 24 patches had been used. The first group of 12 patches was used to check the accuracy. Colour differences between the measurements and real spectrophotometric measurements revealed median dE_ab values of 6.40 for proper patches of calibrated normal images and 17.75 for uncalibrated images, respectively, demonstrating an important improvement in accuracy after calibration (Figure 10). The result for the patches used in the calibration was also included, and they had a median of 1.59 dE_ab. Figure 12 presents the accuracy box plot for the proper patches of the normal images. As mentioned Figure 10 Accuracy of Color Calibration. Probability distribution of dE*ab errors between the patches of the images and spectrophotometric measurements. Based on 39 images (Nikon D200) & 15 images (Canon 10D) under different illuminants and settings. Median dE*ab is 6.40 for the proper patches of calibrated normal images, 17.75 for uncalibrated images. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 7 of 11 Figure 11 Reproducibility of Color Calibration. Probability distribution of dE*ab errors, based on 39 images taken with a Nikon D200 and 15 images with a Canon 10D under different illuminants and settings. Median of 3.43 dE*ab for all calibrated images, 23.26 dE*ab for all uncalibrated images, a median of 2.83 dE*ab for all ‘normal’ calibrated images and 14.25 dE*ab for all ‘normal’ uncalibrated images. If we restrict ourselves to the proper patches of normal calibrated images the median is only 2.58 dE*ab. Figure 12 Accuracy: boxplot for the proper patches of the normal images. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 8 of 11 above, we could only use patches that had not been used in computing the calibration in order to check accuracy, therefore only 12 patches are shown in this figure. As figure 11 demonstrates, the reproducibility, visua- lized by the probability distribution of the dE_ab errors between two measurements of the patches of the images, had a median of 3.43 dE* for all calibrated images, 23.26 dE_ab for all uncalibrated images, a median of 2.83 dE_ab for all ‘normal’ calibrated images, and 14.25 dE_ab for all ‘normal’ uncalibrated images. Restricting the calcu- lation to the proper patches of normal calibrated images, the median was 2.58 dE_ab. Wilcoxon sum-rank testing (p < 0.05) between uncalibrated normal images and cali- brated normal images with proper squares was equal to zero, demonstrating a highly significant improvement in reproducibility. Examining dE_ab errors for each MBCCC patch indi- vidually revealed that the greatest errors were found in the red, orange yellow, orange and yellow patches. Examination of cyan patches was excluded as these can- not be represented accurately in the sRGB colour space (Figure 13). Experiment 2 The reproducibility of the chart detection during auto- matic calibration is presented using a probability distri- bution of dE_ab errors between two measurements of the same ROI. Ideally this should be as close to zero as possible and comparable to the measurements of the same ROI depicted in the presented figures. Reproduci- bility: box plot for the proper patches of the normal images. The rotated versions of an image should all be equal. Any deviation from this would indicate variability in the chart detection, leading to a slightly different cali- bration and thus different measurements (Figure 14). Figure 13 Reproducibility: boxplot for the proper patches of the normal images. Figure 14 Probability distribution of dE*ab errors with region of interest calibration. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 9 of 11 Discussion The research presented here provides evidence that images taken with commercially available digital cameras can be calibrated independently of camera settings and illumination features, provided that illumination in the field of view is uniform and a calibration chart is used. This may be particularly useful during chronic wound assessment, as this is often performed in different loca- tions and under variable lighting conditions. The pro- posed calibration transforms the acquired images in an unknown colour space (usually RGB) to a standard, well defined colour space (sRGB) that allows images to be dis- played properly and has a known relationship to the CIE colorimetric colour spaces. First, we challenged the cali- bration procedure with a large collection of images con- taining both ‘normal’ images with proper camera settings and images that were purposely over- or underexposed and/or had white balance mismatches. The reproducibil- ity and accuracy of the calibration procedure is presented and demonstrates marked improvements. The calibration procedure works very well on the images with improper camera settings, as evidenced by the minimal differences between the error distributions of the complete set of images and the set with only the ‘normal’ images. An innovative feature demonstrated during our research is the automatic ‘detection and calibration of the MacBeth Colour Checker Chart Mini [MBCCC]’ in the digital image. Secondly, we tested the effect of this MBCCC chart detection on subsequent real-world colour mea- surements. Figure 14 demonstrates the probability distri- bution of errors between two colour measurements of the same region of interest that can be attributed to var- iations in the chart detection process. The majority of these errors were below 1 dE_ab, demonstrating that the chart detection is robust. This experiment is part of the research presented by the Woundontology Consortium, which is a semi-open, international, virtual community of practice devoted to advancing the field of research in non-invasive wound assessment by image analysis, ontology and semantic interpretation and knowledge extraction http://www. woundontology.com. The interests of this consortium are related to the establishment of a community driven, semantic content analysis platform for digital wound imaging with special focus on wound bed surface area and color measurements in clinical settings. Current research by the Woundontology Consortium is related to our concerns of the interpretation of clinical wound images without any calibration or reference procedure. Therefore we are investigating techniques to promote standardization. The platform used by this Consortium is based on Wiki technology, a collaborative environment to develop a “woundontology” using the Collaborative Ontology Development Service (CODS) and an image server. Research on wound bed texture analysis is per- formed by a computer program: “MaZda”. This applica- tion has been under development since 1998, to satisfy the needs of the participants of the COST B11 European project “Quantitative Analysis of Magnetic Resonance Image Texture” (1998-2002). Additionally, wound bed texture parameter data-mining is analyzed using “Rapid- Miner” which is one of the world-wide leading open- source data mining solution. Recently, results on: THE RED-YELLOW-BLACK (R-Y-B) SYSTEM: A COLORIMETRIC ANALYSIS OF CONVEX HULLS IN THE CIELAB COLOR SPACE were presented at the EWMA 2009 conference in Hel- sinki, Finland. Conclusions To our knowledge, the proposed technology is the first demonstration of a fundamental, and in our opinion, essential tool for enabling intra-individual (in different phases of wound healing) and inter-individual (for fea- tures and properties) comparisons of digital images in human wound healing. By implementing this step in the assessment, we believe that scientific standards for research in this domain will be improved [37]. Acknowledgements Part of this work was performed at the Liebaert Company (manufacturer for technical textile in Deinze, Belgium). We thank the members of the Colour Assessment Cabinet specially Albert Van Poucke, for the fruitful discussions. We also would like to thank the reviewers who helped to improve this paper with their suggestions. Author details 1Department of Anaesthesia, Critical Care, Emergency Care, Genk, Belgium. 2Department of Dermatology, University Ghent, Ghent, Belgium. 3Department of Anesthesiology, Pain and Palliative Medicine, The Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands. 4CNS, Pain and Neurology, Janssen Research Foundation, Beerse, Belgium. 5Critical Care Department, Antwerp University Hospital, University of Antwerp, Antwerp, Belgium. Authors’ contributions SVP designed the method and drafted the manuscript. SVP and YV generated the data gave recommendations for their evaluation. PJ, TM, KV provided feedback and directions on the results. All authors read and approved the final manuscript. Competing interests The authors declare that they have no competing interests. Received: 12 March 2009 Accepted: 18 March 2010 Published: 18 March 2010 References 1. Haeghen Vander Y, Naeyaert JM: Consistent cutaneous imaging with commercial digital cameras. Arch Dermatol 2006, 142(1):42-46. 2. Van Geel N, Haeghen Vander Y, Ongenae K, Naeyaert JM: A new digital image analysis system useful for surface assessment of vitiligo lesions in transplantation studies. Eur J Dermatol 2004, 14(3):150-155. Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 10 of 11 http://www.woundontology.com http://www.woundontology.com http://www.ncbi.nlm.nih.gov/pubmed/16415385?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/16415385?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15246939?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15246939?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15246939?dopt=Abstract 3. Kanthraj GR: Classification and design of teledermatology practice: What dermatoses? Which technology to apply? Journal of the European Academy of Dermatology and Venereology 2009, 23(8):865-875. 4. Aspres N, Egerton IB, Lim AC, Shumack SP: Imaging the skin. Australas J Dermatol 2003, 44(1):19-27. 5. Bhatia AC: The clinical image: archiving clinical processes and an entire specialty. Arch Dermatol 2006, 142(1):96-98. 6. Hess CT: The art of skin and wound care documentation. Advances in Skin & Wound Care 2005, 18:43-53. 7. Bon FX, Briand E, Guichard S, Couturaud B, Revol M, J Servant JM, Dubertret L: Quantitative and kinetic evolution of wound healing through image analysis. IEEE Trans Med Imaging 2000, 19(7):767-772. 8. Jury CS, Lucke TW: The clinical photography of herbert brown: a perspective on early 20th century dermatology. Clin Exp Dermatol 2001, 26(5):449-454. 9. Phillips K: Incorporating digital photography into your wound-care practice. Wound Care Canada 2006, 16-18. 10. Oduncu H, Hoppe A, Clark M, Williams RJ, Harding KG: Analysis of skin wound images using digital colour image processing: a preliminary communication. Int J Low Extrem Wounds 2004, 3(3):151-156. 11. Levy JL, Trelles MA, Levy A, Besson R: Photography in dermatology: comparison between slides and digital imaging. J Cosmet Dermatol 2003, 2:131-134. 12. Tucker WFG, Lewis FM: Digital imaging: a diagnostic screening tool? Int J Dermatol 2005, 44(6):479-481. 13. Wagner JH, Miskelly GM: Background correction in forensic photography. ii. Photography of blood under conditions of non-uniform illumination or variable substrate color_practical aspects and limitations. J Forensic Sci 2003, 48(3):604-613. 14. Wagner JH, Miskelly GM: Background correction in forensic photography. i. photography of blood under conditions of non-uniform illumination or variable substrate color_theoretical aspects and proof of concept. J Forensic Sci 2003, 48(3):593-603. 15. Riley RS, Ben-Ezra JM, Massey D, Slyter RL, Romagnoli G: Digital photography: A primer for pathologists. J Clin Lab Anal 2004, 18:91-128. 16. Palioto DB, Sato S, Ritman G, Mota LF, Caffesse RG: Computer assisted image analysis methods for evaluation of periodontal wound healing. Braz Dent J 2001, 12:167-172. 17. Heydecke G, Schnitzer S, Türp JC: The colour of human gingiva and mucosa: visual measurement and description of distribution. Clin Oral Invest 2005, 9:257-265. 18. Scheinfeld N: Photographic images, digital imaging, dermatology, and the law. Arch Dermatol 2004, 140(4):473-476. 19. Gopalakrishnan D: Colour analysis of the human airway wall. Master’s thesis, University of IOWA 2003. 20. Lotto RB, Purves D: The empirical basis of colour perception. Consciousness and Cognition 2002, 11:609-629. 21. Prasad S, Roy B: Digital photography in medicine. J Postgrad Med 2003, 49(4):332-336. 22. Maglogiannis I, Kosmopoulos DI: A system for the acquisition of reproducible digital skin lesions images. Technol Health Care 2003, 11(6):425-441. 23. Haeghen Vander Y: Development of a dermatological workstation with calibrated acquisition and management of colour images for the follow- up of patients with an increased risk of skin cancer. Ph.D. thesis, University Ghent 2001. 24. Gilmore S: Modelling skin disease: lessons from the worlds of mathematics, physics and computer science. Australas J Dermatol 2005, 46(2):61-69. 25. Goldberg DJ: Digital photography, confidentiality, and teledermatology. Arch Dermatol 2004, 140(4):477-478. 26. Macaire L, Postaire JG: Colour image segmentation by analysis of subset connectedness and colour homogeneity properties. Computer Vision and Image Understanding 2006, 102:105-116. 27. Streinera DL: Precision and accuracy: Two terms that are neither. Journal of Clinical Epidemiology 2006, 59:327-330. 28. Feit J, Ulman V, Kempf W, Jedlickov H: Acquiring images with very high resolution using a composing method. Cesk Patol 2004, 40(2):78-82. 29. Byrne A, Hilbert DR: Colour realism and colour science. Behavioral and Brain Sciences 2006, 26:3-64. 30. Harkness N: The colour wheels of art, perception, science and physiology. Optics and Laser Technology 2006, 38:219-229. 31. Multimedia systems and equipment - Colour measurement and management -Part 2-1: Colour management - Default RGB colour space - sRGB. 1999, IEC 61966-2-1 Ed. 1.0 Bilingual. 32. Haeghen Vander Y, Naeyaert JM, Lemahieu I, Philips W: An imaging system with calibrated colour image acquisition for use in dermatology. IEEE Trans Med Imaging 2000, 19(7):722-730. 33. Ikeda I, Urushihara K, Ono T: A pitfall in clinical photography: the appearance of skin lesions depends upon the illumination device. Arch Dermatol Res 2003, 294:438-443. 34. Leon K, Mery D, Pedrischi F, Leon J: Colour measurement in l*a*b* units from rgb digital images. Food Research International 2006, 39(2006):1084-1091. 35. Danilova MV, Mollon JD: The comparison of spatially separated colours. Vision Res 2006, 46(6-7):823-836. 36. Johnson GM: A top down description of s-cielab and ciede2000. Col Res Appl 2003, 28:425-435. 37. Bellomo R, Bagshaw SM: Evidence-based medicine: classifying the evidence from clinical trials_the need to consider other dimensions. Crit Care 2006, 10(5):232. Pre-publication history The pre-publication history for this paper can be accessed here: [http://www.biomedcentral.com/1471-2342/10/7/prepub] doi:10.1186/1471-2342-10-7 Cite this article as: Van Poucke et al.: Automatic colorimetric calibration of human wounds. BMC Medical Imaging 2010 10:7. Submit your next manuscript to BioMed Central and take full advantage of: • Convenient online submission • Thorough peer review • No space constraints or color figure charges • Immediate publication on acceptance • Inclusion in PubMed, CAS, Scopus and Google Scholar • Research which is freely available for redistribution Submit your manuscript at www.biomedcentral.com/submit Van Poucke et al. BMC Medical Imaging 2010, 10:7 http://www.biomedcentral.com/1471-2342/10/7 Page 11 of 11 http://www.ncbi.nlm.nih.gov/pubmed/12581077?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/16415393?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/16415393?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11055792?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11055792?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11488837?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11488837?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11488837?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15866806?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15866806?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15866806?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/17163918?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/17163918?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15941435?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12762531?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12762531?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12762531?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12762530?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12762530?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12762530?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15065212?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15065212?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11696912?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11696912?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15096377?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15096377?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12470626?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/14699233?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/14757921?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/14757921?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15842395?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15842395?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15096378?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/16549250?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15233022?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/15233022?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11055787?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/11055787?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12563541?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/12563541?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/16288793?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/17029653?dopt=Abstract http://www.ncbi.nlm.nih.gov/pubmed/17029653?dopt=Abstract http://www.biomedcentral.com/1471-2342/10/7/prepub Abstract Background Methods Results Conclusion Background Methods Experiment 1 Image Acquisition Calibration Procedure Analysis Experiment 2 Image Acquisition Calibration Procedure and Analysis Results Experiment 1 Experiment 2 Discussion Conclusions Acknowledgements Author details Authors' contributions Competing interests References Pre-publication history