Work Package 2
Smartphone application
Design of a smartphone application to assess variations of an ulcer of the plantar foot surface along time (A2).
Background and previous work of our group: Ulcer and wound therapeutic follow-up involves periodic visual examination to monitor the evolution of its shape (area and depth) and appearance (color of the tissues) in order to adapt the treatment. Wound care is devoted to the nurses as it is done when removing a dressing and wound cleaning. It suffered for many years of manual practice. For shape measurements we can mention: simple ruler for length or radius (1D), wound outline tracing on transparent sheet to compute area (2D) or volume from alginate cast or serum injection (3D). Concerning the nature of tissues, the ratio was reported on a little color coded scale (granulation, fibrin, necrosis and epithelium respectively for red, yellow, black and pink) [GOT-03].
Ulcer imaging became common practice with the spreading of low-cost digital systems. Logically, many pioneering works focused on 2D analysis as wound area may be easily obtained after contour following and a complete segmentation process can provide regions labelled by different tissues after a classification step [ODU-04] [KOL-04] [ZHE-04]. Unfortunately, several drawbacks limit the interest of a simple 2D approach: wounds are not planar and perspective effects degrades area estimation; color constancy is not ensured due to changes in lighting and sensor spectral response. In other works, the problem of 3D wound reconstruction to obtain wound depth and volume was tackled by two techniques, namely passive versus active vision: in the first case, several images are combined through stereo photogrammetry prototypes to obtain 3D points after matching homologue points in the images [PLA-98] [MAL-02]; in the second case, laser or white light patterns like dots or lines are projected on the wound and 3D data is obtained over these projections by triangulation [KRO-02] [CAL-03] [MAL-04]. Limitations still occurred: these devices need tedious calibration and are complex, cumbersome and of course expensive. As wounds have a high prevalence in most of hospital services, it implies to disposinge of many compact and low cost systems.
As it can be observed, no work investigated a unified approach dealing both with wound shape and appearance measurements, the two inseparable components of a complete wound assessment. So, we have proposed in the University of Orleans to use a simple digital camera to provide at low cost a 3D model of the wound labelled with the different tissues (Figure 6), such as a geographic relief map. [TRE-09] [WAN-11] The drawbacks mentioned before were removed: autocalibrationAuto calibration avoided a tedious calibration step [ALB-05] and a reference pattern placed near the wound provided both color correction and the scale factor [WAN-10] [WAN-12].

Figure 6: Color and 3D analysis of a wound.
With the recent technological advances, digital cameras tend to be replaced by smartphones for image capture. As these devices now support powerful embedded processing and inherent data transmission capabilities, they have become perfect tools for wound imaging tools [SPR-11] [FOL-13b] [HET-13]. But the most amazing fact is that creating hardware add-ons for smartphones is currently one of the most fertile areas for technology. So new image modalities will rapidly enhance expand and the power of existing mobile phones:
- Thermal imaging: this is already the case for the thermal modality we have already experimented [VIL-14].
- 3D scanning: several low-cost 3D scanners have become popular, especially with the growth of 3D printing market. Actually, they are independent devices but some are interfaced with portable tools like Android or iOS tablets (iSense from 3D Systems, Structure from Occipital) and add-ons for smartphones are coming soon as development kits are available (Google Tango projects, LazeeEye from heuristics Labs). We also observe the emergence of affordable time-of-flight cameras which have been advantageously compared to manual techniques for wound volume measurement [PUT-14].
- Multispectral or even hyperspectral imaging systems, firstly developed for remote sensing applications, are now spreading into the industry, to control manufactured parts or food. In the medical field, it is a recent introduction: we have already experiencmented it on wounds (Figure 6) and other tissues (Figure 7) and concluded that diagnosis can be strongly improved with spectral discrimination [NOU-13] [NOU-14].
STANDUP project activities: Our opinion is that it is once more necessary to continue a unified approach, involving a multimodal system, as it will provide, not adding but multiplying benefits, for accurate and robust DF ulcer assessment. Going further than preceding multimodal attempts [CHE-05] [MIR-11] [BAR-13], we intend to combine and integrate our previously tested modalities (color imaging and 3D reconstruction), with emergent modalities (thermal) now available on smartphones. With the increasing need for low-cost and widespread wound assessment devices, it is clear that most of this technology should be embedded in smartphones for ulcer monitoring either in medical centers or directly at home.
Another requirement will be ulcer monitoring over time. In all the works presented here, ulcer assessment was always limited to a static evaluation: the knowledge of ulcer history, including its geometric evolution and the temporal changes of the tissues haves never been taken in account to fine tune diagnosis.
The main goal of WP2 concerns the development of a second smartphone application A2 during the STANDUP project. Its implementation will enable strong novelties from technical points of view compared to ongoing research in the domain:
- Color and thermal scanning using Smartphone.
- Ulcer multispectral segmentation and classification.
- 3D diabetic foot scanning and analysis.
- Development of monitoring tools to describe, synthesize and diagnose ulcer evolution from one visit to another.
- Adapt these algorithms to smartphones running under Android and iOS.
The first part of the work (color and thermal analysis) will be done by a PhD student in the University of Agadir that will start in October 2017. His thesis will be a co-direction between the University of Agadir and the University of Orleans. He will spend 70% of his time in Agadir, and the rest of his time in Orleans. He will be directed by experienced researchers of Agadir and Orleans. He will be funded by the Moroccan government and will get the support of the STANDUP rise project during his mobility in Orléans. The second part of the work (3D analysis and smartphone applications) will be done by a second PhD student in the same conditions as the previous PhD student. We have identified a good PhD student for the second task, and we are interviewing new students for the first part of the work.
Back