The Allen Coral Atlas is built by a dedicated team of scientists, technologists, and conservationists, using one-of-a-kind methodologies. Led and funded by Vulcan Inc., the partnership also includes Arizona State University’s Center for Global Discovery and Conservation Science, the National Geographic Society, Planet, and the University of Queensland, who together identified the following methods for coral reef habitat map creation and dissemination.
It all starts with 3.7m satellite imagery. With this data in hand, distorting “noise” in the imagery is taken out. Algorithms then correct for the effects of the atmosphere and sun glint on the surface of the water, and then depth is calculated. Next, field data is used to calibrate rule sets before generating a habitat map with machine learning. The resulting map is then displayed on the Allen Coral Atlas website, and the Atlas Engagement team works with scientists, academics, policymakers, and protected area managers to facilitate use.
The following are illustrations and documentation about these respective methods.
A high quality global coral reef mosaic from Planet’s PlanetScope satellite imagery is the starting point for creating the Atlas maps. PlanetScope (Dove) imagery exhibits the following technical specifications:
Imagery captured by PlanetScope constellation undergo a number of processing steps depending on product delivered. The following steps are taken to transform PlanetScope imagery for use in the Atlas:
The area to be used as the basis for a new global coral map, based on Planet satellite image data, includes coral reefs shallower than 20m deep, between 30° north and 30° south latitude, in clear water (that is, water without high turbidity) and listed as a coral reef in the United Nations Environment Programme (UNEP) 2010 Coral Layer.
To create a global image collection mask for coral reefs for the purpose of tasking acquisition of Planet Dove image data, the following buffer approach is applied:
Additionally, PlanetScope imagery goes through a mosaicking process. Planet uses “best scene on top” (BOT) techniques for mosaicking PlanetScope imagery. This approach differs from the best-pixel method traditionally used in scientific research projects by stamping the entire scene into the mosaic instead of select pixels.
Satellite data are made available to the science team in calibrated at-sensor radiance units (W str-1 m-2 s-1) as spatially contiguous orthorectified mosaics. These data require extensive processing using GDCS algorithms to generate at-surface, sub-surface, and benthic reflectance data from the Planet radiance imagery. Reaching these three levels of processed data requires modeling of the radiometry of each Planet satellite (Dove, SkySat) used in generating coral reef coverage worldwide. Additionally, the following corrections need to be applied to Planet data to support the UQ mapping component (geomorphic zonation and benthic composition) as well as the GDCS alert-monitoring component:
The corrections for both the atmospheric effect and water column attenuation derive the benthic reflectance (or bottom reflectance). The derived benthic reflectance is applied to the coral reef classification and bleaching detection with improved accuracy. The method was developed based on the four bands (B, G, R, and near infrared [NIR]) Planet Dove satellite images for deriving the benthic with the assistance of depth data.
Ocean region is delineated from the corrected satellite images through the normalized difference water index as:
Then the following processing is processed on the water-only region.
The removal of sun glint (water surface effect) in the study regions were performed by equation 1 as:
Where Rrs, 0+ is the remote sensing reflectance just above the water surface in blue, green and red bands, Rrs is the water leaving reflectance (R, G and B), and Rrs(NIR) is the reflectance in the NIR band. After the sun glint correction, the below surface reflectance is derived as:
A band-ratio algorithm is applied for deriving the depth based on B, G, and R bands of the Dove images:
The tunable constant (m0 and m1) is calibrated for the study sites according to the water column attenuation conditions. this research was supported by The Nature Conservancy
For validation of the water depth product, reference data from field measured water depths is compared with coincident locations on the map product and to calculate regression values. Field measured depth is sourced from previously collected data from existing programs.
In optically shallow waters, the water-leaving reflectance is made up of contributions from both waterbody and bottom sediments. So the below-surface remote sensing reflectance rrs is modeled as:
where rcrs represents the water column contribution. rbrs represents the bottom sediments contribution at below-water surface. H is the estimated depth, and B is the bottom reflectance to be derived. D(at+bb) represents the light attenuation caused by water column absorption and backscattering for water column light components (Dc) or light components from bottom (Db).
Finally, Dc and Db are empirical factors associated with under-water photon path elongation due to scattering and are calculated as below:
rrsdp represents below-surface remote sensing reﬂectance when the water is infinitely deep and is modeled as:
Then the water inherent optical properties (IOPs) are modeled as different components of water as:
The water IOPs contain the contribution of pure water ( aw(λ ), bbw(λ) ), CDOM ( acdom(λ ) ) and particles ( ap (λ) , bbp(λ) ). Then the bottom reflectance can be derived.
Diagrammatic workflow from Dove reflectance to derive the depth and bottom reflectance. The different components are illustrated below as the methodology sections above. The normalized difference water index (NDWI) is applied to mark water and land regions.
For the water regions, the sun glint (or water surface effect) is removed by subtracting the NIR band. The water leaving signals ( Rrs ) are then derived. Next, the subsurface remote sensing reflectance is calculated to remove the sea-air interface effect. Finally, the water column attenuation correction is processed
Gao, Bo-Cai. 1996. “NDWI—A Normalized Difference Water Index for Remote Sensing of Vegetation Liquid Water from Space.” Remote Sensing of Environment 58 (3): 257–66.
Lee, ZhongPing, Kendall L. Carder, and Robert A. Arnone. 2002. “Deriving Inherent Optical Properties from Water Color: A Multiband Quasi-Analytical Algorithm for Optically Deep Waters.” Applied Optics 41 (27): 5755–72.
Lee, Zhongping, Kendall L. Carder, Curtis D. Mobley, Robert G. Steward, and Jennifer S. Patch. 1999. “Hyperspectral Remote Sensing for Shallow Waters. 2. Deriving Bottom Depths and Water Properties by Optimization.” Applied Optics38 (18): 3831–43.
Lee, Zhongping, Alan Weidemann, and Robert Arnone. 2013. “Combined Effect of Reduced Band Number and Increased Bandwidth on Shallow Water Remote Sensing: The Case of Worldview 2.” IEEE Transactions on Geoscience and Remote Sensing51 (5): 2577–86.
Li, Jiwei, Qian Yu, Yong Q. Tian, and Brian L. Becker. 2017. “Remote Sensing Estimation of Colored Dissolved Organic Matter (CDOM) in Optically Shallow Waters.” ISPRS Journal of Photogrammetry and Remote Sensing 128: 98–110.
Wabnitz, Colette C., Serge Andréfouët, Damaris Torres-Pulliza, Frank E. Müller-Karger, and Philip A. Kramer. 2008. “Regional-Scale Seagrass Habitat Mapping in the Wider Caribbean Region Using Landsat Sensors: Applications to Conservation and Ecology.” Remote Sensing of Environment 112 (8): 3455–67.
To map geomorphic and benthic zones of global coral reefs we use Planet Dove image data, water depth derived from Dove, Sentinel 2 and Landsat satellite imagery, modeled waves and surrogates for texture and slope through machine learning random forest classification followed by an object based cleaning approach using eco-geomorphological rules.
Three levels of classification, corresponding to three of the four scales of coral reef environments (Figure 1), each mapped over a set range of depths, (Figure 2) are used where the global mapping focuses on geomorphic zonation and benthic composition:
Reef versus non-reef – outline of global shallow reef extent
Reef type – classification of individual reefs into types
Reef geomorphic zonation (0-20 m depth) – classification within individual reefs into zones
Reef benthic composition (0-10 m) – classification within reef zones into habitats
Figure 1: Minimum mapping units, spatial scale and levels of mapping detail to be used for coral reefs
Figure 2: Allen Coral Atlas mapped classes. Coloured boxes represent map classes used in the hierarchical classification scheme applied in the project.
The Allen Coral Atlas classification scheme outlined in Figure 1 and Figure 2 forms the backbone of the Allen Coral Atlas project (Kennedy, E. et al. In prep). Classes were also designed with users in mind to reflect ecological, geological and socially meaningful features of reefs, and constrained by mapping capability with input data available and the approach followed, for more detail see (Kennedy, E. et al. In prep). Development of mapping classes required a sensitive trade-off between the requirements of users in terms of the level of detail needed, the input resources available and the quality of the globally repeatable mapping methods. Our classes paid respect to recent regional to global scale coral reef mapping projects (e.g. NOAA maps), as well as other past global projects and current NOAA UNEP global coral data sets were reviewed.
See more details for following links for short description of mapping classes and a detailed background of mapping classes.
The scale of the project, computational power needed and regional inconsistencies in coral reef structure and size meant that a region-by-region approach to mapping the worlds reefs was adopted (Figure 3). Mapping regions include reef areas that fall between the 30 degrees latitude and in clear shallow waters (less then approximately 20 m deep) that would be amenable for satellite based information extraction. Regions were derived from a combination of existing global bio regionalisation’s that account for coral biogeography (Veron, Stafford-Smith et al. 2016) and oceanography (Spalding, Fox et al. 2007) modified based on visual assessment of reef areas observed on the dove satellite image mosaic, the reef types, water depth and water quality to result in 30 regions.
Figure 3: Allen Coral Atlas mapping methodology relies on a region-by-region approach to mapping the worlds reefs. The target mapping regions (pictured) reflect established patterns of coral reef bio geography and do not consider geo-political boundaries.
The mapping region outline was then combined with a global reef mask to create mapping region specific mask. A global reef mask is created to reduce the area for which Dove imagery has to be acquired and manage the computational efforts. The mask included areas: within the 30 degrees latitude; shallower then 20 m by creating a buffer around the approximately 20 m depth sourced from an existing course level global depth layer; excluded land using a global land outline; and having a known presence of reefs using the UNEP-WCMC Coral Reef Layer 2018 (UNEP-WCMC, WorldFish Centre et al. 2018). The global reef mask was finalized by visual based contextual editing using a global Landsat mosaic.
The current geomorphic and benthic mapping approach combines machine learning with Object Based Analysis (OBA) (Lyons, Roelfsema et al. 2020) (Figure 5). It includes four modules: 1) Data Preparation, 2) Machine Learning Classification.3) OBA clean-up and 4) Accuracy Assessment
Figure 3: Flow chart detailing the four key modules of the coral reef mapping framework, including the data input types, processing steps and output products.(Lyons, Roelfsema et al. 2020).
In Module 1 the input data sets and reference samples are prepared for Module 2, the machine learning component.
The input data sets include: Planet satellite image mosaic, physical attributes including water depth from Sentinel 2, Landsat 8, Planet Dove data (Li, Knapp et al. 2019), exposure to waves (Harris et al in prep) and turbidity (Li et al in prep). See end of this page more detail on waves.
The satellite image and physical attribute data are segmented into ‘objects’ following an OBA paradigm (Lyons, Roelfsema et al. 2020), and the segmented information is added to the ‘data stack’ so that both local and neighborhood information can inform the mapping process. The OBA paradigm is based on the image and other spatial data to be first segmented into groups of pixels with similar characteristics (e.g. colour or texture, or a physical property such as water depth). This is akin to how we use our eyes to segment images into objects for interpretation. Each image ‘object’ is given a set of metrics based on its constituent pixels – in this case we use the mean values for each of the metrics in the pixel-based data stack.
Reference Samples: Training and validation data (point-based) are used to train the machine learning classifier and validate the resulting maps. These points are derived from as a large pool of reference samples and then randomly split into training and validation. Reference samples (objects) are created by assigning mapping classes to a subset of objects for small representative groups of reefs within a mapping region. Assigning labels to reference samples, is currently based on manual labeling of segments by a trained expert. The trained experts would visually review: a combination of classification description that include class typology; satellite image colour and texture; depth and waves; existing coral reef habitat maps, existing or new field data and expert knowledge. Where the existing and newly collected or created field data or maps are provided through the verification component of the Allen Coral Atlas.
Using a training data set, a machine learning Random Forest classifier is used to make a preliminary classification of geomorphic and benthic classes. Based on an established framework for OBA on the Great Barrier Reef (Roelfsema, Kovacs et al. 2018, Roelfsema, Kovacs et al. 2020), the preliminary classification is then refined using the contextual principles of OBA.
The Random Forest classifier is a well-established machine learning algorithm used to create maps from the data. It is an ‘ensemble learning’ method, which means it assesses the input data based on a number of constructed ‘decision trees’, that each have a component of random variation in the parameterisation. The final decision takes into account all of the different possibilities in the different decision trees, and they are particularly useful because they balance predictive performance with overfitting, and are also robust to redundant predictor variables (James, Witten et al. 2013). The classifier is trained using the curated training data set, with the input variables (e.g. mean band values, depth, waves) included both the pixel and segmented objects from the image and physical environment data.
The output classification from the machine learning classifier is then processed using a number of automated OBA membership rules. Membership rules form the typology of a mapping class defined by different attributes or spatial relationships. These attributes are typically based on contextual information based on where a class is, its attributes and what its neighboring classes are. For example, a generic rules could be if Class A (e.g. Reef Crest) is small and surrounded by class B (e.g. Outer Reef Flat) then Class A should be reclassified to B.
Validation points were derived from the reference samples, and then compared with the mapped data through an error matrix. From the error matrix standard accuracy measures were derived that include overall map accuracy and the individual map category user and producer accuracy (Congalton and Green 2008).
Confidence level is a product of a combination of: the quality of the input image and depth data represented through an artefact layer; confidence in the reference sampling process per mapping category, miss-classification taken place at reference sample location and the probability product from the machine learning classifier.
Wave exposure is the dominant force influencing the ecological makeup and physical structure of coral reefs. Changes in the benthic ecological community as well as some crucial metabolic and biological functions of coral reefs have been linked to variations in wave energy. Long-term geomorphic development of coral reefs is also driven by the relative exposure of coral reefs to wave processes. A thorough understanding of wave exposure is now an important component of benthic ecological surveying in coral reefs. Wave exposure on coral reefs has typically been determined using a suite or computationally onerous models which limits wave modelling to a local or regional basis which. To calculate the wave exposure for every reef in the world a wave model that is flexible, computationally fast and links with global wave models. This model uses principles of wave refraction and diffraction to determine the dissipation of wave energy from deep water sources to shallow reef environments and through often complex coral reef regions. This provides the local wave height for every reef prior to wave break point and hence the wave exposure index for each coral reef.
Datasets: National Oceanic and Atmospheric Associate (NOAA) Wave Watch III global wave model hindcast reanalysis (1979-present). Planet derived bathymetry.
Reference samples are created with the help commercial software Trimble eCognition 9.3. in the Allen Coral Atlas workflow. The machine learning mapping and the For the mapping regions, the machine learning and OBA refinement stages are implemented in Google Earth Engine, and open source and free cloud-based processing environment that provides capability to access and process the Planet Dove imagery, along with a range of other satellite image archives (e.g. Sentinel 2, Landsat). The image segmentation and OBA refinement workflow was not previously available in Google Earth Engine, so that software capability is a major output of the Allen Coral Atlas project (Lyons, Roelfsema et al. 2020).
For more information on mapping and monitoring through remote sensing from the University of Queensland, check out this Remote Sensing Toolkit
Congalton, R. G. and K. Green (2008). Assessing the accuracy of remotely sensed data: Principles and practices. Mapping Science. Boca Rotan FL, CRC Press.
James, G., D. Witten, T. Hastie and R. Tibshirani (2013). An introduction to statistical learning. New York, Springer.
Kennedy, E., K. E., R. Borrego, M. Roe, D. Yawonno, J. Wolf, N. Murray, M. Lyons, R. S. Phinn, P. Tudman and C. M. Roelfsema (In prep). "Coral Reef classificatory system: a key for interpreting global reef habitat maps derived from remote sensing." Scientific Data: Nature.
Kennedy, E., C. Roelfsema, Lyons, M., E. Kovacs, R. Borrego-Acevedo, M. Roe, S. Phinn, K. Larsen, N. Murray, D. Yawonno, J. Wolf, P. Tudman (2020). Reef Cover: a coral reef classification to guide global habitat mapping from remote sensing
Li, J., D. E. Knapp, S. R. Schill, C. Roelfsema, S. Phinn, M. Silman, J. Mascaro and G. P. Asner (2019). "Adaptive bathymetry estimation for shallow coastal waters using Planet Dove satellites." Remote Sensing of Environment 232: 111302.
Lyons, M., C. Roelfsema, V. E. Kennedy, E. Kovacs, R. Borrego-Acevedo, K. Markey, M. Roe, D. Yuwono, D. Harris, S. Phinn, G. P. Asner, J. Li, D. Knapp, N. Fabina, K. Larsen, D. Traganos and N. Murray (2020). "Mapping the world's coral reefs using a global multiscale earth observation framework." Remote Sensing in Ecology and Conservation n/a(n/a).
Roelfsema, C., E. Kovacs, J. C. Ortiz, N. H. Wolff, D. Callaghan, M. Wettle, M. Ronan, S. M. Hamylton, P. J. Mumby and S. Phinn (2018). "Coral reef habitat mapping: A combination of object-based image analysis and ecological modelling." Remote Sensing of Environment 208: 27-41.
Roelfsema, C. M., E. M. Kovacs, J. C. Ortiz, D. P. Callaghan, K. Hock, M. Mongin, K. Johansen, P. J. Mumby, M. Wettle, M. Ronan, P. Lundgren, E. V. Kennedy and S. R. Phinn (2020). "Habitat maps to enhance monitoring and management of the Great Barrier Reef." Coral Reefs.
Spalding, M. D., H. E. Fox, G. R. Allen, N. Davidson, Z. A. Ferdaña, M. Finlayson, B. S. Halpern, M. A. Jorge, A. Lombana, S. A. Lourie, K. D. Martin, E. McManus, J. Molnar, C. A. Recchia and J. Robertson (2007). "Marine Ecoregions of the World: A Bioregionalization of Coastal and Shelf Areas." BioScience 57(7): 573-583.
UNEP-WCMC, WorldFish Centre, WRI and TNC (2018). Global distribution of warm-water coral reefs, compiled from multiple sources including the Millennium Coral Reef Mapping Project. Version 4.0. . U. E. W. C. M. Centre, Cambridge (UK): .
Veron, J. E. N., M. G. Stafford-Smith, E. Turak and DeVantier. L. (2016.). "Corals of the World." from http://coralsoftheworld.org.
Field data are required for training and validation of the mapping approach and map products such as water depth, geomorphic zones, and benthic community composition. The field data are used, alongside expert image interpretation, to create geomorphic and benthic reference samples, which are then used for training the map classifier and assessing the accuracy of the resulting maps. The habitat mapping team at the University of Queensland (UQ) are responsible for the collation of this verification data, in collaboration with National Geographic team. However, for a project at a global scale, the collection of new and existing field data is a global collaboration, and reef experts and institutions around the world are contributing existing and new data to the global mapping effort. The verification data are collected in two major formats: 1. New collected benthic field data, collected using by UQ developed goereferenced photoquadrat method. 2: Existing field data and maps. More detailed of both steps below
1: New collected field data
New field data are collected at various reef regions around the globe by both the UQ habitat mapping team and various interested and collaborating institutions. The new field data are extremely valuable to the project, and include georeferenced benthic photoquadrats taken across the geomorphic zones of the reefs. Sites for new field data collection are selected to capture the greatest range of reef environments, benthic habitats and geomorphic zones across the mapping regions.
Our general georeferenced photoquadrat protocol (figure 1) involves taking thousands of benthic photos that each represent a 1mx1m photoquadrat, along transects that capture the various zones of the reef, such as exposed and sheltered reef slope, reef flat and lagoon. Georeferenced photo quadrates can be collected from boat, on snorkel or scuba (Roelfsema and Phinn 2010). During the transect, the photographer tows a surface buoy with a GPS recording a position every 2 seconds.
At the end of the day, each of the photos can be directly linked to a GPS location using the time differential between the GPS and the camera, with software developed specifically for our purpose (https://github.com/joshpassenger/gpsphoto). This software also produces geo-referenced thumbnails of each photo, which can be overlayed directly on the satellite image, and reviewed to inform the reference sample creation. The benthic photoquadrats collected are then annotated to the benthic categories required for the map using automated machine annotation, which is expertly trained for each region to provide a consistent output for the mapping categories.
Figure 1: Georeferenced photoquadrat method principle. (A) Snorkeler, Diver or UW drone capture photoquadrats of the seabed. (B) Photoquadrats are analysed automatically in CoralNet for benthic composition, and (C) The analysed benthic composition for each photo is linked to its relevant GPS position and can be viewed as a pie graph overlayed on the satellite image.
Other methods for collecting new field data can include spot surveys, Remote Operated Vehicle (ROV) or Autonomous Underwater Vehicle (AUV) surveys (Roelfsema, Lyons et al. 2015), and the utilization of expert local knowledge (figure 2)
Figure 2: Variation of newly collected field data for the Allen Coral Atlas: (top left) Detailed surveys using our georeferenced benthic photoquadrat protocol: (top right) Basic surveys, characterizing the benthos at one position: (lower left) Expert local knowledge: (lower right) remote surveys, where technology is deployed to gather autonomous information about the seafloor.
The georeferenced photoquadrat protocols are available online. For a short overview of the photoquadrat method please see the protocol overview. The full protocol, required software and assistance with training and site selection can also be provided if you are interested in collecting new data for the Atlas.
Do you want to collect new data for the Atlas?
Here is our global habitat mapping overview. (should be called Photoquadrat protocol overview or something similar).
2: Existing benthic and geomorphic data and maps
Existing benthic and geomorphic data is also valuable to the verification process for both training and validation of the maps. The UQ habitat mapping team in collaboration with National Geographic is conducting extensive data searches for each of the mapping regions. Existing data is sourced utilizing connections with local and international organisations and researchers active in each region, as well as internet searches for scientific and grey literature and available data and maps. The benthic data collected are in various formats including existing maps, benthic transect data, and photoquadrats. The collected data are cleaned, attributed to the required benthic classes, and summarized to the nearest geo-referenced point. The data collection method and georeferencing accuracy is also recorded.
Existing Field data. Benthic data useful for the maps should be georeferenced, with a Latitude and Longitude provided for each sample point. The data should describe benthic composition that could be collapsed to our general benthic categories (hard coral, soft coral, macroalgae, seagrass, sand, rubble, rock), with a documented survey method and date collected.
Existing maps. Existing maps should be georeferenced or able to be georeferenced and describe the geomorphic zonation and/or benthic compostion.
All data contributed would not be published or used for any other purpose and will be acknowledged for the specific mapping region.
If you have benthic data for the coral reefs or seagrass habitats of your region, please consider contributing to the map training or validation process. The more data we can gather for each region, the better the training and validation of those maps will be, resulting in more useful maps of the habitats in your region.
Do you have existing data to contribute?
Get in touch at firstname.lastname@example.org.
For more information on mapping and monitoring through remote sensing from the University of Queensland, check out this Remote Sensing Toolkit
Roelfsema, C., M. Lyons, M. Dunbabin, E. M. Kovacs and S. Phinn (2015). "Integrating field survey data with satellite image data to improve shallow water seagrass maps: the role of AUV and snorkeller surveys?" Remote Sensing Letters 6(2): 135-144.
Roelfsema, C. and S. Phinn (2010). "Integrating field data with high spatial resolution multi spectral satellite imagery for calibration and validation of coral reef benthic community maps." Journal of Applied Remote Sensing 4(1): 043527-043527-043528.
This coral reef monitoring product was developed by the ASU team of the Allen Coral Atlas. When an area shows potential bleaching from the National Oceanic and Atmospheric Administration Coral Reef Watch (NOAA-CRW), the weekly Planet data are processed to identify locations of possible brightening of the coral. There are several steps in the data processing that are outlined below.
The weekly basemaps from Planet are created at a resolution of 4.77m using the available 4-band least-cloudy images. The images are masked with an automated cloud masking algorithm that is applied to the surface reflectance imagery and normalized against MODIS composited data to reduce image discontinuities in the resulting mosaic. The resulting images are provided as approximately 20x20km quads in units of surface reflectance.
The surface reflectance mosaics go through an additional cloud masking to remove any remaining edge of clouds and haze. These images are then processed from surface reflectance at the air-water interface to bottom reflectance using the algorithm described in (Li et al., 2020). The bottom reflectance forms the basis for detecting change from the time series. In order to represent the state of the coral in a stress-free environment, the bottom reflectance data are processed from a time period in which the coral is under cool water conditions as indicated by the NOAA-CRW sea surface temperature and bleaching alert data. For the images within this “baseline period”, we find the maximum bottom reflectance in that timeseries of images and record it in our baseline image on a pixel-by-pixel basis. Thus, the baseline images represent the highest brightness of the coral during the lowest heat stress.
Figure 1. NOAA-CRW of Main Hawaiian Islands showing bleaching alert and sea surface temperature (SST). The baseline period of low heat stress is identified.
In order to detect locations of possible coral brightening, the timeseries of weekly mosaics are compared to the baseline images. During the weeks when we are monitoring for bleaching, each pixel is compared to the baseline image and tested to see if it is above the baseline reflectance maximum. The number of weeks above the baseline is recorded in image form.
Figure 2. Flowchart of processing and compilation of weekly data for identifying areas of possible brightening.
Because the number of pixels in the week-count images is larger than the number of points that the heatmap generation algorithm can accommodate, we randomly sample the pixels in the week-count images that are above or equal to 3. The pixels are also filtered to only include those that fall within the Coral/Algae or Rubble classes. The pixels within each 20x20 km quad are also filtered so that the density is no more than 30 points per hectare. This filtering was carried out to reduce the overall number of points for heatmap generation.
The resulting filtered points are then combined for a given monitoring region and used to generate a dynamic heatmap of areas where coral brightening might be occurring. The areas in light blue are those with the lightest amount of coral brightening and those with the most brightening are in white.
Figure 3. Example of Heatmap generated from sampled points.
Although every effort has been made to mask out clouds, haze, and other contamination in the image data, there will be cases where they are included and may suggest an area is experiencing coral brightening when it is not.
The data for the brightening products are downloadable as points that were used to generate the heatmap. It is important to note that these points are not precise in their location and represent places where coral brightening might be occurring. The density of the points are meant to indicate general areas where the brightening may be occurring. Any individual point does not indicate that brightening is necessarily occurring there.
This project was supported by Vulcan Inc. Supercomputing was supported by Arizona State University’s Knowledge Enterprise Development program. Project partners providing financial, service and personnel include: Arizona State University, the National Geographic Society, Planet Inc., and University of Queensland. Significant support was also provided by Google Inc. and the Great Barrier Reef Foundation.
Li, J., N.S. Fabina, D.E. Knapp, G.P. Asner. The Sensitivity of Multi-spectral Satellite Sensors to Benthic Habitat Change. Remote Sensing 12. 2020.
How do I start using the Atlas in my coral reef work?
Would you like specific help with integrating the Atlas into your work?
Do you have existing data to contribute?
Do you want to collect new data for the Atlas?
Photo by Michael Markovina
The goal of the Allen Coral Atlas Field Engagement team, based at the National Geographic Society, is to help realize the vision of the late Dr. Ruth Gates of enhancing our collective ability to globally monitor, manage, and protect our unique and vibrant coral reef and coastal ecosystems. We strive to get the Atlas into the hands of coral reef conservation practitioners, coastal communities, and coral reef scientists worldwide.
The Field Engagement Team’s purpose is threefold:
To raise awareness about the Atlas maps and products, and develop capacity within the conservation sector to utilize the tools and resources
To assess the uptake and impact of the Atlas as a tool to improve conservation efforts and inform management strategies and policies regarding coral reefs.
To facilitate the collection of new and existing data through an interconnected network of managers, researchers, and other organizations.
Ultimately, the Atlas can help report on progress toward achieving international targets such as the Sustainable Development Goals and Convention on Biological Diversity Aichi targets. We are working with existing efforts (e.g., the International Coral Reef Initiative, the Global Coral Reef Monitoring Network, and the Reef Resilience Network) to facilitate planners, managers, and policymakers using the findings and data from the Allen Coral Atlas to achieve conservation impact.
How do I start using the Atlas in my coral reef work?
Whether you are a reef manager looking for habitat maps of your region to site areas for restoration, a researcher planning an expedition, a policy maker prioritizing coastal areas for protection, or you’re doing other work in need of spatial mapping of coral reefs, the Atlas can help. To get started, check out Conservation Biology Institute’s webinar, The Allen Coral Atlas: A new map for coral conservation for an introduction to the Atlas and basic functionality of the site. You can also explore our Youtube channel. We are also developing online courses on using the Atlas in collaboration with the Reef Resilience Network (coming summer 2020).
Photo by Michael Markovina
Atlas Accelerator Program
To develop capacity to use the Atlas among the coral reef conservation community, the Field Engagement Team encourages users to apply to our Atlas Accelerator Program. Individuals and teams accepted into the program will receive one-on-one support from the Field Engagement Team and develop connections to the Atlas’s network of mapping scientists to help formulate the best way to apply spatial mapping to their conservation and management problems.
Help us improve the Atlas
The Allen Coral Atlas is made possible by a huge network of organizations, agencies, and individuals. We rely on valuable input from scientists and managers to make the Atlas the best tool possible, given that we are mapping globally so the mapping is automated to achieve scale. We believe this is where we can have the broadest impact. Unfortunately, that does mean that local errors are likely. That said, the mapping team will attempt to address re-occurring errors.
Photo by Team Lamu, CORDIO
Therefore, if you have identified a specific error on the Atlas, please write us at email@example.com and include:
reef name(s) or location either:
coordinates and/or a georeferenced polygon (in kml, shapefile, or JSON format)
screen shots from the Atlas of relevant area (annotated with details)
Copy URL of zoomed in location
explanation of the error and your suggestion for correcting it/them
Send us any relevant field data
your contact details.
Share your data with the Atlas
If you or your team have existing maps of an area or georeferenced datasets of benthic habitat you are able to share with the mapping team, please reach out to us at firstname.lastname@example.org.
Or if you would like to be involved in collecting new data for the Atlas, here is our global habitat mapping overview.
We are planning training workshops in conjunction with symposia and conferences, although these have been postponed in light of COVID-19. We will be offering on-line options for summer 2020.
Online Course - Reef Resilience Network
Coming soon, Summer 2020
Manado, Indonesia - postponed
The Allen Coral Atlas is hosting a workshop training participants in the use of the Atlas, including a field excursion.
Noumea, New Caledonia - postponed
The Allen Coral Atlas Team is facilitating a conservation hackathon for coral reef practitioners to apply the Atlas to a wide variety of scientific and management questions.
Rescheduled for July 2021
Photo by Michael Markovina
USER Guide Part 1: Learn about the advantages and limitations
USER Guide Part 2: Benthic and geomorphic class descriptions
The Coral Reef Watch near real-time 5km global products on the Allen Coral Atlas site are the most recent day's published sea surface temperature (SST), SST Anomaly, Coral Bleaching HotSpot, Degree Heating Week (DHW), a 7-day maximum Bleaching Alert Area, and 7-day SST Trend data from NOAA's Coral Reef Watch program. Please see their website for more information about the program. For more technical details about the 5-km products, see Liu et al. 2017 and 2014, and Heron et al. 2016 and 2015. If these products are used in any way, please follow the citation guidance.
This coral reef extent map for the Global Coral Reef Monitoring Network (GCRMN) was produced by the Center for Global Discovery and Conservation Science (GDCS) at Arizona State University.
The data were processed to rasters with a pixel size of 9.55m. The data are divided into quads of approximately 20x20 km in the Web Mercator projection (EPSG code 3857). Each quad is stored as a GeoTIFF file. Each 8-bit pixel has a value of 0 (non-reef) or 1 (reef) to show the extent of the reef area. An index KML is also included with the data to show the location of each quad.
Coral reef research and management efforts can be improved when supported by reef maps providing local-scale details across global extents. The reef extent maps are one of the most
commonly used and most valuable data products from the perspective of reef scientists and managers. We used convolutional neural networks and a global mosaic of high spatial resolution
Planet Dove satellite imagery to generate a globally-consistent coral reef extent map to facilitate scientific, conservation, and management efforts. Our methodological advantages include
high-resolution satellite imagery, modern deep learning approaches, and consistent global methods. Our coral reef extent map provides new high-resolution spatial information of coral reefs at a global scale.
The methods used to produce this data set are described in detail in the forthcoming publication:
Jiwei Li, David E. Knapp, Nicholas S. Fabina, Emma V. Kennedy, Kirk Larsen, Mitchell B. Lyons, Nicholas J. Murray, Stuart R. Phinn, Chris M. Roelfsema, and Gregory P. Asner. A global coral reef probability map generated using convolutional neural networks. Coral Reefs. 2020. Advance online publication. https://doi.org/10.1007/s00338-020-02005-6
This project was supported by Vulcan Inc. Supercomputing was supported by Arizona State University’s Knowledge Enterprise Development program. Project partners providing financial, service and personnel include: Arizona State University, the National Geographic Society, Planet Inc., and University of Queensland. Significant support was also provided by Google Inc. and the Great Barrier Reef Foundation
Quad Index (KML)
Andaman Sea (5 MB)
Central Indian Ocean (9.2 MB)
Central South Pacific (10.5 MB)
Coral Sea (2.9 MB)
East Africa (9.4 MB)
East Micronesia (7.3 MB)
Gulf of Aden (1.3 MB)
Hawaii (2.1 MB)
Indonesia (East) (28.4 MB)
Indonesia (West) (10.9 MB)
Mesoamerica (7.7 MB)
Northeast Asia (4.2 MB)
Northern Caribbean (16.5 MB)
Northwest Arabian Sea (5.2 MB)
Northwest China Sea (8.8 MB)
Philippines (23.3 MB)
Red Sea (16.3 MB)
South Asia (4.1 MB)
South China Sea (3.5 MB)
Southeast Caribbean (9.4 MB)
Southwest Pacific (East) (5.5 MB)
Southwest Pacific (West) (16.2 MB)
Subtropical East Australia (1 MB)
Timor Sea (9.6 MB)
Tropical East Pacific (4 MB)
West Indian Ocean (3.7 MB)
West Micronesia (5.6 MB)
Western Australia (3.6 MB)