gdalcubes_set_gdal_config("AWS_ACCESS_KEY_ID", "xxxxxxxxx")
gdalcubes_set_gdal_config("AWS_SECRET_ACCESS_KEY", "xxxxxxxxxx")
gdalcubes_set_gdal_config("AWS_REQUEST_PAYER", "requester")
Example 2: Time series analysis
Please notice that code chunks in this document are meant to be executed on an Amazon Web Services (AWS) machine in region us-west-2
(Oregon). Examples have been selected to yield computation times acceptable for a live demonstration. Please feel free to apply on larger areas and/or using a higher spatial resolution.
Introduction
In this example, we use MODIS vegetation index data (MOD13A1) on AWS to apply more complex time series analysis (trend estimation and change detection). The steps performed are very similar to the first example. However, the data is provided in a requester pays bucket, i.e., we need to pay for data requests and transfer. To do so, we need to set up an AWS IAM-user with S3reader rights and some configuration options before we can read images:
1. Define area of interest
Our study area is the whole country of Germany, provided as GeoPackage file (data/de.gpkg), which we can read using the sf
package (Pebesma 2018).
library(sf)
= read_sf("data/de.gpkg") de_shape
2. Query available images from STAC
We calculate the bounding box of the original as well as of the transformed polygon and request images from the corresponding STAC service, using the “mod13a1” collection.
= st_bbox(de_shape)
bbox |>
de_shape st_transform("EPSG:4326") |>
st_bbox() -> bbox_wgs84
bbox_wgs84
library(rstac)
= stac("https://eod-catalog-svc-prod.astraea.earth")
s = s |>
items stac_search(collections = "mod13a1",
bbox = c(bbox_wgs84["xmin"],bbox_wgs84["ymin"],
"xmax"],bbox_wgs84["ymax"]),
bbox_wgs84[datetime = "2010-01-01T00:00:00Z/2019-12-31T00:00:00Z") |>
post_request() |> items_fetch(progress = FALSE)
items
## xmin ymin xmax ymax
## 5.865999 47.270362 15.041531 55.057060
## ###STACItemCollection
## - matched feature(s): 707
## - features (707 item(s) / 0 not fetched):
## - MOD13A1.A2015353.h18v04.006.2016007181814
## - MOD13A1.A2015353.h19v04.006.2016007181809
## - MOD13A1.A2018353.h19v04.006.2019009043915
## - MOD13A1.A2018353.h18v03.006.2019009043612
## - MOD13A1.A2010353.h18v03.006.2018226093446
## - MOD13A1.A2013353.h18v04.006.2018226105916
## - MOD13A1.A2014353.h18v03.006.2018226110056
## - MOD13A1.A2011353.h19v04.006.2015230023909
## - MOD13A1.A2011353.h18v04.006.2018226092723
## - MOD13A1.A2012353.h18v03.006.2018226105744
## - ... with 697 more feature(s).
## - assets:
## BR, CDOY, EVI, MIRR, NDVI, NIRR, PR, RAA, RR, SZA, VIQ, VZA, eod_thumbnail
## - other field(s):
## context, features, links, search:metadata, stac_extensions, stac_version, type
3. Create a gdalcubes image collection from STAC result
We simply convert the STAC response to an image collection using the stac_image_collection()
function.
library(gdalcubes)
= stac_image_collection(items$features, asset_names = c("NDVI","VIQ"))
col col
## Image collection object, referencing 707 images with 2 bands
## Images:
## name left top bottom right
## 1 MOD13A1.A2015353.h18v04.006.2016007181814 0.00000 50 40.00444 15.55033
## 2 MOD13A1.A2015353.h19v04.006.2016007181809 13.05492 50 40.00444 31.10757
## 3 MOD13A1.A2018353.h19v04.006.2019009043915 13.05492 50 40.00444 31.10757
## 4 MOD13A1.A2018353.h18v03.006.2019009043612 0.00000 60 50.00444 19.99112
## 5 MOD13A1.A2010353.h18v03.006.2018226093446 0.00000 60 50.00444 19.99112
## 6 MOD13A1.A2013353.h18v04.006.2018226105916 0.00000 50 40.00444 15.55033
## datetime srs
## 1 2015-12-19T00:00:00
## 2 2015-12-19T00:00:00
## 3 2018-12-19T00:00:00
## 4 2018-12-19T00:00:00
## 5 2010-12-19T00:00:00
## 6 2013-12-19T00:00:00
## [ omitted 701 images ]
##
## Bands:
## name offset scale unit nodata image_count
## 1 NDVI 0 1 707
## 2 VIQ 0 1 707
4. Create a data cube and perform a simple NDVI trend estimation
Next, we define the data cube geometry (1km spatial resolution), create the data cube, and apply a trend estimation (quantile regression from the quantreg
package (Koenker 2022) as a user defined function) on yearly aggregated data. Notice that instead of direct plotting, we simply export the results as a netCDF file. This file can be used afterwards with most GDAL-based tools, including other R packages as well as QGIS. Alternatively, calling write_tif()
would create GeoTIFF file(s).
= cube_view(extent = list(left = bbox["xmin"], right = bbox["xmax"], bottom = bbox["ymin"],
v top = bbox["ymax"], t0 = "2010-01-01", t1 = "2019-12-31"), dx=1000,dy = 1000,
dt = "P1Y", srs = "EPSG:25832", aggregation = "median")
v
## A data cube view object
##
## Dimensions:
## low high count pixel_size
## t 2010-01-01 2019-12-31 10 P1Y
## y 5235589.56627249 6101589.56627249 866 1000
## x 280307.319211578 921307.319211578 641 1000
##
## SRS: "EPSG:25832"
## Temporal aggregation method: "median"
## Spatial resampling method: "near"
raster_cube(col, v) |>
select_bands("NDVI") |>
reduce_time(names = "slope", FUN = function(x) {
= x["NDVI",]/1000
ndvi = 1:length(ndvi)
t if (sum(!is.na(ndvi)) <= 2) {
return(NA)
}library(quantreg)
= rq(ndvi ~ t)
trend return(trend$coefficients["t"])
|>
}) filter_geom(de_shape$geom) |>
write_ncdf("ndvi_trend.nc")
5. Visualize result in an interactive map
Since our result is a simple single-band, single-time netCDF file, we can load it with the terra
package (Hijmans 2020) and use tmap
(Tennekes 2018) to create a trend map.
library(tmap)
tmap_mode("view")
tm_shape(terra::rast("ndvi_trend.nc")) +
tm_raster(palette = "BrBG", title = "NDVI Trend", style = "cont", midpoint = 0,
breaks = seq(-0.5,0.5, length.out = 5))
Most significant negative trends relate to some surface mining activities (and deforestation).
6. Run change detection
Similarly, we can apply change detection methods on pixel time series. Below, we use the bfast
package (Verbesselt et al. 2010) and apply bfastmonitor()
to extract change dates and magnitudes in 2018.
= cube_view(view = v, extent = list(t0 = "2017-01-01", t1 = "2018-12-31"), dt = "P1M")
v v
## A data cube view object
##
## Dimensions:
## low high count pixel_size
## t 2017-01-01 2018-12-31 24 P1M
## y 5235589.56627249 6101589.56627249 866 1000
## x 280307.319211578 921307.319211578 641 1000
##
## SRS: "EPSG:25832"
## Temporal aggregation method: "median"
## Spatial resampling method: "near"
raster_cube(col, v) |>
select_bands("NDVI") |>
reduce_time(names = c("change_date", "change_magnitude"), FUN = function(x) {
= x["NDVI",]/1000
ndvi if (all(is.na(ndvi))) {
return(c(NA,NA))
}= ts(ndvi, start = c(2017, 1), frequency = 12)
ndvi_ts library(bfast)
tryCatch({
= bfastmonitor(ndvi_ts, start = c(2018,1), history = "all", level = 0.02)
result return(c(result$breakpoint, result$magnitude))
error = function(x) {
}, return(c(NA,NA))
})|>
}) filter_geom(de_shape$geom) |>
write_ncdf("ndvi_changes.nc")
7. Convert to stars and create an interative map
We can now convert the result to a stars object (Pebesma 2019) using st_as_stars()
and use the tmap
package for creating interactive maps.
library(stars)
ncdf_cube("ndvi_changes.nc") |>
st_as_stars() -> changes_stars
= changes_stars[1,,,1]
chng_date = changes_stars[2,,,1]
chng_magn
tm_shape(chng_date) +
tm_raster(palette = "GnBu", title = "Change date", style = "cont")
tm_shape(chng_magn) +
tm_raster(palette = "GnBu", title = "Change magnitude", style = "cont")