Learning to use Google Earth Engine

I’m poking my way around Google Earth Engine and seeing what’s here, as well as learning some basics of the Python API. They have image and feature collections, accessible through API functions…

  • ImageCollection
  • FeatureCollection()

You can pass the name of a collection to FeatureCollection() to query it. Sometimes these are actually Fusion Tables. It’s kind of weird to get to the fusion tables, but let’s say you have a script or something else with a reference to their ID. You can open it by…

https://fusiontables.google.com/data?docid=<whatever that long ID is>

Variables are modifiable through .function(), as is often Python convention. Lines end with semicolons.

GIS for Fun: My Digitized BBS Route

The North American Breeding Bird Survey (BBS) was meant to be a stratified random sample of the continent’s terrestrial bird populations, so we could monitor the pulse of populations and trends. When I went to the American Ornithologists’ U/COS conference (back when it was still called that) in 2015, I presented in a symposium dedicated to BBS. The opening talk was about the history of BBS, which helped give me some historical context about the data set. It added a dimension to my understanding of the time periods of ecological research. The data set was developed before landscape ecology really took hold in the U.S., and along with it an explicit awareness of spatial analysis.

The spatial data we have for the BBS is what I believe to be digitized route paths from old highlighted paper route maps. As such, it’s rife with errors: I’ve found routes in the layer of many variable lengths, but some egregiously too long. Being that it’s a volunteer effort, it’s at times hard to get clarification on what’s going on here. I’ve done and continue to do analyses based on these route paths, so their accuracy matters to me. Side note: if the United States Geological Survey – Patuxent would have me, I’d be happy to travel the country in something like the “Google street view car” and help digitize these routes during BBS season! Just putting that out there.

I decided to pull up my route in GIS to see what calculated stops would look like, by first placing stops every 0.5 mi. (probably the more correct method) and then placing 50 equally spaced stops along the route. Since the route is a little long, the 50 stops end up probably being more spread out than they should be. Also, as route runners know, it’s not an exact science, in that you might need to move a stop up to 0.1 mi. for safety concerns.

Anyway, when I synced this up to road data, I found that the routes were significantly off from the latest road path data, probably because the latest version is > 4 years old now. The road data I’m using is quite intensive, and I’m trying to figure out what to do next. How will I line up the paths, and will this little experiment even matter in the long run?

Misadventures in Open Source GIS

When most ecologists conceptualize “GIS” we often think of a desktop GIS program, most likely ArcGIS. When we want to not use ArcGIS (or can’t), we most likely say “is there an R package for that?” In many cases, there is, and the repository keeps growing! The ever-growing number of R spatial packages reflects the general trend in GIS right now: it’s booming in lateral growth. Everywhere you turn, there’s something new, and actually my motivation for writing this post is just to try to keep up.

Going back “old school,” perhaps the most famous and long-running (i.e. older than I am) open source program is GRASS GIS. Here’s where the open source movement GIS movement fails, though: the point-and-click functionality and friendliness of interfacing with the program stops before you even get to the gate. If I go to the Ubuntu software center, it points me to an outdated version of the program that won’t even launch, so I had to remove it before installing the newer version (where the version number is in the command name). Luckily, though, it points me to a broader repository whereby I can access the “Ubuntu GIS” suite.

# Add Ubuntu Unstable PPA when running LTS Ubuntu release
sudo add-apt-repository ppa:ubuntugis/ubuntugis-unstable

The real problem I’ve had with GRASS from the beginning, though, is the beginning: you can’t just open the program and try to open some spatial data like you can in ArcGIS. You have to start by selecting a home folder, defining a “location” which must have uniform spatial reference, and then making a map set with spatial reference info. It’s where I admittedly gave up trying with GRASS long ago, in favor of the ease of just launching ArcGIS and adding a shape file.

QGIS solves more of this problem, but there are again some hangups on the install: it’s not in the Ubuntu software center, and installing it “the old fashioned way” seems to get an older version of the program (and the terminal errors will tell you so if you launch it from the terminal).

I’m thinking about finally trying to learn PostgreSQL in conjunction with PostGIS. I’m trying to retrace my thoughts on that to last year, and I think it was because I found out that SQL was a solid way to build my own GIS tools. I remember settling on that PostGIS was probably my best bet to learn open source GIS, and ironically I just saw a post-doc advertisement that specifically requested PostGIS knowledge.

Want to keep up? I compiled a certainly-not-exhaustive-but-pretty-comprehensive list of people I could find tweeting about open source GIS

Let me know if you should be on that list!

Extracting NetCDF Values to a Shape File

Here’s a script that loops over climate NetCDF bricks in a folder and extracts the values for each layer in the brick of each file, in this case averaged over polygons in a shape file.

rm(list = ls())
library(raster)
library(rgdal)
library(ncdf4)
library(reshape2)
library(stringr)
library(data.table)
setwd("where your climate files are")
your_shapefile <- readOGR("path to your shapefile","the layer name for your shapefile")

climate <- rbindlist(lapply(list.files(pattern="nc"), function(climate_file)
{
#in my case, the weather variable is in the filename
climate_var = ifelse(grepl("pr",climate_file),"pr",ifelse(grepl("tmax",climate_file),"tmax","tmin"))
print(climate_var)
climate_variable <- brick(climate_file,varname=climate_var,stopIfNotEqualSpaced=FALSE)
#I needed to shift the x-axis to be on a -180 to 180 scale
extent(climate_variable)=c(xmin(climate_variable)-360, xmax(climate_variable)-360, ymin(climate_variable), ymax(climate_variable))
blocks <- spTransform(blocks,projection(climate_variable))
r.vals <- extract(climate_variable, blocks, fun=mean,na.rm=TRUE,df=TRUE,layer=1,nl=1680)
r.vals <- melt(r.vals, id.vars = c("ID"),variable.name = "date")
r.vals$climate <- climate_var
#depending on the filename convention of your files, get a variable for your climate model and name it "modelname"
r.vals$model <- modelname
climate_variables[[length(climate_variables)+1]] <- r.vals
}))

climate <- rbindlist(climate_variables)

At this point, you have a data frame called “climate” that has all your data long-form. You can cast this as you see appropriate depending on your needs!

How This Problematic NetCDF File Was Fixed

I can’t be thankful enough for my godsend Twitter friend/NetCDF guru Michael Sumner for coming in clutch to rectify this problematic file! As I take my baby steps in learning how to deal with NetCDF files (~3 weeks after being thrown into the “deep end” and trying to tread water), I realized that one of our files had some sort of problem. Relevant to this issue, NetCDF files are often (not always) multidimensional arrays, where 2 of the dimensions are coordinates. The variables can be defined in relation to the dimensions of the file. So for instance, in my case, temperature is described over space and time (4-dimensional array), and each temperature value corresponds to a location and time. So, each temperature value “goes along with” a set of coordinates and a time. When I imported the NetCDF file as a raster brick (which means I put the array data into a stack of grids), the metadata seemed to show a larger longitudinal “step” than latitudinal, the latter of which was at the correct resolution. Note that this means, as is common, I took what is really a point (a temperature value corresponding to a latitude, longitude and time) and turned it into a grid, where the original point is the center of the resulting cell that gets the value. I tried  plotting it to see what it looked like.

Rplot01
This first plot hints at the real problem: it appears the image has been grabbed at either end and stretched too wide

I tried to crop the image to only the extent I needed but it still was messed up.

I think those are the Great Lakes erroneously stretched into the study area

I took this at face value, that perhaps the grid was somehow “wrong,” but my new friend showed me how to look more deeply into the problem! He opened the file in R and extracted the coordinate values, plotting for a visual diagnosis of the problem.

library(ncdf4) con <- ncdf4::nc_open("file.nc")
lon <- ncdf4::ncvar_get(con, "lon")
lat <- ncdf4::ncvar_get(con, "lat")
ncdf4::nc_close(con)

So, my friend suggested plotting the longitude values stored in this dimension of the array.

Rplot

Herein we start to see a problem! Something is off about the longitudes stored in the array.

head(lon)
# [1] 244.0625 244.1875 244.3125 244.4375 244.5625 244.6875
tail(lon)
# [1] 281.8125 281.9375 282.0625 282.1875 3.5200 3.5100
r <- raster::brick(f, stopIfNotEqualSpaced = FALSE)
extent(r)
# class : Extent
# xmin : 3.056128
# xmax : 282.6414
# ymin : 40
# ymax : 52.875
The first longitude values are the correct lowest values for the raster. For whatever reason, the last 2 longitude values got reassigned to the (very wrong) smallest values in the grid, though they should be the largest values. The extent describes the grid that I assigned the points to when I imported the data into R. For comparison and clarification, I also posted the results of looking at the extent of the raster that I created , to show that they’re “2 different things.” Assuming that the NetCDF stores the center point of each cell, the latitude extents (which are correct) provide a half-cell-width buffer to the actual data point, correctly placing them in the center of the cell.
So to correct, he loaded the raster with the following note…
## let's treat it as if the start is correct,
## the final two columns are incorrect
## we hard-code the half-cell left and right cell edge
## (but note the "centre" might not
## be the right interpretation from this model, if we're being exacting)
valid_lon2 <- c(lon[1] - 0.125/2) + c(0, length(lon)) * 0.125
valid_lat <- range(lat) + c(-1, 1) * 0.125/2
r2 <- setExtent(r, extent(valid_lon2, valid_lat))
r_final <- raster::shift(r2, x = -360)
Now that the extent is rectified he shifted it to our desired reference, namely out of 0-360 longitude to -180/180.
Now the Great Lakes are…great!
Voila! Thank you very much, Michael!

Next NetCDF Issue I Discovered Today

Another problem I ran into: apparently Nov-Dec. 1999 monthly averages for HADCM3 B1 precipitation, where I flipped the axes in NCO, are blank.

problem <- brick("hadcm3.b1.pr.NAm.grid_monthly.nc_out.nc")
plot(problem$X1999.11.15)
plot(problem$X1999.12.16)

The above returns the correct axes, but blank plots. The problem is present before I flipped the axes, so I have to go back to the original files before I did the averaging to find out what’s wrong. I wonder if there were blank layers anywhere in the Nov-Dec daily data?

Working with Irregularly Spaced NetCDF

I’m in the process of trying to average raster cells within a shape file of polygons in R. Today I got this error message while trying to import a NetCDF as a raster brick…

“cells are not equally spaced; you should extract values as points.” – brick() function warning message from raster package

When I suppressed this warning message with the stopIfNotEqualSpaced=FALSE option, I was able to look at the metadata and see that indeed, the grid is irregular. The lat/long resolutions differ, making perhaps rectangular cells instead of square cells. Predictably, the number of columns and thus number of cells differ from the other files. Interestingly, the extent is way different too. So I plotted it to take a look.

irregular
my NetCDF file where the longitudinal resolution is coarser than the latitude

I first follow the suggestion of my colleague Andy Allstadt to project my shape file to the raster projection, instead of projecting them both. Using the rasterToPoints() function to convert the raster to a point layer gives me a wide format attribute table, such that the first attribute columns are coordinates, and the remaining columns are the layers in the brick (in my case, each of the monthly averages for all the years in my raster stack). The problem is, the grid cells in this irregular file are much larger than they should be longitudinally, so my blocks do not necessarily overlap the points at the center of each cell, which is how I think the raster was converted to points. So, now what? It looks like maybe I’ll have to re-grid the raster layer?

How I’ve Been Processing Climate Data

From a THREDDS server, I’m using the NetcdfSubset portal to obtain a spatial subset of my climate data set of interest. Since the files I need are too big to download from their HTTP server, they instead give me an OPeNDAP URI with the spatial subset info. I then pass that to nccopy from the NetCDF library.

Then, I installed CDO and wrote a script to do the monthly means for all the files: it averages a daily time step file into a monthly averaged (or whatever metric you choose) file.

cdo monmean foo_hourly_or_daily.nc  foo_monthly_mean.nc

Then, I installed nco to flip the axes.

ncpdq -a lat,lon in.nc out.nc

Then, I imported the correctly-oriented files into R.

rm(list = ls())
library(raster)
library(rgdal)
library(ncdf4)
setwd("wherever your files are")
climate <- brick("filename.nc",varname="whatever your climate variable is")

In my case, as I think is common for climate files, longitude was on a 0-360 scale, instead of -180/180.

And Now for a Windows Hack in a Pinch…

For reasons I won’t describe here, I had to quickly install nco on Windows, and figure out how to use the command line to loop over files in a given directory. So, I navigated to the folder where I have nco on the command line, and ran this to loop over the files in my climate folder.

for /r %i in "C:\Users\me\myfolder" do ncpdq -a lat,lon %i %i_out.nc