Spatial – https://archive.gaiaresources.com.au Environmental Technology Consultants Thu, 29 Feb 2024 03:47:38 +0000 en-AU hourly 1 https://wordpress.org/?v=4.9.1 Wildlife corridors: a spatial analysis approach to restore and protect habitats https://archive.gaiaresources.com.au/wildlife-corridors-spatial-analysis-approach-restore-protect-habitats/ Wed, 29 Jun 2022 03:38:34 +0000 https://archive.gaiaresources.com.au/?p=10101 One of the side effects of urban and agricultural development is vegetation fragmentation, which affects biodiversity conservation and environmental quality. In order to restore and protect existing habitat, it is important to provide linkages between fragmented areas, which will encourage the movement of wildlife, preserving and improving biodiversity. According to the Department of Agriculture, Water... Continue reading →

The post Wildlife corridors: a spatial analysis approach to restore and protect habitats appeared first on Gaia Resources.

]]>
One of the side effects of urban and agricultural development is vegetation fragmentation, which affects biodiversity conservation and environmental quality. In order to restore and protect existing habitat, it is important to provide linkages between fragmented areas, which will encourage the movement of wildlife, preserving and improving biodiversity. According to the Department of Agriculture, Water and Environment, wildlife corridors are ‘connections across the landscape that link up areas of habitat. They support natural processes that occur in a healthy environment, including the movement of species to find resources, such as food and water’.

Last year we were contacted by the Shire of Mundaring to design wildlife corridors in their local government area. The purpose was to maintain a healthy landscape, restoring connectivity and promoting local biodiversity through the expansion of available habitats. As a solution, we conducted a spatial analysis using QGIS to derive the optimum paths to establish connectivity between vegetation patches, considering a few datasets of a sufficiently high level resolution.

It is interesting to see how there are many toolboxes developed to design wildlife corridors using licensed GIS software, but not many of them are available to add in open source software. There are a few models that can be downloaded as standalone software, which aim to find optimum restorations paths or connect reserves. The problem with these is that they can be out of date (you can encounter many bugs while trying to get your results) or the parameters you can control are very limited, so there is little room for tailoring the corridors to your preference. 

Open-source software tools and plugins are often developed for a specific purpose and then put out to the public domain ‘as-is’ for others to adapt. For these niche applications, that might be as far as they get, and when you want to use them you are either limited to those application constraints or faced with a development cost to adjust the tool to your needs. There seems to be an opportunity here actually, to adapt open-source tools in QGIS and other software, and offer a solution that is tailored to defining wildlife corridor options. So I am looking into that as part of my professional development at Gaia Resources, and should have some news around this in the not-to-distant future!

The Least-Cost Path is one of the many plugins that can be added to QGIS. This kind of multi-criteria analysis is very popular for designing corridors for a range of applications like wildlife conservation and infrastructure planning.

The concept behind performing a Least Cost Path analysis is the following: given an origin and a destination point, the algorithm will search in a cost raster for the cells with the minimum value and create a corridor between them. 

If you are not familiar with GIS jargon, you might be wondering what a cost raster is. Long story short, the cost raster represents the potential resistance faced while transiting that path, and it combines a number of parameters that we’ll look into in a bit more detail below. Basically, each cell in your cost raster is the aggregation of scores that you apply to the input parameters. 

Let’s think of all the parameters that need to be considered while designing corridors, with the idea that each of these can be sourced as data inputs into the cost raster. First and foremost: vegetation, an essential component that provides habitat and shelter for biodiversity. Next, we want to provide some source of water for the wildlife, so this element should also be included. 

Knowing the land use types is also important while planning this kind of development, since some of them will be better to restore vegetation than others (I do not want to think of bandicoots establishing their home near a highway!). Talking about roads, that is another key dataset, since it would be ideal to avoid them (although crossing them is also possible, like in this example). Other datasets can be considered in this kind of study as well, it all depends on the requirements you want to cover with your corridors.

These parameters are given scores according to different criteria, considering how suitable they are for conservation purposes – and the scores are applied to the input datasets. For example, a criteria could be ‘main roads should be a severe impediment to wildlife corridor crossings, and wildlife corridors should not be present within 10m of a main road’. When the criteria are defined they will be assigned different scores according to a ranking, i.e. highways will have a higher cost than a minor road. Once the datasets are ready, they can be merged, and the scores will be added to each other; the final merged layer represents the cost raster that serves as input in the Least Cost Path plugin.

The model will create a corridor connecting one source point to a destination, or many corridors from a source to various points. A distinct advantage of using the Least Cost Path algorithm is that it is data-driven; it selects paths through analysis of inputs with defined rules that would otherwise be subjective and counter intuitive to choose while doing a visual analysis of the area. The idea is that the paths chosen by the tool will have a lower overall cost based on the data inputs. 

Data-driven models will often lead to some unexpected results, and a review of the results can highlight additional perspectives, missed criteria, opportunities for improvement or data inputs that can be better handled. This makes wildlife corridor mapping an iterative process, but the model itself is robust and, once established, can be repeated with ease. For example, you can incorporate particular strategic planning areas not represented in the land use dataset you used in the last run of the model, or introduce firebreaks and ideal spots for wildlife cross-overs to overcome barriers on main roads.

The corridors obtained from the model can be given different priorities, as they will have an assigned cost as a result of crossing through different areas. Since the corridors have diverse lengths, a good practice is to obtain the cost/length ratio. This helps organisations to decide – alongside environmental and planning objectives – where the best value is for their revegetation and investment.

We can all help protect biodiversity by managing the environment in local areas. Organisations can use wildlife corridors to make meaningful engagement with landowners and feed into the planning process – giving everyone a chance to contribute to the local biodiversity in their area. If this project sounded interesting to you and would like to do something similar, reach out and start a conversation with us via email, or through our social media platforms –  Twitter, LinkedIn or Facebook

Rocio

The post Wildlife corridors: a spatial analysis approach to restore and protect habitats appeared first on Gaia Resources.

]]>
Biodiversity spatial data challenges and solutions https://archive.gaiaresources.com.au/biodiversity-spatial-data-challenges-solutions/ Wed, 25 May 2022 03:33:05 +0000 https://archive.gaiaresources.com.au/?p=10070 In this blog, we’ll explore some of the challenges and solutions around managing the spatial aspects of biodiversity data.  Claire recently wrote about how she loved the way nature always had such elegant answers to complex problems (see her blog here). Researchers and environmental scientists often observe these elegant answers when they are out in... Continue reading →

The post Biodiversity spatial data challenges and solutions appeared first on Gaia Resources.

]]>
In this blog, we’ll explore some of the challenges and solutions around managing the spatial aspects of biodiversity data. 

Claire recently wrote about how she loved the way nature always had such elegant answers to complex problems (see her blog here). Researchers and environmental scientists often observe these elegant answers when they are out in the field collecting data. 

Whether it is the way that plant seeds have evolved to propagate through the use of wind or fire, or a symbiotic relationship that benefits both plant and animal species. 

Channelling the Samuel review of the EPBC Act, if we are going to get serious about arresting the decline of species and ecosystems in Australia, we need to do much more to connect the  dots of biodiversity data. The review found that “in the main, decisions that determine environmental outcomes are made on a project-by-project basis, and only when impacts exceed a certain size. This means that cumulative impacts on the environment are not systematically considered, and the overall result is net environmental decline, rather than protection and conservation.” (source: Independent review of the EPBC Act – Interim Report Executive Summary)

Gaia Resources is currently developing two separate biodiversity data management projects in Australia that are helping State and Federal government agencies to streamline biodiversity data submission, increase accessibility to biodiversity data and hopefully, in turn, support decision making and improve environmental outcomes.  We are rapidly approaching the launch of both these projects – so stay tuned for more details to come!

We are helping to join the dots by enabling biodiversity data to be aggregated and made more readily available as a public resource. This type of data – including species occurrence data, systematic surveys and vegetation associations – comes in many forms and from multiple sources. Researchers and environmental scientists have employed different methodologies across our vast continent and territories to collect data for their particular project or area of study. Depending on the nature of the survey, field biodiversity data can be collected as point occurrences or observations, transect lines, plots, traps, habitat survey areas and quadrats (as shown below). 

A schematic representation of different types of biodiversity survey types including points, tracking data, transects, traps, plots and habitat surveys.

The observed absence of a species within a defined survey area/site, and time of the survey, are also important data elements for ecological research.  Adding to that data complexity is the fact that over the past few decades, technological advancements in GPS (Global Positioning Systems) and apps on handheld devices have changed the way we record things like coordinate locations and levels of accuracy. Technological advancement has also impacted the volume of information we can gather with the time and resources we have available.  

To have a chance of aggregating all this data from different sources in a meaningful way, there is a need to apply a consistent approach, or standard, to the biodiversity information.  Apart from the considerable challenges standardisation presents from a taxonomic perspective in classifying species, there are also several spatial data challenges, which I’ll focus on here – more on the use of standards and varying approaches to using them will be coming in a later blog. 

One key challenge is knowing and specifying the spatial coordinate system of incoming data, so that any repository can transform many project submissions into a spatially consistent system. Once you know the reference system, it is then possible to assess whether the data is positioned in a logical place – on the Australian continent or its Island Territories, for instance. 

Another big one has been how to handle different geometries of data (e.g. point, line, polygon) describing the same type of thing in the field. Take an example of a 30 year old report that lists a single point coordinate referencing a 50x50m plot area, but with no other information like the orientation of that plot.  Do we materially change a plot reference to make that a polygon shape, based on a snippet of information in the accompanying report? What happens when some of the information we need is missing, or the method described in the report is ambiguous?  As system developers, we are avoiding anything that amounts to a material change to the source data; instead, systems should be designed to put some basic data quality responsibilities to solve these mysteries back on the authors of the data.

Finally, we have the issue of spatial topology in biodiversity data. Once you get into the realm of transects and areas, it becomes tricky to represent that spatial location using text based standards. Technology provides an elegant – although arguably not that user-friendly – solution through something like a Well-known text (WKT) expression. This standard form can simplify a line or polygon into a series of coordinates that become one column in a dataset, like that shown below.

Points, lines and polygons can be represented by a text string where the discrete numbers are coordinate pairs (Source: Wikipedia)

Instead, we are looking to leverage the open Geopackage format. Generally speaking, this format gives us an open and interoperable approach that can be used across a range of GIS software applications. The Geopackage format has been around for years, and provides a more accessible alternative to proprietary geodatabase formats that you can only really use in a particular GIS software. It also allows configuration and customisation through the SQLite database on which it is based. 

Finally, we have a responsibility to ensure that the biodiversity data is FAIR (Findable, Accessible, Interoperable, and Reusable). In my view, this is a challenge as much about data coming into a system as it is about the user experience of people trying to interact and get data out of a system.  Spending some quality time on both ends of the data chains is very important – and that’s why we’ve been working heavily on design for these systems, too.

By its nature, aggregating data from multiple sources across space and time comes with a suite of challenges, some of which I’ve touched on here.  So these are some of the spatial challenges we’ve been working on in the spatial biodiversity data area, and our expertise in both biodiversity data and spatial data has been very useful in these projects. 

If you want to know more about biodiversity information or spatial data, we would love to hear from you. Feel free to drop us an email or start a conversation on our social media platforms – Twitter, LinkedIn or Facebook.

Chris

The post Biodiversity spatial data challenges and solutions appeared first on Gaia Resources.

]]>
GIS in your organisation: can you identify any pain points? https://archive.gaiaresources.com.au/gis-organisation-can-identify-pain-points/ Wed, 18 May 2022 03:28:24 +0000 https://archive.gaiaresources.com.au/?p=10060 I was recently involved in a GIS Health Check for one of our clients, Carbon Neutral, who wanted to get an outside perspective on ways they could improve their data management and spatial software. It was a really rewarding experience on a personal level to explore how another company used spatial information, but also I... Continue reading →

The post GIS in your organisation: can you identify any pain points? appeared first on Gaia Resources.

]]>
I was recently involved in a GIS Health Check for one of our clients, Carbon Neutral, who wanted to get an outside perspective on ways they could improve their data management and spatial software. It was a really rewarding experience on a personal level to explore how another company used spatial information, but also I think we were able to give them some recommendations that will deliver benefits to their business. 

A GIS Health Check involves a process of investigation and discovery of an organisation’s Geographic Information System.  It includes a set of stakeholder interviews coupled with a hands-on exploration of the GIS data and software. It also considers other aspects of the broader system environment; but, at the end of the day, what are the benefits of a GIS Health Check? 

Well, let me explain by giving you an example. 

Think of the uses your company makes of spatial information. Can you identify any pain points, or areas where you think things could be done more effectively? Throughout the process, these are the key business problems we want to come back to, to ensure that whatever is recommended, is aimed at delivering real benefit and value. These may be things that jump out straight away in interviewing stakeholders, or they may be more stealthily embedded issues that need more analysis and thinking. Before we think about the solution, it is critical that the GIS Health Check identifies the problems that need to be solved. 

During a GIS Health Check, we consider five elements: People, Processes, Data, Software and Hardware. When these elements are appropriately resourced and working together effectively, we can say we have a healthy GIS system. However, if one (or more) element is not operating efficiently, or is lacking focus in the organisation, there are problems in the organisations’ workflow.

Each organisation has a different aim when requesting a GIS Health Check. For some, a major issue is about historical data management and a resulting lack of structure in the database system (e.g. layer naming, authoritative sources, accessibility). Others might be more interested in improving their processes, writing documentation on them and making sure all the GIS stakeholders are aware and brought into a consistent framework. There are companies that think the software they are using might not be the best fit for their work, or that alternative products may be available that will deliver better value and efficiency. Most of the time, it is a combination of issues from the different elements. 

So when it comes down to an individual GIS Health Check – what do we at Gaia Resources actually do? Even though the objectives are different for each organisation, the approach is pretty much the same. We start by checking the current state of the system: we interview all the GIS users in the company, in order to identify what is working well and where are the ongoing issues. We also get inside your system to ‘lift the hood’ and evaluate the organisations’ spatial data and how it is organised, checking for duplicate files, naming conventions and degree of adoption, folder structures, software used, key business processes supported by spatial information, and many other aspects. 

We also tap into the knowledge of the organisation’s stakeholders, to understand the business context, related strategy objectives and to gain an understanding of how staff think their Geographic Information System should evolve. Basically, what does the future state look like?

Based on our review of the current and desired future state, we put our thinking caps on to brainstorm and provide recommendations. These are meant to improve the future state of the company’s GIS environment, and provide tangible strategies and actions for getting there. The recommendations are classified in terms of  priority, estimated effort and category (e.g. the 5 GIS environment elements covered above). 

Gaia Resources has conducted a significant number of these GIS health checks over the years, as mentioned in this blog. Some of our clients are IGO, Redbank Copper, MBS Environmental, OEPA and recently, Carbon Neutral. We keep in touch with these organisations, and with some we continue to support them with QGIS training and software development. 

Having been through the process, and really benefiting from the previous GIS Health Checks deliverables, I took a page out of our own book so to speak and developed a guideline on “How to deliver a GIS Health Check.” We reviewed previous projects and identified all the common points for a successful Health Check. That guideline is now a resource for our Data Science team and project managers who will no doubt be helping more organisations in the future. 

To summarise (and answer my first question), having your GIS checked can bring many long-lasting benefits, from improving workflow efficiency and consistency, to enhancing decision making and building capabilities in your team. We help organisations achieve this by working with them to focus on the key business challenges where spatial information can play a role. 

If you think the GIS environment in your organisation could do with a review, reach out and start a conversation with us via email, or through our social media platforms –  Twitter, LinkedIn or Facebook. We are here to help!

Rocio

The post GIS in your organisation: can you identify any pain points? appeared first on Gaia Resources.

]]>
Archiving Spatial Data https://archive.gaiaresources.com.au/archiving-spatial-data/ Wed, 06 Apr 2022 03:37:21 +0000 https://archive.gaiaresources.com.au/?p=9999 Last week, late local time on a Wednesday night, I was excitedly listening and watching presentations, writing notes and tweeting as a result of the the Digital Preservation Coalition’s (DPC) online seminar “Where are we now? Mapping progress with geospatial data preservation”. When we saw that the DPC were putting on an event that included... Continue reading →

The post Archiving Spatial Data appeared first on Gaia Resources.

]]>
Last week, late local time on a Wednesday night, I was excitedly listening and watching presentations, writing notes and tweeting as a result of the the Digital Preservation Coalition’s (DPC) online seminar “Where are we now? Mapping progress with geospatial data preservation”.

When we saw that the DPC were putting on an event that included both archives and spatial data, I was super keen to be involved.  My first degree in Geography was all about spatial data – the love of which persisted through my ecology and environmental stage of my career to see the creation of Gaia Resources.  Along the Gaia Resources journey, as I talked about briefly at the recent Archival Society of Australia WA branch meeting, we have ended up working with Archives all around Australia… and then suddenly there was the DPC event, combining those two topics and giving me an excuse to stay up late and enjoy myself.

The event itself was run online, and included a series of talks followed by a short question and answer piece at the end, with some good breaks in between to present attendees from going mad looking at a screen for four straight hours.  The event itself was expertly facilitated and run by the DPC staff, which is still no mean feat these days.

The talks were varied, and included groups from a variety of organisations and industries, including:

  • Organisations that were decommissioning first generation nuclear power plants in the UK, and trying to deal with the challenges of legacy spatial data,
  • The British Geological Survey, who provided an overview of how they’re going to deliver their spatial data into the future (more on that later)
  • Data standards bodies like Geonovum in the Netherlands, who pointed out how the W3C and OGC standards have been separated for way too long, but have been brought together with their joint working groups, 
  • The UK Geospatial Commission, who were discussing how to provide data of high quality, that was also Findable, Accessible, Interoperable and Readable (FAIR), 
  • The US Library of Congress, talking about what sorts of formats they accept into their archive through their Recommended Format Statement, 
  • A case study in the archaeology profession in the UK (and the lingering love of the shapefile), and
  • A final case study on the National Library of Scotland, and how they are preserving historical maps and making those available (and I then spent ages browsing their website – linked below).

The National Library of Scotland “Map Images” website – find it at https://maps.nls.uk/geo/explore/

There was a lot of content in here to digest, and I pulled out a few points that I’ll summarise as the highlights for me, which were:

  • The rise of the Geopackage – as highlighted early in the British Geological Survey talk, they are moving towards delivering data in API style feeds (using the OGCAPI feeds) and delivering files in GeoTIFF, GeoJSON and Geopackage only.  This was echoed in the Library of Congress preferring the GeoPackage format as well – and for many good reasons.  If you’re working with digital spatial data, then you need to start investigating Geopackages and how they’re emerging as a great open standard,
  • International differences – listening to talks from the nuclear decommissioning crews and the archaeologists reminded me of how spatial data in the mining industry was at way back in the early days of Gaia Resources when we were working with Western Australian mining companies to digitise all their legacy spatial data, and how far we’ve come since then – which will be the subject of a future blog from me as well,
  • Active community – there are a whole raft of people out there in the world who are looking at the same challenges we are in terms of digital preservation of spatial data, and the DPC seems to be a great place to connect to them – which we’ll be doing, and we’ve reached out to the DPC to see how we can help and be involved, and
  • Practical, pragmatic implementation – while I love detailed standards like any other spatial/archiving nerdy type, the practical applications of these were what really stood out in the talks.  Standards can often be developed in ivory towers, away from the practical implementation of them, but I saw some very close touchpoints here across the talks and as a result, there is plenty of good guidance for operators to follow.

I was also pleased – and somewhat relieved – to see a whole bunch of things in the talks that reminded me of work that we’ve done.  Apart from the example of our digitisation work in the mining sector here in Western Australia, the National Library of Scotland “Map Images” site is our Retromaps project on steroids, and there was a fair bit of thought around the use of GeoPackages, which we’re building into several of our larger data collation and aggregation projects at the moment.

 

This was my first introduction to the DPC – I’ve heard about them before, but never been actively able to participate in an event.  It certainly was worth staying up late to get these different perspectives and how these different players deal with those challenges of archiving spatial data.

If you’d like to talk more about the event, or find out more about how we’re working on capturing, managing and archiving spatial data in a range of industries here at Gaia Resources, feel free to drop us an email or start a conversation on our social media platforms – TwitterLinkedIn or Facebook.

I must also say thanks once again to the team behind DPC, and all of the speakers, for an interesting Wednesday night!

Piers

The post Archiving Spatial Data appeared first on Gaia Resources.

]]>
Satellite platforms: free and open data for environmental monitoring https://archive.gaiaresources.com.au/satellite-platforms-free-open-data-environmental-monitoring/ Wed, 02 Mar 2022 02:43:17 +0000 https://archive.gaiaresources.com.au/?p=9951 My colleague Rocio recently wrote about her team winning a NASA global hackathon, with their prototype solution to monitor movement in trees from satellite imagery as an indicator of landslide risk (read about that here). It inspired me to do a bit of research and a refresh on my knowledge of common and not so... Continue reading →

The post Satellite platforms: free and open data for environmental monitoring appeared first on Gaia Resources.

]]>
My colleague Rocio recently wrote about her team winning a NASA global hackathon, with their prototype solution to monitor movement in trees from satellite imagery as an indicator of landslide risk (read about that here). It inspired me to do a bit of research and a refresh on my knowledge of common and not so well known satellite platforms out there for environmental monitoring.  

[Caveat – I cannot claim to be an expert in either environmental science or remote sensing disciplines, but I know there are many of us in the same boat. It’s tricky to keep track of it all, so I thought if I shared some information and tricks on how to use this data then hopefully I can give a few people a leg up.]

Satellites and remote sensing have played an important role for decades in monitoring land cover change, marine and climate conditions; but developments in this field have increased dramatically in recent years. New satellite platforms, cloud computing, computational capabilities, and free and open access data have allowed scientists and researchers to get their hands on more and more data ready to use for particular environmental applications. 

There are some heavy hitting satellites out there that scientists and researchers would know and love – or hate depending on their context! MODIS, Landsat and Sentinel platforms (outlined in the table below) provide imagery at different resolutions, multispectral band combinations and revisit frequencies. For example, a scientist concerned with bushfire risk may leverage all three in different contexts to provide temporal and spatial coverage across such a complex issue spanning vegetation condition, climate/weather and fuel loads. For other applications, one can get a lot out of one satellite platform. 

Table 1: Overview specifications of some of the most popular satellite platforms used for environmental monitoring applications.

Satellite Description Sensor type/product Resolution (m) Frequency
MODIS (Terra and Aqua) Atmospheric, land, and ocean multispectral imagery, including 36 bands Moderate Resolution Imaging Spectroradiometer 250m

500m

1000m

Twice daily
Landsat 7 Multispectral imagery, including 8 bands Enhanced Thematic Mapper+ (ETM+) 30m

15m

16 days
Landsat 8 Multispectral imagery, including 9 bands Operational Land Manager 30m

15m

16 days
Thermal imagery, including 2 bands Thermal Infrared Sensor (TIRS) 100m 16 days
Landsat 9 Multispectral imagery, including 9 bands Operational Land Manager-2 30m

15m

16 days
Thermal imagery, including 2 bands Thermal Infrared Sensor (TIRS-2) 100m 16 days
Sentinel Synthetic Aperture Radar (SAR)  imagery Sentinel-1 5 x 5m

5 x 20m

20 x 40m

6 days
Multispectral imagery, including 13 bands Sentinel-2 10m

20m

60m

5 days

Spectral band comparison between Landsat 5 (TM), Landsat 7 (ETM+), Landsat 8 and 9 (OLI, OLI-2).

The Landsat mission spans six decades, and an archive of free historical imagery archives is readily available going back as far as 1972. With each launch – most recently Landsat 9 in September, 2021 – NASA have made progressive improvements in technology and spectral parameters while maintaining data consistency and a long-term monitoring record. Landsat 9, for instance, includes the same spatial resolution but with higher radiometric resolution (14-bit quantization compared to 12-bit for Landsat 8). This allows sensors to detect more subtle differences, especially over darker areas such as water or dense forests. For instance, Landsat 9 can differentiate 16,384 shades of a given wavelength, compared to 4,096 shades in Landsat 8, and 256 shades in Landsat 7 (source: USGS).

What I find amazing is how close these satellites’ orbits really are to us – at between 700-800km altitude, these things are imaging the Earth at a horizontal equivalent less than the distance between Sydney and Melbourne, and whizzing past at 26,972 km/hr!

GIS packages like QGIS and other analytics platforms can ingest and visualise satellite data in a number of formats. You can either download the imagery directly from their online portals – such as the USGS Earth Explorer and the Copernicus Open Access Hub – or connect to web map services in the form of WMS and WMTS layer types.

QGIS shows a Landsat 9 imagery for Perth (left) with the higher resolution Sentinel-2 imagery (right).

The QGIS plugin repository contains a number of freely available plugins offering access to satellite base map services, and others with easy to use facilities to search and download the raw imagery for analysis. Still others offer spatial layers derived from these satellite sources – and the NAFI plugin we developed is one of the many 

Google Earth Engine (GEE) is a platform we’ve started to use for analysis and visualisation of geospatial datasets, and it is accessible for academic, non-profit, business and government users. We were able to process large volumes of imagery to detect changes in forest cover and vigour against a long-term baseline (read more about that project here). GEE hosts publicly available satellite imagery with historical earth images going back more than forty years. The images are available globally, and ingested on a daily basis to really make it powerful for monitoring and prediction applications. It also provides Application Programming Interfaces (APIs) and other resources like Jupyter Notebooks scripts to enable the analysis of large volumes of data.

Earth on AWS is another source of open data that helps you discover and share datasets for geospatial workloads. AWS Marketplace has a large number of geospatial, GIS and location-based applications that can benefit planning, predictive modelling and mapping applications. 

This movement towards free and open-source satellite data – and the growth of enabling platforms – offers incredible opportunities for environmental scientists, encouraging new questions to be explored at regional and continental scales.

At a talk organised by the Research Institute for the Environment and Livelihoods (RIEL) back in 2019, I was introduced to a few lesser known satellite platforms that have plenty to offer for environmental monitoring. The table below provides a just a bit of a snapshot, but I am certain there are many more out there and I am only scratching the surface:

Table 2: Overview of other satellites used for environmental monitoring. Links are provided to specifications and available products.

Satellite Mission/Purpose Sensor type/product Resolution (m) Frequency
Himawari 8 Near real time weather satellite used for weather imagery. Advanced Himawari Imager (16 bands) 500m

1000m

2000m

10min
Global Ecosystem Dynamics Investigation (GEDI) To understand how deforestation has contributed to atmospheric CO2 concentrations, how much carbon forests will absorb in the future, and how habitat degradation will affect global biodiversity. LiDAR (Light Detection and Ranging)

Products include: 

– canopy height and profile,

– ground elevation, 

– leaf area index, 

– above ground biomass.

25m

1000m

Variable
EnMAP hyperspectral satellite (planned launch in 2022) To monitor ecosystems by extracting geochemical, biochemical and biophysical parameters on a global scale. Hyperspectral band imagery (131 bands) 30m 4 days
Sentinel-3 To measure sea surface topography, sea and land surface temperature, and ocean and land surface colour to support ocean forecasting systems, environmental and climate monitoring. Four main sensors:

OLCI

SLSTR 

SRAL

MWR

300m

500m

1000m

<2 days
Sentinel-4 To monitor key air quality, trace gases and aerosols over Europe at high spatial resolution and with a fast revisit time. Multispectral imagery (3 bands) 8000m 1 hour
Sentinel-5

Sentinel-5P

To provide atmospheric measurements and climate monitoring, relating to air quality, ozone and UV radiation. Two sensors: 

– Multispectral imagery (7 bands)

– TROPOspheric Monitoring Instrument (4 bands)

7500m

50,000m

Daily
Sentinel-6 To provide enhanced continuity to the  mean sea level time-series measurements and ocean sea state that started in 1992 with previous missions. Three sensors:

– Synthetic Aperture Radar (SAR) 

– Advanced Microwave  Radiometer

– High Resolution Microwave Radiometer

300m 10 days

The Himawari satellite viewer (link) provides a continental scale animation of weather systems. Cyclone Anika is shown crossing the Western Australia Kimberley region.

Remote sensing and Earth Observation is a whole world (sorry, pun intended) of specialised science and data unto itself. There is so much research out there, but also some practical analysis and visualisation tools to help people in the environment space apply these resources to real-world applications. I must admit the more I dig into different satellite platform websites and their data products, the more I discover that could be valuable. I hope I’ve been able to give people a sense of the potential out there, and we’ll also think about building some of this content into a QGIS training module in the near future. 

Contact us via email or start a conversation with us on one of our social media platforms –  Twitter, LinkedIn or Facebook.

Chris

The post Satellite platforms: free and open data for environmental monitoring appeared first on Gaia Resources.

]]>
The annual FOSS4G Conference: Celebrating Open Source Software in the Spatial Community https://archive.gaiaresources.com.au/annual-foss4g-conference-celebrating-open-source-software-spatial-community/ Wed, 01 Dec 2021 03:16:12 +0000 https://archive.gaiaresources.com.au/?p=9729 You may have heard about free and open source software – we’ve talked about it a lot at Gaia, and have practically built the business off of it. There’s a whole suite of open source software which serves the geospatial community, bringing powerful mapping and database tools to the world at the most affordable price... Continue reading →

The post The annual FOSS4G Conference: Celebrating Open Source Software in the Spatial Community appeared first on Gaia Resources.

]]>
You may have heard about free and open source software – we’ve talked about it a lot at Gaia, and have practically built the business off of it. There’s a whole suite of open source software which serves the geospatial community, bringing powerful mapping and database tools to the world at the most affordable price point possible – free – which empowers people far and wide regardless of financial or social status.

To celebrate this software and bring the spatial community together, an annual conference is held known as FOSS4G, or Free and Open Source Software for Geospatial. This year Gaia were very proud to both sponsor and facilitate the conference on 12th November. The organising committee consisted of a crack team of volunteers from a range of businesses and educational facilities, who pulled off an incredible two-day event jam-packed with information and hands-on learning.

Things got off to a hairy start when one of our presenters came down with COVID-like symptoms and had to quarantine, but alas, these are the times we live in. The presentations that weren’t foiled by COVID were filmed and are available here on the FOSS4G SotM Oceania YouTube channel.

Russel Keith-Magee discusses his experiences in contributing the the open source community.

This year’s keynote presenters gave us a lot of food for thought: Russell Keith-Magee treated us to an energetic and enlightening introduction to the world of contributing to open source software. The audience were captivated and hopefully a few were inspired by his note that you don’t need to be able to code in order to contribute. Then Femina Metcalfe and Helen Ensikat unveiled the long journey to bringing open source software to the local government sector in Western Australia, revealing incredible foresight, persistence and tenacity. 

A series of presentations and 5 minute lightning talks, interspersed with top-notch catering from Joey Zaza’s, made for an enjoyable and educational event. We learnt about how open source spatial software is being used in the private, government and education sectors; we were shown how to collect spatial data in the field using the free QField mobile app; and we were treated to a number of fascinating scientific studies which were undertaken utilising free and open source software. 

A personal highlight for me was our own committee member John Bryant experiencing some technical difficulties at the start of his 5 minute lightning talk about new features in QGIS, and having to speed through the rest of it. He made it with seconds to spare, and got a cheer from the audience. 

What I love most about this particular conference is the ability to network and connect – I really feel it’s the ethos of open source that facilitates the desire to share your ideas, learnings and data with the community. This was such a welcome change from conferences which are geared around sales pitches and profit. 

The organising committee would like to extend a massive thank you to the sponsors of the event, without which we couldn’t hold it. These amazing companies are fostering the availability of powerful software tools to the world and the removal of socio-economic boundaries. 

Special thanks to our venue sponsor FLUX, who allowed us to fill their terrific Basement venue with raucous nerdery for the day. 

And of course an enormous kudos to the organising committee, who put in months of effort to make the event happen (big shout out to John Bryant and Maia Williams).

If you’d like to know more about FOSS4G, check out their website. If you’re interested in getting involved in the event for next year, free to get in touch via email, or start a conversation with us on Facebook, Twitter or LinkedIn.

Cheers!
Tracey

  


Sponsors

   
      
      

Organisers
John Bryant
Maia Williams
Tracey Cousens
John Duncan
Bryan Boruff
Sam Wilson
Ivana Ivanova
Nick Middleton
Nimalika Fernando
Daniel Moore
Piers Higgs

Volunteers
Cholena Smart
Keith Moss
Grant Boxer
Petra Helmholz
Rocio Peyronnet
Rachel Pennington
Angus Mackay
Gail Wittich

 

The post The annual FOSS4G Conference: Celebrating Open Source Software in the Spatial Community appeared first on Gaia Resources.

]]>
Open source software and open data https://archive.gaiaresources.com.au/open-source-software-open-data/ Wed, 20 Oct 2021 01:59:37 +0000 https://archive.gaiaresources.com.au/?p=9653 Perth is about to host the FOSS4G Oceania Conference (Perth Hub) on 12-13 November 2021, and up here in Darwin I’m just a tiny bit disappointed I can’t go along to take part. My office buddy Tom Lynch will be heading there to give a presentation, which I’ll talk a bit more about later, as will... Continue reading →

The post Open source software and open data appeared first on Gaia Resources.

]]>
Perth is about to host the FOSS4G Oceania Conference (Perth Hub) on 12-13 November 2021, and up here in Darwin I’m just a tiny bit disappointed I can’t go along to take part. My office buddy Tom Lynch will be heading there to give a presentation, which I’ll talk a bit more about later, as will a number of former friends and work colleagues. 

FOSS4G is short for ‘Free and Open Source Software for Geospatial’ – it’s a great convergence of people who are passionate about open source software and open geospatial data, and want to share their experiences. It’s safe to say we all see the business value and the opportunities for innovation and for creating good in this world through sharing and collaborating.

Maybe you haven’t heard the terms open source or open data before, or perhaps you’ve heard them in comparison to Commercial Off The Shelf (COTS) – or proprietary – products? In either case, let’s have a look at what a few of these terms mean:

  • Open source software is where the copyright holder grants users the rights to use, study, change, and distribute the software and its source code to anyone and for any purpose. Often source code will be collaborated on and shared through public channels like GitHub.
  • Open Data is the concept or commitment to make data products freely available to everyone to use and republish as they wish, without restrictions from copyright, patents or other mechanisms of control.
  • Open API is an open Application Programming Interface specification for describing, producing and consuming web services. It allows organisations to open up controlled gateways into their systems, and encourage third parties to build integrations into their own applications.  

There are some truly massive open source software projects out there that are breaking new ground and really challenging the COTS providers for functionality and benefits. In the spatial realm QGIS desktop software and PostGIS relational databases provide free and open source equivalents to COTS products.  In statistics, we make use of products like the R Project, and in software engineering you see Python, PHP, and other open source programming languages everywhere. Even on relatively closed software products, there is a trend to create open APIs so that systems can more easily integrate and exchange data.  

A nice example of QGIS and Python development is what Tom will be talking about at FOSS4G in relation to our involvement with the Northern Australian Fire Information program. The NAFI website has for several years built up an impressive array of fire related data products and services that support land managers (see our previous blogs). For the NAFI QGIS plugin, we leveraged the QGIS open source plugin framework to create a quick access tool for the folks who rely on that desktop package for fire management activities.

The NAFI QGIS plugin places a quick layers panel to the left for easy access to data layers.

We are also close to releasing another plugin that streamlines fire scar mapping tasks for Bushfires NT staff in the Northern Territory using Sentinel imagery from the European Space Agency (another free data product).

It’s not just feature parity and lower price that makes these open source products appealing—it’s also the flexibility and community-driven development opportunities they offer that allow organisations to build their own software interfaces, plug-ins, models and extensions to tailor functionality to meet real business needs.

Increasingly, government agencies publish “open data portals” like data.gov.au as an entry point to gaining easy access to FAIR data extracts and web services – by FAIR we mean data which meet principles of Findability, Accessibility, Interoperability, and Reusability. The Open Geospatial Consortium standardised web service formats (e.g. WMS, WMTS, WFS) these agencies publish to are a lynch pin in so many spatial systems out there. They recognise that FAIR data and open source software availability can kick start and accelerate a range of innovative products and applications they could only guess at.

If you are in a business evaluating software solutions – and I have been on both sides of that supplier-buyer fence – your decision process likely involves evaluating against a number of business risks. I would say that a well-supported open source product could have a lot to offer in terms of reducing those risks:

Risk Area Reframed
Functionality: will this open source product meet all of our business requirements and needs, or cost extra in customisations? Does the open source solution meet the majority of our requirements, and allow us to focus otherwise sunken licensing costs on features tailored to our needs?
Financial: what will be the Total Cost of Ownership (TCO) for this open source system over X years, including support, training, maintenance and infrastructure? Understand how the open source solution stacks up in terms of TCO, also taking into account licensing, annual maintenance and other costs that don’t apply. 
Operational: will the open source solution help us meet our objectives for streamlining and delivering new capabilities?  Fair question – does the open source solution offer a framework for building tools, apps and web-based solutions?
Support: Who can we depend on for support when there is no vendor? Rather than vendor support, consider that you have access to a community of users and consultants who can provide support. Not to mention looking at the skills within your team to support the solution internally.

Other questions worth considering are: how many users are there actively using the product? How often is it updated? Do others find it easy to learn and use? What skills do you need to build on it? All the same questions you might ask of a COTS product, to be honest.  

When you make the choice to use a product like QGIS or to build your own open source solution, know that there is a whole community out there (including us!) willing to lend a helping hand. For whatever challenge you have, chances are that there is someone that has tackled something similar, and has shared a solution or developed a script or plug-in, where you can save time and potentially add value back. 

I really hope everyone heading along to the FOSS4G conference has a great time, and comes away with a basket full of ideas and new connections in their open geospatial journey. If you’d like to strike up a conversation, please feel free to contact me or hit us up on Twitter, LinkedIn or Facebook.

Chris

The post Open source software and open data appeared first on Gaia Resources.

]]>
Two day QGIS training course https://archive.gaiaresources.com.au/two-day-qgis-training-course/ Wed, 06 Oct 2021 02:19:41 +0000 https://archive.gaiaresources.com.au/?p=9631 Wanting to make sure you know your vectors from your rasters?  Need to make professional quality maps of your spatial data? Gaia Resources have scheduled another of our highly regarded 2-day QGIS for Beginners training course. This course is perfect for those looking to upskill in spatial software and would suit anyone from land managers... Continue reading →

The post Two day QGIS training course appeared first on Gaia Resources.

]]>
Wanting to make sure you know your vectors from your rasters?  Need to make professional quality maps of your spatial data? Gaia Resources have scheduled another of our highly regarded 2-day QGIS for Beginners training course. This course is perfect for those looking to upskill in spatial software and would suit anyone from land managers to mining crew.

The course will be held over two days – 18 and 19th November at our office on St Georges Terrace in the Perth CBD. 

You will learn the fundamentals of GIS and the QGIS software, including:

  • Coordinate Reference Systems
  • Vector and Raster data
  • Creating & editing shapefile data
  • Symbology & styling data
  • Georeferencing images
  • And the most fun part: Making maps

We keep the class size small (10 people or fewer) so that our trainer can spend plenty of one-on-one time with you and make sure everyone gets maximum value and learnings from the material.

If the Beginner’s course doesn’t quite meet your requirements or you’d like something more advanced, we can also customise the course to include advanced features important to your enterprise. For companies looking to train multiple staff we can also deliver this course at your own facilities, or even offer a condensed one-day version.

We have limited spaces available for our course, and we’d love to have you there! If you’d like to register, or if you’d like to discuss more custom training requirements, please contact us via training@gaiaresources.com.au or call us on 08 9227 7309

You can also read more about our training here https://archive.gaiaresources.com.au/services/training/ or start a conversation with us on Facebook, Twitter or LinkedIn.

Gus

The post Two day QGIS training course appeared first on Gaia Resources.

]]>
A platform for monitoring forest canopy cover https://archive.gaiaresources.com.au/platform-monitoring-forest-canopy-cover/ Wed, 18 Aug 2021 05:10:30 +0000 https://archive.gaiaresources.com.au/?p=9511 We recently completed a technical investigation into leveraging Sentinel satellite imagery for monitoring forest canopy cover in forested areas in Victoria. The investigation was a really interesting deep dive for our data scientists into some free and open-source analytical tools and techniques our client could use to assess one aspect of bushfire risk. The client... Continue reading →

The post A platform for monitoring forest canopy cover appeared first on Gaia Resources.

]]>
We recently completed a technical investigation into leveraging Sentinel satellite imagery for monitoring forest canopy cover in forested areas in Victoria. The investigation was a really interesting deep dive for our data scientists into some free and open-source analytical tools and techniques our client could use to assess one aspect of bushfire risk. The client wanted a repeatable operational tool they could use to hone in on areas of higher risk, and make some informed decisions about where to focus limited field resources.

Forest ecosystems play an essential role in the environment. Monitoring and detecting change in forests is important for the development of conservation policies that would lead to sustainable forest management. For this purpose, Earth Observation (EO) data can be analysed in order to assess disturbances in forest vegetation, as it can reach a worldwide coverage with a high temporal frequency at a low cost. Currently, remote sensing techniques are being used to process EO data from passive and active sensors, providing fast and accurate results across a wide range of applications.

The idea for our particular study was to look at Sentinel-2 (optical) and Sentinel-1 (Side Aperature Radar, or SAR) inputs into a processing model that outputs a regional NDVI raster coverage. The Sentinel imagery captures an image every 5 days for a given patch of the world, and so the potential was there to look at a long term monitoring and change detection tool. We needed to assess the imagery products to see if they could give us useful and consistent output, and groundtruth that against known areas of change on the ground. We also needed to know what technology was out there to help us in this challenge, and what others (researchers and private organisations) had done to solve similar challenges.

The Sentinel-2 platform (left) from the European Space Agency provides coverage every 5 days that can be used for forest canopy cover applications.

We started with a literature review to uncover research that had been conducted on the use of Sentinel and Landsat imagery on forest cover change, both in that south-eastern parts of Australia, nationally and overseas. This step also included looking at current ground forest canopy assessment techniques. With the short term nature of our investigation, we had to really target and timebox our search, and we were able to find some really good material from reasearch at universities across Australia and major research organisations. The Veg Machine project and the previous work done by its originators Jeremy Wallace and Bob Karfs at CSIRO on long-term monitoring using Landsat coverage was an inspiration for our modelling approach. As were the personal experiences of our team members from previous projects and roles.

The literature review had another more software focused aspect to it, as we were looking at a number of analytics platforms that would be the processing backbone and visualisation tool for our modelling.  From this we decided to pick up some Jupyter Notebook scripts in Geoscience Australia’s Digital Earth Australia (DEA) platform, and leverage the Google Earth Engine (GEE) platform. The DEA product enabled generating outputs for a regional scale view, and the GEE platform enabled users to produce NDVI plots on demand for a given local area. The two platforms complemented eachother by providing that regional overview and target area time series plots.

The Digital Earth Australia (DEA) platform and Jupyter Notebook scripts configured for the regional comparison of NDVI images against a long-term baseline.  

The Google Earth platform enables users to look at time series plots of NDVI for an area of interest.

We devised a modelling approach that would ingest new Sentinel imagery inputs and compare them against a 3 year rolling NDVI baseline. If the new image contained pixels above or below our thresholds, then it would simply show up as a different colour on the mapping: green for significant positive change, red for significant negative change. In this proof-of-concept investigation, the client was happy to look at simply detecting a change of significance; and the reason for that change was something they could target and follow-up on. That reason could be anything from heat stress, planned/unplanned, land clearing, fire activity or disease. We also considered seasonal differences and frequency of images for processing within that modelling approach.

Finally, operational staff at the client’s offices use ArcGIS and QGIS software for a range of mapping and analysis tasks. The final raster outputs from the DEA and GEE platforms are capable of being visualised and analysed further in GIS packages along with other key operational and administrative layers.

GIS software can visualise the raster outputs from the modelling for further analysis and decision support.

So as a first step proof-of-concept investigation, we were able to document a technical and operational approach for our client to detect forest cover canopy change and support their bushfire risk decisions. The next stage coming up will be all about implementation and scaling a solution on AWS cloud infrastructure.

We’d love to hear from you if you have been involved in using any of the tools or applications mentioned here, or you’d just like to have a chat out of interest. Feel free to contact me or hit us up on Twitter, LinkedIn or Facebook.

Chris

The post A platform for monitoring forest canopy cover appeared first on Gaia Resources.

]]>
EOFY – QGIS Training Course https://archive.gaiaresources.com.au/eofy-qgis-training-course/ Wed, 09 Jun 2021 01:38:05 +0000 https://archive.gaiaresources.com.au/?p=9307 With the end of the financial year fast approaching, Gaia Resources has decided to hold our highly regarded 2-day QGIS for Beginners training course. This course is perfect for those looking to upskill in spatial software and would suit anyone from land managers to mining crew.  The course will be held over two days on... Continue reading →

The post EOFY – QGIS Training Course appeared first on Gaia Resources.

]]>
With the end of the financial year fast approaching, Gaia Resources has decided to hold our highly regarded 2-day QGIS for Beginners training course. This course is perfect for those looking to upskill in spatial software and would suit anyone from land managers to mining crew. 

The course will be held over two days on Monday 21st and Tuesday 22nd June at our office on St Georges Terrace in the Perth CBD. 

You will learn the fundamentals of GIS and the QGIS software, including:

  • Coordinate Reference Systems
  • Vector and Raster data
  • Creating & editing shapefile data
  • Symbology & styling data
  • Georeferencing images
  • And the most fun part: Making maps

We keep the class size small (10 people or fewer) so that our trainer can spend plenty of one-on-one time with you and make sure everyone gets maximum value and learnings from the material.

If the Beginner’s course doesn’t quite meet your requirements or you’d like something more advanced, we can also customise the course to include advanced features important to your enterprise. For companies looking to train multiple staff we can also deliver this course at your own facilities, or even offer a condensed one-day version.

We have limited spaces available for our June course, and we’d love to have you there! If you’d like to register, or if you’d like to discuss more custom training requirements, please contact us via: 

Email: training@gaiaresources.com.au
Phone: 08 9227 7309

You can also read more about our training here: https://archive.gaiaresources.com.au/services/training/ or start a conversation with us on Facebook, Twitter or LinkedIn.

 

 

The post EOFY – QGIS Training Course appeared first on Gaia Resources.

]]>
The NAFI app is changing the way work is planned in the field https://archive.gaiaresources.com.au/nafi-app-changing-way-work-planned-field/ Wed, 28 Apr 2021 01:40:21 +0000 https://archive.gaiaresources.com.au/?p=9220 Controlled burning is underway across the western and central parts of tropical north Australia. As we move into the dry season and the floodways on our Top End roads become accessible, indigenous groups, parks managers and farmers are keen to get those early season burns in full swing. This type of fuel mitigation burning happens... Continue reading →

The post The NAFI app is changing the way work is planned in the field appeared first on Gaia Resources.

]]>
Controlled burning is underway across the western and central parts of tropical north Australia. As we move into the dry season and the floodways on our Top End roads become accessible, indigenous groups, parks managers and farmers are keen to get those early season burns in full swing. This type of fuel mitigation burning happens at a time of year when there is moisture in the soil and vegetation, in order to limit more catastrophic bushfires later in the season when everything has dried up. It reminds me of the explanation Dom Nicholls from the Mimal Rangers gave me over a coffee chat last year, when he said in East Arnhem land they begin their programs as early as they can get the flames to take hold in the grassy vegetation – in March if they can get road access – and then race to fill the gaps later using fire scar mapping and careful planning.

Farmers like Mark Desaliy can use the app to monitor fires near their stations.

Our initial release of the North Australia and Rangelands Fire Information (NAFI) app for iOS and Android back in February brings the most used fire information resource for land managers in Australia to your phone, allowing you to keep a constant eye on bushfire threats. You can view maps of satellite generated fire activity (hotspots) and burnt areas (fire scars) provided by the NAFI service. There’s a good summary back in March from Rohan Fisher on ABC Radio – Kimberley.

At a regional scale like this area in northern NT and WA, the NAFI app represents real-time hotspots through a heat map clustering algorithm.

Just to recap on how the app works behind the scenes to provide you with real-time fire information:

  • The hotspot locations are updated several times a day and the fire scars are updated up to once or twice a week depending on fire conditions.
  • The fire scars are produced by the NAFI Service and the hotspots are sourced from Landgate WA and Geoscience Australia.
  • Base maps for imagery and topography can be downloaded for offline use in your region of interest, and then used for when you go outside of mobile data range.
  • Burnt area mapping covers the Australian Savannas and rangelands that comprise around 70% of Australia, but does not cover NSW, VIC or the heavily populated regions of QLD, WA and SA.

So how popular is the NAFI app – well we can monitor a number of analytics using iOS AppStoreConnect and Google Play console, or the Firebase dashboard. These are configurable dashboards that can tell us things like how many installations occurred by day or week, how many are actively used, and filtered by operating system or device type. As of today, the iOS app has been downloaded 288 times since it’s initial release, and the Android version 142 times.

AppStoreConnect dashboard for the iOS NAFI app provides statistics of installations by week since the mid-February release.
Google Play Console shows the increase in installations of the Android NAFI app over time since the mid-February release.

 

We expect installations to continue upwards in the month of May and beyond, as more people on the ground become aware of the benefits and utility of the app. There are two phases of bushfire related activity  where the app can be useful, associated with the early Dry season burn programs and carbon (emission reduction) projects, and the late Dry season bushfire response.

The statistics are anonymised so we are not tracking personal information, but what the out-of-the-box analytics does help us to understand are the trends, and – along with ratings and word of mouth – we get a bit more insight into how people are reacting to the app. This can then feed into our strategy with clients on helping them target marketing campaigns and prioritise enhancements. We also utilise Firebase Crashlytics as a way of logging the details of any crashes and error messages received, and this really helps us get quickly to the root cause of a technical issue a particular user is experiencing.

Please be aware if you are using the app:

  • Hotspot location on any map may only be accurate to within 1.5 km
  • The hotspot symbol on the maps does not indicate the size of the fire
  • Some fires may be small, brief, or obscured by smoke or cloud and go undetected
  • Satellites detect other heat sources such as smokestacks

For more information visit: https://savannafiremapping.com/nafi-mobile-app/

If you would like to know more about our projects with the NAFI team, or want to strike up a conversation by sending me an email or getting in touch on TwitterLinkedIn or Facebook. 

Chris

 

The post The NAFI app is changing the way work is planned in the field appeared first on Gaia Resources.

]]>
Remote learning: tips for trainers to maximise success https://archive.gaiaresources.com.au/remote-learning/ Thu, 25 Mar 2021 01:21:36 +0000 https://archive.gaiaresources.com.au/?p=9076 These days you can learn about almost any topic by watching videos online. But the benefits of having a trainer present to guide and correct you, troubleshoot issues, and maximise your learning makes face-to-face training invaluable. So how does this translate when the people you’re teaching are thousands of kilometres away, watching you on a... Continue reading →

The post Remote learning: tips for trainers to maximise success appeared first on Gaia Resources.

]]>

Credit: Chris Montgomery (Unsplash)

These days you can learn about almost any topic by watching videos online. But the benefits of having a trainer present to guide and correct you, troubleshoot issues, and maximise your learning makes face-to-face training invaluable. So how does this translate when the people you’re teaching are thousands of kilometres away, watching you on a video call?

Whilst face-to-face trainers are irreplaceable in terms of effectiveness, particularly for novice classes, there are many benefits to remote training such as:

  • Greater flexibility for timing and duration.
  • Reduced costs (especially if inter-regional travel is involved); and
  • A much greater geographic reach.

Recently a client rang asking if I could teach their team a couple of new skills in QGIS in a hurry so they could get a report over the line. We had to put together some training material in a short time frame and attempt to deliver it as an effective learning session. And from all accounts, the training was a success!

Here’s how we made it work:

  1. Small class size
  2. Use appropriate teaching mediums
  3. Teach the concepts, not just the content
  4. Give attendees prior knowledge of the topic
  5. Limit your audience appropriately
  6. Preparation!


1. Small class size

This one is a no-brainer. In a small group, the trainer can provide more one-on-one time, people are less likely to fall behind if they get lost at any stage, and you won’t need to stop as frequently to help people out as you would in a large class. Manageable class sizes are especially important when running remote training, since watching demos on a computer monitor can be trickier for students than being present in a room.


2. Use appropriate teaching mediums

The majority of people learn best via visual formats and hands-on exercises. As a trainer you’re already challenged with keeping attendees engaged and focused (doubly so for remote training), so look for opportunities to use visual learning tools.

It’s no cliché that a picture tells a thousand words! Most people zone out when they see a wall of text (like this blog post).

Something as simple as a stick figure diagram in a slide show with some animated components can get through to your audience and give them that “Aha!” moment that is so gratifying as a trainer.

Here’s an image we’ve used in our QGIS course – a humorous but helpful take on the difference between raster and vector images (humour is another fantastic tool for learning!):


3. Teach the concepts, not just the content

You’ve got your training program established. You have a workbook full of exercises and instructions. It’s so easy to fall into the trap of just having attendees learn the HOW by working through those exercises like robots, without understanding the WHY.

Start with the concepts. Break them down into digestible explanations. Use analogies, diagrams, and practical real-world examples. Then open the floor up for discussion – get attendees to think about how this concept or tool might apply to their own work/life, or where they can see its application. Not only will this help them get their heads around the concepts, but it will also help you grow as a trainer with a better understanding of your target audience.


4. Give attendees prior knowledge of the topic

OK, understandably this is not always feasible – people are coming to you to learn a skill, after all. But where possible you can give students a leg-up with simple, engaging prerequisite material to help them grasp the fundamentals before the day of the actual training. This could be in the form of educational videos, instructions on how to set up the software, and even a beginner’s exercise for the course. By allowing attendees to familiarize themselves with the software and material they’ll come into your training with a rudimentary understanding, instead of blindly.


5. Limit your audience appropriately

Something else to consider is limiting who you run remote training for, based on the difficulty of the training. In our case, the attendees all had some prior experience using other GIS software, which allowed them to navigate QGIS with relative ease. Where possible, try to gain an understanding of the proposed attendees and their relevant skills, and make a judgement call on whether your training is accessible enough to them in the remote format.


6. Preparation!

Another no-brainer here, but too often overlooked. Small things go wrong all the time, and can diminish your appearance of professionalism and competency, as well as disrupt the class. Well in advance:

  • Triple-check all material and send out any necessary material to attendees.
  • Provide clear instructions to attendees with times, meeting links, and any prerequisites.
  • Do an internal “dress rehearsal” to check your camera, mic, slideshows etc.
  • Be sure to leave some wriggle room for technical difficulties (at both your end and the attendees).

With more people working from home, or staff scattered geographically, it’s the perfect time to look at converting your training to an online offering, and hopefully, these tips help get you off on the right foot. Take a look at our existing QGIS course information for in-person and online training.

If you have any further ideas, please leave a comment below. Or if you would like to talk to us about our QGIS training offerings, please get in touch with us via training@gaiaresources.com.au or our social media streams – FacebookTwitter or LinkedIn.

Tracey

The post Remote learning: tips for trainers to maximise success appeared first on Gaia Resources.

]]>