Chris Roach – https://archive.gaiaresources.com.au Environmental Technology Consultants Thu, 29 Feb 2024 03:47:38 +0000 en-AU hourly 1 https://wordpress.org/?v=4.9.1 Harnessing the Power of Artificial Intelligence to Capture Biodiversity Data https://archive.gaiaresources.com.au/harnessing-power-artificial-intelligence-capture-biodiversity-data/ Thu, 08 Sep 2022 03:33:51 +0000 https://archive.gaiaresources.com.au/?p=10179 The Northern Territory Government is one of the many organisations that we help solve environmental technology challenges with. We’ve been having interesting conversations with the Flora and Fauna team at the Department of Environment, Parks and Water Security (DEPWS) for a while now, but recently we undertook a proof-of-concept project with them around streamlining the... Continue reading →

The post Harnessing the Power of Artificial Intelligence to Capture Biodiversity Data appeared first on Gaia Resources.

]]>
The Northern Territory Government is one of the many organisations that we help solve environmental technology challenges with. We’ve been having interesting conversations with the Flora and Fauna team at the Department of Environment, Parks and Water Security (DEPWS) for a while now, but recently we undertook a proof-of-concept project with them around streamlining the field survey data collection efforts with Artificial Intelligence (AI). 

This proof-of-concept project came about from a chance discussion in a meeting around biological data and data standards, where we were talking about hard copy data capture forms and how time-consuming it was to transcribe them into a database.  We have been working for some time on an AI service offering around transcription and so, we proposed to trial our Clio system to see if it could help solve the transcription problem for the Flora and Fauna team.

Imagine a bird survey containing several small plots at a site near Alice Springs, and over the course of 4-5 days the field scientist records the birds they observe or hear and other details. Information about each bird – such as its gender and age – is collected, if possible, to help in modelling population trends and understanding ecological processes. Once the field work is done, the notes are copied into a standard survey form. 

Input bird survey form

The hard copy forms contain a bunch of site information (e.g. siteID, surveyor’s name, coordinates, plot size, land descriptions etc.) and a number of visits at different times, with where observations of species are recorded as a list of occurrences.  Due to the complexity and rapid data collection techniques in the field, these surveys continue to be done with pen and paper. The forms build up across a team of field staff over the course of a year, and this then creates a significant backlog of time-consuming manual transcription work.

That’s where our team comes in. Gail Wittich is a Data Scientist at Gaia Resources and Hayden Richards is one of our Software Engineers who jumped at the chance to work on this proof-of-concept project. The outcome we were chasing was to see if we could significantly reduce that manual data entry time. I spoke to Gail about how she and Hayden tackled the challenge.

Were you just able to feed the scanned field survey forms in, and get the data out?

Not quite, but that was the general idea. We were able to save a significant amount of human processing time by first modifying the design of the survey form, and then introducing the AI algorithms to process the scans. Specifically, this uses Handwritten Character Recognition to read the forms and output a machine-readable file. The raw outputs from Clio also required some post-processing to fix some common spelling and format errors; but from there it was ready to have minor edits and curation performed by a human, before being ready as tabular data to import into the database. 

What does the processing approach look like?

The nt-birds-survey-tool is a proof-of-concept Python script for performing Clio text recognition on scanned PDFs, post-processing raw results and outputting tabular data as CSV files. We utilise cloud storage, tools and services from Amazon Web Services (AWS) to help bring this service together.

High level Technical Overview

The wonderful thing is that Clio gives a clear indication as to how well it is able to read each component of text, in the form of a Confidence Level (percentage) which can then be viewed chromatically to draw your attention to the problem areas for corrections. It also recognises form field and tabular information, and can export that into sensible data as rows and columns in a spreadsheet.

Clio prototype interface

What are some of the technical challenges you’ve come across?

AI has come a very long way in the last 20 years, and it is a rapidly evolving space; but for our line of work we are not talking about robots and algorithms that can pass the Turing test. For us, it comes down to helping scientists and researchers achieve time efficiencies and savings that they can apply to doing more good environmental work. As you can imagine with bird surveys, a person’s handwriting is not at its best when the person is writing on a clipboard or notepad and moving swiftly around in variable weather conditions. Complimenting the Clio raw result with standard post-processing techniques improves results significantly. It’s still tough for a machine to correctly read a 2 when it is written like a Z, for instance, but when you know the data is referring to a count of birds, you can be sure it is a 2. Or an S is a 5, and an I is a 1, and so on.   

Initially, Clio recognition was as low as 73% (73% of information on a form was correctly transcribed). In many cases this was just one or two letter differences or a  few numbers interpreted as letters. With the form design and a range of post-processing corrections like the ones above, we were able to increase accuracy to between 93% and 99%. We know that a successful implementation also requires a bit of training and reinforcement for field staff in how to use the form, but we were really pleased with those results. 

Why not just collect the data with a mobile app?

While we are big supporters of mobile apps and field technology solutions in the right situations, Clio is designed to support scanned content – current or historical. Just because we have the ability to build apps, doesn’t mean that every challenge will be solved by having one. Clio is ideal for the situations when people get back to basics – and use pencil on paper.

Why do you think this work is important?

Biodiversity data helps to answer important research questions and inform decisions on a variety of subjects, including: urban and rural development impacts, climate change and how it affects habitat and species populations and migration patterns. I work mainly with the data, but I can definitely see how we need reliable and efficient methods to generate and aggregate consistent, standardised data of this type to support research into those areas.

Where to from here?

We have proven that with some minimal redesign of form inputs, we can use this solution to get highly accurate transcriptions from these handwritten survey forms. We know it is going to save time, but this isn’t just about birds. There are many different types of flora and fauna surveys out there, including vegetation, mammals, reptiles, invertebrates and more – and several survey techniques and guidelines that define best practice in this space. We are concentrating on the survey forms, and I think the intent is that we can realise those time savings on many of those different types by following a similar process. We do need to run more tests and measure manual entry times for comparison. However, even if this saves 30 minutes in data entry per survey (not really a stretch when you think about it), for every thousand of these forms the payoff is 500 hours in time savings.

It’s really exciting to be undertaking proof-of-concepts like this one that allow us to leverage AI to help clients turn their scanned content into rich, standardised and reusable biodiversity data. We’ll have another blog a bit later to tell you about another Clio proof of concept project we’ve undertaken in the archives world – so stay tuned! 

If you’ve got data in a hard copy format that you need transcribed, then reach out to us and let’s see how we can help you solve your problems.  In the meantime, if you’d like to know more, start a conversation on our social media platforms – Twitter, LinkedIn or Facebook or send me an email.

Chris

The post Harnessing the Power of Artificial Intelligence to Capture Biodiversity Data appeared first on Gaia Resources.

]]>
Biodiversity spatial data challenges and solutions https://archive.gaiaresources.com.au/biodiversity-spatial-data-challenges-solutions/ Wed, 25 May 2022 03:33:05 +0000 https://archive.gaiaresources.com.au/?p=10070 In this blog, we’ll explore some of the challenges and solutions around managing the spatial aspects of biodiversity data.  Claire recently wrote about how she loved the way nature always had such elegant answers to complex problems (see her blog here). Researchers and environmental scientists often observe these elegant answers when they are out in... Continue reading →

The post Biodiversity spatial data challenges and solutions appeared first on Gaia Resources.

]]>
In this blog, we’ll explore some of the challenges and solutions around managing the spatial aspects of biodiversity data. 

Claire recently wrote about how she loved the way nature always had such elegant answers to complex problems (see her blog here). Researchers and environmental scientists often observe these elegant answers when they are out in the field collecting data. 

Whether it is the way that plant seeds have evolved to propagate through the use of wind or fire, or a symbiotic relationship that benefits both plant and animal species. 

Channelling the Samuel review of the EPBC Act, if we are going to get serious about arresting the decline of species and ecosystems in Australia, we need to do much more to connect the  dots of biodiversity data. The review found that “in the main, decisions that determine environmental outcomes are made on a project-by-project basis, and only when impacts exceed a certain size. This means that cumulative impacts on the environment are not systematically considered, and the overall result is net environmental decline, rather than protection and conservation.” (source: Independent review of the EPBC Act – Interim Report Executive Summary)

Gaia Resources is currently developing two separate biodiversity data management projects in Australia that are helping State and Federal government agencies to streamline biodiversity data submission, increase accessibility to biodiversity data and hopefully, in turn, support decision making and improve environmental outcomes.  We are rapidly approaching the launch of both these projects – so stay tuned for more details to come!

We are helping to join the dots by enabling biodiversity data to be aggregated and made more readily available as a public resource. This type of data – including species occurrence data, systematic surveys and vegetation associations – comes in many forms and from multiple sources. Researchers and environmental scientists have employed different methodologies across our vast continent and territories to collect data for their particular project or area of study. Depending on the nature of the survey, field biodiversity data can be collected as point occurrences or observations, transect lines, plots, traps, habitat survey areas and quadrats (as shown below). 

A schematic representation of different types of biodiversity survey types including points, tracking data, transects, traps, plots and habitat surveys.

The observed absence of a species within a defined survey area/site, and time of the survey, are also important data elements for ecological research.  Adding to that data complexity is the fact that over the past few decades, technological advancements in GPS (Global Positioning Systems) and apps on handheld devices have changed the way we record things like coordinate locations and levels of accuracy. Technological advancement has also impacted the volume of information we can gather with the time and resources we have available.  

To have a chance of aggregating all this data from different sources in a meaningful way, there is a need to apply a consistent approach, or standard, to the biodiversity information.  Apart from the considerable challenges standardisation presents from a taxonomic perspective in classifying species, there are also several spatial data challenges, which I’ll focus on here – more on the use of standards and varying approaches to using them will be coming in a later blog. 

One key challenge is knowing and specifying the spatial coordinate system of incoming data, so that any repository can transform many project submissions into a spatially consistent system. Once you know the reference system, it is then possible to assess whether the data is positioned in a logical place – on the Australian continent or its Island Territories, for instance. 

Another big one has been how to handle different geometries of data (e.g. point, line, polygon) describing the same type of thing in the field. Take an example of a 30 year old report that lists a single point coordinate referencing a 50x50m plot area, but with no other information like the orientation of that plot.  Do we materially change a plot reference to make that a polygon shape, based on a snippet of information in the accompanying report? What happens when some of the information we need is missing, or the method described in the report is ambiguous?  As system developers, we are avoiding anything that amounts to a material change to the source data; instead, systems should be designed to put some basic data quality responsibilities to solve these mysteries back on the authors of the data.

Finally, we have the issue of spatial topology in biodiversity data. Once you get into the realm of transects and areas, it becomes tricky to represent that spatial location using text based standards. Technology provides an elegant – although arguably not that user-friendly – solution through something like a Well-known text (WKT) expression. This standard form can simplify a line or polygon into a series of coordinates that become one column in a dataset, like that shown below.

Points, lines and polygons can be represented by a text string where the discrete numbers are coordinate pairs (Source: Wikipedia)

Instead, we are looking to leverage the open Geopackage format. Generally speaking, this format gives us an open and interoperable approach that can be used across a range of GIS software applications. The Geopackage format has been around for years, and provides a more accessible alternative to proprietary geodatabase formats that you can only really use in a particular GIS software. It also allows configuration and customisation through the SQLite database on which it is based. 

Finally, we have a responsibility to ensure that the biodiversity data is FAIR (Findable, Accessible, Interoperable, and Reusable). In my view, this is a challenge as much about data coming into a system as it is about the user experience of people trying to interact and get data out of a system.  Spending some quality time on both ends of the data chains is very important – and that’s why we’ve been working heavily on design for these systems, too.

By its nature, aggregating data from multiple sources across space and time comes with a suite of challenges, some of which I’ve touched on here.  So these are some of the spatial challenges we’ve been working on in the spatial biodiversity data area, and our expertise in both biodiversity data and spatial data has been very useful in these projects. 

If you want to know more about biodiversity information or spatial data, we would love to hear from you. Feel free to drop us an email or start a conversation on our social media platforms – Twitter, LinkedIn or Facebook.

Chris

The post Biodiversity spatial data challenges and solutions appeared first on Gaia Resources.

]]>
Satellite platforms: free and open data for environmental monitoring https://archive.gaiaresources.com.au/satellite-platforms-free-open-data-environmental-monitoring/ Wed, 02 Mar 2022 02:43:17 +0000 https://archive.gaiaresources.com.au/?p=9951 My colleague Rocio recently wrote about her team winning a NASA global hackathon, with their prototype solution to monitor movement in trees from satellite imagery as an indicator of landslide risk (read about that here). It inspired me to do a bit of research and a refresh on my knowledge of common and not so... Continue reading →

The post Satellite platforms: free and open data for environmental monitoring appeared first on Gaia Resources.

]]>
My colleague Rocio recently wrote about her team winning a NASA global hackathon, with their prototype solution to monitor movement in trees from satellite imagery as an indicator of landslide risk (read about that here). It inspired me to do a bit of research and a refresh on my knowledge of common and not so well known satellite platforms out there for environmental monitoring.  

[Caveat – I cannot claim to be an expert in either environmental science or remote sensing disciplines, but I know there are many of us in the same boat. It’s tricky to keep track of it all, so I thought if I shared some information and tricks on how to use this data then hopefully I can give a few people a leg up.]

Satellites and remote sensing have played an important role for decades in monitoring land cover change, marine and climate conditions; but developments in this field have increased dramatically in recent years. New satellite platforms, cloud computing, computational capabilities, and free and open access data have allowed scientists and researchers to get their hands on more and more data ready to use for particular environmental applications. 

There are some heavy hitting satellites out there that scientists and researchers would know and love – or hate depending on their context! MODIS, Landsat and Sentinel platforms (outlined in the table below) provide imagery at different resolutions, multispectral band combinations and revisit frequencies. For example, a scientist concerned with bushfire risk may leverage all three in different contexts to provide temporal and spatial coverage across such a complex issue spanning vegetation condition, climate/weather and fuel loads. For other applications, one can get a lot out of one satellite platform. 

Table 1: Overview specifications of some of the most popular satellite platforms used for environmental monitoring applications.

Satellite Description Sensor type/product Resolution (m) Frequency
MODIS (Terra and Aqua) Atmospheric, land, and ocean multispectral imagery, including 36 bands Moderate Resolution Imaging Spectroradiometer 250m

500m

1000m

Twice daily
Landsat 7 Multispectral imagery, including 8 bands Enhanced Thematic Mapper+ (ETM+) 30m

15m

16 days
Landsat 8 Multispectral imagery, including 9 bands Operational Land Manager 30m

15m

16 days
Thermal imagery, including 2 bands Thermal Infrared Sensor (TIRS) 100m 16 days
Landsat 9 Multispectral imagery, including 9 bands Operational Land Manager-2 30m

15m

16 days
Thermal imagery, including 2 bands Thermal Infrared Sensor (TIRS-2) 100m 16 days
Sentinel Synthetic Aperture Radar (SAR)  imagery Sentinel-1 5 x 5m

5 x 20m

20 x 40m

6 days
Multispectral imagery, including 13 bands Sentinel-2 10m

20m

60m

5 days

Spectral band comparison between Landsat 5 (TM), Landsat 7 (ETM+), Landsat 8 and 9 (OLI, OLI-2).

The Landsat mission spans six decades, and an archive of free historical imagery archives is readily available going back as far as 1972. With each launch – most recently Landsat 9 in September, 2021 – NASA have made progressive improvements in technology and spectral parameters while maintaining data consistency and a long-term monitoring record. Landsat 9, for instance, includes the same spatial resolution but with higher radiometric resolution (14-bit quantization compared to 12-bit for Landsat 8). This allows sensors to detect more subtle differences, especially over darker areas such as water or dense forests. For instance, Landsat 9 can differentiate 16,384 shades of a given wavelength, compared to 4,096 shades in Landsat 8, and 256 shades in Landsat 7 (source: USGS).

What I find amazing is how close these satellites’ orbits really are to us – at between 700-800km altitude, these things are imaging the Earth at a horizontal equivalent less than the distance between Sydney and Melbourne, and whizzing past at 26,972 km/hr!

GIS packages like QGIS and other analytics platforms can ingest and visualise satellite data in a number of formats. You can either download the imagery directly from their online portals – such as the USGS Earth Explorer and the Copernicus Open Access Hub – or connect to web map services in the form of WMS and WMTS layer types.

QGIS shows a Landsat 9 imagery for Perth (left) with the higher resolution Sentinel-2 imagery (right).

The QGIS plugin repository contains a number of freely available plugins offering access to satellite base map services, and others with easy to use facilities to search and download the raw imagery for analysis. Still others offer spatial layers derived from these satellite sources – and the NAFI plugin we developed is one of the many 

Google Earth Engine (GEE) is a platform we’ve started to use for analysis and visualisation of geospatial datasets, and it is accessible for academic, non-profit, business and government users. We were able to process large volumes of imagery to detect changes in forest cover and vigour against a long-term baseline (read more about that project here). GEE hosts publicly available satellite imagery with historical earth images going back more than forty years. The images are available globally, and ingested on a daily basis to really make it powerful for monitoring and prediction applications. It also provides Application Programming Interfaces (APIs) and other resources like Jupyter Notebooks scripts to enable the analysis of large volumes of data.

Earth on AWS is another source of open data that helps you discover and share datasets for geospatial workloads. AWS Marketplace has a large number of geospatial, GIS and location-based applications that can benefit planning, predictive modelling and mapping applications. 

This movement towards free and open-source satellite data – and the growth of enabling platforms – offers incredible opportunities for environmental scientists, encouraging new questions to be explored at regional and continental scales.

At a talk organised by the Research Institute for the Environment and Livelihoods (RIEL) back in 2019, I was introduced to a few lesser known satellite platforms that have plenty to offer for environmental monitoring. The table below provides a just a bit of a snapshot, but I am certain there are many more out there and I am only scratching the surface:

Table 2: Overview of other satellites used for environmental monitoring. Links are provided to specifications and available products.

Satellite Mission/Purpose Sensor type/product Resolution (m) Frequency
Himawari 8 Near real time weather satellite used for weather imagery. Advanced Himawari Imager (16 bands) 500m

1000m

2000m

10min
Global Ecosystem Dynamics Investigation (GEDI) To understand how deforestation has contributed to atmospheric CO2 concentrations, how much carbon forests will absorb in the future, and how habitat degradation will affect global biodiversity. LiDAR (Light Detection and Ranging)

Products include: 

– canopy height and profile,

– ground elevation, 

– leaf area index, 

– above ground biomass.

25m

1000m

Variable
EnMAP hyperspectral satellite (planned launch in 2022) To monitor ecosystems by extracting geochemical, biochemical and biophysical parameters on a global scale. Hyperspectral band imagery (131 bands) 30m 4 days
Sentinel-3 To measure sea surface topography, sea and land surface temperature, and ocean and land surface colour to support ocean forecasting systems, environmental and climate monitoring. Four main sensors:

OLCI

SLSTR 

SRAL

MWR

300m

500m

1000m

<2 days
Sentinel-4 To monitor key air quality, trace gases and aerosols over Europe at high spatial resolution and with a fast revisit time. Multispectral imagery (3 bands) 8000m 1 hour
Sentinel-5

Sentinel-5P

To provide atmospheric measurements and climate monitoring, relating to air quality, ozone and UV radiation. Two sensors: 

– Multispectral imagery (7 bands)

– TROPOspheric Monitoring Instrument (4 bands)

7500m

50,000m

Daily
Sentinel-6 To provide enhanced continuity to the  mean sea level time-series measurements and ocean sea state that started in 1992 with previous missions. Three sensors:

– Synthetic Aperture Radar (SAR) 

– Advanced Microwave  Radiometer

– High Resolution Microwave Radiometer

300m 10 days

The Himawari satellite viewer (link) provides a continental scale animation of weather systems. Cyclone Anika is shown crossing the Western Australia Kimberley region.

Remote sensing and Earth Observation is a whole world (sorry, pun intended) of specialised science and data unto itself. There is so much research out there, but also some practical analysis and visualisation tools to help people in the environment space apply these resources to real-world applications. I must admit the more I dig into different satellite platform websites and their data products, the more I discover that could be valuable. I hope I’ve been able to give people a sense of the potential out there, and we’ll also think about building some of this content into a QGIS training module in the near future. 

Contact us via email or start a conversation with us on one of our social media platforms –  Twitter, LinkedIn or Facebook.

Chris

The post Satellite platforms: free and open data for environmental monitoring appeared first on Gaia Resources.

]]>
Open source software and open data https://archive.gaiaresources.com.au/open-source-software-open-data/ Wed, 20 Oct 2021 01:59:37 +0000 https://archive.gaiaresources.com.au/?p=9653 Perth is about to host the FOSS4G Oceania Conference (Perth Hub) on 12-13 November 2021, and up here in Darwin I’m just a tiny bit disappointed I can’t go along to take part. My office buddy Tom Lynch will be heading there to give a presentation, which I’ll talk a bit more about later, as will... Continue reading →

The post Open source software and open data appeared first on Gaia Resources.

]]>
Perth is about to host the FOSS4G Oceania Conference (Perth Hub) on 12-13 November 2021, and up here in Darwin I’m just a tiny bit disappointed I can’t go along to take part. My office buddy Tom Lynch will be heading there to give a presentation, which I’ll talk a bit more about later, as will a number of former friends and work colleagues. 

FOSS4G is short for ‘Free and Open Source Software for Geospatial’ – it’s a great convergence of people who are passionate about open source software and open geospatial data, and want to share their experiences. It’s safe to say we all see the business value and the opportunities for innovation and for creating good in this world through sharing and collaborating.

Maybe you haven’t heard the terms open source or open data before, or perhaps you’ve heard them in comparison to Commercial Off The Shelf (COTS) – or proprietary – products? In either case, let’s have a look at what a few of these terms mean:

  • Open source software is where the copyright holder grants users the rights to use, study, change, and distribute the software and its source code to anyone and for any purpose. Often source code will be collaborated on and shared through public channels like GitHub.
  • Open Data is the concept or commitment to make data products freely available to everyone to use and republish as they wish, without restrictions from copyright, patents or other mechanisms of control.
  • Open API is an open Application Programming Interface specification for describing, producing and consuming web services. It allows organisations to open up controlled gateways into their systems, and encourage third parties to build integrations into their own applications.  

There are some truly massive open source software projects out there that are breaking new ground and really challenging the COTS providers for functionality and benefits. In the spatial realm QGIS desktop software and PostGIS relational databases provide free and open source equivalents to COTS products.  In statistics, we make use of products like the R Project, and in software engineering you see Python, PHP, and other open source programming languages everywhere. Even on relatively closed software products, there is a trend to create open APIs so that systems can more easily integrate and exchange data.  

A nice example of QGIS and Python development is what Tom will be talking about at FOSS4G in relation to our involvement with the Northern Australian Fire Information program. The NAFI website has for several years built up an impressive array of fire related data products and services that support land managers (see our previous blogs). For the NAFI QGIS plugin, we leveraged the QGIS open source plugin framework to create a quick access tool for the folks who rely on that desktop package for fire management activities.

The NAFI QGIS plugin places a quick layers panel to the left for easy access to data layers.

We are also close to releasing another plugin that streamlines fire scar mapping tasks for Bushfires NT staff in the Northern Territory using Sentinel imagery from the European Space Agency (another free data product).

It’s not just feature parity and lower price that makes these open source products appealing—it’s also the flexibility and community-driven development opportunities they offer that allow organisations to build their own software interfaces, plug-ins, models and extensions to tailor functionality to meet real business needs.

Increasingly, government agencies publish “open data portals” like data.gov.au as an entry point to gaining easy access to FAIR data extracts and web services – by FAIR we mean data which meet principles of Findability, Accessibility, Interoperability, and Reusability. The Open Geospatial Consortium standardised web service formats (e.g. WMS, WMTS, WFS) these agencies publish to are a lynch pin in so many spatial systems out there. They recognise that FAIR data and open source software availability can kick start and accelerate a range of innovative products and applications they could only guess at.

If you are in a business evaluating software solutions – and I have been on both sides of that supplier-buyer fence – your decision process likely involves evaluating against a number of business risks. I would say that a well-supported open source product could have a lot to offer in terms of reducing those risks:

Risk Area Reframed
Functionality: will this open source product meet all of our business requirements and needs, or cost extra in customisations? Does the open source solution meet the majority of our requirements, and allow us to focus otherwise sunken licensing costs on features tailored to our needs?
Financial: what will be the Total Cost of Ownership (TCO) for this open source system over X years, including support, training, maintenance and infrastructure? Understand how the open source solution stacks up in terms of TCO, also taking into account licensing, annual maintenance and other costs that don’t apply. 
Operational: will the open source solution help us meet our objectives for streamlining and delivering new capabilities?  Fair question – does the open source solution offer a framework for building tools, apps and web-based solutions?
Support: Who can we depend on for support when there is no vendor? Rather than vendor support, consider that you have access to a community of users and consultants who can provide support. Not to mention looking at the skills within your team to support the solution internally.

Other questions worth considering are: how many users are there actively using the product? How often is it updated? Do others find it easy to learn and use? What skills do you need to build on it? All the same questions you might ask of a COTS product, to be honest.  

When you make the choice to use a product like QGIS or to build your own open source solution, know that there is a whole community out there (including us!) willing to lend a helping hand. For whatever challenge you have, chances are that there is someone that has tackled something similar, and has shared a solution or developed a script or plug-in, where you can save time and potentially add value back. 

I really hope everyone heading along to the FOSS4G conference has a great time, and comes away with a basket full of ideas and new connections in their open geospatial journey. If you’d like to strike up a conversation, please feel free to contact me or hit us up on Twitter, LinkedIn or Facebook.

Chris

The post Open source software and open data appeared first on Gaia Resources.

]]>
Greener business: Cumulative positive impacts for the environment https://archive.gaiaresources.com.au/cumulative-postive-impacts/ Wed, 01 Sep 2021 03:11:17 +0000 https://archive.gaiaresources.com.au/?p=9552 Any business can make a significant and meaningful contribution to minimising our collective environmental footprint. We’ve been looking cumulatively at at our company’s impacts and how we can continue to work on sustainable practices and reducing our carbon footprint into the future. For an office-based company like ourselves that contribution could be in a direct... Continue reading →

The post Greener business: Cumulative positive impacts for the environment appeared first on Gaia Resources.

]]>
Any business can make a significant and meaningful contribution to minimising our collective environmental footprint. We’ve been looking cumulatively at at our company’s impacts and how we can continue to work on sustainable practices and reducing our carbon footprint into the future. For an office-based company like ourselves that contribution could be in a direct or indirect form.

Businesses can look at direct reductions in emissions through our transport and electricity usage, smarter choices about consumables, and changes around procurement decisions and asset management (e.g. vehicles, laptops, phones, equipment etc.). It can also be about a more engaged approach to recycling and re-use; offsetting small bits of waste – in our offices and personal lives – that really add up over time and would otherwise be destined for landfill.

We’ve written about a few initiatives and changes we’ve put in place over the years, including our company keep-cup analysis (read our blog here) and partnership with Climate Clever, who have an app that helps businesses and individuals to understand their own specific emission reduction opportunities and savings. Even our decision to move to the Flux offices in Perth back in 2017 was partially informed by the potential to further reduce our carbon footprint.

In an indirect sense, we’ve been investing in a greener world by putting some of our profits back into tree plantings and habitat restoration, which in turn is offsetting our carbon emissions. Last year we worked with the Carbon Neutral team to analyse our emissions and to become a carbon neutral company through the use of Gold Standard certified offsets. To offset some 38 tonnes of carbon we generated last year, mainly through commuting and air travel, we contributed to biodiverse planting and reforestation of the Yarra Yarra Biodiversity Corridor. For many years, we have also combined e-Christmas cards with the planting of individual trees as part of that certified reforestation program.

So how does that all add up? Well cumulatively speaking, from 2009-2021 we have offset 95 tonnes of CO2 emissions and planted 978 trees. Those trees are not only a ‘carbon sink’ for decades to come, they also provide important shelter and food for native wildlife in land that has previously been cleared.  We see our bit as a great achievement and contributing to something meaningful; but we also understand there is plenty more we could be doing.

 

 

 

 

 

 

Native tree planting (left) is a program we couple with our company Christmas e-cards, while CO2 offsets (right certificate) is an annual contribution based on analysis of our emissions.

 

We know this is a long-term game. It is a reality that running a business comes with a carbon footprint; but with a little reflection and intent, there are relatively small decisions and actions we can take to reduce our cumulative impacts and contribute to a better, greener future. We feel it’s important that we recognise and celebrate success, and understand the nature of the positive changes we make in our lives.

We’re happy to talk to you more about how these initiatives work, or to hear about your own experiences. You can either email me directly or reach out via our social media feeds on TwitterLinkedIn or Facebook.

Chris

 

The post Greener business: Cumulative positive impacts for the environment appeared first on Gaia Resources.

]]>
A platform for monitoring forest canopy cover https://archive.gaiaresources.com.au/platform-monitoring-forest-canopy-cover/ Wed, 18 Aug 2021 05:10:30 +0000 https://archive.gaiaresources.com.au/?p=9511 We recently completed a technical investigation into leveraging Sentinel satellite imagery for monitoring forest canopy cover in forested areas in Victoria. The investigation was a really interesting deep dive for our data scientists into some free and open-source analytical tools and techniques our client could use to assess one aspect of bushfire risk. The client... Continue reading →

The post A platform for monitoring forest canopy cover appeared first on Gaia Resources.

]]>
We recently completed a technical investigation into leveraging Sentinel satellite imagery for monitoring forest canopy cover in forested areas in Victoria. The investigation was a really interesting deep dive for our data scientists into some free and open-source analytical tools and techniques our client could use to assess one aspect of bushfire risk. The client wanted a repeatable operational tool they could use to hone in on areas of higher risk, and make some informed decisions about where to focus limited field resources.

Forest ecosystems play an essential role in the environment. Monitoring and detecting change in forests is important for the development of conservation policies that would lead to sustainable forest management. For this purpose, Earth Observation (EO) data can be analysed in order to assess disturbances in forest vegetation, as it can reach a worldwide coverage with a high temporal frequency at a low cost. Currently, remote sensing techniques are being used to process EO data from passive and active sensors, providing fast and accurate results across a wide range of applications.

The idea for our particular study was to look at Sentinel-2 (optical) and Sentinel-1 (Side Aperature Radar, or SAR) inputs into a processing model that outputs a regional NDVI raster coverage. The Sentinel imagery captures an image every 5 days for a given patch of the world, and so the potential was there to look at a long term monitoring and change detection tool. We needed to assess the imagery products to see if they could give us useful and consistent output, and groundtruth that against known areas of change on the ground. We also needed to know what technology was out there to help us in this challenge, and what others (researchers and private organisations) had done to solve similar challenges.

The Sentinel-2 platform (left) from the European Space Agency provides coverage every 5 days that can be used for forest canopy cover applications.

We started with a literature review to uncover research that had been conducted on the use of Sentinel and Landsat imagery on forest cover change, both in that south-eastern parts of Australia, nationally and overseas. This step also included looking at current ground forest canopy assessment techniques. With the short term nature of our investigation, we had to really target and timebox our search, and we were able to find some really good material from reasearch at universities across Australia and major research organisations. The Veg Machine project and the previous work done by its originators Jeremy Wallace and Bob Karfs at CSIRO on long-term monitoring using Landsat coverage was an inspiration for our modelling approach. As were the personal experiences of our team members from previous projects and roles.

The literature review had another more software focused aspect to it, as we were looking at a number of analytics platforms that would be the processing backbone and visualisation tool for our modelling.  From this we decided to pick up some Jupyter Notebook scripts in Geoscience Australia’s Digital Earth Australia (DEA) platform, and leverage the Google Earth Engine (GEE) platform. The DEA product enabled generating outputs for a regional scale view, and the GEE platform enabled users to produce NDVI plots on demand for a given local area. The two platforms complemented eachother by providing that regional overview and target area time series plots.

The Digital Earth Australia (DEA) platform and Jupyter Notebook scripts configured for the regional comparison of NDVI images against a long-term baseline.  

The Google Earth platform enables users to look at time series plots of NDVI for an area of interest.

We devised a modelling approach that would ingest new Sentinel imagery inputs and compare them against a 3 year rolling NDVI baseline. If the new image contained pixels above or below our thresholds, then it would simply show up as a different colour on the mapping: green for significant positive change, red for significant negative change. In this proof-of-concept investigation, the client was happy to look at simply detecting a change of significance; and the reason for that change was something they could target and follow-up on. That reason could be anything from heat stress, planned/unplanned, land clearing, fire activity or disease. We also considered seasonal differences and frequency of images for processing within that modelling approach.

Finally, operational staff at the client’s offices use ArcGIS and QGIS software for a range of mapping and analysis tasks. The final raster outputs from the DEA and GEE platforms are capable of being visualised and analysed further in GIS packages along with other key operational and administrative layers.

GIS software can visualise the raster outputs from the modelling for further analysis and decision support.

So as a first step proof-of-concept investigation, we were able to document a technical and operational approach for our client to detect forest cover canopy change and support their bushfire risk decisions. The next stage coming up will be all about implementation and scaling a solution on AWS cloud infrastructure.

We’d love to hear from you if you have been involved in using any of the tools or applications mentioned here, or you’d just like to have a chat out of interest. Feel free to contact me or hit us up on Twitter, LinkedIn or Facebook.

Chris

The post A platform for monitoring forest canopy cover appeared first on Gaia Resources.

]]>
Spicing up work life with a bit of field work https://archive.gaiaresources.com.au/variety-spice-life/ https://archive.gaiaresources.com.au/variety-spice-life/#comments Wed, 14 Jul 2021 01:48:43 +0000 https://archive.gaiaresources.com.au/?p=9389 Sometimes shaking things up a bit in your job is exactly the ticket you need. I had that opportunity recently when our partners at Outline Global (who capture high resolution aerial imagery for the Northern Territory Government) called me up and asked if I could wander around the Darwin region looking for Ground Control Points... Continue reading →

The post Spicing up work life with a bit of field work appeared first on Gaia Resources.

]]>
Sometimes shaking things up a bit in your job is exactly the ticket you need.

I had that opportunity recently when our partners at Outline Global (who capture high resolution aerial imagery for the Northern Territory Government) called me up and asked if I could wander around the Darwin region looking for Ground Control Points (GCPs) that had been surveyed in 2019, and give them a bit of a zhuzh… a bit of a refresh…  a lick of new paint… a new lease on life as it were.

I jumped at the chance to get outside and do some ‘field work.’ Ok, it’s a far cry from my days as a geologist stepping off helicopters onto remote mountain ridges… but hey when you spend the majority of your time tapping a keyboard and clicking a mouse, this is an opportunity with a lot of advantages. Firstly, there’s a lot of the Darwin region I had not seen. Then there’s the sunshine and beautiful conditions of the Dry season, the chance to use some free tracking apps… it was a bit like an easy but widely distributed Geocaching excursion.

What are GCPs I hear you ask? These are temporary survey markers that are obvious control points visible from a plane capturing aerial imagery. The plane criss-crosses on a structured flight plan, so that the resulting imagery strips have significant overlap and can be used for post-processing. Analysts use software to find the GCPs in overlapping images to ‘register’ the imagery and create an orthorectified mosaic. To go through this process with a high level of accuracy you either need to re-use old GCPs with known surveyed coordinates, or conduct a new survey.

You see, these dilapidated white markers were in need of some TLC. Some were little more than a bunch of painted white rocks assembled in a cross and referenced in the previous survey report, and it is not surprising that in the parks and public places they were placed two years back, that someone thought: ‘Well, that is a collection of rocks that is just begging to be kicked.’

An example of a GCP located out at Lee Point. On arrival the marker and cross (left) were barely recognisable. The refreshed GCP (right) will now be visible from the aerial imagery. (right)

To be fair, it is not surprising in that time that these mysterious assemblages would have experienced both human and natural wear and tear, such as blustering winds, monsoonal rains, people with anarchistic tendencies.

So here I come smiling away with my set of 23 waypoints loaded up onto an free and open-source app called OpenGPXTracker and a can of white spray paint. I also brought my laptop along for the ride with a QGIS project with the waypoints and OpenStreetMaps. This was my regional view to help me plan my route,  but I also had the original survey report on the laptop as a reference. I also made sure to bring along plenty of water and a first aid kit.

Across two days, I navigated to the coordinate positions, and followed a bit of a process at each destination:

  • wander around with my phone until I stood on the waypoint location
  • locate the white GCP marker (sometimes very obvious, other times pretty damn difficult)
  • take a “before” photograph
  • brush off the soil vegetation
  • spray paint the original area
  • take an “after” photograph and notes
  • hop back in the vehicle

As the tracklog map below hints at, there was a fair bit driving, a number of little dead-ends where the map was a bit ambitious about what constituted a road. There was one that turned out to be a 10km bush track along a fenceline that connected two sealed country roads. Initially happy to find the short-cut, I was soon glad to have brought the 4WD so I could avoid getting bogged in the sandy ruts on that track.

It’s a dragon! Ground Control Points and routes travelled across the Darwin and Humpty Doo region. Day 1 (blue) and day 2 (orange) are shown.

I got to see parts of Darwin you don’t often drive to unless you have a work responsibility to be there, or are a keen fisherman. Apart from the mundane manhole cover on the side of the road, there were termite mounds and historical sites. For instance, Channel Island was a bit of a drive but was an interesting spot with its power station, jetty and historical uses as a quarantine hospital and leprosarium dating back to the early 1900’s. The GCP there by the way was a damaged sign that I think someone must have backed into with their boat!

Channel Island Bridge looking back at jetty and transmission towers (left), Ground Control Point at Channel Island (right),

So now you are up to speed with my field work out of the Darwin office of Gaia Resources! Hope you found that somewhat amusing, but if you’d like to learn more about the imagery being captured, or other projects we get involved in the Top End please feel free to contact me or start up a conversation on Twitter, LinkedIn or Facebook. 

Chris

The post Spicing up work life with a bit of field work appeared first on Gaia Resources.

]]>
https://archive.gaiaresources.com.au/variety-spice-life/feed/ 1
Artificial Intelligence for fish species identification https://archive.gaiaresources.com.au/artificial-intelligence-fish-species-identification/ Wed, 16 Jun 2021 01:30:49 +0000 https://archive.gaiaresources.com.au/?p=9235 As we wrote in our previous blog, the “Counting Fish” challenge was put forward by the Australian Institute of Marine Science (AIMS) as part of a call-out to look at innovative and streamlining technologies for a widely used method of marine research data collection. The Commonwealth Government’s Business Research and Innovation Initiative (BRII) has provided... Continue reading →

The post Artificial Intelligence for fish species identification appeared first on Gaia Resources.

]]>
As we wrote in our previous blog, the “Counting Fish” challenge was put forward by the Australian Institute of Marine Science (AIMS) as part of a call-out to look at innovative and streamlining technologies for a widely used method of marine research data collection. The Commonwealth Government’s Business Research and Innovation Initiative (BRII) has provided the grant funding and program to bring the best minds and solutions to tackle the challenge. Together with our partners at the Global Wetlands team from Griffith University, we’ve recently finished up the first stage which was an intensive 4 month Feasibility Study. 

The study focused on BRUVS (Baited Remote Underwater Video System) footage, and leveraging artificial intelligence (AI) and machine learning (ML) technologies to collect data and accelerate our understanding of fish in our oceans. AIMS and other researchers spend a lot of time manually capturing data from the videos, so finding efficiency measures and improvements to data consistency and quality would be of tremendous value. Out of the study we built a prototype application for processing and visualising BRUVS data, including automatically identifying and counting tropical fish species.  

Taking the OzFish open dataset and many hours of AIMS BRUVS footage, the team focused on training the AI model to accurately identify a range of fish species representing rare and common fish, fast moving and very small species, schools of overlapping fish and also differentiating morphologically similar species. Demonstrating  the effectiveness of our method for these specific challenges, allowing us to produce quantified, highly accurate results. We are now able to look confidently ahead towards tackling hundreds of species that live in Australia’s tropical waters.

The Fishscale online prototype – video metadata and playpack showing annotations and count statistics.

When we look back at it, we’ve achieved an incredible amount in a short space of time. Our nationally distributed team (Perth, Brisbane, Darwin) worked really hard to make sure we were on the same page and productive with online meetings, collaborations and workshops. This was no small feat when you think we had two COVID-19 lockdowns affecting our Queensland team members.

With a new Fishscale prototype web interface, a new BRUVS video can be uploaded and processed within minutes. While the researcher grabs a coffee, it generates the statistics they need to help model and understand population ecology and fish behaviour. There’s an important human quality control element as well, meaning that fish experts have the ability to make corrections, improve the model and increase the value of their data. 

We really enjoyed the regular interaction with the AIMS team as well, which helped us to design our Fishscale prototype with exciting features that will eventually deliver lots of value and efficiency gains for research workflows and other industry applications. 

So what happens next? Well, there is still plenty to do if we progress to the next phase. We know there are still challenges around much larger numbers of species, variations in water quality and environmental factors. In phase 2, our plan includes customising the user interface to adapt to different user types depending on their requirements for data capture and output. Different products based on the AI framework will have different audiences in mind depending on whether they come from research, monitoring, education, or not for profit groups.

We are confident this is just the beginning of an exciting journey to develop a highly valuable product for streamlining research workflows and generation of important statistics. In fact, the Proof of Concept phase starts up around September, and we are hopeful we can progress and continue working with AIMS on this key initiative. 

If you are interested in this space or are someone who works with underwater videos and fish identification, we would love to get your perspective for future development. Feel free to give me a call or an email though if this type of work interests you – strike up a conversation on Twitter, LinkedIn or Facebook. 

Chris

The post Artificial Intelligence for fish species identification appeared first on Gaia Resources.

]]>
Checking in with our (spatial) Data Scientists https://archive.gaiaresources.com.au/checking-spatial-data-scientists/ Wed, 12 May 2021 01:30:50 +0000 https://archive.gaiaresources.com.au/?p=9238 Our Data Science team meet regularly to talk about our workloads, upcoming projects and emerging technologies. With software engineers, business analysts, spatial analysts and technical project managers on board, we have quite a variety of skills and experience; but a shared passion for leveraging all sorts of data sources, tools, software and infrastructure. I think... Continue reading →

The post Checking in with our (spatial) Data Scientists appeared first on Gaia Resources.

]]>
Our Data Science team meet regularly to talk about our workloads, upcoming projects and emerging technologies. With software engineers, business analysts, spatial analysts and technical project managers on board, we have quite a variety of skills and experience; but a shared passion for leveraging all sorts of data sources, tools, software and infrastructure. I think we all love coffee too!

The part of our team that normally work with GIS (Geographic Information Systems) software are noticing a shift from static to dynamic deliverables, and they are adapting their skills and knowledge to meet a changing demand in consultancy.  The solutions they are contributing to now are data-driven decision support tools, monitoring dashboards, change detection toolsets, apps and websites. Desktop analysis is still a key feature of the workflow; but they work more closely these days with our software engineers and research partners to go deeper into the tech, designing models and technical processes that can be run on cloud infrastructure such as our solutions using Amazon Web Services. More and more we are leveraging real-time data sources to assist our clients too like daily satellite feeds, web APIs, sensor arrays or business systems data.

I thought it could be timely to check in with our (Spatial) Data Scientists, and ask them to look back over the last 11 months or so to recap on some of the interesting projects we’ve been involved in. This is what they had to say:

Jake Geddes – Senior Spatial Analyst

Jake Geddes

1. What are you working on at the moment, and what makes your spatial work special?

I’m currently working alongside a resources company’s Heritage Team to consolidate their heritage data into a master database as well as offering GIS support. I think the majority of spatial work we do at Gaia Resources is quite special, because you often get the feeling that you really are making a positive difference to the environment and people’s lives.

In another project this year, I advised our mobile app developers who were working on algorithms to visualise bushfires in real time. I also work with clients to establish spatial data standards around biodiversity and other environmental data to support their regulatory reporting requirements.

Real-time bushfire hotspots and fire scars are generated from satellite imagery and rendered on a mobile app from web services such as WMTS and GeoJSON feeds.

2. We are getting close to the end of the financial year – what was the highlight for you in the last 12 months?

There have been a few highlights for me despite a somewhat difficult and chaotic year for “Earth”:

  • Gaining a deeper understanding of how spatial and software engineering interact, such as in the Retromaps project I was involved in.
  • Supporting multiple environmental organisations in collaboration and conservation (e.g. WABSI, Greening Australia, multiple NRM groups)
  • I was able to lead a GIS health check investigation (find out more about these here)

3. What do you see as the link between Data Science and GIS?

Consolidating Big Data sources into something usable and effective can be quite challenging, but there are now numerous tools and processes in GIS which move beyond the consumption of raw data products and create links with data science workflows. Being able to incorporate a geographic and visual aspect is a great advantage of GIS in Data Science applications.

4. Any new widgets or tools you’ve discovered, that you’d like to tell the world about?

I have been having a play with a few statistical packages and looking into how they can integrate with our spatial analyses. These include:

  • R software for statistical modelling
  • GeoPandas (and Fiona): an open source project to make working with geospatial data in python easier.
  • kepler.gl: an open source geospatial analysis tool for large-scale data sets.
  • enso.org: a graphical user interface for automating data-driven processes

For years now I have been getting stuck into multiple plugins in the free and open source QGIS software, such as the Semi-Automatic Classification Plugin (SCP) and data plotly.

Barbara Zakrzewska – Spatial Analyst

Barbara Zakrzewska

1. What are you working on at the moment, and what makes your spatial work special?

At the moment I am working for a mining company and helping them with environmental approvals, disturbance and rehabilitation. It’s a huge company that used contractors before the current GIS Coordinator joined and started setting up a new holistic system. One of my favorite aspects of this type of work is building spatial systems in a manual sense and then creating models that automate repeatable workflows. I like building single-source-of-truth geodatabases, where incoming data is either handled automatically (e.g. via government download, seamless data feeds, custom scripts) or signed off by a knowledgeable data owner.

In other recent work, I supported our mobile app developers on a Transport Innovation project, to standardise and translate council parking data for inner Sydney suburbs.

2. We are getting close to the end of the financial year – what was the highlight for you in the last 12 months?

My highlight has been my current roll, where I have access to proprietary software such as FME and exploration data in enterprise databases like acQuire. I have created several useful models that combine business and spatial data, and was asked for input on building and improving the enterprise GIS system.

3. What do you see as the link between Data Science and GIS?

I see GIS and other spatial modelling tools as vital components of many Data Science applications. Several GIS programs come with data science plugins, calculators and modelling tools that enable further analysis and data interoperability.

4. Any new widgets or tools you’ve discovered, that you’d like to tell the world about?

While I did not discover new amazing tools in the last year, I was able to use FME and QGIS – and my knowledge of the business requirements – to achieve some interesting and valuable outputs.

Rocio Peyronnet – Spatial Analyst

1. What are you working on at the moment, and what makes your spatial work special?

I am currently working on a change detection in forest canopy health project for Victorian government. We developed two models that account for variation in the canopy condition using Sentinel-2 imagery, in cloud-based platforms: Jupyter notebooks and Google Earth Engine. What makes this work special is that this model will help in the operational assessment of vegetation condition in an easy and quick way, allowing the user to focus on the areas that require attention more urgently.

Tree canopy health change – or difference in Normalised Difference Vegetation Index (NDVI) –  is modelled with Jupyter Notebooks and imported into ArcGIS software.

2. We are getting close to the end of the financial year – what was the highlight for you in the last 12 months?

Definitely, knowing that the work we are doing will support fire risk assessments and prevention of this kind of risk is the highlight of my year. I have only been part of Gaia Resources a short time, but I am really happy to be working for a consultancy and team focused on the delivery of sustainable technology solutions.

3. What do you see as the link between Data Science and GIS?

We are living in times in which data is constantly generated. Data scientists manipulate and analyse all that data to obtain information from it, but when the location parameter is present in the data, GIS can be put into action. By using GIS we can visualise spatial patterns in our data and present them in a map, providing a better understanding of where our information is positioned.

4. Any new widgets or tools you’ve discovered, that you’d like to tell the world about?

I am personally fascinated by Digital Earth Australia, a platform for open source analysis developed by the Australian government. It uses spatial data and satellite imagery to detect changes across Australia, providing codes and tutorials so users can perform their analysis, all free of cost.

————————————

What we’ve noticed is that most of our client engagements are no longer just standalone analyses, maps or even webmaps. They want consultants to provide value to the business – through streamlined workflows , integrated systems and insights – to help them make more effective and timely decisions. This invariably involves our spatial analysts, business analysts, software and devops engineers to come together to bring the solution together. As Data Scientists, we all agree that there are loads of tools and data sources on offer – but that the key to a successful project is to focus on the challenge or outcome, and to build out a plan that leverages the right tools, data and modelling approach for the task at hand.

If you have a perspective on the changing landscape of data science and the spatial industry, feel free to give me a call or an email. Or, strike up a conversation on our Twitter, LinkedIn or Facebook pages.

Chris

The post Checking in with our (spatial) Data Scientists appeared first on Gaia Resources.

]]>
The NAFI app is changing the way work is planned in the field https://archive.gaiaresources.com.au/nafi-app-changing-way-work-planned-field/ Wed, 28 Apr 2021 01:40:21 +0000 https://archive.gaiaresources.com.au/?p=9220 Controlled burning is underway across the western and central parts of tropical north Australia. As we move into the dry season and the floodways on our Top End roads become accessible, indigenous groups, parks managers and farmers are keen to get those early season burns in full swing. This type of fuel mitigation burning happens... Continue reading →

The post The NAFI app is changing the way work is planned in the field appeared first on Gaia Resources.

]]>
Controlled burning is underway across the western and central parts of tropical north Australia. As we move into the dry season and the floodways on our Top End roads become accessible, indigenous groups, parks managers and farmers are keen to get those early season burns in full swing. This type of fuel mitigation burning happens at a time of year when there is moisture in the soil and vegetation, in order to limit more catastrophic bushfires later in the season when everything has dried up. It reminds me of the explanation Dom Nicholls from the Mimal Rangers gave me over a coffee chat last year, when he said in East Arnhem land they begin their programs as early as they can get the flames to take hold in the grassy vegetation – in March if they can get road access – and then race to fill the gaps later using fire scar mapping and careful planning.

Farmers like Mark Desaliy can use the app to monitor fires near their stations.

Our initial release of the North Australia and Rangelands Fire Information (NAFI) app for iOS and Android back in February brings the most used fire information resource for land managers in Australia to your phone, allowing you to keep a constant eye on bushfire threats. You can view maps of satellite generated fire activity (hotspots) and burnt areas (fire scars) provided by the NAFI service. There’s a good summary back in March from Rohan Fisher on ABC Radio – Kimberley.

At a regional scale like this area in northern NT and WA, the NAFI app represents real-time hotspots through a heat map clustering algorithm.

Just to recap on how the app works behind the scenes to provide you with real-time fire information:

  • The hotspot locations are updated several times a day and the fire scars are updated up to once or twice a week depending on fire conditions.
  • The fire scars are produced by the NAFI Service and the hotspots are sourced from Landgate WA and Geoscience Australia.
  • Base maps for imagery and topography can be downloaded for offline use in your region of interest, and then used for when you go outside of mobile data range.
  • Burnt area mapping covers the Australian Savannas and rangelands that comprise around 70% of Australia, but does not cover NSW, VIC or the heavily populated regions of QLD, WA and SA.

So how popular is the NAFI app – well we can monitor a number of analytics using iOS AppStoreConnect and Google Play console, or the Firebase dashboard. These are configurable dashboards that can tell us things like how many installations occurred by day or week, how many are actively used, and filtered by operating system or device type. As of today, the iOS app has been downloaded 288 times since it’s initial release, and the Android version 142 times.

AppStoreConnect dashboard for the iOS NAFI app provides statistics of installations by week since the mid-February release.
Google Play Console shows the increase in installations of the Android NAFI app over time since the mid-February release.

 

We expect installations to continue upwards in the month of May and beyond, as more people on the ground become aware of the benefits and utility of the app. There are two phases of bushfire related activity  where the app can be useful, associated with the early Dry season burn programs and carbon (emission reduction) projects, and the late Dry season bushfire response.

The statistics are anonymised so we are not tracking personal information, but what the out-of-the-box analytics does help us to understand are the trends, and – along with ratings and word of mouth – we get a bit more insight into how people are reacting to the app. This can then feed into our strategy with clients on helping them target marketing campaigns and prioritise enhancements. We also utilise Firebase Crashlytics as a way of logging the details of any crashes and error messages received, and this really helps us get quickly to the root cause of a technical issue a particular user is experiencing.

Please be aware if you are using the app:

  • Hotspot location on any map may only be accurate to within 1.5 km
  • The hotspot symbol on the maps does not indicate the size of the fire
  • Some fires may be small, brief, or obscured by smoke or cloud and go undetected
  • Satellites detect other heat sources such as smokestacks

For more information visit: https://savannafiremapping.com/nafi-mobile-app/

If you would like to know more about our projects with the NAFI team, or want to strike up a conversation by sending me an email or getting in touch on TwitterLinkedIn or Facebook. 

Chris

 

The post The NAFI app is changing the way work is planned in the field appeared first on Gaia Resources.

]]>
Fire information app launch https://archive.gaiaresources.com.au/fire-information-app-launch/ Wed, 10 Feb 2021 02:00:04 +0000 https://archive.gaiaresources.com.au/?p=9006 Today marks the launch of the NAFI Fire Information app by the team at Charles Darwin University (CDU) responsible for maintaining Northern Australia and Rangelands Fire Information (NAFI) system. Gaia Resources worked closely with the NAFI team to design and build the app, which you can now download onto your device from the Apple Store or Google... Continue reading →

The post Fire information app launch appeared first on Gaia Resources.

]]>
Today marks the launch of the NAFI Fire Information app by the team at Charles Darwin University (CDU) responsible for maintaining Northern Australia and Rangelands Fire Information (NAFI) system.

Gaia Resources worked closely with the NAFI team to design and build the app, which you can now download onto your device from the Apple Store or Google Play Store.

The release coincides with the 2021 Savanna Fire Forum being run remotely from Darwin to over 150 participants.  Available for Android and iOS, the app enables land owners, indigenous rangers, conservation scientists, pastoralists and others to get near real-time fire information across 80% of the Australian continent. It is part of a bunch of support we have been providing to NAFI and other fire management groups in recent years (click here for a snapshot of previous blogs), and we are really excited about our contribution up in the Top End.

On the surface the app simply reflects the powerful data products available from the NAFI website that are already used extensively across projects and programs in the north of Australia to monitor savanna burning programs and bushfires.  Fire scars are displayed representing remotely sensed burnt areas coloured by the month of the fire, as are thermal hotspots detected from an array of satellites. These data layers are presented in an intuitive mapping interface with a small selection of base maps, location and compass direction functionality.

The NAFI app starts with a view of your region (left), presents a legend and layer selector (middle left), provides topo and imagery base maps (middle right) and near real-time hotpots (right).

In this initial release, the idea is to get the data out there onto mobile devices, and the NAFI team are keen to have that drive discussion about enhancements that will deliver high value to people working in the field and planning their fire management activities. This could be planners and rangers on carbon abatement programs focused on early dry season controlled burns, or community and government organisations battling raging bushfires, like the one that swept through 87,000 hectares of the World Heritage Listed Fraser Island last December (here is a link to the most recent article on that event).

An earlier test version of the app (left) during the December, 2020 Fraser Island fire. Image source: The Australian.

When you start using the app, you’ll notice a few little gems in there that are all focused on increasing the accessibility and usefulness of that NAFI data. So let’s start with the near real-time aspect:

  • the app checks for updates regularly, with hotspots updated every 20 minutes on average, and fire scars updated 2-3 times per week,
  • data is pulled down dynamically from the NAFI server and processed on AWS cloud-based infrastructure,
  • the data is then automatically uploaded to the person’s device whenever they have the app running with a mobile data connection.

Next, let’s consider the offline capabilities:

  • the app allows you to download base maps (OpenMapTiles imagery or NAFI’s Topographic map) for your region(s) of interest,
  • you can continue to work outside of a mobile data connection, with the most recent fire scar and hotspot data from when you were last online and had the app running,
  • the location marker and compass direction give you geographical context online or offline.

Hundreds of thousands of hotspots are rendered seamlessly using a heatmap algorithm. To overcome a performance constraint for mobile devices, we have devised a rendering algorithm that can render tens of thousands concurrent hotspot points across Australia into temporal heat map clusters. People using the app can get that regional view of hot spots and visualise three different fire age groupings in purple (0-6hrs), red (6-24hrs) and blue (24-48hrs). The app also features some high resolution fire scar mapping of the Darwin area sourced from Sentinel satellite imagery, as part of a trial implementation with BushfiresNT. The continental scale fire scar mapping is based on MODIS satellite imagery (250m resolution), so the new Sentinel based mapping based on much higher resolution imagery is an exciting new space to keep an eye on.

Being an initial release, the NAFI team are looking for feedback future versions, or just to hear what you think – there’s a direct feedback link in the app itself too. We’d also love to hear your thoughts, so feel free to reach out to us and start up a conversation by sending me an email or getting in touch on TwitterLinkedIn or Facebook. 

Chris

The post Fire information app launch appeared first on Gaia Resources.

]]>
Counting fish – supporting research at the Australian Institute of Marine Science https://archive.gaiaresources.com.au/counting-fish/ Fri, 15 Jan 2021 04:32:26 +0000 https://archive.gaiaresources.com.au/?p=8816 You may have heard the news on Tuesday from the Commonwealth government media release where the Minister for Industry, Science and Technology Karen Andrews announced grants to develop products that improve our natural environment. We are very excited to announce that Gaia Resources (mentioned as Tekno Pty Ltd) is one of the grant recipients, and we... Continue reading →

The post Counting fish – supporting research at the Australian Institute of Marine Science appeared first on Gaia Resources.

]]>
You may have heard the news on Tuesday from the Commonwealth government media release where the Minister for Industry, Science and Technology Karen Andrews announced grants to develop products that improve our natural environment. We are very excited to announce that Gaia Resources (mentioned as Tekno Pty Ltd) is one of the grant recipients, and we have some really exciting work ahead of us in the coming months.

Together with some excellent partners and the Australian Institute of Marine Science (AIMS), we will be looking at leveraging Artificial Intelligence (AI) and Machine Learning (ML) tools to identify fish species, counts and other measures from underwater video footage. Tailored to the research challenges that AIMS faces, we are hoping our work will continue on to develop products and insights that can streamline marine research programs and conservation efforts.  Our focus will be to support scientific understanding of critical issues and build online tools to streamline and expand the capacity and program effectiveness.

Our team is really looking forward to getting started, and I am sure we will have an update for our interested readers in a few months time. Feel free to give me a call or an email though if this type of work interests you – strike up a conversation on TwitterLinkedIn or Facebook. 

Chris

The post Counting fish – supporting research at the Australian Institute of Marine Science appeared first on Gaia Resources.

]]>

Plugin by Social Author Bio