html website templates

waterRIDE DATA Manager

Single Point Of Truth

You can be confident that your flooding data is up to date, correct, and ready to use.

Metadata

A holistic metadata framework ensures the providence of your flooding datasets.

Quality
Control

Archiving, versioning, rollback, and quality levels ensure your datasets are tracked, rated and fit for purpose. Dedicated tools streamline data inflow and outflow.

Data
History

As your flooding data continues to change over time, you can drill through these changes at any location, or even go back to previous state.


The waterRIDE DATA Manager module designed to streamline and codify the management of your ever-changing flooding datasets. 

Designed to work with your existing waterRIDE FLOOD licensing, or as part of a fully managed WaaS solution, DATA Manager helps you ensure that your flooding datasets can be relied upon.

Single Point of Truth

Centralise disparate flooding datasets into a single framework to ensure that all users are accessing the correct datasets.

Dedicated tools help you manage study overlap, mixed detail levels, and inconsistency in your flooding datasets to ensure that your end users do not need to. They can just rely upon the datasets they have access to.

Mastergrids enable you to provide seamless access to all of your flooding data, at once, and with unrivalled speed.

Metadata Management

An extensive metadata framework ensures that your data sources are never lost and maintain their providence.

Commonly used fields include: Study details (Name, Author, Date, Coverage extents), Model Type, Model Detail, Model Quality – a measure of the quality of the modelling (resolution, model type etc), Study Quality – a measure of the quality of type of study (formal flood study, flood impact assessment, rapid hazard assessment, etc), Overall Quality – a combination of Model Quality and Study Quality, Flooding Type (Riverine, Overland and Storm Tide), Design Events modelled, Scenarios modelled (calibration, climate change, blockage), Catchment ID, Locales covered by study, Status (draft, adopted, retired, rollback etc), Information type (peak or time series), Reference GIS layers, Study reports and references.

Data Access Control

Versioning and publishing allow you to manage what datasets various users in your organisation have access to and the way in which they interact with them.

Create varied projects to ensure that different end user types can access the datasets they need without being confused by sundry information.

Streamlined Data Exporting and Receipting

Specialised tools allow you to supply datasets to external users. Just draw a polygon and export.

All data export processes are tracked in the database.

Equally, when new/updated seb-datasets are returned (eg a development assessment), you can seamlessly merge the new datasets into you existing datasets.

Key Features

  • Governance and quality
  • Dataset receipt and tracking
  • Rollback, decision and “point in time” data history (legal challenges)
  • Archiving
  • Data searches
  • Management of study overlap and consistency
  • Data updates vs dataset replacement 
  • Structured information distribution (internal and external)
  • Usability and drill-down detail for differing end user needs
  • Centralised data management
  • Management of data updates
  • Performance management ensures massive datasets are usable

Case Study: Sunshine Coast Regional Council

Download a technical paper presented by Cameron Druery at the Floodplain Management Australia Conference in Canberra, 2019:
The need to manage the looming "big flood data" problem.

© Copyright 2020 Worley Digital - All Rights Reserved - License and Maintenance Agreement