The Challenge
The environment sector generates vast amounts of data and wish to share this data to drive local action. At a Water Hub Hackathon the Environment Agency (EA) presented their current platform. Their current platform includes critical measurements used in determining overall quality of water bodies, and thus used for identifying potential projects and impact.
Our winning submission included the ability to subscribe to data updates, improved visualisation, ability to look at multiple regions at once, improved user experience and modularity. This win led to a project to explore real user needs and requirements, and develop a working prototype of potential solutions.
The Solution
This project required a significant focus on discovering and engaging with current and potential users. We achieved this through related conference attendance, identification of key users with support from The Water Hub, and one-to-one demonstrations of proposed ideas and discussion of existing work practices related to Catchment Data.
To support discussions we developed wireframes of varying complexity, from static images, to interactive pages with sample data. Initial findings showed that current usage was lower than expected and EA requirements limited the inclusion of desired third party data such that many user requests could not be directly supported.
To fully explore the feedback and user requests we developed a prototype platform separate from the EAs data explorer to provide missing user requests and allow the EA to determine which features they could develop or implement internally.
The Results
Using R and Shiny as our main development tools, and RStudio Connect as the publication platform we host a functional prototype of the “Water Body Explorer”. This platform aggregates raw data from many third party sources both internal and external to the EA, through bespoke APIs, commercial databases, and asset management systems. All data is referenced to original sources allowing users to analyse, correct, or check provenance. Data processing was achieved using Python, R, and other open source GIS tools. In addition, a full report on user findings, development process, technical requirements, proposed further work with technical challenges highlighted was provided by us on completion.