TDWI Articles

Q&A: How Geospatial Data Assists Disaster Relief

Todd Mostak, founder and CEO of OmniSci, explains how the power of geospatial data can help get the right aid to the right people at the right time and place.

Enterprises are increasingly finding new ways to put geospatial data to work. Todd Mostak, OmniSci's founder and CEO, explains how it can help get the right aid to the right people at the right time and place.

For Further Reading:

Q&A: How Location Intelligence Helps Make Farming More Environmentally Sustainable

Emerging Practices in Location Data Management and Analytics

Dealing with Disasters: BI and Analytics Drive Nonprofit's Success

Upside: What type of data is important for disaster relief purposes?

Todd Mostak: Interactive location intelligence is key to disaster response and relief. The ability to optimize global logistics operations, layer in location intelligence, and identify trends for general readiness or emergency management is essential for making rapid decisions.

What is involved in modeling disaster risks?

Modeling risk associated with a disaster involves rapidly analyzing and visualizing spatiotemporal data to better understand the impacts of resource and personnel placement, weather, road and power grid conditions, and more.

How does geospatial data inform decision making?

Harnessing the power of geospatial data allows disaster management agencies, emergency managers, and federal agencies to more accurately assess and predict the scope of a disaster threat and make data-informed decisions about the best ways to orchestrate a response.

In what types of scenarios is this data leveraged?

Disaster management agencies use new, data-driven capabilities to analyze and plan for a wide range of activities. With hurricanes and other tropical storms, data can be leveraged to help identify areas at risk of flooding and can help emergency responders map out the best strategies for evacuations. Utility companies can use data visualization tools to mitigate fire risk by monitoring power lines in real time and proactively identifying areas in need of maintenance. When it comes to disaster situations, the opportunities to leverage data are practically endless, but unfortunately underexploited.

What is the speed of the analysis?

With the existing tools in the marketplace for disaster managers, modeling complex disaster risks to communities can require more than 24 hours of computer processing time. In disasters, minutes count. In order to make decisions effectively, managers need to have real-time, accelerated data at their fingertips. Real-time analytics platforms enable users to speed up the modeling process, allowing anyone to quickly visualize the data and even model scenarios, which will drive better mitigation, planning, and response decisions.

Can you give an example of how this intelligence is being used?

Hurricane Irma struck in 2017 and knocked out power across the Caribbean, Florida, Georgia, and the Carolinas, leaving over 17 million people without power. People were left without air conditioners in the extreme heat, without refrigerators to keep food and medicine safe, without lights at home or on the streets, and without effective means of communication to stay informed.

Critical infrastructure and management agencies that depend on power -- first responders, hospitals, drinking water suppliers -- were all heavily impacted. At the time, agencies were hindered by their inability to get a reliable read on the situation as it unfolded, which only exacerbated the issue of acquiring necessary emergency supplies and getting them to where they were needed the most.

An accelerated analytics platform can dramatically accelerate geospatial analytics, granting the emergency management community critical access to a tool that is capable of delivering millisecond-level performance for analyzing disaster data at scale. This was put into action in 2018, when Hurricane Maria hit Puerto Rico. Skyhook, an OmniSci customer, used our accelerated analytics platform to quickly plot wifi access points on a map. They then monitored the data to track power outages across the island, identifying areas in need of help and enabling resources to be directed there. Power outages, supply inventory levels, and weather data can all be tracked in real time, giving disaster response teams unprecedented power.

Where is geospatial data heading? As more countries around the globe are grappling with the short- and long-term effects of natural disasters, we anticipate more will invest in geospatial analytics to ensure they have the tools necessary to proactively protect their communities.

Of course, the impact of geospatial data will go far beyond disaster response. With the proliferation of mobile phones, as well as GPS-enabled vehicles and IoT devices, an increasing proportion of data generated is location- and time-stamped. Although geospatial analytics was traditionally the domain of GIS specialists working with relatively small data sets, it is quickly becoming a central component of big data analytics and data science workflows, where geospatial data and non-geospatial data must be mined in tandem for insight.

Unfortunately, traditional analytics platforms typically cannot fully extract value out of geospatial data, much less visualize billions of points, lines, or polygons at granular scale. Conversely, legacy GIS platforms, although capable of advanced analysis of geospatial data, were not built to scale to handle the massive sensor-produced data sets mentioned above, and moreover, are not well suited for general purpose analytics and data science workflows.

Analysts and data scientists in spaces as diverse as telecom, energy, government, insurance, utilities, retail, consumer packaged goods, and adtech increasingly need to not only extract insight from massive, real-time data sets, but also interleave both general-purpose and spatiotemporal analytics workflows. Platforms such as OmniSci, built from the ground up to accelerate these new classes of converged analytics and location intelligence workflows, will play a central role in helping derive value from the huge swaths of data that not only tell the story of "what" and "when," but also "where."

About the Author

James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him via email here.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.