The emergency response world has been wrestling with the challenge of experience versus data for years. Numerous response managers who have come up through the ranks look at incidents, map them up quickly with experience, and throw resources at the incident based on those collective memories. However, when approached with data models, many are sceptical. Experience, obviously, has been tested and the results remembered. So, no surprise, first responder managers often rely on what seems like a better choice for accountability.
However, as more organizations start moving to digital, the availability of big data becomes more apparent in terms of potential decision-making benefits. So why should that be ignored in favour of experience? It shouldn’t, but the discussion should also not be one of comparing why or why not experience is better or data better. Instead, decision-makers should be looking at how their experience and new data tools can be married for smarter, more informed decisions altogether.
The resource is obvious and available. According to IBM, something near 2.5 exabytes of data was generated daily in 2012. One can only wonder what that figure is now four years later. As David Greenberg of BankMobile notes, some entities already have up to 99 percent of the data they need stored somewhere. So not using data already collected inside a company is literally leaving money on the table for waste and inefficiency. However, the data also needs to be correct and unbiased, otherwise it’s a waste of time to filter through as well. This inherently means that the tools used to collect the data need to be reliable, accurate, and robust. A one-off spreadsheet is likely not going to fly. A database worth of data crunched with objective metrics may be far more reliable when utilized and updated regularly, followed by producing ongoing reports people can use on the fly.
The specific tools applied depend on the data being measured and the output desired. However, at a minimum, data use should be based on accepted statistical principles, typically using measures of central tendency (averages, variances, standard deviations, etc.). More importantly, however, decision-makers and those preparing the data reports should both have good grounding in how to read statistics-based reports correctly. Numbers and numbers that are important can be worlds apart. So, quantification needs to be accurate, easy to apply, and quick to operate consistently every time they are used. Too often decision-makers and their staff get caught up in fancy tools that aren’t used correctly or, worse, report the wrong data. As former Secretary of Defense Robert Gates pointed out, "I have wasted more money on IT than anyone in history." Then bad decisions occur, and the entire project is thrown out the window with the bath water.
At Spire, we have pioneered contextual intelligence technology, which can be applied to any part of the human capital supply chain. With 5,200 GB of data per person on earth expected by 2020, accessing information correctly will be in high demand, and data literacy will be essential as a result. Our platform searches, understands and interprets 'data' contextually, deducing analytics for valuable insights. These tools help reduce costs, reduce time, increase revenue, and increase operational value. Contextual search & intelligence technology computes data faster as well. It gives a 95% accuracy in contextual search and 80% accuracy in demand-supply mapping. And, more importantly, the data is interpreted and analyzed in the unique context of your business, not a general industry. Make Spire your next step in evolved decision-making.