Data democratization is an aspirational goal for many organizations, but what does it take to achieve it and how can Location Intelligence play a role? In Palo Alto this week, at the Telco Data Analytics conference, speakers posed some interesting questions and discussions around data democratization. While some of these organizations have been heading down this path for a few years, it is clear that it takes commitment and it is rarely clear sailing all the way. Issues ranged from data acquisition, data quality and data privacy, as well as the skills gap with insufficient data scientists trained in modern technologies.
One thought that really resonated, was the idea that not everyone needs to be a data scientist, but that everyone in the organization has a role to play. Data democratization is not the end game; information sharing is. A smaller number of data scientists and business analysts need to make the data fit for use by a larger number of data consumers. This is where Location Intelligence can be a part of the solution. In many cases, very large datasets can be overwhelming. Organizing the data using location can help bring context. Making use of a spatial key such as one associated with an address or a geohash can help organize the data. Data can be joined using the spatial key and made available to applications.
To take an example in property and casualty (P&C) insurance, an address-based key can be added to multiple property and risk based datasets through a geocode or other spatial process. An underwriting application can then retrieve all available data attributes for a property based on the address-based key. The end consumers of the underwriting application do not need to understand how to work with GIS functions or SQL syntax but are able to use the retrieved data to make actionable decisions that bring a return on investment.
Similarly in telecommunications, making sense of billions of call transactions can be challenging. Organizing data into a geohash allows data aggregation by location. Patterns over space and time can be much more readily identified and anomalous call quality can easily be identified from background data. Data can be served to varying applications within the organization whether that be network optimization, marketing or customer services.
Another idea where hopes have been dashed is that of an omnipotent dashboard. In many cases it seems that a dashboard would show all the metrics needed. The reality unfortunately is that after a relatively short time the dashboards go unused. Giving access to the data, it would seem, is not enough to make it useful in an ongoing way.
We see parallels to this in the Location Intelligence world, where in order to understand the data, visual inspection, query results, charts and thematic mapping through a visual interface are invaluable. However if a ROI is to be achieved, the patterns in the data need to be rapidly identified and moved to an operational process. In the first steps every piece of data needs to be analyzed by a real human who understands what it means. It is unrealistic to expect that anyone else will have the time or skills to give it the same level of interest. Making data part of an operational process before projects end and people move on to the next initiative is essential.
Location Intelligence can help in transferring data science problems into operational solutions through organization of data around a spatial key. For more information on the Pitney Bowes Spatial Data Lake solution download this white paper.
Visit us online: www.pitneybowes.com/us/location-intelligence