We caught up with one of own resident data experts Dan Adams, VP Data Product Management here at Pitney Bowes to ask him few questions prior to his upcoming participation in Forbes Insight’s webinar The data differentiator: How improving data quality can improve your business on September 7, 2017 at 2pm EST
Read on to learn more about his perspective on why the importance of data quality is top of mind for companies of all shapes and sizes. Plus, find out how Pitney Bowes is transforming to meet the demand.
01. You’ve been a self-proclaimed "data-guy" for over 20 years. Can you tell us a bit about your background?
Ha. I think I probably refer to myself as a data-geek more often, but data-guy works too. I’ve spent 20+ years working in various executive roles across data sales, product management, business development, supply change, data collection and map production management for companies including TomTom, GDT and Tele Atlas. For the last three years I served as the CEO for Maponics, a spatial data company based in Vermont that was acquired by Pitney Bowes mid last year. I’ve been at Pitney Bowes for just over a year now and I’m as happy as a "data-guy"could be.
02. Many are surprised to learn that data is such big focus area for Pitney Bowes. When did that start and why?
Well, what I think surprises people the most is when they learn it– that our focus on data – isn’t really new.
My colleague Roger Pilc recently published a great piece in CIO magazine called Postcards from the digital transformation from a 100 year-old start up . There he discusses not only how mission-critical data precision and accuracy is to any organization’s physical and digital offerings, but how critical it is to us here at Pitney Bowes.
We’ve been undergoing our own digital transformation for over four years and data has and will continue to play a significant role in that. From how we ingest and use data to inform our business decisions at Pitney Bowes, to how we productize data as an offering to help clients address business and industry challenges in areas from real estate and insurance to location based marketing and advertising.
We’ve had an extremely robust location data portfolio for years, something I’ve long admired prior to my joining Pitney Bowes. Due to our legacy in shipping and mailing we have a richly multi-sourced dataset that delivers industry-leading match rates and has the highest precision coordinates of any geocoding product available. We’re building new products and new data capabilities every day. Pitney Bowes has around for a long time, and as a result, we have a lot of experience in key categories driving commerce around the world. Data is one of the areas we really stand out.
03. On September 7, you’ll join Forbes to discuss the importance of data quality. What are some of the business needs driving organizations to improve the overall quality of their data?
I think the business trend stems from overall changes in our society to be more data-driven. You can see the importance data is playing across many different aspects of business. Data itself is becoming part of the competitive landscape. Examples include major changes in how baseball is being played, with data replacing tradition; predictive traffic information saving people time sitting in traffic; and, my favorite, the number of views/tweets/followers on social media. We’re all consuming more data derived products every day. If your business isn’t deriving new insights and analytics out of your data and your competitor is, they will potentially learn something that you won’t. They can take that intelligence to market and win business from you because you weren’t paying attention to the information they were.
Companies are beginning to think of their data as an asset and many are starting to manage it that way. Data quality, freshness and accuracy become important focus areas. Managing quality in process design, assuring the quality is retained through all internal processes and the ability to identify, confront and correct errors early in the process is critical to speed and success. Often, data quality control is designed in reaction to problems that impacted a customer or caused a missed deadline. Quality control as the primary error detection is not an enabler of speed, it’s a control mechanism. In the best functioning data systems, quality control confirms that quality assurance is catching and correcting problems upstream.
With data being used to make impactful business decisions every day, or with individual transactions, there is considerable demand to get it right, and competitive pressure to do it quickly. The old adage of pick two: "speed, cost, or quality" isn’t a luxury we have anymore.
04. In your opinion, what’s the top takeaway for those who tune in to the webinar September 7.
Nobody licenses data or builds data just to have it. They want to do something with it. So, for those listening in – I’d encourage you to think about what exactly you’re trying to accomplish with the data you’re using and engage this group with those questions in mind. What can we expose you to that will help you answer and measure the hard questions around whether or not your data is meeting your needs? Leaning in to learn tips to better understand the sources behind where the data is coming from – and your supplier’s commitment to providing you with the data that’s most important to you. That’s key.
Thanks you for your time, Dan. Looking forward to a great session next week. Don’t forget to tune in to the webinar to see what other insights our data experts Dan Adams, Anthony Scriffignano and Linda Brendish have to share.