In this opinion piece, Nigel Lester (pictured below), managing director of software solutions at Pitney Bowes for Australia and New Zealand, discusses the importance of data quality in creating a successful analytics regime.
The big data concept has been around for a while now. With the velocity and veracity of the data explosion, demand and dependency for data analytics is directly proportional as firms look for unconventional ways to target and engage customers. Market analysts are bullish on the growth potential, with Gartner estimating the Australian BI and analytics market to grow by 13.4 per cent in 2017, as analytics becomes commonplace. Organisations will need to decide what they want to use data for, which information is important, how to verify it, which tools should be used to analyse it, as well as learning how to apply the insights garnered in the planning and execution process.
With an increasing number of devices becoming interconnected, there is a huge surge of data. That said, data by itself is not valuable until one can draw insights and leverage the intelligence gained for effective decision making. This is especially true in the digital age where data analytics can be used to drive demand, meet supply and improve competitiveness.
Data as a service is gaining traction, and smart companies will intelligently start leveraging data to make faster decisions. Good data can set the foundation for a successful data analytics initiative, but most organisations face a challenge in maintaining data quality. Obsolete or incorrect data can cost organisations heavily. For example, a study from Gartner indicates the average financial impact of poor quality data costs organisations $9.7 million per year. This is reinforced by research from Forrester that says third-party data is most valued, and most Australian companies will extensively use third party data coming from sources such as point-of-sale (POS) devices, digital channels, social media channels and other internet-enabled devices. The lack of accurate and quality data can lead to loss of revenue, and potentially cause reputational damage. Hence, it is essential to clean data before it is fed into an analytics tool.
Organisations also need to clearly define business requirements and desired goals before undertaking a data analytics initiative. This will better enable the identification of the appropriate data sources. Once sources are identified, modelling the data according to the desired data format can be done, and pilots completed to verify and cross check the accuracy against known metrics.
Mining intelligence from unstructured data sources such as video, medical images or social media is gaining more prominence than the traditional data sources, as they are able to derive more relevant and contextual information. That said, traditional sources like location is still relevant, and is now being maximised more than ever before, not just by traditional tech/IT teams, but also by the business teams as they realise that location based data (including data of social media, beacons, sensors, etc) can help them to drive more targeted marketing activity with effective and better business outcomes.
Coincidentally, the focus of the physical and digital worlds is something that is a core focus for many businesses today, and many are seeking answers through data. However, it’s not quite as simple as just crunching numbers.
Social media gives a lot of insight about individuals, but the information we’re seeing about a person on Twitter is not necessarily the same as what we see on Instagram or LinkedIn. The persona someone takes on LinkedIn is likely professional and job-related, however, their Instagram profile may be dedicated to their personal life, while their Twitter profile may be used to share and gather information. If you then throw a person’s real-life, offline experiences into the mix, you may find they don’t interact and engage the same online as they do offline.
It’s impossible to have a truly complete view of someone through one singular channel or to understand their relationships by assessing just one aspect of their life. That’s why data quality has been a pain point for many businesses.
The successes of location-based analytics relies on good quality data and organisations across the globe continue to rely on the traditional source of data – location. For years, enterprises have analysed location data to make business decisions, but in many ways, that location data is being collected is mind boggling. Location intelligence business has undergone a sea change, from desktop to spatially available solutions. The use of geospatial technology to underlie tools involving big data, artificial intelligence and machine learning, and the Internet of Things, are influencing the way companies manage sales and marketing, as well as risk management, government planning, smart city services, and more.
Data analytics capabilities are what will enable organisations to stand out from the competition and create business successes. Understanding what attracts and influences customers will be heavily reliant on the data an organisation can gather and their ability to mine and apply it.
Domain’s chief marketing officer (CMO), Rebecca Darley, said that without Tealium’s customer data platform (CDP), her business would not have been able to achieve the personalised marketing success that has made it one of the leaders in data-driven marketing in Australia. Speaking at an exclusive breakfast event hosted by B&T at Sydney’s swish harbourside restaurant, […]
Man of Many in partnership with NBC Universal, stages an atmosphere of elegance for Sydney premiere of ‘Argylle’ film. Independent lifestyle publication digital publication, Man of Many, has shown its innovative approach to event management in the premiere of ‘Argylle’ at Hoyts Cinema in Sydney’s entertainment quatre. Over 400 guests were included in the films […]