In a 2012 article by the New York Times, Gary King, director of Harvard’s Institute for Quantitative Social Science, made a bold prediction at the time when he said that “the march of quantification, made possible by enormous new sources of data, will sweep through academia, business and government. There is no area that is going to be untouched.” In the years since Mr. King made this prediction, the business world, specifically, has experienced a rapid growth of cognitive technologies and higher customer expectations for the value these products and services provide. In 2012, consumers were awed by Apple’s brand-new personal assistant named Siri. Now people are disappointed that Siri isn’t more intelligent. It’s remarkable how fast our understanding of technology has developed in such a short period of time and how high our expectations are of disruptive technology. .
In order to fully embrace the shift towards smarter technology, executives must look internally and decide whether their current IT systems are strong enough to survive the transition. Prashant Kelker, a contributor to CIO.com, believes that a “reliance on legacy systems is one of the biggest hurdles” for companies trying to capitalize on the big data revolution. Usually companies will discover that there is a gap between the availability of useful data and their ability to effectively utilize it.
New Sources Of Information
As organizations have gained access to these new sources of information, and the rapid expansion of the data volumes they are trying to leverage, the need for new methods to manage and understand data has become urgent. Governments, Businesses and Academia are now learning to procure, manage, and implement data in new ways that were previously impossible to imagine. Data points are no longer confined to Excel spreadsheets and organized databases that are neatly displayed in rows and columns. The arrival of smarter technology has enabled researchers and data scientists to extract insights from emails, PDFs, satellite imagery, security videos, and many other sources. Developments in artificial intelligence — specifically natural language processing and image recognition — have come about just as the business world has learned that their existing people and resources aren’t able to make sense of the data.
Consider the example of Descartes Labs, a start-up based in Santa Fe, New Mexico that uses satellite imagery to rapidly identify wildfires. Combining artificial intelligence and machine learning with imagery from government satellites, the team at Descartes can recognize the presence of any smoke or possible changes to thermal data and immediately alert the appropriate officials in the area – usually within ten minutes. Accessibility and understanding of these enormous data volumes has enabled Descartes to successfully identify over 6,000 fires, and it would not have been possible without harnessing the power of AI’s image recognition capabilities. There are countless other examples of similar projects that utilize AI to protect against the effects of landslides, floods, and other natural disasters.
In a much broader sense though, projects such as these demonstrate the utility of having such a vast amount of data at our disposal. It is estimated that 90% of the data that is currently generated is unstructured, and it is continuing to grow at a rate of 55-65% annually. In recent years, the increasing variety of data has been matched by the advancement of technology to a point where machine learning can understand and use it to mitigate risk. With the introduction of AI, the challenge has shifted from finding ways to understand the data, to learning how to optimize that data to inform decision-making.
Providing Value to Customers
The development of cognitive technologies has also led organizations to acknowledge the ability of artificial intelligence and machine learning to expand operational performance and provide value to customers. Aside from optimizing internal operations and reducing the monotonous administrative work within an organization, Erik Brynjolfsson, a professor at MIT Sloan School of Management, believes these technologies “enable firms to gather extremely detailed information from, and propagate knowledge to, their consumers, suppliers, alliance partners, and competitors.”
Coca-Cola was one of the first major companies to fund the research and development of AI and machine learning to get on top of massive data sets, and have since found several useful ways to apply them across their operations. In one instance, the company analyzed over 120,000 pieces of content and 20 billion online impressions to help “learn more about different types of consumers, their demographics, and behavior,” which they used to re-align the marketing campaigns.
According to a study from the OMD Group, 74% of customers rely on social channels to help with their purchasing decisions, which is why Coca-Cola implemented computer vision and natural language processing (NLP) to track photos of their products uploaded online and to perform sentiment analysis.
In an age of information overload, where businesses need to manage social channels and monitor customer feedback, machine learning and NLP have proven to be instrumental in guiding strategy decisions that can unlock previously unseen opportunities for growth.
Providing Value to Businesses, No Matter the Size
Large multinationals such as Coca-Cola are expected to have fully developed AI capabilities, but one of the more recent and most impactful advancements in AI has been its wider availability to small- and medium sized businesses. In the Harvard Business Review, Andrew McAfee and Erik Brynjolfsson noted that the “steadily declining costs of all the elements of computing—storage, memory, processing, bandwidth, and so on—mean that previously expensive data-intensive approaches are quickly becoming economical”. As a result, productivity and data processing improvements are available to a wider range of companies.
Companies such as Albert AI, IBM, and Arimo, for example, are giving SMBs the opportunity to leverage analytics and artificial intelligence to build, manage, and, in some cases, fully automize advertising campaigns. They are providing more efficiency and value to a wider range of companies. Marketing platforms and other AI-driven services that were once reserved for big tech companies or those with the deepest pockets are more accessible than ever.
Data-driven approaches to value creation are no longer a dream for young entrepreneurs, they are an expectation. The democratization of technology through constant progress over the past decade has made it easier for companies, no matter the size, to rethink how they can deliver the most value to their customers.
Artificial intelligence including machine learning, natural language processing, and other cognitive technologies have undergone an evolution in the past decade. Legacy systems are incapable of performing the data processing tasks necessary to keep pace in a competitive business landscape. Modern business challenges require updated systems that simplify operations, improve the customer experience, and increase the speed and value of data processing.
With the help of AI – technology that is now more accessible than ever – businesses have the power to “measure and therefore manage more precisely than ever before,” turning decisions that previously relied on gut feel and intuition into stronger, more robust judgements informed by data and information. Together, the processing capabilities and the opportunities to provide more value are making it imperative for businesses to move away from existing technologies. Artificial intelligence is the only way for businesses to create order from chaos.