In the minds of machines: Fundamental change from deep analytics
Deep analytics and artificial intelligence are combining to provide new ways to leverage Big Data.
By Bill Marcus, contributing writer
(HPE INSIGHTS) In Munich, Germany, the technology conglomerate Siemens AG is betting $1.1 billion on digital technologies, such as the proposition that blockchain data can be leveraged by machine learning to improve the secure transmission of data used in energy trading. Siemens is welcoming its employees and independent firms to bid for the money if they are willing to research how Siemens can develop businesses that use artificial intelligence.
Siemens is just one of a number of companies embracing machine learning—the combination of artificial intelligence and deep analytics that enables enterprises to make predictions on large amounts of data and allows developers to experiment by incorporating features like speech and pattern recognition, as well as statistical techniques, into their analysis. And HPE’s recent announcement of “machine learning as a service” (MLaaS) is aimed at helping the process really take off.
Speaking at HPE Discover Las Vegas 2016, HPE Executive Vice President Robert Youngjohns called machine learning and deep analytics the most fundamental change we’re ever going to see. This, he said, is a solution to a glut of data that’s only increasing. “Everything and everyone is producing data, whether it’s enterprise apps, mobile phones, wearable devices, thermostats, sensors, log files, and more and more,” said Youngjohns.
The importance of scale
Youngjohns noted that organizations large and small will be affected. That said, the scale of an organization and its data is key to how much the process will matter, according to James Howard, a Maryland-based data science consultant who provides high-level data, policy, and economic consulting to federal agencies and similarly sophisticated clients, including, at one time, the Board of Governors of the Federal Reserve System. “If you’re a small bank, machine learning is going to help get you through the day, but it’s not going to be something critical—whereas if you’re doing high-frequency trading, all you’re doing is machine learning,” he says.
By automating how an organization makes decisions, artificial intelligence and deep analytics enable Amazon, for example, to tell an online customer what product might be of interest, Howard says. “(It) provides the tools that Facebook uses to say, ‘Maybe you should be friends with this person’ [and for job sites] to say, ‘This is a job we think might be right for you.'”
“Outlook on Artificial Intelligence in the Enterprise 2016,” a Narrative Science survey of 235 business executives, found that 88 percent of organizations are already relying on artificial intelligence technologies in the workplace, and 58 percent of respondents are already using predictive analytics. Driving the increase is the proliferation of data.
Taking off in manufacturing, technology, and business
Manufacturers who use predictive analytics can pinpoint where and how much product they need to produce, thus avoiding costly and inefficient stockpiling, says Kevin Lyons, associate professor in the Department of Supply Chain Management/Supply Chain Archaeology Lab at Rutgers Business School. “With IoT’s predictive analytics capabilities, you can pinpoint what production is needed, where, and how much, in real time,” he explained in a recent article.
Giants in the tech world are leaning heavily on these tools as well: Google has used it to trim power use by 15 percent at its already-efficient data centers. At its Durathon Battery factory in Schenectady, New York, General Electric uses Big Data, software, analytics, and 10,000 sensors to “correlate the variations in manufacturing and usage to product performance, accelerate learning, and, ultimately, provide the customer with a better experience.”
Twitter is increasing its use of artificial intelligence and analytics by using Magic Pony Technology, a Twitter subsidiary, to “understand the features of imagery,” the company says. Pitney Bowes, a multinational shipper, uses machine-learning algorithms to figure out the most efficient way for client firms to meet shipping and tariff requirements of the more than 200 nations and 5,000 classification systems.
Teaching machines to learn
Challenges abound in making it possible for machines to learn from data. The neural networks used in everyday speech translation, for example, struggle to comprehend cultural nuances and metaphors in human-to-human communication. This makes translating languages difficult, and occasionally leads to humorous results such as “children sandwiches” and “wife cake.” However, a Google application called Cloud Natural API is helping to decode meaning.
Also, older data isn’t necessarily ready for artificial intelligence, says Howard. A couple of years back, Howard was exploring the impact of deep analytics on a Federal Communications Commission complaint. Someone had a problem with the advertising during the halftime show of a Super Bowl game. Howard requested relevant data, which the FCC sent in paper form. “It was a minimal-quality scan and then, on top of that, it looked like a fax,” Howard says. “We gave up. That’s when you need somebody who is a data scientist or a data consultant to come in and say, ‘OK, what you have here is a mess, and this is how we can clean it up, and this is what we can get out of it,'” he says.
Data scientist scarcity
If only there were enough data scientists. Personnel shortages are also creating a challenge, says John Lucker, Deloitte’s Global Advanced Analytics and Modeling Market leader. A widely cited McKinsey report revealed that by 2018, there could be up to 190,000 open positions. “I think it’s important for business leaders to pragmatically consider what it might mean to use a result from such tools as AI without human guidance,” Lucker says. “Expertise is required to interpret, explain, and visualize machine learning results.”
As Youngjohns said in his Discover presentation, wider acceptance of machine learning will change the way business is done. He noted that the paradigm of creating a business process, then measuring it, is going to flip to a data-first model. “This is spawning a new paradigm … where you start with the data and then you use the analytics, and then you create the business processes on top of that,” he said. And the resulting augmentation of human decision processes has the potential to transform business, and the world, he added.
HPE’s approach to MLaaS lets organizations leverage the power of these capabilities without the complexity, intense talent requirements, or cost. For more, read about Haven OnDemand.