listen. innovation
Innovation

Opportunities for Growth – Artificial Intelligence at Scale

Meredith Kaplan | State Street Corporation

October 11,2017

Everywhere we look, Big Data is there for the sampling.

But what makes it useful are the algorithms that can process vast amounts of data (more than any human could ever consume), and tease out nuanced biases, patterns and trends that are only visible at scale. In an era dominated by personalization, does the secret to growth actually lie in collecting so much data about everything that we can uncover patterns never before seen?

There are two very different ways to scale artificial intelligence. One is to tap into the power of a broader network to overcome the limitations of individual sample size or processing capacity to see larger patterns. The other is embracing the granular – capturing and intelligently using the valuable data points that exist immediately around you. It’s almost like using a telescope vs. a microscope: one shows the universe beyond us and the other shows the universe within us.

Making Small Big

When trying to understand how animal species migrate as they search for food, mates, shelter and more, tracking one animal over the course of a lifetime was nigh impossible. Even if they could track a few animals, the small sample sizes made it difficult to statistically validate their behavioral theories. Yet the introduction of satellite-linked tags in the late 1970s changed everything. Since satellite tagging has become more widespread (and cost-effective) a small sample size can now be overcome by pooling research data across multiple studies, creating larger datasets that become statistically reliable. That data can then be mined to identify collective movement patterns across the entire species, as well as a specific population, and even at the individual level. Because researchers now know why the species does what it does, they can better understand whether an individual animal’s behaviors are normal or not. Personalization is then found through global patterns.

At both ends of the scale, AI helps previously unseen patterns become clear.

And as the New York Times reported1, IBM has spent billions purchasing analytics companies to serve its Big Data initiative. Their goal was to find "commonalities and overlapping interests" in a broader search for patterns that can be applied to multiple areas of study that may appear to have nothing in common. By collecting and comparing smaller sets of disparate data, IBM has found universally applicable configurations that can be used to explain hard-to-quantify issues.  For instance, "students of the statistical computing language known as R have used methods of counting algae blooms to prove patterns of genocide against native peoples in Central America." By making small data sets bigger through a broader network of data, larger truths are unveiled.

Mining Every Last Point

A racing boat sensor array can measure over 300 variables on the boat – all an attempt to shave microseconds off race times. The weight of every rope, the tautness of every sail, the distance between every bolt, all of it is measured and tracked and analyzed. But the biggest unknown and biggest performance factor is their power source, the wind. It's arguably the most important factor in a race, yet almost impossible to measure at the level of detail a racing team would want. Unlike a racecar engine that has been fine tuned for precise power output, the wind comes and goes as it pleases; changing in speed, angle, varying at different heights, interplaying with the textures of the boat and more. The changes can be so minute (and almost never exactly the same) that measuring them is impossible with standard tools. But with Big Data and machine learning, algorithms can sort through the constantly changing data and uncover performance patterns at any given point with any set of variables, recognizing what really matters to performance.

In a sport where so many elements cannot be anticipated by the participants, measuring, tracking and mining every last detail gives us a way to understand the unknown and uncontrollable. We may not be able to change the speed of the wind, but we can learn how to adjust everything else for maximum performance.

At both ends of the scale, AI helps previously unseen patterns become clear. Those patterns are then turned into algorithms, which can actually be overlaid to explain wildly different scenarios. For instance, a farm's irrigation system, Beijing's traffic patterns and the electrical wiring in your phone all have an optimal "flow," so disparate companies may be able to use the same algorithm to address three very different systems. But the only way to find those patterns is to either scale up or get so granular that no data point is left behind.

There is real knowledge to be gleaned from information pulled from Big Data sources. Our latest research on the financial industry underscores how investment institutions will need to choose the right kind of scale with their approach to machine learning, data collection and analysis, and asset intelligence if they are to succeed. The more data you collect and manage, the more likely you are to uncover new truths that give you an edge.

1. Hardy, Q. (2012, February 15). I.B.M.: Big Data, Bigger Patterns. Retrieved October 03, 2017, from https://bits.blogs.nytimes.com/2012/02/15/i-b-m-big-data-bigger-patterns/

CORP-3304

Topics: Fintech


Meredith Kaplan | State Street Corporation

Meredith Kaplan leads industry insights for State Street’s Global Marketing division. She oversees the creation and execution of institutional market surveys to support State Street’s lines of business and thought leadership campaigns, and conducts a range of industry and competitive analysis. Meredith is a classically trained flutist and aspires to write a novel set in Depression-era New England.