No matter how much has been written about edge computing, how many presentations given or panels held, the definition of edge is still a talking point. That’s where we started during the Data Center Austin Conference (DCAC) where I hosted a discussion with Ty Schmitt, VP and Fellow of Extreme Scale Infrastructure, Dell EMC, and Sharif Fotouh, Founder & CEO, Compass EdgePoint.
Why can’t we agree on what edge means? While there’s value to speaking in common terms, Schmitt thinks we shouldn’t try to define it. Though it’s human nature to want to put parameters and classifications around technology to solve a specific problem, in this case he believes the open-ended nature of edge is what makes it full of possibilities for the data center market.
“The lack of definition or consistency comes with massive opportunity,” he explained. “We should embrace the unknowns. That’s where we find opportunity galore. If edge was a constant, we wouldn’t be talking about it, and there’d be no industry competition.” He goes on to say that edge is not so much about what it is, but what it does. “Edge will be characterized by the usage model,” according to Schmitt.
Fotouh points out how edge has changed from mainly regional data centers 15 years ago (located in big cities on either coast), to the more recent 100Kw edge deployments popping up in Tier 2 markets. “Over the last year, edge has transformed into a new form factor,” he said. “It can be a shipping container, a box going to a cell tower, or an industrial complex. Edge is a spectrum that goes right down to the 3-watt devices in our pockets.”
Edge’s Trajectory: A Forcing Function in the Data Center Market
The edge is a convergence of many different spaces – network, infrastructure, IT, hyperscale, security. Each has value and must be taken together for edge to be successful. Fotouh compares the state of edge to the early days of the internet when, as technology emerged, no players wanted to show their cards. Today, the internet is neutral and open. The same paradigm is needed for the future of edge.
Schmitt agreed. “Traditional data center elements were addressed discretely, but edge is forcing IT and infrastructure to come together. Edge must be solved upfront, so it’s bringing functions that historically did not work together to the table at once. Ultimately, this collaboration will drive greater efficiency.”
5G will also be a bit of a “forcing function,” Fotouh said. With the aggregation of fiber and increase in super cell towers, downstream points must be equally considered. Schmitt envisions both public and private edge applications with private driving the most deployments.
The Cloud’s likely Influence to Edge Computing’s Future
Cities will get smarter, cars more autonomous and, the most data-driven industries of today (such as manufacturing and healthcare) will look at their current colocation models through a new lens of on-premise edge solutions, according to Fotouh.
“If we look back 5 years from now and the edge has not manifested itself to take advantage of the spans of locations to provide a level of resiliency not driven down to the individual site – it will be a huge miss,” Schmitt said. “The key is to not have to individually create a hardware level of resiliency – all the thousands of edge sites do not have to be independently redundant. That will play into liability and cost and will drive attractiveness and value for customers. If not done, adoption will be more difficult and prolonged.”
He wonders if the cloud will be a great enabler of edge and bring on the next cycle of massive centralization. That’s just one of the unknowns in edge computing, which begs the question: What inning are we in? Fotouh said we’re still in line getting hot dogs. “The game has not started yet.”
“We are in the first quarter of football game,” Schmitt joked. “We don’t really know what sport we’re playing yet.” While the edge computing game may just be getting started, we all agree that we want tickets. Advance your learning about edge computing with Schneider Electric’s edge solutions.