Today we are in the early stages of a transition in how we deploy, manage, and use computer systems. Like earlier transitions, such as the emergence of cloud and mobile computing, it’s first visible to those who are paying close attention or are working on it themselves.
We are witnessing the very beginning of a shift toward edge computing, a neologism that simply means the storage and processing of information close to where it’s used and on machines that belong to the end users.
Edge computing isn’t new. It was the dominant paradigm from the late 1970s until the mid 2000s. Back then it was called personal computing. A computer was something you kept under your desk or perhaps in your backpack, and it was where you stored all your data and ran your software.
Beginning in the late 1990s and reaching a tipping point around 2010, the PC era was brought to a close by the advent of ubiquitous Internet connections that made accessing larger remote systems fast and easy. The industry attached the term “cloud” to these invisible remote systems and the age of cloud computing was born.
The move toward cloud computing was driven by many advantages. Software in the cloud could run all the time and be accessed from anywhere. Invisible employees of cloud software and hosting companies magically managed IT resources and handled software updates behind the scenes. Huge amounts of data could be leveraged, and economies of scale allowed short bursts of massive processing power to take the place of long hours spent waiting for a personal computer to crunch data. Multiple users could collaborate on documents and data sets from anywhere.
The simultaneous emergence of mobile computing closed the cloud computing deal. Mobile phones and tablets are extremely portable and convenient but their power is limited by size and battery life. Using these small devices as “thin clients” to access the cloud proved a logical way to expand their power without increasing their cost, and storing data elsewhere is particularly attractive when the local machine is something that can be stepped on or dropped in a toilet.
Cloud computing looked new to many, but was really the return of something old and familiar. Prior to the PC era, a computer was a huge expensive thing called a mainframe that was the size of a room at a minimum. Mainframes were managed by teams of technicians and accessed by way of desktop terminals that didn’t do much more than take input from a keyboard and show output on a screen.
Just replace mainframe with cloud, and terminal with mobile device. The cloud era, it turns out, is simply “mainframe computing 2.0.”
The history of computing is full of cycles. Operating systems, languages, and applications start simple, become complex, and then are rewritten to make them simple again. Computing functionality is moved from general purpose CPUs to specialized hardware and then back again. Platforms and software stacks oscillate between closed and open, bundled and unbundled, monolithic and modular.
To a casual observer it looks like an exercise in futility, but each iteration is a learning experience. When software goes from simple to complex to simple again, that newfound simplicity incorporates a new level of understanding of the problem domain. When platforms and software stacks bundle and unbundle, each turn of the wheel brings new capabilities arranged in new and usually more useful ways.
We think the move to edge is another one of those cycles. This new shift is being driven by many factors. Broadband Internet advances like the increasing deployment of fiber optics, 5G, and low Earth orbit satellites will conspire to close the network performance and reliability gap between the edge and the cloud. Dozens of CPU cores and terabytes of fast local storage are becoming easily affordable. Advances in distributed software engineering techniques like automated consensus and orchestration, continuous integration, intelligent peer to peer data replication, block chains, and network virtualization are making it possible to realize cloud-like automation and fault tolerance everywhere. Meanwhile concerns about privacy, security, cost, and data sovereignty are driving more and more users to reconsider the wisdom of storing and processing their most sensitive information in systems they do not control.
Here at Anorak Ventures we pride ourselves in investing in those rare and transformational ventures that will help to build the world of tomorrow. We’ve been watching this shift and looking for those companies that are best positioned to play a pivotal role in facilitating it.
The cloud may be conceptually similar to the mainframe but it looks nothing like it, and its capabilities are vastly greater. We likewise do not believe this new era of edge computing is going to look or act like the PCs of old. It’s not going to take the form of beige boxes under desks, but of networks of directly connected personal computing devices. Systems located at remote cloud data centers won’t go away, but their role will be as participants offering specialized services rather than as the central point where everything occurs. The new era of edge computing will be the cloud, but everywhere and owned by everyone leveraging open-source protocols.
Old networking technologies like physical routers, firewalls, and VPN boxes are neither flexible nor ubiquitous enough to power this new world. We need new networking technologies that make it easy to connect anything to everything securely and with the performance and scalability only peer-to-peer networking can deliver. That’s why we’re announcing our investment in ZeroTier, a company founded to build the network layer for the second age of personal computing.
ZeroTier’s mission is to “directly connect the world’s devices” and in so doing to functionally erase the distinction between the cloud and the edge. ZeroTier’s software makes it possible to easily manage networks as if the entire planet is one cloud “region.” PCs, phones, tablets, special purpose IOT devices, cloud virtual machines, and even discrete applications can now be joined directly as if they’re all plugged into the same Ethernet switch, even if they’re mobile or scattered across the world. ZeroTier’s open -source protocol has been around since 2011 and seen strong adoption in recent years. Learn more at Zerotier.com and get in touch if you’re interested in getting involved. They’re hiring.