Imagine what the world of work would have been like if the pandemic began in 2010 instead of 2020. To put it bluntly, we all would have been sent home and nothing would have got done – or, at least, nothing of the caliber that previously took place. Notably, there would have been very limited and rudimentary video conferencing because platforms like Zoom, Microsoft Teams, and Google Meet all debuted in the 20-teens. The connectivity and collaboration tools we enjoy today and often take for granted simply did not exist.
In 2020, people continued interacting on any number of software applications while working from home (or, if they were lucky, beach-front hammocks) in disparate locations across the globe. This, too, would have been impossible a decade before. The widespread availability of high bandwidth connectivity facilitated a self-serve digital transformation and empowered individuals and enterprises to change the definition of what a work environment is, which has been permanently altered.
When looking at what systems enable productivity across every industry, it’s hard to imagine a more globally impactful force than connectivity itself. Connectivity refers, broadly, to the exchange of data by different organizations. At the wholesale scale, data are shared by individuals, enterprises, networks, clouds, and other digital platforms.
This exchange, like the internet itself or the smartphone, is akin to magic for the vast majority of users. And even to those with a general understanding of the underlying digital infrastructure, the transmission of data via fiber or a wireless carrier, the transfer of data from one digital platform to another at a peering exchange or by directly interconnecting with each other, might seem to be a feat of technical wizardry.
To demystify what’s happening behind the curtain slightly: A network provider would interconnect with a streaming content provider at a neutral interconnection host by cross connecting via fiber their edge routers. These so-called gateways or on-ramps are the entry point to the internet backbone that ultimately delivers content to end users. These gateways operate in the internet backbone and function like a home or office modem/router setup, but at a much higher speed. This is how organizations miles and miles apart communicate in a fraction of a second.
If you look at the current state of research in academic institutions, hospitals or labs, as well as much of software and other technological development, you’ll notice its foundation is open-source collaboration. To accelerate innovation, institutions and creators rely on the ability to interoperate so that, say, someone in India can update a file posted by someone in the United States and then flag someone in Brazil to offer a peer review. This iterative process is also the genesis of the machine learning construct that serves as the foundation of AI.
Genomic coding or weather modeling are other areas that require the constant exchange of data. Weather models are complex – they’re formed by supercomputers that make quadrillions of calculations per second, called petaflops. The European model and the American model, which we would typically hear about in the news when a weather event is approaching, share the data from these calculations in an ongoing way. This type of open-source meteorological data handling allows us to predict the weather with far more accuracy, and to keep people safer as a result.
These types of collaborative interactions take place thousands of times a day all over the world and are emblematic of the tremendous increase in joint and team ventures that connectivity facilitates. Effectively, we have eliminated barriers of distance or country of origin, enabling the entire planet to converge and work together on ever more ambitious projects.
Today, data make up the raw material for any decision that is made on a corporate level, and many on an individual level. Data drive our products, our services, productivity and our judgements about value. Sophisticated algorithms help companies do everything from suggesting songs for a Spotify playlist to releasing a quarterly report on inflation, which – the moment it is released to the public – will then be processed by thousands of financial algorithms worldwide to help them make informed decisions. Data are the underlying “currency” of almost every industry and conducting business without data is nearly impossible. This is only going to become more prevalent with the advent and proliferation of AI.
Data centers have had to evolve to support the massive and growing volume of data that powers our digital existence. Outside, they’re still the same non-descript brick buildings that they were 15 years ago. (They really are. Try finding a data center that has a beautiful architectural façade!) But inside, the servers, switches, and routers have become a lot heavier. The equipment requires much more hardened buildings because it consumes a lot more power and requires a lot more cooling. All of this necessitates more capital. In essence, as data currency has become more valuable and the amount of data being stored and processed has grown exponentially, the infrastructure necessary to facilitate data processing and exchange has become more and more capital-intensive.
However, there is a solution making the business case for data transport. Companies are realizing that data doesn't need to be physically located at the enterprise itself; they're going to third-party colocation data centers and are using the cloud in a so-called hybrid IT architecture. Sharing electrical and cooling infrastructure with other companies enables significantly cheaper storage, much higher throughput, and collaboration between departments or regions of an organization through physical and virtual networking. This too facilitates innovation by allowing companies to work at a much larger scale, faster and cheaper than they would be able to if relying on an on-premises data center.
Imagine an interconnected ecosystem of enterprises, networks and clouds. If you’re a company utilizing a colocation data center, you may be able to connect directly with a financial exchange, putting their data at your fingertips in real time. You may also benefit from connecting directly with a content delivery network, so your digital content can be streamed with low latency and at high quality of service to consumers globally. A highly integrated data center is a hub that reduces latency because you don't have to transport the data back and forth from your own production facility to a data center to a third party. Basically, you can take the hypotenuse of the triangle instead of the two legs.
This leap to colocation data centers requires a degree of trust on the part of enterprises and a belief in the ability to safely communicate their data and intellectual property. Today, there are numerous authentication mechanisms for data exchanges. For instance, sending money over the internet is safer than ever. Could some of that security be subverted with the advent of AI? Possibly, so we have some work to do as an industry in that area. But, overall, there are many strong mechanisms in place to authenticate and encrypt infrastructure communication, so the networks are extremely secure.
One of the pressing questions of the day for data scientists, IT professionals, and enterprise leaders is how to balance the need to secure and safeguard data with the imperative to leverage and share it in order to drive decision-making and collaborative work. Advancements in connectivity are a big piece of that puzzle, such that even on the back of the pandemic, conditions are in place for unprecedented innovation on a global scale.
Note: This article was previously published by the Forbes Technology Council and can be found here.