Who’s Afraid of the Big Bad Cloud: What is cloud and where did it come from?

Back to Blog Posts

Cloud computing has reached a tipping point in enterprise adoption, with the global market surpassing $200 billion and nearly 60% of enterprises deploying some form of cloud solution. The cloud offers scalability and cost savings that have driven exponential rates of adoption in recent years. Some studies anticipate that as much as 83% of all enterprise workload will be in the cloud by 2020.

The rapidly increasing volume, variety, and velocity of data facing legal practitioners coupled with the at times debilitating cost of keeping pace with infrastructure demands makes the ediscovery industry ripe to make the leap to cloud. The time to adapt is now, and those that lag will be stuck playing catch-up. 


So… what exactly is the cloud? 

Cloud computing, despite all the hype and confusion, is, at its core, a very simple resource sharing model. The cloud is simply on-demand computer resource, generally storage and computation power. 

Cloud providers offer enterprises a pay-as-you-go consumption-based model. This eliminates the burden of each individual enterprise building its own data center or infrastructure on premise, staffing it with system administrators and various expensive IT professionals and worrying about the ever looming threat of cyber attacks. Basically, a well-executed cloud-based IT program allows enterprises to focus more on the business of doing business and less on the headaches of IT management. 


Birth of cloud

Although there is quite a bit of buzz around the concept today the underlying concept dates back to the mid 1950s. 

The earliest versions of computing relied on something called a mainframe, an inconceivably expensive (millions in the 1950s) and massive central computer. Since it was not a scalable model to have a mainframe for every employee or even every corporation, multiple users accessed this central computer through dumb terminals. These dumb terminals possessed no standalone processing power and their sole purpose was to provide access back to the mainframe.

The UNIVAC I Mainframe from the 1950s

Eventually, the cost and space-saving benefit of virtual memory and the advent of personal computers replaced the mainframe model. But, this was not the end of the shared resource approach to computation. 

J.C.R. Licklider conceptualized a global decentralized computer network he dubbed an intergalactic computer network. Licklider had a big vision of interconnected communication and interaction with computers that drove his development of Advanced Research Projects Agency Network (ARPANET), the predecessor to the internet. 

Despite its limited abilities from nascent technology, ARPANET allowed globally disparate researchers to connect to the limited number of supercomputer mainframes and each other. The system had the additional benefit of increased security because no one node would destabilize the whole ecosystem if compromised. 

ARPANET started with 4 nodes and ballooned up to hundreds over the decades it developed. But with the advent and adoption of the internet, it faded into obscurity and was decommissioned in the ‘90s. 

A map of ARPANET from 1973 (Wikimedia Commons)


Virtual renaissance 

The advent of the “virtual machine” (VM) operating system by IBM in the early ‘70s changed the game completely. It allowed multiple complete (virtual) computers to operate simultaneously in the same piece of hardware. This effectively took the shared time model of the mainframe and elevated it to a shared hardware model. 

Telecom providers upped the ante by creating virtual private networks (VPNs), instead of creating more infrastructure to support individual users, they were able to drastically scale infrastructure and provide users fractional access. This shared infrastructure approach allowed for rapid innovation in the computation space, from parallel computing to utility-based, SaaS and eventually the modern understanding of cloud computing. 

The largest push in the adoption of cloud computing as a viable model for enterprise came from the entrance of Amazon. Their EC2 offering allowed companies and individuals to rent virtual computers that they could run their own programs and applications on. This allowed large organizations to take advantage of on-demand availability of computer system infrastructure, especially data storage and computing power, without having the manage the cost, data security, or personnel necessary to support the traditional on-premise IT infrastructure.

The entrance of Google, Microsoft, Openstack (NASA) and countless other major players in the cloud computing space solidified the importance of this model in the continued evolution of the business of doing business. 


The tipping point of cloud adoption

Today, upwards of 90% of consumers of digital content are engaging with the cloud, even if they do not know it. Nearly every application or web-based tool accessed via the internet is cloud-enabled. The elasticity and scalability of cloud without the exponential cost infrastructure has lead to rapidly increasing rates of adoption. 

Nearly 60% of North American enterprises now rely on public cloud platforms, five times more than did just five years ago. Gartner predicts that by 2025, 80% of organizations will have migrated away from on-premise data centers towards colocation, hosting and the cloud. Cloud is the new normal. 

In the next segment we will discuss the ways organizations are engaging with the cloud and the types of cloud options available as enterprises migrate to the cloud.

Subscribe to the blog
Quick Menu
0%
100%