10 years of Cloud Computing in a nutshell

March 15, 2021

In two weeks time I will be with Red Hat for already 8 years. Even before joining Red Hat, I have been deeply involved in all things cloud at Sun Microsystems for many years. So I thought, now might be a good time to reflect on what has been happening in Cloud Computing in the last decade.

Who is this cloud guy?

There is a famous movie in Germany, which has a line: “Jetzt stell’n wir uns ma’ ganz dumm un’ frag’n uns: Watt is’n en Dampfmaschien” (Note the Berlin dialect, translated this reads: “Now let’s turn ourselves real dumb and let’s ask ourselves: What’s a steam-engine?”).

So, let’s start with: What is a cloud?

Someone once said, a cloud is nothing else, but someone else’s computer(s).

Someone else once framed a famous tagline: “The Network is the Computer”.

When put in context these lines make more sense now, and we can understand what cloud computing really is and means today.

So I guess there is no reason to describe what cloud computing means today.

So, how did I become cloudy and one could even say a Cloud Solution Architect?

I started in IT at the age of around 14 (something like 42 years back now), playing with my friends’ Z80-based systems. I had never been much into coding or application development, although there may be code from me that still runs at least in some parts at the airport in Munich. I was and I am more of an administrator like person for large infrastructures and such. That is why Cloud Computing has always been more interesting to me from the “how to build and run” and less from the “how to use” aspect. For me it naturally developed itself from combinations of hosting and virtualisation, scaling, and automating the rollout into what is now known as Cloud Computing. Still, it was very interesting to watch how the large clouds started and developed into what they are today.

The early beginnings of hyperscalers

The last decade started, when Oracle bought Sun (at least for me). At Sun and during the two years I still worked there under Oracle’s ownership, I did mainly focus on data center aspects, I was a long-time so-called data center ambassador (one of 70 people at Sun worldwide, but there were 6 Ambassador Programs, making it between 400 and 450 Ambassadors in total) and even became a Principal Field Technologist (one of also a group of approx. 70 people), which was one level below a Distinguished Engineer at Sun. As a data center ambassador my main topics were SunCluster and all around the “N1” acronym, which at Sun meant everything you need to create, manage, run (aka: automate) infrastructure. “N1” meant: “Manage N systems as if they were just 1“. “N1” so-to-say, was Sun’s idea for managing the infrastructures which offer cloud computing. Sun even had a Cloud-DC ready to go into service, but Oracle (we all know: Larry famously called clouds “vapor water“) decided at the time of the Sun acquisition, that it didn’t want to pursue that angle.

Me in 2004 talking about N1 at a Sun Conference

So I was curious to see and follow how the first big cloud provider started.

Amazon

We all know what AWS now stands for and what it offers. Important to note and keep in mind for me is how it got there. AWS decided that it might be a good idea to take the virtualisation aspect which VMware targeted on premise with vCenter to a new level. AWS started with what we in past years called Infrastructure as a Service (IaaS) and over time added Services on top of that.

Google

Google on the other hand, early on decided that they would target a different market than AWS and created what we in past years called PaaS (aka: Platform as a Service). Here Google offered components to create more complex services, and took that burden away from the developer.

So what?

I perceived Amazon as being way more successful with its approach, as it covered a much larger percentage of use cases that were understood in those days. Administrators knew how to install an OS, how to manage an OS, and how to install COTS (Commercial off the Shelf) Software into their managed OS instances.

Google did not find as many developers to create new services out of their service-components, so over time, Google also started to make IaaS something that could be done in their cloud.

Over time, Amazon also added these PaaS components to augment its offerings, and nowadays both large clouds do offer similar capabilities with AWS still a large leader (from a commercial perspective).

Microsoft

Microsoft was late to the game with its Azure Cloud offering. But this is typical for Microsoft. They often let others pave the way and then monitor and decide on how to enter a slightly paved field, thereby omitting the errors and pitfalls that early adopters have already made or fallen into (One small example here: Internet explorer). Microsoft had the advantage, that they have a huge ecosystem of use cases, users and developers, and as late as they were in creating a Hypervisor based virtualisation (aka: Hyper-V), they were lucky to be able to create an IaaS + PaaS Offering successfully close to overnight. And as they already had all the components on premise (aka: IaaS: Hyper-V, PaaS: .NET), bringing that into an online Cloud Service offering was a huge success, which we can see today in Azure being the second largest Cloud after Amazon’s.

IT service providers needed to become “cloud-like”

So, when I moved from Sun to Red Hat, that was the stage, on which I needed to play.

First and foremost, there are other aspects to look at, like:

  • What has Open Source to do with cloud computing?
  • Why should a vendor of Enterprise Open Source be interested in cloud computing?

In order to answer these questions, let me try to describe what I did eight years ago, and what I do today, and while strolling alongside history nonchalantly as passersby give insights into this problem-space.

After a period of a decade of public cloud offerings like the ones described above, which in turn mainly are also based on open source components already, enterprises also wanted to get “into the cloud”. For many it was even less precisely crafted, what they wanted, it sounded more like “We need to do something with cloud!”. That led to a couple of different problem spaces, which needed to be addressed a decade ago (yes, cloud is also already more than 20 years old!). These are:

  • Private Cloud
  • Cloud-Readiness

Private cloud was born out of the idea that things available in public cloud also need to be available in one’s own data center. And that due to some privacy or control aspects, not everything could or should be put into a public cloud. Today we know this, a decade ago, these were less clear aspects and also less clear articulated. Some enterprises simply wanted to be able to show off and boast to their customers that they have a large data center with many blinking LEDs behind walls of glass.

Another aspect that seeped into consciousness was the fact that with cloud computing control over the internal IT diminished and things like “shadow IT” happened more and more often. This was a consequence of the fact, that the idea of “cloud computing” also introduced the “as a service” aspect into many people’s expectations and therefore forced the introduction of more agile and more specifically, quicker, execution of procedures, that in the past had been worked upon by meters and meters of ITIL-conform documentation, including many very manual steps in executing them.

In order to stay relevant as an internal IT service provider, the need arose to also become “cloud-like”. This led to the above two aspects, private cloud and cloud-readiness.

You needed to become cloud-ready in order to offer cloud-like services. You needed to have products that allowed you to build an internal (aka: private) cloud.

OpenStack

OpenStack was born out of the need to have something that was capable of countering AWS in one’s own datacenter. In good Open Source tradition, a couple of companies and enthusiasts got together and started the process of building this solution. This still is one of the largest Open Source projects and has been implemented in many companies’ data centers, being the de-facto standard today for most of the telecom industry world-wide. It took a long way to get there, and it had not been without hurdles (we still remember the “pet vs. cattle” discussions) and had not been adopted in all industries at the same level (not all workloads can easily be “cloudified”). Taking into account that Microsoft calls it’s on-premise version of Azure “AzureStack” gives a hint, how important OpenStack is.

OpenStack did solve one small aspect, namely the IaaS on-premise cloud product-version.

Still missing from the product portfolio was the on-premise PaaS solution. It was also partially targeted as a project inside OpenStack called Magnum, but never really took off. The reason for that is multifold, and many do remember that quite well, as it’s still an ongoing development, which started with:

Docker

Much has been written about Docker, the technology, as well as Docker, the company, so suffice it to say here for me:

Docker made it possible to create, transport, and manage computing artefacts, which allowed them to resemble “services” in the form of “containers”. This allowed the easy creation of more complex services in a PaaS-like method on on-premise infrastructures.

Over time, this merged into what we know as Kubernetes, and is now the de-facto standard for PaaS-like infrastructures. In Red Hat, the product that is based on Kubernetes is known as OpenShift.

So, now that the two basic technology components had been created and successfully brought into the market, we still miss a very important component, and that has something to do with tools and fools.

Tools are crucial for successfully adopting cloud computing

There is a great saying, which goes: “A fool with a tool is still a fool!”.

As mentioned above, Open Source did solve two important aspects in the greater scheme of cloud computing. Still missing are things that help people learn to understand, what Cloud-Computing really is about, what it means for one’s own company and how to get there. In the past years, we all have read “The Phoenix Project” and loved the lessons and insights exhibited in a page-turning narrative.

So, in order to make good use of the available tools we need to enable us to no longer be fools.

Some companies did create methodologies around that, the most well-known being the so-called “twelve factor apps” methodology. This helps in getting a grip on how to develop software so that it becomes “cloud-ready”. In good Open Source practice Red Hat developed a Consultancy Engagement, that can be used to help customers become cloud-ready in doing a larger exemplary software development project. We call it the Red Hat Open Innovation Labs. This way we turn fools with tools into experts with tools.

What will happen during the next decade of Cloud Computing?

So, the last decade saw me helping my customers in the DACH-area understand the deficiencies they needed to overcome in order to become cloud-ready, understand the benefits of tools like OpenStack and OpenShift, enable them to decide, what approach to take in using these tools and become successful cloud providers to their internal customers. Over time, we added additional components to the solution set, like the Ansible Automation Platform, which allows our customers to not only create internal clouds once, but by using Infrastructure as Code (IaC) methods to also treat clouds like a service and create Software defined Data Centers (SDDC). We also added tools to manage multiple clouds as one, so that all the visionary aspects of Sun’s “N1” idea now became close to reality, and even in an Open Source way.

Summary, or words of wisdom

Having had the chance as an SA for Cloud to help you all with becoming cloud-ready, understanding the tools needed to build private clouds, and thereby helping you in shaping your way into the cloud for the last decade, I’m curious what the next decade will bring us in Cloud Computing.

Like the utilitarian industry also saw moves from individual to centralized tooling (aka for example: power generators in every house to centralized power plants for large regions or even countries), we notice that some trends move back to more individual setups (solar power panels on the roofs). In our industry this might be influenced by IoT (Internet of Things) or edge elements and aspects and might require some additional changes, adoptions and adaptations of products, processes and people. The nice thing is, that this industry is innovative and collaborative, and in an OpenSource fashion helps make things better.

Outlook

The last decade saw a development from public cloud usage to creation of private clouds. This helped from creating new micro-service-architecture based applications in the cloud to more agile software development methods in large enterprises.

The next decade will see a more radical re-architecture of legacy monolithic applications to cloud-native applications. One example here in Europe is the whole SAP portfolio.

We also will see the convergence of cloud and on-premise ideas. The term “hybrid cloud” only provides a glimpse of what that might look like. Starting with connections of elements at the construction belts in large plants to either on-premise or even public clouds to cars and other “IoT” elements making use of informations in the cloud and uploading informations into the clouds, the idea of ubiquitous computing will become more and more pervasive. So, after now close to twenty years of cloud-like and cloud computing infrastructure creation, now comes the time of making broad use of these technologies.

I’ll make a mark in my calendar to write a successor to this post in an additional eight years to reflect what the second decade of being a Cloud Solution Architect will have brought to us.

Thanks for reading so far, and I know, there still is an open question on which movie I have been talking about in the introduction…I’ll make it easy, it’s: https://www.imdb.com/title/tt0036818/

One reply on “10 years of Cloud Computing in a nutshell”

Leave a Reply

close

Subscribe to our newsletter.

Please select all the ways you would like to hear from Open Sourcerers:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our newsletter platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.