We are in the early stages of what Gartner calls the second decade of cloud computing. Over the last decade, cloud adoption has increased across the industry. Cloud is being adopted for the benefits like instantaneous availability of compute resources, scalability, and pay-as-you-go.
Cloud platforms help organizations to move faster towards their business goals. The complexity of managing vendor relationships with datacenters, hardware vendors moves over to the public cloud operator. That provides the flexibility to move forward, faster in an uncertain business environment. Business project iterations have a much faster turnaround.
Increasingly visible cloud computing trends across the businesses are:-
Cloud-First strategy envisages the use of cloud computing as the primary compute option. This results in the empowerment of project managers with limited budgets in the initial stages of their projects.
Traditional datacenter deployments, on-premise or colocated in datacenters are not being considered as the first option due to the complexity of execution and longer-term contractual commitments or hardware amortization costs.
Cloud-First approach is also chosen because this enables deployment of various workloads in the form of semi-managed microservices on the cloud, resulting in faster time to market. The semi-managed microservices on the cloud also provides self-service management levers.
MultiCloud using Open Source is the New Normal
According to 2019 State of the Cloud Survey by RightScale, organizations are using almost 5 clouds across both public and private.
There are many reasons for using the MultiCloud approach. Multiple teams with fragmented skills on multiple clouds or through in-organic acquisitions for instance could potentially result in organizations using and supporting their workloads on multiple cloud platforms. Other reasons include the need for fully isolated disaster recovery (DR) setups, avoidance of vendor lock-in.
Cloud Native now means MultiCloud native. Function-As-A-Service now means Open Source FAAS software running on open source container orchestration and control planes running across multiple public clouds in an interoperable fashion.
Mature open source tools like Terraform, Rundeck, Rancher, etc. which allow organizations to manage MultiCloud environments natively are seeing significant adoption amongst the DevOps community. The broad trend catching on is that organizations trust Commercial Open Source Software to deploy their MultiCloud Native stacks.
Artificial Intelligence and Machine Learning
IDC forecasts that spending on AI and ML will grow from $12B from 2017 to $57.6B by 2021.
Organizations are increasingly exploring Deep Learning techniques to enhance their existing rules-driven systems with Artificial Intelligence. Their Machine Learning Compute Workloads on the public clouds have thus exponentially increased.
Cloud platforms offer Compute Nodes with GPUs specifically targeting Machine Learning workloads for natural language text processing, speech recognition, image recognition/manipulation including for video streams.
Deep Learning workloads require a large number of powerful GPUs working in parallel. A lot of these workloads are usually required to be run for shorter durations. That doesn’t justify 3-year hardware amortization periods. On E2E Networks, a Cloud Provider in India, high-performance Compute Nodes with dedicated GPU(s), Nvidia’s Tesla V100 can be had for a short 30 day commitment period at a price which is much lower than acquiring and running GPU workloads in-house.
Edge computing is all about bringing the physical location of cloud services nearer to the devices it is interacting with, to reduce the latency and to improve the service quality.
According to Million Insights, “the global edge computing market size is expected to value at $3.24B by 2025.”
Edge computing augments cloud computing, and it focuses more on physical locations and economic feasibility. Organizations will be implementing edge computing strategies to leverage the benefits such as improved quality of service deliverability.
Internet of Things (IoT)
“Worldwide technology spending on the Internet of Things to reach $1.2T in 2022, attaining a CAGR of 13.6% over the 2017-2022 forecast period,” according to IDC.
With the Internet of Things (IoT), organizations need to possibly manage a massive number of devices. The management of IoT devices requires a sophisticated approach to handle frequent un-availability of power/connectivity to the devices via specialized IoT DevOps stack operating on public cloud. The provisioning/monitoring/commands to connected IoT devices along with intelligence at the edge powered by low power SoC requires specialized micro-services to be provisioned by public cloud operators and they are all interested in servicing these workloads. A large number of physical appliance developers, smart LED lighting manufacturers, mobile electric vehicle makers are interested in using IoT software designed for use via public cloud to embed intelligence in their products.
Most organizations see a MultiCloud future in which they are able to use their traditional Compute Workloads coupled with AI/ML jobs. Newer use cases that can make use of edge computing can deliver consistent application experience to global audiences. Increasingly smarter connected devices are bringing IoT much closer to reality and CIOs should watch this trend very closely as it would throw up some of the most significant opportunities for their organizations.