Transcript
This transcript was autogenerated. To make changes, submit a PR.
Greetings everyone.
I'm Surya Busi coming with 18 plus years of IT experience and currently working
as technical lead for an automotive industry for the past 11 years.
So I have been closely looking and working with the emerging cloud technologies.
So the topic for today's conference is how to be future ready for the.
Cloud which includes discussions about different architectures like
the monolithic microservice and the concepts of edge com, edge computing.
And also we will look into the future technologies, the
emerging technologies at scale.
So with this, my agenda for today.
Is, comes with the cloud evolution, how the landscape has shifted from monoliths
to the futuristic distributor systems.
And next we'll look into the microservices how they have brought a big difference
and also their adoption trends, benefits, and the implementation challenges.
Later we'll be looking into the edge computing, which is a process of bringing
the data and analysis closely to the source of data instead of relying on
the centralized cloud data centers.
And to the very end, we'll be looking into the future technologies
like the hybrid clouds, serverless architecture, and quantum computing.
With all this in view, let's begin our presentation.
Join me for the actionable insights from the real world implementation
and the practical strategies for your organization's cloud journey.
With this, we'll move into the very first part of our agenda.
Which is the cloud native evaluation.
So for years we have been stuck with the monolithic applications where
it's a software system where all the features and the functionality is
developed as a single unit and with all its features and functionalities like.
The user interface, the business logic and the data access layers are all
being developed as one single unit and one exec executable application.
So this single unit of application enables us to have a simple deployment
and development and deployment.
Whereas going, moving on, it'll have impacts on the.
Maintainability and update maintainability of the application and any updates
related to technologies as the, everything is tightly coupled as one unit.
So the key characteristics of this monolithic involves the unified code.
Pace, tight coupling, single deployment and simplified development.
So looking at these, the advantages of monolithics are
the simplicity, the performance, and the centralized management.
But it has the disadvantages as well which are with the, as we
mentioned, the full scalability.
The limited flex flexibility since being a sim single unit of application,
any technology upgrades has to be done to the entire application, which
might cause a significant delay in the releases and the challenges in
maintenance, which also adds to the deployment risks of being a single unit.
The typical use cases of the monolithic applications include a small and
small applications where with little complexity and also an a startup
applications where the prototype with simplicity as the paramount that's the
typical use cases of the monolithic.
With all these challenges with the monolithic architecture, we move
on to the microservices, which gives a greater advantages which is
basically a collection of, collection of autonomous units with their own
individual responsibilities, business responsibilities and its own data store.
So the complexity being divided into multiple units which can
be maintained by themselves.
And these services communicate through the different protocols,
be it BHTP or through the.
Message queues or different sources.
And this enables this being smaller units, it's easy.
It enables the easy maintenance and gives us the flexibility, scalability,
and all these advantages when compared to the legacy monolithic applications.
With that the typical use cases of both the microservices are the large and
complex applications which involves complexity to be dealt with ease.
That's about the microservice architecture, the key use
cases, as I mentioned.
Involves the enterprise systems, the e-commerce platforms example, and
the streaming services where high availability and resilience are critical.
And the upcoming and technology is the edge computing which involves processing
data closer to the source of generation, such as I OT devices, sensors, or local
server, rather than relying solely on the centralized cloud infrastructure.
We'll look into it closely into the later part of the presentation,
but for now, edge Computing is a process where the data and analysis
are close to the source of data.
With this, I'll move to the next.
Which is with the benefits we have seen with the microservices
architecture this driving pri, this being driving the enterprise agility.
So we look into the various, advantages of the micro ARC microservices
architecture in detail here.
With the widespread adoption, we have seen huge surge in the industry where the
microservices adoption has increased from 45% in 2019 to the dominant 91% in 2024.
So this is clearly indicating the industry shift.
With all the proven advantages of microservices, as we explained
earlier, this smaller units enable the faster deployments.
With that being said, we have seen a significant 76% reduction
in the deployment time, enabling faster iteration cycles and
quicker delivery of new features.
Which also includes the cost efficiency.
So organization, organizations can achieve up to 50% reduction in the cost
as we scale only the resources we need.
The services being independent of each other.
And then comes the enhanced scalability which contribute to the 62%
improvement in resource utilization, allowing high demand components
to scale precisely what as needed.
These go, the enhanced scalability and the cost efficiency
goes hands in hand in hand.
As we will be scaling only the units which need, which is receiving huge traffic.
Whereas the, which comes with the cost efficiency of unnecessarily wasting the
resources which need not to be scaled.
Then comes the better fault isolation as a critical advantage.
We have seen 83% reduction in the system-wide failures.
The failure of a service, one individual service easily can be pinpointed with
the with the with the business lodging being scaled into different services.
With all these advantages will move to the next, which is.
As we have seen the pros, now we have to look into the cons of the
microservice architecture with its implementation challenges.
The first being is the technical depth because 60%, 67% of the enterprises.
Are all stuck with the legacy code where code integration, which has
the complexity of understanding and trying to divide into the simple units
with the microservice architecture.
So this huge technical depth is causing the impediment of the moving to.
Moment to the microservice implementation, then comes the service decomposition.
As we understand the business logic in microservice architecture is being
divided into autonomous units, it's highly critical to better understand
the business and to the level it has to be optimally set the boundaries
and being divided into the units.
So without.
A good business knowledge and the boundaries to be set on the domains.
It makes it difficult for a proper design of the microservices.
And with all this in mind, the operational complex complexity,
which says that 71% report increased complexity in monitoring, debugging,
and maintaining the service depend.
As we understand microservices are different services which communicate
and and can grow in number as the complexity of the application
grows, and the difficulty of maintaining them as well grows.
That's what the optic.
Cost, operational complexity being increased.
And also with the different services, having different data stores, it becomes
great difficulty with the data management.
As we see, there's a 64% struggle with data consistency between these
different single units and the distributor transactions and the data
accuracy between these several units.
This what we understand about the implementation challenges of
the microservices architecture.
Now let's move on to with all.
This knowledge as we move to the next step, which is the
architecture foundations needed for the microservice architecture.
This involves the infrastructure components to be understood, the
design, different design patterns available as of today to better
implement the microservice architecture.
When coming to the infrastructure components, we have the container
orchestration with which the greater scale and the management of this different
business units is, would be possible.
As we understand, as the application grows to be complex, it's very key.
To be able to deploy and scale with a greater accuracy in the performance.
So the Kubernetes Container Organization orchestration is one
of the key components when come, when it comes to the infrastructure.
Similarly with the huge traffic as we expect it to be scaled to a
huge level of the traffic expected.
It is important to have a proper traffic routing, load balancing, and,
observability between the services where in this scenario the service
mesh for this traffic management is a good option available for
as an infrastructure component.
And the proper routing enables proper performance and
deliverance of the results.
So the API Gateways for client service communication is one
another infrastructure component.
So this gives us a controlled way of organizing the request, routing them and
handling them with the authentication necessary for the proper security.
Then comes the centralized logging and monitoring which is very key
in where the complexity being properly managed and to provide a
accuracy of the information provided.
This relates to the aggregating the logs metrics from all the microservices to
enable the e debugging and operational visibility at a higher level.
So as we see the inspir sector components moving onto the design patterns which
comes with the domain driven design for service boundaries, as we understand
the business complex complexity to be divided into an operatable single unit,
it the domain driven knowledge is very key to set the boundaries and to develop
the individual services and maintaining them properly and held complexity.
And similarly, the event driven architecture decoupling uses events
of, like we, as we mentioned, where the responses need not be synchronous, where
the event driven architecture helps with the loose coupling and the per
increase in performance and scalability and the improved responsiveness.
So these are the key while designing the microservice patterns architecture and
the CQRS for complex, as we understand.
As the scalability increases the data being to be operated becomes complex.
So separate, read and write operations to manage this data and the the
performance with this complex data is being achieved with the QRS.
As we all know with this many point touch points and the network ops we
take with the microservice architecture, a proper circuit breaker is much
more important for the resilience.
Which prevents the failures at a certain point and not cascading
it to the respective systems.
So with these architectural foundations needed for the microservice
architecture, we'll move to the next.
So now we understand the legacy challenges with the legacy monolithic obligations,
and we overcome what we've overcome with the microservice architecture and
the different challenges implementing the microservice architecture.
We'll move on to the next emerging technology, which is edge computing, which
is a process where, the huge data involved today with this, all these applications
need to be processed efficiently.
So edge computing is a process where the data is being brought
closer to the for analysis to the source of the data, and which has
proven to be providing many results.
So here are some numbers proving its efficiency.
So with the data being, analysis and it's being closed to the source of the data.
It, the latency reduction, the 45% latency reduction, decrease in processing
delays, and for very time applications.
And which with which there's a 42% real time improvement.
Enhanced data handling capabilities for mission critical systems, as example such
as healthcare and manufacturing, and which also comes with the 78% bandwidth savings.
As we understand the data being close to the source, the bandwidth the reduction
in the network hoops being decreased greatly increases the bandwidth savings.
And the response acceleration is proven to be 56% increased with the emergency
response, wherein all it's needed.
As I mentioned in the healthcare applications and the manufacturing
applications and need not be necessarily the same, we, it can
be applied across multiple domains like banking and e-commerce or
where the response is critical.
It.
Looking into the numbers, we'll move on to the next.
Which is the edge computing applications where it can be applied in different
industries which involves the two examples being mentioned here.
The manufacturing and the healthcare.
As we understand manufacturing, were huge volumes of data is being processed to
pro efficiently run the production lines.
There has been a 33% efficiency boost.
Seen and the realtime quality control of the different data and to provide
any defects is being the de defect detection being increased by 87%.
So this predictive maintenance this cost to the predictive
maintenance reduced by 41%.
That's what we see in the manufacturing industry.
Coming to the healthcare, there was there's been a significant
increase in the response up to 56% faster emergency response with a 93%
accuracy, which is very key where the life of the patients is online.
Where immediate data processing is very critical for taking for the
medical staff to take decisions.
So edge computing is very proven to be very valuable in such environments
where time sensitive data is key for the day-to-day operations,
and we move to the next, where with this edge computing with all that
we learned there are definitely some architectural considerations to be made.
Where with this as mentioned, different sources of data with
the quality and quantity of data being processed, it is key to
understand the resource constraints.
With the hardware limitations of the CPU storage.
It is key to understand the amount of, hardware needed to process such data.
So as of today, the K three s and the Micro Ks are designed for
these environments, optimizing and giving a huge results.
Same with the connectivity where it need to be considered.
The emergency of, and the criticality of the data need to be presented.
It is important to have a reliable and high performing
connectivity, network connectivity.
So Kubernetes architecture is being proven to be a solution
with all these in consideration.
As we mentioned, with all the huge data and the critical data being over spread
on the attack surface is being increased.
Security becomes the paramount with the amount of PIA and SPIA being on net.
So the robust.
Security process necessities, like the zero trust models, hardware based
security features, and the strong encryption is very much needed for all
this communications over the network.
So when it comes to edge computing these concentrations are need to be
made when designing the applications.
Now that we move to the next.
So with all the current solutions available in market today for this
hybrid and multi-cloud approaches, which involves the Amazon Web services,
the Microsoft Azure, the Google Cloud and on-premises are proven to be in
different, providing great results in different business areas like the AWS
being, the production and scalable services and Azure being good for the
enterprise apps and identity integration.
And whereas Google Cloud is coming up with the data analytics and ml workloads
being proven with the on-premises, the latency sensitivity and the
legacy systems are being well handled.
That's what the solutions available helps us to understand better
design our applications as of today.
With that, we'll move to the next slide.
So with all this available in the market, it makes us easy to have a
development focus on the key business logic rather than worrying about
the infrastructure where all this infrastructure as service is providing.
In the, by many organizations in the marketplace, this leads
to the constant cost efficiency.
With the zero cost as it's paper per use with all the services provided,
if not in use, we need not be paying.
As we mentioned, it's a zero cost for the ideal resources and the, and with the old
being small and the market being growing, the auto-scaling plays a key role, which
is hugely enabled with this flexibility of opting the paper use models available.
Which enables of instance, scaling whenever depending
on the traffic automatically.
This enables organizations using serverless architectures report
61% shorter development cycles and 45% reduced operational overhead.
With all this emerging technologies, we move on to.
Another very futuristic application, which is the quantum computing.
So quantum computing is basically implement, makes use of the quantum
mechanic concept of quantum mechanics to process information using quantum
bits or qubits which can, which enables computers to perform at much higher
speed than the classical computers.
City has a great potential and this can help.
In various key areas like the material science, pharmaceuticals, and machine
learning by solving problems with a with a much more higher speed with this.
But coming to the disadvantage of quantum computing, it's still under
development and it's still emerging.
It's in the very early stages.
And with, with the amount of speed we are expecting, the infrastructure
involved comes at high cost.
So these services as we understand are still emerging.
As we understand this all being in the cloud, there are certain concerns with
the security of these distributed systems.
Proper mechanisms have been in place with like different mechanisms
like the zero trust architecture, the runtime protection, the secret
management, and the service membership.
Security and the supply chain security.
So coming to the zero trust architecture, it's like the least privilege being
provided and the fine-grain identity controls being set in place to avoid any
security concerns and secret management is encrypting and storage of the necessary
secrets in a more secured area and providing a least privileged access.
And GP Secret Manager is one of the examples for this.
Similarly, the SEC service mesh securities at the network layer, the
TLS layer providing the certificate management automation at a great level.
And the runtime protection being detecting the any behavioral analysis
anomaly with the container workloads, the health checking, the timely health,
and providing dashboard with the analysis and the supply chain security is where
the verifying the bills, identifying the dependency vulnerabilities in
advance to avoid any disturbances.
With all this security and distributed clouds, we'll move to the next.
So what we have understood with the legacy monolithic applications,
the micro upcoming microservice architecture and the hybrid cloud and
the edge computing, and the very much emerging quantum quantum computing.
Now we have all the tools with her within us with our hands.
We are trying to understand how we can apply all this to be a
future ready cloud strategy.
So with all these technologies available and having having them in
our radar enables us to assess and our organizational requirements and try to.
Use the proper tools for designing the organization requirements.
So with this, we can enable the progressive adoption patterns with the,
by adopting the less clear critical requirements and trying to use the
tools and prove the value of them, and then expand whatever we learned to the
different areas of the organization which enables us a progressive
adaption instead of, a bulk modulation.
And that involves.
That involves investing in the platform engineering as well, where a developer
platform where we can put all the learnings in one place and then, and
enabling the rest of the development teams utilize what we learn and making
and enforcing the best practices.
To have the learning curves being reduced at a high level.
So not just learning.
And definitely the performance need to be measured.
Measured what matters with all the Dora metrics and measuring the cost efficiency
and setting up the key performance indicate indexes enables us to build
a very future ready cloud strategy.
Hope this all helps, and with this.
Oh.
We come to the key takeaways of this presentation of what we learn and what
are supposed to be our next steps.
So coming to the what we learn microservices adoption.
Has reached the 91% delivering proven benefits as we understand with the
huge benefits of microservices, it being adopted and has been growing
widely throughout the industry.
And Edge computing being delivering 45%.
Latency reductions.
As we understand in the very time sensitive industries like the healthcare
and the manufacturing and the hybrid cloud approaches, reducing downtime
by 41% as the infrastructure is of a less burden now and the key business.
Focus is being increased with all this emergency, and we understand the
emerging technologies like the serverless architecture and the quantum computing,
which greatly benefits in research areas and helps in the growing industries.
So with all these understandings, our next steps include SS or.
Current requirements of our industry and the better design applications
using the emerging technologies against the cloud native patterns and identify
the opportunities of analysis of the data using the edge computing and,
setting up the right tools, having the right technology radar within
set within the organization help us develop a much business focused and
much and highly performing applications.
And with this it helps the organizations grow at a great level.
With this, I end my presentation and thank you for this opportunity.