Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hello everyone.
My name is Gobin Kaman.
I'm a data architect enable digital transformations through data strategy
and ea. I focus on ERNI, industry, sector and specialist and oil and guests.
Today I'll be presenting on how AA is reshaping the master data
governance in an SAP environment.
And I'll also take you through with the small use cases, which is an incident
management application as we talk about data such as customer master.
Suppliers, materials, enterprises, asset management or HR data such.
For any example, employee data are considered.
These data are considered a blood line of any enterprise applications when this data
is inaccurate, inconsistent, or outdated.
This affects the operational SAP processes such as supply chain, order
to cash, or even infect the finance.
It also doesn't stop there.
It also impacts their downstream applications, which consumes
this data and use this data.
Any change to the data will, not just going and updating data,
it needs an incident management application where you had to submit
an incident to correct the data.
And the data stewards who are responsible for that have to come and put their
effort to correct the data and replicate the data to the respective applications.
Today I'll be taking through the journey from a traditionally MDM to an A driven
cognitive MDM and practical SAP use cases, implementing that approach.
We will discuss about how AI brings an A actionable results.
Measurable results, actionable insights, and forward looking, how
implementing those set solutions.
Let's start with the governance approach and see how this can be mitigated, but
then instant management applications.
As we all know, data is the backbone of A-E-S-A-P applications, rather
any ERP applications, right?
It connects to the various business processes or operations such as
procurement, finance, or supply chain.
Of data.
So data is the DNA for ions.
If the DNA is corrupted, you're right, everything will go back
and then it'll fall apart.
You cannot uncompromised on this data.
Incorrect data, right?
How these, this has been done traditionally.
Traditionally, our systems are built and relied on manual rules.
That means that can be manageable when the volume is limited.
Or the proactive measures can be taken through a manual processes, right?
Ghana, the days now things are changing by a globalization with the imp, with the
inclusion of multi-system environment, a rule-based approach are fragile,
slow, and prone To understand this challenge better, let's get to a few
master data maintenance limitations.
How cognitive MDM can address this challenge, right?
Let's shift into that traditional MDM limitation so that you'll all understand
how, what is the key challenges using a traditional md, as we all
know, we talked about that rule-based validations and a manual fencing impact.
Traditional MDM to be slow as this cannot cap up, this cannot keep up with the PA
pace when there is a velocity increases or the variety of data changes, right?
Key challenges includes such static data that do not adapt to a business
scenario such as consolidations or harmonization, and also do not support
an extension to a new product line.
Majority of the traditional MDM were batch managed batch process,
managed that delayed detection until the, until it is too late.
It has a huge challenge.
It's more of a reactive approach.
The struggle is in the distributed environment where the correction
need to be started from the MDM applications and then replicate
to the consumer application.
Research shows that poor data maintenance.
In an enterprise application on an average impacts $13 million annually, right?
That's not it.
It does not stop with the cost.
It also includes 41% of data, professional work data, professional
car time is getting wasted.
It can be, which it can be used for innovating purposes.
Now it's a time to sift how through an AI transformation.
A transforms MDM to a deterministic to a cognitive though, right?
How a brings up machine learning that enables a pattern
recognition and anomaly detection.
Natural language processing unlocks the unstructured data
and improves the useability.
A sentimental reviews can be read through that unstructured data
and text can be read through that.
And based on that.
Incident can be created automatically where incident
management can help with this.
With the preventive predictive analytics in place, the set of governance
from a reactive to a proactive can happen when AI arguments MDM
organization move from a firefighting mode to a preventing them right?
I can give a few real time examples instead of waiting for a duplicate vendor
need to be corrected in the first place.
Right now what is happening is it's waiting for them payments to be
failed, and the system flags if the duration that duration can happen.
Instead.
If we were able to do that well in advance, this detection can
happen well in advance so that the correction can be early in advance.
How the evolution happened from a traditional versus a cognitive brain.
Since we got a good handle of traditional MDM, as we all talked
about a rule-based workflow, our rule-based definition has been made.
MDM has a limitations.
Now, traditional MDM has a limitation.
Now let's do a quick compare how a traditional MDM can be a can
be an issue with the cognitive MDM against the cognitive M dm.
In traditional MDM, as we talked about majority death cases, it's all rule-based.
Majority of them are generally central managed, right?
Which means that somebody has to come and correct the data
and manually intent process.
It's all reactive and a batch processes, right?
Which means that you had to wait for the failure to happen and somebody manually
have to go and correct this data.
This is an which, an impact, right?
In contrast, when we do the cognitive M DM machine learning algorithms.
Natural language process.
Our predictive analytics can ha help improving this processes,
you take the proactive measures.
Adapt that and address the realtime issues on and off on directly.
It's capable of also, it is also capable of handling a large
volume of data in a sub-second.
This approach from a traditional MDM to a cognitive M immune is not
just an incremental, you are not sifting from one system to another.
It's whole paradigm shift.
The data governance evolved from a static guardrail to an
intelligent self-improving systems.
So the challenge shows that.
There is a need for a smarter way at adaptive governance
through a cognitive MDM.
This sits from the fundamental MDM to a powered MDM solution, right?
How organization approach data governance within the SAP ecosystem.
Now let's understand the impact of when master data goes wrong.
Data, how, when there is an instant need to be created and then instant
ripple through the SAP landscape.
There are four areas that need to focus when incident need to be treated right.
How there are majority of the areas such as data quality issues, which is
a inconsistency of data which spreads across the module, impacted areas like
data analytics, where the direct impacts to identify the right details or reporting
to the leadership of our customer.
Personally, I have seen many such challenges.
Where involves heavy human interventions to carry the data.
Obviously this will cost time and money for any of the human projects.
Majority of the data projects invo involves in the data quality corrections.
It also doesn't stop there just as the data, it cascades the are, it propagates
the data into a integration points such as the data which are created
or updated in the traditionally MDM.
Sends operational, send the data to integration plans such as
operational SAP systems, or in fact, to the downstream applications.
For example, if a EA Enterprise Asset management data replaces the data to
a performance management system, the rules that in the performance management
need to be corrected are calculated based on the data, what it receives.
If the data goes bad, obviously the rules will not match, and then the
performance calculation will go over.
This would impact, hit the preventive or proactive maintenance or a
reactive maintenance process.
Wherein the wherein the process can go bad and then it'll create a lot of issues
in the preventive maintenance process, especially in the oil and gas sector.
It also has a valid velocity changes.
When there is a velocity that volume changes.
In the global instance, such as the global Master declaration, the issue
doesn't stop in one location, it spreads to the other areas as well.
These are the very concerning for the centrally managed
objects, such as customer master.
It replicates the data to the all the subsequent applications,
which impacts this data.
It also impacts the operational resilience, depends on
preventing that failure early.
A single corrupted supplier data today, no longer, it'll
not just stop the supply chain.
Impact.
It also impacts the foundational data that cannot be compromised.
The effects can propagate to the multiple layers of the enterprise
architecture, as you all know, from the data ingestion to the operational
systems, to the data consumption systems.
Speaking to all these challenges with the new tools and technologies,
the challenge can be addressed via.
A solution which has an which enhances resilience by making a data
ance as a proactive ance, right?
Instead of your reactive or a firefighting mode.
Three pillars of the there are three pillars which can drive this
transformations, machine learning, natural language processing,
and then predictor analytics.
These are the three key pillars which A, can be driven and help this
cognitive implementing cognitive.
Machine learning, for example it can detect and correlate the
SubT related data predictor.
Lyze anomalies automate the classifications and it gives
a it gives a good results.
For instance, anomaly detection capability that far exceeds from the human analytics.
In today's world, what data stewards are doing it is they're taking a
report out of a Power BI or an Excel.
They scan through the data.
They're finding an anomaly of this data is very cumbersome.
It's not easy to do a manual interpretation and try to find that data,
which cost a lot of time and energy.
Further, this can be overridden by writing in ML languages, whether all
those challenges can be addressed and anomaly detection can be done within
the Sub-Zero seconds, as you all know.
NLP natural language pricing, right?
There are unstructured data which relies on customer.
Data can be relayed on the desktop applications from the users, or it
can be relay on the data management applications such as DMS or D two systems.
Any of these unstructured data enables a conversational governance
by leveraging the value through her.
Text extraction wherein NLP will go read that data and then find that text out.
You can do the sentimental analysis based on the user's response saying
that, okay, this is a good data.
I helped to remediate by this processes.
All those things, based on their interactions with the systems, these
sentiments can be read and then future forward looking it can correct its
own predictive analytics, right?
Helps to identify the risk very well in advance.
And materialize and focus that issues in a potential on the potential
these data can, these issues can happen and potentially instant can
be triggered well in advance, right?
Based on the time series analysis, it has its own, and then
historical data analysis together.
These create a cognitive layer over the A PM, dm, S-A-P-M-D-G,
haa and data outs and on.
From 2025 SAP roadmap, adding SAPM, dg, MDG on B two P applications, which
supports a and ML capability, right?
Analytics, a driven analytics is part of the SAPM, DG MDG solution with the
data quality controls as they make it to their a core as a solution.
Looking forward with all the foundational layout laid out.
Now let's jump into understand how operational AA is supporting this in M
dm. How this can be implemented, right?
Cognitive MDM for an instant readiness classified into four different stages.
First, as we all know, we start with the assessment and find the readiness of it.
The assessment on any data project starts with the filing of data, right?
Evaluate the.
Current process maturity, data, infrastructure readiness,
and other parameters.
This can be done through a building a LLM, small LLM based on the AA to drive the
data profiling work where users can build profiling dashboards from a live data and
perform appropriate, actionable cleansing effort, okay, at the source and advise the
transformations logic to be implemented.
I personally have implemented a small solution, which is a LLM that reads
the data from the live data, from the systems or the legacy systems,
and generate the profiling dashboard for the users that this have the
users to go and figure it out.
If there is a data cleansing data is required when there is a transformation
project in place for them, and then this will be help for them in
the transformation project across.
How do we approach, right?
We cannot just go and do the whole deployment.
Instead, we had to go as a modular architecture, deploy a domain
specific or use case driven modules.
Okay?
Integrating in C framework, we don't want to just go and
boil the integration, right?
Our focus should be on a priority, our severity, key use cases on the high
priority and the severity use cases, and approach those as the priority one items.
Obviously there will be a human interaction.
AI collaboration is must have, as you all know, the data need to be the a LM
model need to be trained based on the use case given by the data stewards and
with the SAP infrastructure framework.
Focus on the objects which are high, complex logics to be trained,
complex objects so that the A model can be trained and then enrich their
capabilities on the going forward.
Change management is one of the key area which you need to focus
on the whole process, right?
It's not just a tool which derives that or which determines the solution.
It's a mindset of people also need to be changed.
They have to upskill their upskill, redesign the process where necessary,
such as implementing a SIP two implementing smaller solutions.
For example, a SIP two customer, which today it has been manual
process of, integrating this and then creating a workflow process
and all those things tomorrow.
We can de devise a plan saying that if it's a region, it's this
one, and then sold to is this one.
All I need to do is automatic this, automate this one so
that the STO can be created.
We don't need a manually interaction attack with that, the structured framework
A becomes a Bolton, just a bolt-on tool.
We don't want to do this so we need to have structured approach.
It becomes a trans that can be transformable and help to build
the enterprise and the data set.
Data can be set right.
Unlike the periodic checks by a user, unlike the periodic check by the
users, the quality reports, a model can be trained continuously to monitor
and continuously evolve the data.
It can also assess the data and mediate automatically, assess as needed.
How do we do the real time monitoring and preventing right.
Given the business category like an asset man asset manage asset maintenance
process, a realtime monitoring represents one of the most significant ones in
the eight one management applications, such as steam analysis, adaptive
thresholds, and pattern correlation.
In an EM based in an EM objects, object types, and a maintenance plan
can drive a lot of, based on the object type and the maintenance plan.
The insights of the asset objects can be collected and
correlate to identify the pattern.
Incident can be caught before they escalate with this, right?
For example, a sudden spike of an rejected purchase order might be
a flagged in a real time by trace of to a faulty vendor master.
This could be, and this can be corrected, or if you do not correct this one on time,
it's gonna go into a financial process.
A based solutions can help to alter alert the preventions by assessing
the business impacts, prioritizing the risk, and escalate this
dynamically with the trend analysis and this and the seasonal pattern.
Predictive issues, identification, help to correlate the analysis and
identify the scenarios and respective capabilities can be achieved.
Now, we talked about a benefits.
Proactive monitoring.
Also, right now, let's find how this outcome will help how the outcome
a enhanced master data outcomes.
Our capabilities can accelerate the root cause of analysis by providing an
intelligent investment support automate the correlation analysis and pattern
recognition can be a future forward.
How do we accelerate the root cause analysis, right?
A also accelerate root cause analysis by automating the lineage tracing shows
where they error occurred, where it started, and how the data occurred.
NLP interfaces allows the data steward to ask, why did this error occur?
This allows the data steward to identify and analysis through the
failure curve and predictive analysis.
Based on the analysis, data steward can remediate the
data and replicate as needed.
This can be a good training for their data model as well.
Multidimensional correlations identifies across system impacts,
such as identifying and impacts and m and the magnitude is very high.
This helps to prioritize the investigation in RU root cause analysis.
The result of this analysis is not just time.
It all it just reduce the time from a days to an hours resolution becomes
smarter by announcing the detection analysis, as you all know, right?
You detect early so that the analysis can be done early and then the capabilities
can be achieved for an incident resolution activities that can be addressed.
Many data related incident without human information.
This can be reduced 50% of the data towards time so that they can focus on
innovating the data solutions rather than wasting the time on correcting the data.
How do we, the intelligent automation of the resolution can happen when the, when
we say the resolution becomes smarter, but then intelligent automation process, there
are some processes such as autonomous data correction need to be established.
For an example, the system may automatically correct and misclass
and misclassified the material code.
This can be notified to the approvers and ensure that
compliance logs can be updated.
Predictive strategies based on the historical impacts helps, which will help
reduce the resource optimization and find the algorithms, the timeline predictions,
and model the estimate solutions.
Key benefits of routing the workflow dynamically and artistry the
stakeholders to support their co compliance and audit integration
with the release of 2025 and beyond.
SAP performs very heavily on the a, a transformations a forward
for forward looking aid solutions.
SAP is simplifying the architecture, as you all know, by introducing
the a and ml capabilities in the areas such as SAP Hana.
In memory computing where ML libraries and a real time replication can happen
and implementing an S-A-P-M-D-G solutions where workflow enhancements have
happening, semantic improvements are added, which includes the analytics.
Predictive analytics reporting has been built, okay, but today world,
there, there is no predictive analytics of the dashboard cannot
be done for a data quality reports that has to be manually built and it
has to be built outside of the MDG.
By introducing the data spare into the system where cementing
models cross integrate system integration can be easily achieved.
As we all know, things are moving towards the cloud of majority of the cases
as services and security can be very helpful, this ecosystem, and it brings
the ecosystem with the SAP Hana, MDG data sphere and then cloud enables the A driven
cognitive enables the A driven cognitive towards scale to scale enterprise value.
Speaking of the functionality and benefits of a and ML thus far right
now, how a business or a customer can benefits out of this, right?
Can or measure the success of this one By bringing such tools.
It's not just you implement the tool the business or the customer
has to benefit out of this, right?
How do we measure the success based on the implementation experience.
In several industries such as the metrics I have captured on the below,
traditional data quality team, data quality improvements, and the unique value
proposition on the cogniti capabilities.
Here are some of the KPIs from the implementation, the.
75 percentage of 75 percentage of the cleansing effort can be achieved
or reduced over the period by increasing 95 percentage of clean data.
Okay?
When the data model matures, 90% of the classification 90% of the
classification of accuracy can be achieved through the ML algorithms, right?
Achieving with the high accuracy data, as you all know, which is a very high for
any data product, any data applications.
60% of the time can be reduced by implementing a cognitive MDM solutions.
How this will be reduced when the user ask them if there is any user they
have to do a manual approach, finding an anomaly or finding and predictive
analysis on all those things where system can do that with the 60% less of time,
80% of the incident can be prevented by well in advanced predictive analysis.
These are very key.
These are some of the key things which I have learned from
implementing such solutions.
Again, AM is not a is not an A MDM is not just a static, right?
It has to be evolving.
When the industry grows or the compliance landscape grows,
it has to grow appropriately.
How do we implement the future implications, right?
How do we address the future implications?
Looking ahead the future areas where future implications can be considered
implementing such things, right?
There are some areas which we need to focus on.
Where autonomous governance solution have has to be implemented.
This can be a autonomous governance solution can be a self-correcting data.
With the quantum of quantum computing integration where the data automatically
corrects it, or create an incident as in when needed, wherein there
without a manual intervention.
Industry specific implementation can be a greater help, as we all
know, such as healthcare, finance, retail, oil and gas, or can be built.
This can have, this can help preventive or a reactive process.
Implementing and implementing can be a heavy, as we all know, right?
It's not something small.
And by supported regularities, these are needed through a transparency ethics.
And global Standard will be adapted needed to be adapted.
It can't be just lying by its own or a or a, or individually managed.
It has to be a global status, standard, data standard to be
established before you propose.
I would suggest before you propose to the change before you propose for any
of this change, make sure that you bring the organizational alignment in this.
As these are the key when you implement any MDM solutions, right?
'cause you need to be defining a new rules, new structure.
And culturally they have to adapt.
With this.
With this, we can take a look into the real time world.
Examples where, how this results can transpire or how
this results can be achieved, approved for the patients, right?
You had to wait for that.
How this can be evolved, how this has been evolved from one to another.
With the oil and gas, a global retailer where I have implemented
a and ML solutions, now 60% less manual efforts are getting in ed.
This has been implemented in very recent past where right now data steward not
spending much time on data collection.
This not only just reduce the data correction, it also improve the supply
chain efficiency by 15% and in turn, that does mean what improve the.
Dollar value for you, saving the dollar value for you, right?
Some of the examples on a healthcare industries, which is a very
critical, the accuracy of the data correction need to be very stringent.
It cannot be just a point, 0.1 or two, cannot be done.
Achieving this through a manual or traditional MDM cannot be easy, right?
Because it's a manual intense process.
It has to be a reactive process, all those things by implementing an a ML solution.
95% or more passion records has been improved.
And then compliance has been very happy addressing these solutions as well in the
form, in the financial firm of financial sectors, like a banking sector, right?
70% of the onboarding a customer can be has been.
Okay, earlier, they have to start from the Salesforce all the way to the lead, to
the pursuit, to the creating a customer.
Now with the vetting process and all those process, it takes a longer duration.
With implementing A-A-M-L-M-D-M solutions, what it happens is it
reduces this time and the fast, but can happen within few minutes.
So this revenue, this ha, helps a revenue gain of about $5 million,
which I have seen very recently.
These cases can be demonstrated in A and MDM.
It's not just a theory, right?
As I was explaining, it delivers a tangible outcomes, as you all know
right now since we talked about a and ml advancements on in an SAP
platform for a conversional for a conversational governance, right?
Subduction and a quantum inspired optimization can be a pro, can be a
helpful way in future organization that can seamlessly integrate
artificial intelligence into their fundamental business process.
Creates an adaptive and intelligent business process and operations and
respond to that incident immediately.
And these challenges can be addressed continuously, improving the performance.
Creating a data incident today world, creating a data incident, just not
just it's not just stop by taking time, it impacts through the cast and it is
disrupting the other process as well.
And traditional governance can keep up with this one.
The solution forward for this one is A one MDM solution, then makes a governance,
predictive, proactive, and adaptive.
Real world results shows that implementing a driven MDM shows a cost savings,
compliance improvement, and agility gains.
With this integration of a in, in an MDM brings a transformation opportunity for
organizations those seek to enhance their incident management capabilities so that
the incident management can be a very immediate and very quick, they don't need
to wait for a reactive process to happen.
This will happen as based on the preventative analysis
or a time series entries.
These can happen.
The incidents can be created very immediately.
Whenever there is a, the system understands that there is a
process issue or the data issue.
This clearly shows a paradigm.
S shift evolves from the liability of a strategic advantage.
With that, I would like to thank you spending your time with me for about
on this journey of a MDM journey.
Thank you everyone.