KINGLAND.COM
TOPIC
Introduction to Data Management
Maturity Models
PRESENTED TO:
Webinar
July 28, 2016
Discover the Confidence of Knowing.
Today’s Agenda
Agenda Topics
Review of the key points from the first Webinar
Overview of Capability Maturity Models
Discussion of Data Management Maturity (DMM) Model
Discussion of Data Management Capability Assessment
Model (DCAM)
Model Usage Considerations
Introduction to Data Management Maturity Models
Data Management
2
3
Data Management
Data Management Maturity: Defined
Data Management
• The business functions that develop data, and/or
execute plans, policies, practices and projects that
control, protect, deliver and enhance the value of
data.
Data Management Maturity
• The ability of an organization to precisely define,
easily integrate, protect, effectively retrieve, and
deliver data that is fit for purpose for both
internal applications and external purposes .
Metadata is data too, and is required to be proactively managed
4
Data Management
Current State of Data Management Maturity
Data Management Maturity is relatively new, and without it, quality is generally
poor
• Virtually no formal measures of data management maturity, though some measures of
data management program implementation
• No more than ~ 33% of organizations have an active, formal data management program at some level
of implementation1
• Nearly 50% of existing formal data management programs are 1 year old or less1
• Data Quality measures as a proxy for mature data management activities indicate strong
need for improvements
• Measured data quality is reported to indicated ~25-30 percent of organizations have data quality
issues2
• Amount of companies reporting data quality issues is increasing2
• Business demand and regulatory pressures are driving recognition that data management
is a business issue and needs to be improved under formalized programs
• Business demand for Master Data Management, Data Science and Predictive Analytics require
foundational improvement for pro-active management of data from origination through the entire data
flow and lifecycle
• Industry regulations are requiring certain data governance and oversight capabilities
• Surveys show measured improvements in the ability to reduce risk, increase business agility and
increase revenue through formalizing a data management program3
1. EDM Council “Data Management Industry Benchmark Report”, 2015 and Financial Information Management Report; “Modernizing Data Quality & Governance”,
2016
2. Experian “The Data Quality Benchmark Report”, 2015, and Blazent report, “The State of Enterprise Data Quality”, 2016
5
Data Management
Mature Data Management Program Success Matrix
With these you will achieve… …this
Operational
Control
Environment
Funded
Implementation
Confusion
Data Quality
Strategy
Funded
Implementation
Dissatisfactio
n
Data Quality
Strategy
Operational
Control
Environment
Data
Management
Strategy
Funded
Implementation
Exasperation
Governance
Structure
Operational
Control
Environment
Frustration
Data Quality
Strategy
Governance
Structure
Operational
Control
Environment
Funded
Implementation
Inconsistency
Data Quality
Strategy
Governance
Structure
Data
Management
Strategy
Governance
Structure
Data Quality
Strategy
Funded
Implementation
Operational
Control
Environment
Data
Management
Strategy
Data
Management
Strategy
Data
Management
Strategy
Governance
Structure
Operational
Control
Environment
Funded
Implementation
Data Fit for
Purpose
Data Quality
Strategy
Data
Management
Strategy
Governance
Structure
Capability and Maturity Models
6
7
Capability and Maturity Models
Capability and Maturity Models – what are these things?
• Designed on the premise that the quality of a system or product
is highly influenced by the quality of the process used to
develop and maintain it
• Compendium of objective statements of activities designed to
provide guidance for organizations to progress along a measured
path of improvements for a particular set of business activities
• Typically ~5 levels of increasing capability or maturity
• Developed over a period of time leveraging subject matter experts with a
range of experience
• Designed to be universally applicable for any type or size of
organization
• Define the what, not the how
8
Capability and Maturity Models
“All Models are Wrong, But Some are Useful”
Subject of a paper written for a Statistics Workshop, arguing that the existing ‘real world’
“cannot be exactly represented in a model”, but that models can still be “illuminating and
useful”
• As true for Capability and Maturity Models as it is for statistical models
• Capability Maturity Models used since early 1990’s
• First CMM commercially developed by Carnegie Mellon University through funding
DoD, related to software engineering
• CMMI Model currently used globally by thousands of organizations of all types and
sizes
• Organizational Applicability
• Requires detailed understanding of the expectations articulated in the models
• Requires understanding of the goals, rationale of the activities
• Ability to interpret the models to the specific culture and needs of the organization
• Content is presented in a topical structure, not an operational or implementation
sequence
George Box, Statistician, 1978
9
Capability and Maturity Models
How Models are used
• Capability versus Maturity
• Capability. The validated achievement of performing individual functions
• Maturity. A defined level of relative collective capabilities within a specific domain of
work, and degree of optimization of the capabilities
• Useful for benchmarking
• Objective measurements of achievement provide measurements of organizational
capabilities or maturity
• Useful for tracking progress of improvement objectives
• Useful to compare against peers
• Different levels of assessment
• Affirmation/sentiment-based assessment. “I believe we do that.” Useful for initial
benchmarking and gap analysis
• Evidence-based assessment. Objective, third-party evaluation of direct evidence
of the execution of each activity statement in the model. Required for formal
reporting and benchmarking against peers
10
Capability and Maturity Models
Measuring Data Management Maturity
• Released by the Enterprise Data
Management (EDM) Council in 2015
• Designed to guide organizations to
a mature data management
program
DMMSM
• Released by CMMI Institute in 2014
• Designed to encompass all facets
of data management
Kingland is the only firm currently certified to consult on both models
Data Management Maturity (DMMSM)Model
11
DMM Model History
March 2009; EDM Council and Kingland
Systems pitch concept to SEI (Developer
and steward of CMM/CMMI at the time)
Sep 2010; EDM Council initial working
group formed for content development
Feb 2012; content turned over to SEI (now
CMMI Institute) for transition into an
objective model
Feb 2013; Initial model completed and pilot
engagements initiated (Microsoft engaged
in 1st pilot)
2013 – 2014; Model underwent 3 additional
major revisions and Peer Review, Pilot
engagements continued
August 2014 V1.0 released
12
DMM
Data Management Maturity (DMMSM) Model
13
Data Management Strategy
Data Operations
Platform & Architecture
Data Governance
Data Quality
Supporting Processes
Data Management Strategy (DMS)
Communications (COM)
Data Management Function (DMF)
Business Case (BC)
Program Funding (PF)
Measurement and Analysis (MA)
Process Management (PRCM)
Process Quality Assurance (PQA)
Risk Management (RM)
Configuration Management (CM)
Governance Management (GM)
Business Glossary (BG)
Metadata Management (MM)
Data Quality Strategy (DQS)
Data Profiling (DP)
Data Quality Assessment (DQA)
Data Cleansing (DQ)
Data Requirements Definition (DRD)
Data Lifecycle Management (DLM)
Provider Management (PM)
Architectural Approach (AA)
Architectural Standards (AS)
Data Integration (DI)
Data Management Platform (DMP)
Historical Data, Retention and Archiving
• Over 400 functional statements of
practice
• Focuses on the ‘state of activities’ vs.
state of the art
Guidance for complete data management continuum
• Infrastructure
support
practices for
organizational
instantiation
DMM
DMM Model Process Area Construct
14
DMM
DMM
15
DMM Levels
Designed to provide guidance for, and the ability to measure, increased data management
maturity across all aspects of data management
Activities are Informal and ad
hoc.
Dependent on heroic efforts and
lots of cleansing
Activities are deliberate, documented and
performed consistently at the Business unit
DM practices are aligned with strategic
organizational goals and standardized across all
areas
DM practices are managed and governed through
quantitative measures of process performance
DM processes are regularly improved and optimized
based on changing organizational goals – we are seen
as leaders in data management
16
DMM
Functional Practices
Functional Practice Statements
• Statements designed specifically to describe functional capabilities within the topical subject of the
Process Area (PA)
• Example, from Data Integration Process Area
• Functional statements of higher level build on lower level practice expectations
• Level 3 functional statements were designed as minimum target state
Practice Statement
Elaboration Text
17
DMM
Infrastructure Support Practices (ISPs)
Infrastructure Support Practices
• Activities designed to enable and sustain the manifestation of the process area activities into the culture
across the organization
• Part of the control ecosystem
• Every practice expected as part of every Process Area at the designated levels
Level 2 Level 3
18
DMM
DMM Capability and Maturity Requirements
Capability Measures
• Scored by Process Area (PA)
• All capability statements within a PA up through a
particular level
• Example; Capability level 3 in the Data Profiling
Process Area requires performance of all level 1, level
2, and level 3 practice statements in the PA
Maturity Measures
• Scored by Process Area (PA), by category or whole model
• All capability statements within a PA up through a
particular level, plus fully implemented across all ISPs for
the appropriate level
Data Management Capability Assessment Model
(DCAMTM)
19
DCAM History
March 2009; Origin with the pitch for a
maturity model to SEI (Developer and
steward of CMM/CMMI at the time)
Sep 2010; EDM Council initial working
group formed for content development
Feb 2012; content turned over to SEI (now
CMMI Institute) for transition into the DMM
Model
Jan 2014; Work initiated by EDM Council
on DCAM. Desire for a different type of
model
2014 – 2015; Model underwent 3 major
revisions and Peer Review. Pilot
engagements with banks
July, 2015 V1.1 released
20
DCAM
Data Management Capability Assessment Model
(DCAMTM)
21
Guidance for data management program
• Focused on capabilities to establish,
enable and sustain a mature data
management program
• 37 prescribed capabilities with 115 sub
capabilities
• Measurement criteria leading to an
optimized program
DCAM
DCAM Component Construct
DCAM
22
23
DMM
Capabilities, Sub-capabilities and Capability Objectives
Capability Statements
• Affirmatively worded statement of the state of something that should exist
Sub-capability Statements
• Singularly focused statement of the fact of something that must be accomplished or in place in order to
achieve the parent capability statement
• Includes amplifying narrative and capability objectives
• Accomplishment is measured based on Sub-capabilities
Sub capabilities
Capability Statement
Example from Data Management Strategy
24
DCAM Implementation Levels
Designed to provide guidance for, and measure, the journey towards implementation of a control
environment supporting data management
Not
Initiated
Things happen (sometimes), no defined process or controls
Controls
Conceptualize
d
Awareness of needs, concepts and conversations about how
Controls in
development
A strategy to develop process and controls is
underway, with documentation started
Controls
validated
Stakeholders have validated the
documented guidance
Controls
Implemented
The strategy, processes and controls
for the governance program are in
place and being followed
Controls
Enhance
d
Deliberate changes are
occurring to enhance the
program
DCAM
Initial target
25
DCAM
DCAM Capability Measures
Capability Measures
• Scored at Sub-capability level
• Roll-up to capability and component levels
• Each Sub-capability has defined criteria for each level
• Not all are scored to level 6 (Enhanced)
Examples from Business Case and Data Governance components
Model Usage Considerations
26
DMM Model
Designed to provide detailed guidance
via a ladder of increased capabilities
across all activities
DCAM
Designed to measure progress towards
full implementation of a data
management program
DMM Model v DCAM
DCAM
DMM
Model
Focus on program
development and
implementation
Specific activity guidance for
all aspects impacting data
Management, including data
management program
Measures level of
program implementation Measures level of
capabilities across the
organization
Both models address expectations for data governance and stewardship,
but have substantial differences
Both models support use as a means to measure current state and objective
measurements of progress for the content guidance contained in the
respective models
Scoping and use of the Models
27
28
Scoping and use of the Models
Data Management Cycle
Work defined by the top components are intended to drive the
activities performed by the bottom components
DCAM
DMM
Model
Focus on program
development and
implementation
Specific activity guidance for
all aspects impacting data
Management, including data
management program
Measures level of
program implementation Measures level of
capabilities across the
organization
DMM Model v DCAM
The guidance and controls from the data management program should inform
and influence all the day-to-day activities of data management
Scoping and use of the Models
29
DCAM DMM
30
Scoping and use of the Models
Key Considerations About the Models
• Both models help clarify roles of stakeholders and reinforce collaboration between
business and IT through shared understanding
• Both models provide guidance on necessary components of data governance and a data
management program
• “Which Model should I use”?
• Not an easy, binary decision.
• Current state
• Primary organizational driver
• Intended use for the model chosen
• Level organizational buy-in and support
• Ease of accepting change
• Organizational size and complexity
• Operational expertise related to all things ‘data management’
• Types of data domains (DCAM written predominately for financial services)
• Three bears soup problem; DCAM is 55 pages, DMM is 230 pages
• Both require training and expertise to fully understand and apply to be ‘just right’
• Focused on measuring towards
implementation of a program
• Solely interested in the program content
and implementation
• EDM Council membership
DCAM
• Evaluates specific organizational
capabilities for being in performed
• Program expectations interspersed
throughout the model, injected into certain
operational expectations
DMM
31
Scoping and use of the Models
How the Models are Being Used
• Workshops
• Same as Training, plus…
• Focused discussions on content within organizational context
• Affirmation-based baseline and gap analysis for clear path
forward
• Assessments
• Program scope validation
• Affirmation-based for indicative gap assessment
• Evidence-based assessment for unambiguous risk posture
against expected capabilities
• Identified strengths and weaknesses
• Formalized benchmark for peer comparison (if evidence
based) or improvement initiatives
• Training
• Identifying necessary participants in the organization
• Education on model expectations
• Establishing shared understanding and vision
• Self-directed
• Acquire and read the model
• Self-assess gap analysis
• Initiate improvement plans
32
Scoping and use of the Models
Next Webinar
• Deeper dive into scoping your use of the models
• Which model and what type of use
• Case study discussions of different organizations
use of the models
• Large enterprise B2B example
• Mid-sized financial industry example
• Small, focused data repository example
• Discussions of specific values achieved
Last in the series
KINGLAND.COM
For more information on data governance and
maturity – http://www.Kingland.com/data-maturity-
overview
jeff.gorball@kingland.com
33
34
Kingland Systems. Discover the Confidence of Knowing.
INDUSTRY
SOLUTIONS
SOLUTION
PLATFORM
Kingland has been delivering
Industry-specific solutions to
leading global enterprises for
more than 23 years.
The Kingland Strategic
Solution Platform means
continuously smarter
technology to deliver today and
into the future.
EXPERT
SERVICES
Kingland brings deep data and
software expertise to every
solution, helping you realize
benefits swiftly — and with less
risk.
Our clients know that Kingland Systems delivers faster, smarter, more reliable solutions.

Introduction to Data Management Maturity Models

  • 1.
    KINGLAND.COM TOPIC Introduction to DataManagement Maturity Models PRESENTED TO: Webinar July 28, 2016 Discover the Confidence of Knowing.
  • 2.
    Today’s Agenda Agenda Topics Reviewof the key points from the first Webinar Overview of Capability Maturity Models Discussion of Data Management Maturity (DMM) Model Discussion of Data Management Capability Assessment Model (DCAM) Model Usage Considerations Introduction to Data Management Maturity Models Data Management 2
  • 3.
    3 Data Management Data ManagementMaturity: Defined Data Management • The business functions that develop data, and/or execute plans, policies, practices and projects that control, protect, deliver and enhance the value of data. Data Management Maturity • The ability of an organization to precisely define, easily integrate, protect, effectively retrieve, and deliver data that is fit for purpose for both internal applications and external purposes . Metadata is data too, and is required to be proactively managed
  • 4.
    4 Data Management Current Stateof Data Management Maturity Data Management Maturity is relatively new, and without it, quality is generally poor • Virtually no formal measures of data management maturity, though some measures of data management program implementation • No more than ~ 33% of organizations have an active, formal data management program at some level of implementation1 • Nearly 50% of existing formal data management programs are 1 year old or less1 • Data Quality measures as a proxy for mature data management activities indicate strong need for improvements • Measured data quality is reported to indicated ~25-30 percent of organizations have data quality issues2 • Amount of companies reporting data quality issues is increasing2 • Business demand and regulatory pressures are driving recognition that data management is a business issue and needs to be improved under formalized programs • Business demand for Master Data Management, Data Science and Predictive Analytics require foundational improvement for pro-active management of data from origination through the entire data flow and lifecycle • Industry regulations are requiring certain data governance and oversight capabilities • Surveys show measured improvements in the ability to reduce risk, increase business agility and increase revenue through formalizing a data management program3 1. EDM Council “Data Management Industry Benchmark Report”, 2015 and Financial Information Management Report; “Modernizing Data Quality & Governance”, 2016 2. Experian “The Data Quality Benchmark Report”, 2015, and Blazent report, “The State of Enterprise Data Quality”, 2016
  • 5.
    5 Data Management Mature DataManagement Program Success Matrix With these you will achieve… …this Operational Control Environment Funded Implementation Confusion Data Quality Strategy Funded Implementation Dissatisfactio n Data Quality Strategy Operational Control Environment Data Management Strategy Funded Implementation Exasperation Governance Structure Operational Control Environment Frustration Data Quality Strategy Governance Structure Operational Control Environment Funded Implementation Inconsistency Data Quality Strategy Governance Structure Data Management Strategy Governance Structure Data Quality Strategy Funded Implementation Operational Control Environment Data Management Strategy Data Management Strategy Data Management Strategy Governance Structure Operational Control Environment Funded Implementation Data Fit for Purpose Data Quality Strategy Data Management Strategy Governance Structure
  • 6.
  • 7.
    7 Capability and MaturityModels Capability and Maturity Models – what are these things? • Designed on the premise that the quality of a system or product is highly influenced by the quality of the process used to develop and maintain it • Compendium of objective statements of activities designed to provide guidance for organizations to progress along a measured path of improvements for a particular set of business activities • Typically ~5 levels of increasing capability or maturity • Developed over a period of time leveraging subject matter experts with a range of experience • Designed to be universally applicable for any type or size of organization • Define the what, not the how
  • 8.
    8 Capability and MaturityModels “All Models are Wrong, But Some are Useful” Subject of a paper written for a Statistics Workshop, arguing that the existing ‘real world’ “cannot be exactly represented in a model”, but that models can still be “illuminating and useful” • As true for Capability and Maturity Models as it is for statistical models • Capability Maturity Models used since early 1990’s • First CMM commercially developed by Carnegie Mellon University through funding DoD, related to software engineering • CMMI Model currently used globally by thousands of organizations of all types and sizes • Organizational Applicability • Requires detailed understanding of the expectations articulated in the models • Requires understanding of the goals, rationale of the activities • Ability to interpret the models to the specific culture and needs of the organization • Content is presented in a topical structure, not an operational or implementation sequence George Box, Statistician, 1978
  • 9.
    9 Capability and MaturityModels How Models are used • Capability versus Maturity • Capability. The validated achievement of performing individual functions • Maturity. A defined level of relative collective capabilities within a specific domain of work, and degree of optimization of the capabilities • Useful for benchmarking • Objective measurements of achievement provide measurements of organizational capabilities or maturity • Useful for tracking progress of improvement objectives • Useful to compare against peers • Different levels of assessment • Affirmation/sentiment-based assessment. “I believe we do that.” Useful for initial benchmarking and gap analysis • Evidence-based assessment. Objective, third-party evaluation of direct evidence of the execution of each activity statement in the model. Required for formal reporting and benchmarking against peers
  • 10.
    10 Capability and MaturityModels Measuring Data Management Maturity • Released by the Enterprise Data Management (EDM) Council in 2015 • Designed to guide organizations to a mature data management program DMMSM • Released by CMMI Institute in 2014 • Designed to encompass all facets of data management Kingland is the only firm currently certified to consult on both models
  • 11.
    Data Management Maturity(DMMSM)Model 11
  • 12.
    DMM Model History March2009; EDM Council and Kingland Systems pitch concept to SEI (Developer and steward of CMM/CMMI at the time) Sep 2010; EDM Council initial working group formed for content development Feb 2012; content turned over to SEI (now CMMI Institute) for transition into an objective model Feb 2013; Initial model completed and pilot engagements initiated (Microsoft engaged in 1st pilot) 2013 – 2014; Model underwent 3 additional major revisions and Peer Review, Pilot engagements continued August 2014 V1.0 released 12 DMM
  • 13.
    Data Management Maturity(DMMSM) Model 13 Data Management Strategy Data Operations Platform & Architecture Data Governance Data Quality Supporting Processes Data Management Strategy (DMS) Communications (COM) Data Management Function (DMF) Business Case (BC) Program Funding (PF) Measurement and Analysis (MA) Process Management (PRCM) Process Quality Assurance (PQA) Risk Management (RM) Configuration Management (CM) Governance Management (GM) Business Glossary (BG) Metadata Management (MM) Data Quality Strategy (DQS) Data Profiling (DP) Data Quality Assessment (DQA) Data Cleansing (DQ) Data Requirements Definition (DRD) Data Lifecycle Management (DLM) Provider Management (PM) Architectural Approach (AA) Architectural Standards (AS) Data Integration (DI) Data Management Platform (DMP) Historical Data, Retention and Archiving • Over 400 functional statements of practice • Focuses on the ‘state of activities’ vs. state of the art Guidance for complete data management continuum • Infrastructure support practices for organizational instantiation DMM
  • 14.
    DMM Model ProcessArea Construct 14 DMM
  • 15.
    DMM 15 DMM Levels Designed toprovide guidance for, and the ability to measure, increased data management maturity across all aspects of data management Activities are Informal and ad hoc. Dependent on heroic efforts and lots of cleansing Activities are deliberate, documented and performed consistently at the Business unit DM practices are aligned with strategic organizational goals and standardized across all areas DM practices are managed and governed through quantitative measures of process performance DM processes are regularly improved and optimized based on changing organizational goals – we are seen as leaders in data management
  • 16.
    16 DMM Functional Practices Functional PracticeStatements • Statements designed specifically to describe functional capabilities within the topical subject of the Process Area (PA) • Example, from Data Integration Process Area • Functional statements of higher level build on lower level practice expectations • Level 3 functional statements were designed as minimum target state Practice Statement Elaboration Text
  • 17.
    17 DMM Infrastructure Support Practices(ISPs) Infrastructure Support Practices • Activities designed to enable and sustain the manifestation of the process area activities into the culture across the organization • Part of the control ecosystem • Every practice expected as part of every Process Area at the designated levels Level 2 Level 3
  • 18.
    18 DMM DMM Capability andMaturity Requirements Capability Measures • Scored by Process Area (PA) • All capability statements within a PA up through a particular level • Example; Capability level 3 in the Data Profiling Process Area requires performance of all level 1, level 2, and level 3 practice statements in the PA Maturity Measures • Scored by Process Area (PA), by category or whole model • All capability statements within a PA up through a particular level, plus fully implemented across all ISPs for the appropriate level
  • 19.
    Data Management CapabilityAssessment Model (DCAMTM) 19
  • 20.
    DCAM History March 2009;Origin with the pitch for a maturity model to SEI (Developer and steward of CMM/CMMI at the time) Sep 2010; EDM Council initial working group formed for content development Feb 2012; content turned over to SEI (now CMMI Institute) for transition into the DMM Model Jan 2014; Work initiated by EDM Council on DCAM. Desire for a different type of model 2014 – 2015; Model underwent 3 major revisions and Peer Review. Pilot engagements with banks July, 2015 V1.1 released 20 DCAM
  • 21.
    Data Management CapabilityAssessment Model (DCAMTM) 21 Guidance for data management program • Focused on capabilities to establish, enable and sustain a mature data management program • 37 prescribed capabilities with 115 sub capabilities • Measurement criteria leading to an optimized program DCAM
  • 22.
  • 23.
    23 DMM Capabilities, Sub-capabilities andCapability Objectives Capability Statements • Affirmatively worded statement of the state of something that should exist Sub-capability Statements • Singularly focused statement of the fact of something that must be accomplished or in place in order to achieve the parent capability statement • Includes amplifying narrative and capability objectives • Accomplishment is measured based on Sub-capabilities Sub capabilities Capability Statement Example from Data Management Strategy
  • 24.
    24 DCAM Implementation Levels Designedto provide guidance for, and measure, the journey towards implementation of a control environment supporting data management Not Initiated Things happen (sometimes), no defined process or controls Controls Conceptualize d Awareness of needs, concepts and conversations about how Controls in development A strategy to develop process and controls is underway, with documentation started Controls validated Stakeholders have validated the documented guidance Controls Implemented The strategy, processes and controls for the governance program are in place and being followed Controls Enhance d Deliberate changes are occurring to enhance the program DCAM Initial target
  • 25.
    25 DCAM DCAM Capability Measures CapabilityMeasures • Scored at Sub-capability level • Roll-up to capability and component levels • Each Sub-capability has defined criteria for each level • Not all are scored to level 6 (Enhanced) Examples from Business Case and Data Governance components
  • 26.
  • 27.
    DMM Model Designed toprovide detailed guidance via a ladder of increased capabilities across all activities DCAM Designed to measure progress towards full implementation of a data management program DMM Model v DCAM DCAM DMM Model Focus on program development and implementation Specific activity guidance for all aspects impacting data Management, including data management program Measures level of program implementation Measures level of capabilities across the organization Both models address expectations for data governance and stewardship, but have substantial differences Both models support use as a means to measure current state and objective measurements of progress for the content guidance contained in the respective models Scoping and use of the Models 27
  • 28.
    28 Scoping and useof the Models Data Management Cycle Work defined by the top components are intended to drive the activities performed by the bottom components
  • 29.
    DCAM DMM Model Focus on program developmentand implementation Specific activity guidance for all aspects impacting data Management, including data management program Measures level of program implementation Measures level of capabilities across the organization DMM Model v DCAM The guidance and controls from the data management program should inform and influence all the day-to-day activities of data management Scoping and use of the Models 29 DCAM DMM
  • 30.
    30 Scoping and useof the Models Key Considerations About the Models • Both models help clarify roles of stakeholders and reinforce collaboration between business and IT through shared understanding • Both models provide guidance on necessary components of data governance and a data management program • “Which Model should I use”? • Not an easy, binary decision. • Current state • Primary organizational driver • Intended use for the model chosen • Level organizational buy-in and support • Ease of accepting change • Organizational size and complexity • Operational expertise related to all things ‘data management’ • Types of data domains (DCAM written predominately for financial services) • Three bears soup problem; DCAM is 55 pages, DMM is 230 pages • Both require training and expertise to fully understand and apply to be ‘just right’ • Focused on measuring towards implementation of a program • Solely interested in the program content and implementation • EDM Council membership DCAM • Evaluates specific organizational capabilities for being in performed • Program expectations interspersed throughout the model, injected into certain operational expectations DMM
  • 31.
    31 Scoping and useof the Models How the Models are Being Used • Workshops • Same as Training, plus… • Focused discussions on content within organizational context • Affirmation-based baseline and gap analysis for clear path forward • Assessments • Program scope validation • Affirmation-based for indicative gap assessment • Evidence-based assessment for unambiguous risk posture against expected capabilities • Identified strengths and weaknesses • Formalized benchmark for peer comparison (if evidence based) or improvement initiatives • Training • Identifying necessary participants in the organization • Education on model expectations • Establishing shared understanding and vision • Self-directed • Acquire and read the model • Self-assess gap analysis • Initiate improvement plans
  • 32.
    32 Scoping and useof the Models Next Webinar • Deeper dive into scoping your use of the models • Which model and what type of use • Case study discussions of different organizations use of the models • Large enterprise B2B example • Mid-sized financial industry example • Small, focused data repository example • Discussions of specific values achieved Last in the series
  • 33.
    KINGLAND.COM For more informationon data governance and maturity – http://www.Kingland.com/data-maturity- overview [email protected] 33
  • 34.
    34 Kingland Systems. Discoverthe Confidence of Knowing. INDUSTRY SOLUTIONS SOLUTION PLATFORM Kingland has been delivering Industry-specific solutions to leading global enterprises for more than 23 years. The Kingland Strategic Solution Platform means continuously smarter technology to deliver today and into the future. EXPERT SERVICES Kingland brings deep data and software expertise to every solution, helping you realize benefits swiftly — and with less risk. Our clients know that Kingland Systems delivers faster, smarter, more reliable solutions.