100% found this document useful (1 vote)
1K views186 pages

Defining Enterprise Data and Analytics Strategy

The document is a book titled 'Defining Enterprise Data and Analytics Strategy' by Prakash Sah, aimed at providing pragmatic guidance for defining data and analytics strategies based on the author's extensive experience with Fortune 500 companies. It addresses the lack of comprehensive resources for executives and students on the practical challenges and approaches in the field of data and analytics. The book is intended for senior executives, data and analytics leaders, and students, offering insights from both successful and failed initiatives in the industry.

Uploaded by

Nazri Nawi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
1K views186 pages

Defining Enterprise Data and Analytics Strategy

The document is a book titled 'Defining Enterprise Data and Analytics Strategy' by Prakash Sah, aimed at providing pragmatic guidance for defining data and analytics strategies based on the author's extensive experience with Fortune 500 companies. It addresses the lack of comprehensive resources for executives and students on the practical challenges and approaches in the field of data and analytics. The book is intended for senior executives, data and analytics leaders, and students, offering insights from both successful and failed initiatives in the industry.

Uploaded by

Nazri Nawi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 186

Management for Professionals

Prakash Sah

Defining
Enterprise Data
and Analytics
Strategy
Pragmatic Guidance on Defining
Strategy Based on Successful Digital
Transformation Experience of Multiple
Fortune 500 and Other Global Companies
Management for Professionals
The Springer series Management for Professionals comprises high-level busi-
ness and management books for executives. The authors are experienced business
professionals and renowned professors who combine scientific background, best
practice, and entrepreneurial vision to provide powerful insights into how to
achieve business excellence.
Prakash Sah

Defining Enterprise Data


and Analytics Strategy
Pragmatic Guidance on Defining
Strategy Based on Successful Digital
Transformation Experience of Multiple
Fortune 500 and Other Global
Companies
Prakash Sah
Tata Consultancy Services
Thane, Maharashtra, India

ISSN 2192-8096 ISSN 2192-810X (electronic)


Management for Professionals
ISBN 978-981-19-5718-5 ISBN 978-981-19-5719-2 (eBook)
https://doi.org/10.1007/978-981-19-5719-2

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature
Singapore Pte Ltd. 2022
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar
or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd.
The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721,
Singapore
To my father, who left for his heavenly abode
five decades ago, but whose life story
continues to inspire me even today
Preface

Why Did I Write This Book?

While advising various Fortune 500 companies across the globe on their data and
analytics strategy, I have often searched for a good book or a toolkit that I could
share with my client executives. Even while coaching consultants in my practice
team or mentoring B-school/engineering college students, who want to build career
in data and analytics, I wanted to refer them to a book that could make them
understand the real-world challenges in data and analytics and could guide them
on practical approaches to wade through those challenges.
However, I could not find any book in the market that I could endorse. I did find
many good books on specific topics of data and analytics, such as on technology
and architecture, on artificial intelligence/machine learning techniques, on data
governance, and so on. However, I could not find any book that could guide an
enterprise or a data and analytics leader on a practical approach to define data
and analytics strategy. I wrote this book to address this white space. I believe
the book will serve as a good reference to senior executives of large enterprises,
data and analytics leaders (Chief Data Officer, Chief Analytics Officer, Chief Data
and Analytics Officer, Vice President—Data and Analytics, Director—Data and
Analytics, or any other role that the leader is referred to by), B-school students,
and engineering college students specializing in data and analytics.

How Did I Write This Book?

Over the last three decades of professional experience, I have been lucky to have
interacted with business leaders of hundreds of large and complex enterprises. This
helped me get deep insight into their decision-making process, analytical needs,
and challenges faced. I have been advising many of these leaders on their data
and analytics strategy. I have seen many data and analytics initiatives fail even
though many enterprises went for best-in-class technologies and hired a highly
qualified team to run the program. The success rate of data and analytics programs
is estimated to be less than 50%. Because of the failure of majority of data and
analytics initiatives, concerns are often raised around return on investment (ROI)

vii
viii Preface

from data and analytics program. I wrote this book purely based on my experience,
deriving learnings from both successes and failures of enterprises in defining and
implementing data and analytics strategy.
Acknowledgements

Writing a book such as this one involves many different people. It is difficult to
recall each one of them and the influence that they had on my thinking process.
Even though I can recall many names, it is almost impossible to thank each one
of them personally, since the list would be extremely large. Hence, instead of
acknowledging individual persons and organizations, I would like to acknowledge
the following groups of people and a few specific names to whom I am deeply
indebted for their contribution and support.
First, I want to thank the leaders of hundreds of large enterprises with whom I
have been interacting over the years. Each interaction gave me new insights into
the complexities and challenges of large and global enterprises.
Second, I want to thank colleagues of my past as well as my current employer,
Tata Consultancy Services (TCS). Especially at TCS, I have had the privilege to
work with some of the best consultants in data and analytics. I also got opportuni-
ties to advise multiple global clients, which deeply enriched my knowledge in the
field of data and analytics.
Third, I want to thank various analysts from organizations such as Gartner,
Forrester, and International Data Corporation (IDC). My interactions with them
have been in the form of both discussions in meetings/one-to-one sessions and
listening to them in conferences/events. Each interaction has always been very
enlightening because of the wide variety of experience and research expertise that
they bring in.
Fourth, I want to thank the professors and students of all the B-schools and
engineering colleges with whom I interacted with, be it for delivering a lecture as
a guest faculty or for delivering talks in conferences/events. The discussions and
questions asked during those interactions have always triggered a fundamental
thinking process in me, leading to questioning of some of my assumptions.
Fifth, I want to thank two professors who spared their valuable time to patiently
hear my book idea and validate the same. They are Prof. Sanjiv Vaidya, who retired
from Management Information Systems Group at IIM Calcutta, India, and Prof.
Indranil Bose, Professor in Information Systems at IIM Ahmedabad, India.
Sixth, I want to thank the entire team of Springer Nature, including Nupoor
Singh, Samrat Chatterjee, Lokeshwaran M, and Fermine Shaly, for their guidance
and support in publishing this book.

ix
x Acknowledgements

Seventh and last, but not least, I thank my family without whose inspiration
and support I could not have written this book. My mother, Bimla Devi, and
sister, Prema, always motivated me to realize my potential. My wife, Priti, put
up with months of my writing and lost weekends. My daughters, Anjali and
Anisha, supported me in this venture and regularly enquired about the progress
I was making. Anjali, who is a computer science graduate and a data and analytics
enthusiast, painstakingly reviewed the initial draft of the book and provided very
candid feedback, that proved to be extremely valuable to make the book easy to
understand.
Contents

1 What Is Data and Analytics Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1


1.1 Data and Analytics Strategy and Its Criticality to Drive
Enterprise Digital Initiatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Data and Analytics Strategy: A Case in Point . . . . . . . . . . . . . . . . . . . . 2
1.3 Five Elements of Data and Analytics Strategy . . . . . . . . . . . . . . . . . . . . 4
1.3.1 Business Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.2 Technology and Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.3 Team, Processes, and Governance . . . . . . . . . . . . . . . . . . . . . . . 6
1.3.4 Organizational Change Management . . . . . . . . . . . . . . . . . . . . . 6
1.3.5 Value Measurement Framework . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2 First Element of Strategy—Business Capabilities . . . . . . . . . . . . . . . . . . . . 9
2.1 Aligning with Organization’s Business Priorities . . . . . . . . . . . . . . . . . 9
2.2 Establishing Enterprise Performance Management
Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.1 A Brief Historical Perspective on Performance
Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.2 Key Performance Indicators (KPIs): Lagging
and Leading Indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.3 KPI Trees to Drive Enterprise Performance
Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.4 Challenges of Implementing Enterprise KPI
Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.2.5 KPI Framework Defined for Scenario 1
(Organization A) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3 Driving Enterprise Digital Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.3.1 Approach for Scenario 2 (Organization B): Digital
Transformation Leveraging Data and Analytics . . . . . . . . . . 19
2.4 Approach for Defining Data and Analytics Strategy, Starting
with Business Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
2.4.1 Step 1: Enterprise Churning—“Samudra Manthan” . . . . . . 24
2.4.2 Step 2: Defining Required Business Capabilities
and Other Strategy Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
xi
xii Contents

2.4.3
Step 3: Prioritizing and Creating an Integrated
Roadmap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3 Second Element of Strategy—Technology and Architecture . . . . . . . . . 37
3.1 How Not to Define Technology and Architecture Strategy? . . . . . . . 37
3.2 Understanding Non-functional Requirements to Define Data
and Analytics Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
3.2.1 Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.2.2 Mode of Delivery/Access (of Data) . . . . . . . . . . . . . . . . . . . . . . 42
3.2.3 Temporal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3.2.4 Data Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3.2.5 Data Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
3.2.6 Data Atomicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.2.7 Latency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.2.8 Data Quality and Integrity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.2.9 Business Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.2.10 Data Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.2.11 Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.3 Defining Data and Analytics Architecture . . . . . . . . . . . . . . . . . . . . . . . . 49
3.4 Selecting Relevant Technologies After Defining Data
and Analytics Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
3.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
4 Third Element of Strategy—Team, Processes, and Governance . . . . . . 57
4.1 Why Data and Analytics Organization and Processes Need
to Be Different from Other IT Functions? . . . . . . . . . . . . . . . . . . . . . . . . 57
4.2 Choosing the Right Data and Analytics Organization Model . . . . . . 58
4.2.1 Decentralized Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
4.2.2 Centralized Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
4.2.3 Federated Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
4.3 Defining Data and Analytics Organization and Processes . . . . . . . . . 62
4.3.1 Governance Tower . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4.3.2 Business Tower . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
4.3.3 Technology and Architecture Tower . . . . . . . . . . . . . . . . . . . . . 71
4.3.4 Solution Delivery Tower . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.3.5 Service Delivery Tower . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.4 Week-In-The-Life of Data and Analytics Team . . . . . . . . . . . . . . . . . . . 78
4.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
5 Fourth Element of Strategy—Organizational Change
Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
5.1 Need for Change Across the Enterprise . . . . . . . . . . . . . . . . . . . . . . . . . . 81
5.1.1 Till 2010—A Brief History of MIS Era . . . . . . . . . . . . . . . . . . 81
5.1.2 The 2010s—Data Visualization Becomes
All-Pervasive Across Enterprises . . . . . . . . . . . . . . . . . . . . . . . . 82
Contents xiii

5.1.3 The Latter Half of 2010s—Advent of Digital


Technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
5.1.4 Why Organizational Change Management . . . . . . . . . . . . . . . 83
5.2 Driving Change—Key Focus Areas and Objectives . . . . . . . . . . . . . . . 85
5.2.1 Changing Business Environment . . . . . . . . . . . . . . . . . . . . . . . . 85
5.2.2 Four Key Focus Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
5.2.3 Organizational Chaos Theory and Three Key
Objectives of OCM Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
5.2.4 Inter-relationships Between the Focus Areas
and Key Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
5.3 Driving Change—Twelve Elements of OCM Strategy . . . . . . . . . . . . 91
5.3.1 1A. User Persona Focus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
5.3.2 1B. Collaboration and Motivation . . . . . . . . . . . . . . . . . . . . . . . 94
5.3.3 1C. Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
5.3.4 2A. New Ways of Working . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
5.3.5 2B. Innovation Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
5.3.6 2C. Interaction Model with Different Functions . . . . . . . . . . 101
5.3.7 3A. Training on New Technologies . . . . . . . . . . . . . . . . . . . . . . 105
5.3.8 3B. Exploration of Fit-for-Future Technologies . . . . . . . . . . 107
5.3.9 3C. Institutionalization of New Technologies . . . . . . . . . . . . 108
5.3.10 4A. Data Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
5.3.11 4B. Data Thinking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.3.12 4C. Data Democratization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
5.4 Stages of Change and Importance of Change Leadership . . . . . . . . . 120
5.4.1 Stage 1: Prepare and Initiate . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
5.4.2 Stage 2: Scale-Up . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
5.4.3 Stage 3: Institutionalize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
5.4.4 Importance of Change Leadership . . . . . . . . . . . . . . . . . . . . . . . 122
5.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
6 Fifth Element of Strategy—Value Measurement Framework . . . . . . . . 127
6.1 The Need for a Value Measurement Framework . . . . . . . . . . . . . . . . . . 127
6.1.1 Data and Analytics Efficiency-Value Matrix (EV
Matrix) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
6.2 Defining and Measuring Business Value . . . . . . . . . . . . . . . . . . . . . . . . . 131
6.2.1 First Impact Area: Revenue Increase . . . . . . . . . . . . . . . . . . . . 131
6.2.2 Second Impact Area: Cost Reduction . . . . . . . . . . . . . . . . . . . . 135
6.2.3 Third Impact Area: Business Risk Mitigation . . . . . . . . . . . . 142
6.2.4 Fourth Impact Area: Company’s Image Building . . . . . . . . . 144
6.2.5 Business Value Measurement: Correlation Does Not
Necessarily Mean Causality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
6.3 Defining and Measuring Operational Efficiency—Continuous
Improvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
6.3.1 People Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
6.3.2 Process Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
xiv Contents

6.3.3 Technology Capability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151


6.3.4 Data Maturity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
6.3.5 Operational Efficiency and Maturity Assessment . . . . . . . . . 153
6.4 Calculating ROI from Data and Analytics Investment . . . . . . . . . . . . 154
6.4.1 Calculating Benefits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
6.4.2 Calculating Costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
6.4.3 Calculating ROI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
7 The Profile of a Data and Analytics Leader . . . . . . . . . . . . . . . . . . . . . . . . . . 161
7.1 Key Skills That Any Enterprise Data and Analytics Leader
Must Possess . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
7.2 Hard Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
7.2.1 Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
7.2.2 Data Science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
7.2.3 Business . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
7.3 Soft Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
7.3.1 Dealing with Ambiguity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
7.3.2 Team Leadership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
7.3.3 Innovation and Risk Taking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
7.3.4 Organizational Change Management . . . . . . . . . . . . . . . . . . . . . 170
7.3.5 Design Thinking and Empathy . . . . . . . . . . . . . . . . . . . . . . . . . . 172
7.3.6 Marketing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
7.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
About the Author

Prakash Sah has over three decades of multi-faceted


consulting and leadership experience in data and ana-
lytics. His experience spans across multiple industries
and cultures. While working in 13 different countries,
he helped CXOs and other leaders of many Fortune
500 companies in defining their enterprise data and
analytics vision, strategy, and roadmap.
Prakash has been invited as speaker/panelist at var-
ious international conferences and innovation events.
He has also been delivering guest lectures at few B
Schools. He has been mentoring many students and
professionals in academia and industry respectively.
He has authored many papers on enterprise data and
analytics.
Prakash is a mechanical engineer from IIT Kharag-
pur and an MBA from IIM Calcutta—two premier
institutes in India. He is currently working as Man-
aging Partner at TCS (Tata Consultancy Services),
one of the world’s leading consulting and IT services
company. Prakash is based out of Thane in Mumbai
metropolitan region, India, where he stays with his
wife, Priti and daughters, Anjali and Anisha.
Linked-in profile: https://www.linkedin.com/in/pra
kash-sah-9581964/

xv
What Is Data and Analytics Strategy
Key Elements That Should be Part of the Strategy
1

1.1 Data and Analytics Strategy and Its Criticality to Drive


Enterprise Digital Initiatives

Most enterprises today have either embarked on various digital initiatives or are
planning to invest heavily in digital technologies to gain competitive advantage.
Data forms the core of any digital initiative. Whether an enterprise aspires to
improve customer experience, launch new products, improve productivity, or man-
age risks better, having the right data at the right time and in a usable form is an
important first step towards achieving its digital aspiration.
An enterprise must also apply analytics on the data to get insights that would
help it in executing relevant digital initiatives. For example, if the aspiration of
an enterprise is to improve customer experience, it must collect and connect all
the relevant data of the customer—their past buying pattern, their preferences,
their demographic profile, their changing needs, and so on. This interconnected
data and the insights derived from it would help the enterprise in coming up with
customized plan (i.e. plan for each individual customer) for improving customer
experience. Same applies to any other digital aspiration of an enterprise. Since
data and analytics form the core of digital strategy, failure of data and analytics
initiative also leads to the failure of other digital initiatives of an enterprise.
Despite being so critical, the ground reality is that the success rate of data and
analytics initiatives in enterprises is very low. Various analysts have come up with
different estimates of success rates. While there are some differences in their actual
percentages, all the estimates are consistent in the fact that more than 50% of data
and analytics initiatives fail to achieve the business objectives that were expected
from them.
In my experience, most data and analytics initiatives fail largely due to organi-
zational complexities, both political and structural, rather than due to technology
or people capabilities. I have discussed these challenges in this book along with
recommendations on how to wade through these complexities. To avoid failure as

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 1
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_1
2 1 What Is Data and Analytics Strategy

well as to derive maximum benefit from data and analytics investments, I strongly
recommend that an enterprise should define data and analytics strategy upfront,
before embarking on other digital initiatives.
When attempting to define data and analytics strategy, most enterprises struggle
to define a strategy that is:

• Aligned with its business strategy.


• Capable of taking care of business dynamics, be it something as simple
as changing product/service portfolio or something more complex such as
changing business model or mergers and acquisitions.
• Flexible to adapt to a dynamic and fast changing technology ecosystem in data
and analytics space.
• Practical in defining data and analytics organization model and structure, that
can work with a complex business organization structure spanning across func-
tions, business units, and geographies (which is typical for most large and global
enterprises).
• Comprehensive to take care of both global and local business needs.
• Agile to deliver quick business value and drive innovation.
• Secure to enable borderless flow of information and avoid data espionage.

In this book, I have come up with a prescriptive guidance on how enterprises


can define a holistic data and analytics strategy that can help them become more
data-driven and gain competitive advantage.

1.2 Data and Analytics Strategy: A Case in Point

Let me share an interesting experience from a large multinational corporation.

I had a meeting with the global head of data and analytics of the company. In the meeting,
he wanted to share his newly formulated data and analytics strategy and seek my sugges-
tions. He started the discussion by mentioning that he and his team had spent six months in
coming up with a strategy and roadmap that would provide immense value to his enterprise.
He then went on to share an overview of technology/tools that existed in the enterprise,
and a detailed plan on how the technologies could be rationalized to have minimal and
standardized technology footprint.

He mentioned that because of the large number of data and analytics technologies present in
various functions, business units and geographies, they had to incur a huge cost in licenses
and maintenance. Hence, he wanted to standardize technologies so that this cost could come
down. He presented a three-year roadmap that consisted of multiple technology projects to
standardize the technologies. In the roadmap, there were a few basic business projects as
well, such as sales reporting, order and order backlog analysis, and others.

When he finished sharing his strategy, I was disappointed. My concern was that, in his
entire plan, there was no business project that was focused on building any major business
1.2 Data and Analytics Strategy: A Case in Point 3

capability for the enterprise. Further, even his initiative to reduce cost by standardizing tech-
nologies had very low chance of success. The reason for this was that business stakeholders,
who were used to working with certain technologies for years, would not agree to learn and
start using new technologies unless they would see any additional value or business capa-
bilities that these new technologies would give to them. And, if they refused to adopt the
new technologies, the data and analytics head would end up with maintaining both the old
and new technologies. That would increase the overall technology cost, instead of bringing
it down.

I had few other concerns as well on the strategy that the data and analytics head shared.
These concerns are related more to what the strategy did not cover. I am summarizing these
concerns below.

• The strategy did not talk about the business challenges that the enterprise faced or the
market opportunities that the enterprise would like to capitalize on, and how data and
analytics could help in the same.
• The strategy did not talk about how data and analytics team would be built with multi-
disciplinary competencies covering technology and business knowledge.
• The strategy did not include a plan for establishing data governance and other key
processes, which are critical for an enterprise to be successful in data and analytics.
• The strategy did not cover plan for organizational change management, which is one
of the most important factors that determines success or failure of a data and analytics
program.
• And finally, the strategy did not cover how business value delivered by data and analyt-
ics would be measured and how continuous improvement of data and analytics program
would be ensured.

I did share my concerns and suggestions in as diplomatic manner as was possible. How-
ever, I felt that he was still keen to pursue his technology rationalization and standardization
agenda first and worry about other aspects later. I could sense that he was more comfortable
to deal with technology, while the other aspects of strategy that I talked about were not in
his comfort zone.

The experience that I described above is not unique. I have had similar experi-
ences with many other data and analytics leaders, who prefer to pursue a purely
technology-focused strategy. Even within that narrow focus, many of them try to
remain in the comfort zone of a select few technologies that they are conversant
with and are reluctant to try out other technologies. In the field of data and ana-
lytics, technology innovation is happening at an exponential rate. It is, therefore,
important for all data and analytics professionals to keep abreast of new develop-
ments in technology and try out proofs of concept for solving complex business
problems using new technologies.
4 1 What Is Data and Analytics Strategy

In summary, to deliver real business value to an enterprise, no data and analytics


leader can afford to: (a) have a purely technology-focused strategy, and (b) remain
in the comfort zone of select few legacy technologies that she/he is expert in.

1.3 Five Elements of Data and Analytics Strategy

In this chapter, till now, I have highlighted the importance of defining data and
analytics strategy to drive digital initiatives in an enterprise. I also mentioned that
the strategy should not be focused on technology alone. Let me now explain what
really should comprise data and analytics strategy. But before doing that, let me
state the definition of strategy. Oxford dictionary defines the word “strategy” as
a “plan of action designed to achieve a long-term or overall aim”. Hence, like
any other strategy that an enterprise defines, data and analytics strategy need to
comprise elements that would help the enterprise in achieving its long-term vision.
I have summarized all the required components of data and analytics strategy
into five elements, as illustrated in Fig. 1.1.

Fig. 1.1 Five elements of enterprise data and analytics strategy


1.3 Five Elements of Data and Analytics Strategy 5

In the figure, you will notice that all the five elements are interconnected. There
is a reason behind this. I will explain the same in subsequent chapters. For now, it
would suffice to state that the starting point of defining data and analytics strategy
is the first element, i.e., business capabilities required by the enterprise. The other
four elements of the strategy are driven by the first element. However, all the five
are also interdependent.
I have dedicated one chapter on each element, where I will deep dive into what
these elements really mean and how to define strategy for each one of them. In
the following paragraphs, let me briefly describe the five elements.

1.3.1 Business Capabilities

The first element of data and analytics strategy is all about understanding the
business capabilities that are required by an enterprise, both in the short-term
and long-term horizon. Data and analytics leader of the enterprise must take into
consideration questions such as below while defining the first element:

• How does the enterprise plan to grow? For example, is the CEO planning for
a new business model? If yes, what are the data and analytics capabilities that
would be required to enable the new business model?
• What are the various business priorities of the CEO and what do they mean
from data and analytics capabilities perspective?
• What specific capabilities are required to drive various digital transformation
initiatives of the enterprise?

1.3.2 Technology and Architecture

Once the required business capabilities are understood and analyzed, one must
define technology and architecture strategy. Technology and architecture form the
second element of data and analytics strategy. This is a critical foundational ele-
ment that would enable various transformational initiatives. It should focus on the
following:

• Understanding the existing technology and architecture landscape of the enter-


prise.
• Analyzing the limitations of the current landscape.
• Defining the required technology and architecture to meet the business capabil-
ities, that were identified as part of the first element of the strategy.
• Preparing a transition plan to move from the existing technology landscape to
the required one.
6 1 What Is Data and Analytics Strategy

1.3.3 Team, Processes, and Governance

Large and global enterprises are complex. Therefore, one must define the right
data and analytics organization, along with processes and governance, to success-
fully wade through organizational complexity. Third element of the strategy should
focus on:

• Identifying the right data and analytics organization model, especially in a


global context.
• Defining data and analytics organization structure, with clearly laid down skills
and responsibilities for each role.
• Defining key processes and governance for running data and analytics program.

1.3.4 Organizational Change Management

Despite having all the right building blocks, a data and analytics program can
fail if its acceptance within the enterprise is not wide and deep. Understanding
the organizational dynamics and accordingly defining an organizational change
management strategy is, therefore, extremely critical for the success of a data
and analytics program. Hence, it forms the fourth element of data and analytics
strategy. It should focus on:

• People: Understanding the cultural, geographical, and demographical diversity


of employees and their different analytical needs.
• Processes: Defining new processes and new ways of working to drive innova-
tion at scale across the enterprise.
• Technology: Training people on new technologies and exploring future data and
analytics technologies that can make the enterprise ready-for-future to maintain
competitive advantage.
• Data: Changing the way data is treated, viewed, and managed in enter-
prises. Defining a data literacy plan for all user personas and inculcating
“data thinking” amongst all employees of the enterprise, so that data can be
democratized.

1.3.5 Value Measurement Framework

Data and analytics program may be perceived to be successful by some execu-


tives, while not so by few others. Invariably, questions would be asked regarding
the return on investments (ROI) from data and analytics initiatives. Establishing
a value measurement framework is, therefore, an important element of data and
analytics strategy. Data and analytics leader must focus on:
1.4 Summary 7

• Defining a framework to measure and improve value delivered to business.


• Identifying areas where data and analytics team needs to continuously improve
their operational efficiency, so that they can deliver business value in a
consistent and efficient manner.
• Establishing a methodology to calculate financial returns of any data and
analytics investment.

1.4 Summary

Defining data and analytics strategy should be the starting point for any digital
initiative in an enterprise. While most enterprises would agree to this point, many
of them continue to struggle to define a holistic data and analytics strategy. Often,
data and analytics leaders define a strategy that is focused primarily on technology
and architecture. This leads to failure of a majority of data and analytics initiatives
across enterprises. To define a holistic strategy, there are five key elements that
needs to be defined—(a) business capabilities, (b) technology and architecture, (c)
team, processes, and governance, (d) organizational change management, and (e)
value measurement framework. These five elements have been discussed in detail
in the next five chapters.
First Element of Strategy—Business
Capabilities 2
Taking a Top-Down Approach to Align with
Business Strategy

2.1 Aligning with Organization’s Business Priorities

As discussed in the previous chapter, understanding business capabilities required


to grow or transform an enterprise is an important starting point for defining data
and analytics strategy. However, there is no one-size-fits-all approach for this. Let
me exemplify this point by sharing two scenarios in the following paragraphs. The
two organizations referred to in the respective scenarios are not hypothetical but
are actual large enterprises.

Scenario 1: Organization A, which is a manufacturing company, was going through a


rough patch, with profitability going down consistently over the previous five years. Its rev-
enue and market share (vis-à-vis competition) was steady, but costs were increasing, which
was attributed to two factors: (a) operational efficiency of the organization had gone down
due to various reasons, and (b) a new product, for which huge investment was made in
research and development as well as subsequent product launch, heavily underperformed
post launch. As a result, the CEO lost his job and a new CEO was hired, with a clear
expectation that he would ensure continual improvement of EBIT (Earnings Before Inter-
est and Taxes) and cash flow, quarter-on-quarter, for the next three years. Given the market
dynamics and shareholders expectations, major focus area for the new CEO over the next
three years was sustaining revenue at the same level while improving profitability. This was
extremely important for the survival of the organization. The CEO was worried about how
he was going to achieve this difficult task.
Scenario 2: The CEO of organization B saw a huge opportunity to not just grow market
share of the organization through differentiated product and service offerings, but also to
diversify into an altogether new product leveraging digital technologies. The latter would
create substantial revenue stream for the organization. However, to be successful, it required
quite a lot of new capabilities to be built, in addition to adapting to certain new ways of
working. The CEO wanted to make sure that all the right capabilities were built on time and
organizational change management was handled smoothly, so that the organization could
achieve the aspirational goal of taking a quantum leap in business performance.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 9
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_2
10 2 First Element of Strategy—Business Capabilities

While there were lot of initiatives that both the CEOs (mentioned in the two scenar-
ios above) needed to take to achieve their respective objectives, data and analytics
promised to play vital role in these initiatives. Hence, it was important to define
data and analytics strategy upfront in both the scenarios. The question that arose
then is how the data and analytics leaders of these respective organizations should
define strategies to help achieve their CEO’s (and the respective organization’s)
business priorities. Their strategies obviously would be different and could not be
defined in the same manner. I was advising both these organizations, and working
together with their leaders, I defined their respective strategies, that proved to be
very effective. Later in this chapter, I will discuss about the approach adopted for
both the organizations. However, before that, it is important to give an overview
of enterprise performance management and the importance of data and analytics
to drive the same.

2.2 Establishing Enterprise Performance Management


Framework

2.2.1 A Brief Historical Perspective on Performance


Measurement

The terms performance management and performance measurement are some-


times used interchangeably. This is not correct. While performance measurement
is a necessary pre-requisite for performance management, there are differences
between them. The latter is much broader and includes actions one must take to
manage performance better. Let me share a brief historical perspective on per-
formance measurement and management. Enterprises have focused on measuring
business performance since early nineteenth century when traditional manage-
ment accounting-based performance measures were used. These were inadequate
and therefore, in early 1920s, DuPont Corporation developed measures such as
return on investment (ROI), return on assets (ROA), and return on equity (ROE).
While these are still used today, lot of other performance measurement models and
frameworks were created over the years. Models such as balanced scorecard, Cam-
bridge performance measurement design process, business excellence model of the
European foundation for quality management (EFQM), and few others became
popular. Increasingly, the focus of the newer models was to help enterprises evolve
from just measuring performance to managing performance effectively. In the
last few decades, lot of research has been done by various expert groups and
recommendations made on how enterprises should define and measure key perfor-
mance indicators (KPIs), so that the KPIs can be used not just for measuring past
performance, but also to manage future performance better.
2.2 Establishing Enterprise Performance Management Framework 11

2.2.2 Key Performance Indicators (KPIs): Lagging and Leading


Indicators

While performance measurement is more about measuring and tracking certain


metrics to understand how the business has been doing, performance management
is more holistic and includes even actions that decision makers should take to
improve performance in the future. To manage performance better, enterprises need
to define and track not just the lagging indicators (i.e. key performance indicators
or metrics that shows past business performance of the enterprise, e.g. daily sales)
but also leading indicators (i.e. key performance indicators or metrics that suggest
whether business performance of the enterprise would improve in the future, e.g.
sales pipeline). Looking into lagging indicators is akin to looking in the rear-view
mirror. While lagging indicators are useful, one cannot drive a car by just looking
into the rear-view mirror. Enterprises must define relevant leading indicators as
well. They should always define the right and relevant combination of lagging and
leading indicators to manage performance. Following are some more examples of
lagging and leading indicators.

• Lagging indicators: On-time delivery (to customer, as per promised date), num-
ber of safety incidents, energy consumed per unit of production, total emissions
per unit of production, etc.
• Leading indicators: Number of suggestions to customer to improve product
usage, number of safety improvement actions taken, R&D spend on green
initiatives, green products produced as a percentage of total products, etc.

Each KPI needs to be defined with precise calculation formula, with underlying
rules and exceptions, that would vary within as well as across different enterprises.
I will soon discuss on some of the practical challenges faced in this regard. How-
ever, before that, let me share the concept of KPI tree and dynamic nature of
KPIs.

2.2.3 KPI Trees to Drive Enterprise Performance Management

For better performance management, enterprises define not just a set of KPIs but
create KPI trees because of the dependence of top level KPIs of an enterprise
to lower level KPIs. For example, a level 1 CEO’s KPI (such as sales growth
%) can be broken down into few level 2 KPIs, each of which, in turn, can be
broken down into multiple level 3 KPIs, and so on. Normally, lower level KPIs
are for lower levels of management. This cause-and-effect relationship between
lower and higher level KPIs necessitates enterprises to focus not just on individual
KPIs but to create and analyze KPI trees. It is also important to note that KPI
trees should consist of both lagging and leading indicators. Figure 2.1 illustrates
a KPI framework, cascading from top level CEO’s KPIs, to across the depth and
breadth of an enterprise. This needs to be created with a unified vision of common
12 2 First Element of Strategy—Business Capabilities

measurement, common definition, and reuse of business data/information across


levels, processes, functions, business units, and geographies of an enterprise.
Let me explain the KPI framework illustrated in Fig. 2.1 layer by layer, starting
from the top.

• Level 1—Management—CEO: At the tip of the pyramid are CEO’s KPIs. My


experience is that the number of KPIs of a CEO is small, typically between
5 and 10. These KPIs are available (or presented) to the CEO in the form
of dashboards that she/he can monitor daily. A CEO as well as executive
management want a global and consolidated view of business performance
across the value chain (illustrative depiction of value chain in the figure is
that of an agriculture and food processing industry, that has a value chain of
source-transport-process-transform-distribute-market and sell).
• Level 1—Management—other executive management: The next level, i.e.
COO, CFO, etc. takes CEO’s KPIs and translates these into their specific KPIs.
For example, a CFO would have KPIs focused more on cash flow, working
capital, etc. Normally data and analytics systems publish a daily update of
KPIs of each CXO in their respective executive management dashboards, also
commonly referred to as CXO dashboards.
• Level 2—Management—functional management: The next layer of the pyra-
mid comprises functional heads at corporate and business units. These include
head of supply chain, head of quality, head of sustainability, head of HR, etc.
While their KPIs are related more to their respective functional areas, many of
these KPIs would cut across other functional areas as well. This is because busi-
ness processes in an enterprise are inextricably linked to each other, spanning
across multiple functions. For example, purchasing process involves many func-
tions—purchasing function, requisitioning function (that can be any function
of the enterprise), quality control function for checking quality of purchased
material/service, finance function for all payment related processes, and so on.
Functional heads are keen to get cross-functional and global view of KPIs. Also,
while generally they want to view KPIs at a summarized level, they very often
want to drill down into detailed metrics/data for root cause analysis. Functional
management dashboards from data and analytics systems would take care of
these needs. Data in these dashboards need to be refreshed once in a day or, for
certain KPIs, every few hours.
• Level 3—Operational: At level 3 are the functional managers, who are respon-
sible for day-to-day operations. Their KPI/information needs are more at
operational level. Like level 2, they also need cross-functional data since busi-
ness processes cuts across functions. They need to understand bottlenecks in
the business process (within their function or in other functions) that could be
reducing their operational efficiency and productivity. Their KPIs (which are
more commonly called metrics for their level) are defined accordingly and are
more operational in nature. Operational dashboards from data and analytics sys-
tems would meet their needs. Data in these dashboards need to be refreshed on a
2.2 Establishing Enterprise Performance Management Framework

Fig. 2.1 Achieving a unified vision of KPI framework to drive enterprise performance management
13
14 2 First Element of Strategy—Business Capabilities

real-time basis or near real-time basis (i.e., every 15 min or so), since functional
managers must take decisions as soon as any process bottleneck occurs.
• Level 4—Detailed (transactional level): At the bot
tom of the pyramid are the people below the managers, who have opera-
tional information needs to take care of individual transactions. Examples
include a cashier, accountant, machine operator, salesperson, and so on. They
need easy and quick access to relevant data, to perform their tasks efficiently.
They measure their performance by certain operational level metrics, more at a
task level, e.g., throughput. Their data and metrics needs are typically catered
to directly by the operational or transactional systems such as Enterprise
Resource Planning (ERP), Customer Relationship Management (CRM), and
other systems present in the enterprise. They need data on a real-time basis,
which these systems are designed to cater to.

As mentioned earlier as well, while defining KPI trees and dashboards across
the levels, enterprises strive to meet the objectives of ensuring common measure-
ment, common definition, reuse of data/information, standardized look and feel of
dashboards, and overall consistency across all functions, business units and geogra-
phies. Irrespective of which function, business unit or geography an employee
moves to, there should be no major difference in the way one sees and analyzes
any KPI. However, there are lot of challenges that enterprises face to meet this
objective. I will discuss about these challenges in detail in the next section.
I would like to highlight here one major mistake that I have seen many enter-
prises or consultants commit while defining KPI trees. They define it one time for
an enterprise and then never change the tree later. The reason why this is a mistake
is that all enterprises go through a maturity evolution path in performance man-
agement. An enterprise that has just started on performance management journey
would need to have more lagging indicators/KPIs to measure performance. With
time, the learning curve of the enterprise becomes steep, and new KPIs, which
are more leading in nature, can then be introduced. The proportion of leading
indicators (as compared to lagging indicators) can be slowly increased with time.
If one tries to define a highly mature KPI tree (that is relevant for an enterprise
that has high performance management maturity) for an enterprise that has low
performance management maturity, it will lead to various challenges. Some of
the challenges include inability to measure complex KPIs (due to various system
and non-system issues) and demotivating managers (since their performance is
judged based on KPIs). I will discuss these and various other challenges in the
next section.

2.2.4 Challenges of Implementing Enterprise KPI Framework

Theoretically, it is not very difficult to define a KPI framework for an enterprise.


There are lot of useful reference KPI trees available, by industry, with various
consulting companies, that one can make use of as a starting point. However, in
2.2 Establishing Enterprise Performance Management Framework 15

practice, when it comes to making a KPI framework contextual to an enterprise, it


is never an easy task. One would face the following challenges while defining and
implementing enterprise KPI framework.

• Enterprise maturity: As mentioned in the previous section, KPIs need to be


aligned with enterprise maturity. If they are not aligned, it would lead to both
measurement and management issues.
• Multiple business units: Most large enterprises have multiple business units
(BUs), each having its own unique business model. Hence, same set of KPIs
may not apply to all BUs.
• Multiple geographies: Most large enterprises are present in multiple loca-
tion/geography, each having its unique market characteristics. Hence, defining
same set of KPIs for each geography may not work.
• KPI definition variances: While there are certain standard high-level industry
definitions of most KPIs used in different enterprises, when it comes to mak-
ing a definition specific with all the required details, including calculation and
business rules, one would find wide variations across different enterprises. Even
within the same enterprise, one KPI may have many different definitions across
business functions. These variations arise due to various reasons, such as local
country’s regulatory reporting requirements, or the way a KPI has always been
defined in a particular function/geography, or a new leader, who, upon joining
a function, redefined a KPI based on her/his past experience, and so on.
• Stakeholder alignment: In cases where definition of a particular KPI vary
without any valid reason (or a reason that can be overcome), enterprises often
undertake an exercise to align stakeholders to a common definition that would
be agreeable to all. Such standardization always helps the CEO or other execu-
tives to analyze and manage their business better. With standardized definitions,
executives can compare the numbers across different businesses or geographies
better—they can compare “apples and apples” rather than “apples and oranges”.
However, KPI alignment exercises are never easy. One would face lot of resis-
tance to change. My personal experience of having driven many such alignment
workshops is that one must be extremely tactful in driving change. In one of
the alignment workshops that I was driving, even for a simple KPI such as on-
time delivery (of product to customer), different leaders from the same business
unit were not ready to change their respective definitions. Finance leader felt
that the definition used in finance function was more accurate, while sales and
marketing head felt that the definition his team was using was more practical.
Both leaders were right from their respective points of view. However, it was
important to drive home the point that agreeing to a common definition would
help them both and the enterprise in the long run. In the last chapter of this
book, in which I will talk about the soft skills of a data and analytics leader, I
will touch upon the skills that are required to drive changes such as this.
• Cultural differences: Cultural sensitivity is another major consideration while
defining KPIs. Interpretation of KPIs varies for employees from different cul-
tural backgrounds. Let me quote an interesting example on this. I was in a
16 2 First Element of Strategy—Business Capabilities

business planning meeting, where we had leaders from many cultures together
in a room. A discussion was going on to finalize KPIs and set target for them for
the next fiscal year. Amongst those present in the room were a German leader
and an American leader. The German leader said that for him, once a KPI and
its target is finalized, he would go all out to achieve it. If at the end of the year,
he would fall short to achieve the target by even 5%, he would consider that as a
failure, since that is how he has been brought up in German culture. The Amer-
ican leader, however, believed that the KPIs and targets should be aspirational,
i.e. something that is extremely high and difficult to achieve. He believed that
such aspirational KPIs and targets would challenge a person to reach as close to
the target as possible and would also encourage people to think out-of-the-box.
There are no easy solutions for addressing cultural differences such as this. One
must account for such sensitivities while identifying KPIs, defining them, and
setting targets for them.
• System and data challenges: Finally, KPI frameworks are mostly defined in
a top-down approach for any enterprise. However, limitations of data available
from various enterprise systems may not support calculation of all the KPIs,
making the whole KPI exercise redundant. Hence, while creating a KPI frame-
work, it is important to understand sources of data (i.e., systems that contain
the required transactional data) for each KPI. Next, one should analyze issues
with quality of data available from each source and come up with possible solu-
tions to address the issues. One should also identify the right technologies and
define an architecture that can help in automatically collecting, cleaning, and
organizing data to make it useful for consumption in various dashboards.

2.2.5 KPI Framework Defined for Scenario 1 (Organization A)

Having discussed KPI framework in detail, let me now address how we approached
the problem faced by organization A, that I described in scenario 1 at the begin-
ning of this chapter. As mentioned earlier, the focus of the CEO was to primarily
improve EBIT (earnings before interest and taxes) and cash flow, so that he could
take the organization out of a very difficult situation. To achieve this, it was
important to establish a very well-defined KPI framework, that could help him
to identify cost-saving opportunities, drive cost consciousness, and improve per-
formance management across the enterprise. To do this, a value driver framework
as illustrated in Fig. 2.2 was defined as a starting point. This framework was then
used to define KPI tree.
To explain this framework, let me first give a brief explanation of EBIT (earn-
ings before interest and taxes) for those who do not have finance background. EBIT
after cost of capital is the difference between EBIT and cost of capital. EBIT is
calculated as the difference between contribution margin (which is difference of
sales and variable costs) and fixed costs. Cost of capital can be derived by applying
cost of capital percent to invested capital, which in turn can be calculated as the
2.2 Establishing Enterprise Performance Management Framework 17

Fig. 2.2 Value driver framework to prepare KPI tree


18 2 First Element of Strategy—Business Capabilities

sum of fixed assets and working capital. To summarize in layman’s terms, EBIT
is the profitability calculated as sales minus all expenses, while cost of capital is
the cost of getting funds (equity or debt or both) to run a business.
While there are complex accounting principles that need to be taken into
consideration while doing all the above calculations, from purely performance
management perspective, the focus here was to really understand how these key
components of EBIT could be used to define the KPIs (or KPI trees to be more
precise) for various functional leaders and managers within the enterprise. The
objective of the exercise was to ensure that everyone’s focus, and behavior should
be aligned to the CEO’s objective of improving EBIT. Let me elaborate this point
through the example of “sales”. Each function within an enterprise influences sale
in a different way. Sales team can influence sale through articulating product fea-
tures in a way that has a compelling proposition to potential customers. Service
team can influence repeat sale to existing customers by providing world-class
service. Product development team can influence sale by adding a differentiated
feature within an existing product. Thus, if one were to cascade down sales related
KPIs to managers of each of these functions, the KPIs would be very different.
With the above approach, KPI framework, including various KPI trees were
defined. While doing this, the feasibility of measuring the defined KPIs through-
out the enterprise was evaluated. With the KPI framework in place, the next step
was to design and implement a solution to collect, cleanse, and organize data for
calculation of KPIs in an automated manner. Once that was also completed, the
CEO was able to monitor progress of various KPIs on a daily, weekly, and monthly
basis, and could take quick corrective actions wherever he found deviations beyond
acceptable limits. Similarly, subsequent levels of executive and functional man-
agement were also able to monitor and manage performance of KPIs in alignment
with CEO’s focus areas. Within couple of quarters of implementing this solution,
the organization was able to overcome their ongoing business challenges and meet
shareholders’ expectation of improving EBIT and cash flow.

2.3 Driving Enterprise Digital Strategy

While enterprise performance measurement and management are important to


improve business outcomes, CEOs are always looking for opportunities for non-
linear growth of business. Over the last one decade, with the advent of innovative
digital technologies, CEOs see an opportunity to beat their competitors by vari-
ous means, such as improving customer experience, coming up with new business
models, and so on. To do this, most CEOs are defining digital strategy and imple-
menting few game-changing digital solutions. Data and analytics form the core of
any digital initiative in an enterprise. Hence, while defining digital strategy, CEOs
need to define data and analytics strategy first.
Earlier in this chapter, I had described a global manufacturing organization B
(referred to in scenario 2 at the beginning of this chapter), whose CEO saw huge
growth opportunities by leveraging digital technologies. Data and analytics formed
2.3 Driving Enterprise Digital Strategy 19

the core and an essential pre-requisite to realize his vision. Let me describe below
the framework that was defined to realize the CEO’s vision.

2.3.1 Approach for Scenario 2 (Organization B): Digital


Transformation Leveraging Data and Analytics

As mentioned earlier, the CEO of this organization saw two opportunities: (a) grow
market share through differentiated product and service offerings, and (b) diversify
into an altogether new product leveraging digital technologies. To capitalize on
both these opportunities, various brainstorming workshops were conducted with
their business leaders across the globe and multiple initiatives that needed to be
undertaken were identified. I will discuss the approach used for this (or for that
matter a typical approach that can be used for any such exercise) in the next section
of this chapter. For now, let me describe the framework that was created for the
organization, as illustrated in Fig. 2.3.
We can broadly divide the framework into three layers—top (mega business
value drivers), middle (analytical levers across the value chain), and bottom (enter-
prise data lake with underlying enterprise applications landscape). Let me describe
them progressively.

1. TOP LAYER: Mega business value drivers: At the top of the figure, one can
see eight mega business value drivers such as capacity management, enterprise
cost management, and so on. These were identified as the focus areas for digital
transformation of the organization. Instead of explaining all of them, let me
elaborate on just two of the drivers:
– Customer experience, or Customer 360: It was felt that better customer
experience management would be critical to help meet the organization’s
objectives mentioned earlier, viz (a) grow market share through differenti-
ated product and service offerings, and (b) diversify into an altogether new
product leveraging digital technologies. Customer experience needed to be
completely transformed, by putting customer at the center of all sales, mar-
keting, and service initiatives. To make it happen, creating a global customer
360 view, leveraging data and analytics, was critical. This would enable
redefining customer processes and experience. To enable it various initia-
tives were identified. For all these initiatives, data and analytics formed the
core.
– Supply chain excellence: Supply chain of the organization was no longer
linear. Material and information traveled amongst complex nodes in a range
of networks that linked its multiple suppliers and customers across the globe,
making the supply chain susceptible to failure. Global events such as a
pandemic or an earthquake or flood exposed the vulnerabilities of the orga-
nization. Hence, supply chain excellence was identified as another mega
20
2
First Element of Strategy—Business Capabilities

Fig. 2.3 Illustrative digital transformation framework leveraging data and analytics
2.3 Driving Enterprise Digital Strategy 21

business driver in their digital transformation journey. Here again, vari-


ous specific digital initiatives were identified, for which data and analytics
formed the core.
2. MIDDLE LAYER: Analytical levers across the value chain: In the middle of
the illustrated framework is an array of analytical levers across the value chain,
covering the entire product life cycle, in a complex web of business units (BUs)
and business functions, which would enable the mega business value drivers
identified in the top layer. Let me explain this in little more detail below.
– Analytical circles: The four circles in the figure correspond to the four
stages of value chain, starting from new product introduction and going all
the way to deliver (sales) and after sales.
– Analytical areas: Within the annular rings of each analytical circle, there
are various analytical areas such as capacity planning, enterprise spend, risk
management, and so on. One would notice that many analytical areas, for
example, risk management, are present in multiple analytical circles. This
means that to better manage risks, you need to investigate all these analytical
circles in tandem.
– Analytical levers: Each analytical area has multiple analytical levers,
depicted by circle sectors. To improve in an analytical area, one needs to
use all the analytical levers in a synchronized manner. For example, if one
needs to improve in the analytical area of supplier performance, one must
investigate the three constituent analytical levers, viz cost of material, quality
of material, and timeliness of delivery.
– Information thread: With the above overview, it is obvious that, to enable
any mega business value driver, multiple analytical levers from across the
value chain need to work together in tandem. For example, for supply chain
excellence, one needs to work with analytical levers of supplier performance
analytical area, analytical levers of inbound logistics analytical area, analyt-
ical levers of outbound logistics analytical area, and few others. This means
that data and information of various analytical levers across the value chain
need to be connected through what can be called as “information threads”.
Without these connected information threads, you cannot enable a mega
business value driver.
– Information fabric: When I talk about information threads, I am not just
talking about linear threads running across the value chain. I am talking
about threads that run across various business functions and units of the
organization, spanning across multiple geographies where the organization
operates in. So, using the term “information fabric” would be apt to depict
the web of information threads that is cross-functional, cross-BU and cross-
geography. This is the scenario in any large global enterprise. Figure 2.4
illustrates this in a matrix that includes the four dimensions–(a) BUs, (b)
geographical regions, (c) value chain, and (d) business functions.

3. BOTTOM LAYER: Enterprise data lake with underlying enterprise appli-


cations landscape: At the bottom of the framework is an enterprise data lake
22
2

Fig. 2.4 Illustrative information fabric framework (conceptual view)


First Element of Strategy—Business Capabilities
2.4 Approach for Defining Data and Analytics … 23

or a data platform, that helps decouple decision making from the systems com-
plexity underneath it. The data lake enables creation of information fabric, that
is vital for driving analytical levers and consequently, analytical areas. The two
sub layers in this bottom layer is described below:
– Data lake: A common data platform (single or a combination of multiple—I
will discuss more on this in the next chapter, where I will cover technology
and architecture aspects of data and analytics strategy) that can model and
structure enterprise as well as external data, both structured and unstructured,
in a manner that can make them easy to understand and consume by the
analytical levers.
– Internal data (from enterprise applications) and external data: Any large
enterprise would have a plethora of applications belonging to each business
unit as well as to the corporate functions. To add to this, enterprises sub-
scribe to lot of external data from various sources, including social media,
to understand customer or competition better. Role of data lake is to ingest
all relevant data from all these different sources and create information fab-
ric. Of course, there would be lot of challenges that one would face while
doing this. I will discuss these challenges as well as the means to address
them in the subsequent chapters.
Establishing a top-down digital transformation framework, such as the one
described above, is a critical first step towards defining data and analytics strategy,
especially when an enterprise has an ambitious plan to grow using digital technolo-
gies. Organization B, for which this framework was defined, benefitted immensely
from it. This framework became the guiding principle behind all the digital ini-
tiatives that they subsequently embarked on. The organization has already added
hundreds of million dollars to their top-line and bottom-line, as a result. In the
next section, I will describe a proven approach that one can adopt for creating a
framework such as this.

2.4 Approach for Defining Data and Analytics Strategy,


Starting with Business Capabilities

Understanding business capabilities required to grow or transform an enterprise


is an important starting point for defining data and analytics strategy. Figure 2.5
illustrates a typical approach one needs to adopt to define business capabilities and
subsequently rest of the strategy elements.
As can be seen from the figure, the approach is a three-step process:

• Step 1: Enterprise churning—“Samudra Manthan”.


• Step 2: Defining required business capabilities and other four strategy elements.
• Step 3: Prioritizing and creating an integrated roadmap.

Let me elaborate each of these three steps.


24 2 First Element of Strategy—Business Capabilities

Fig. 2.5 Typical approach to define data and analytics strategy

2.4.1 Step 1: Enterprise Churning—“Samudra Manthan”

In Indian mythology, “samudra manthan”, i.e., churning (manthan) of the ocean


(samudra) is the story of gods and demons joining hands to churn the celestial
ocean of milk (often alluded to the Milky Way galaxy) for obtaining nectar of
immortality. A mountain was used as the churning rod and a giant serpent was
used as the rope for churning. The churning required combined strength of both
the gods and the demons and went on for years before the nectar surfaced. Along
with the nectar, lot of other substances, including poison, also surfaced.
There are lot of interesting stories related to “samudra manthan”. However,
the reason why I am calling step 1 of the approach, to define enterprise data
and analytics strategy, as enterprise churning or “samudra manthan” is that if one
were to transform an enterprise leveraging data and analytics, one must do an
initial churning of thoughts, ideas, and emotions existing across the enterprise.
During such an exercise, lot of brilliant ideas as well as constraints and negative
emotions will surface. To achieve substantial digital transformation of business,
one must shake the enterprise. While doing so, there are various organizational
change management considerations that one needs to take care of, about which I
will discuss in detail in Chapter 5. For now, let me talk about the four key steps
of “enterprise churning” exercise, listed as below.

• Executive communication: The kick-off of any large strategic initiative in an


enterprise should be with a context-setting communication from the CEO. I
have often seen that a communication message, regarding the strategic intent
behind a data and analytics initiative, is sent by either the data and analytics
2.4 Approach for Defining Data and Analytics … 25

leader or by the CIO of an enterprise. I do not recommend this since it would


not help in getting a large and diverse set of business leaders, spread across
functions, BUs, and geographies, getting aligned and providing full support to
such an initiative. A communication from the CEO, on the other hand, helps
grab greater attention from the leaders, and conveys the message that the CEO
is behind the initiative. This would help garner greater support for the initiative
from all stakeholders.
• Socialization with leaders: Once the CEO communicates, it is important to start
socializing the initiative with key leaders to explain the initiative in detail and
address any concerns that the leaders may have. Business leaders would have
lot of questions/concerns regarding the initiative. By addressing them upfront,
one can avoid all miscommunications and ensure consistency of message that
cascades down the enterprise. Data and analytics leader should take a lead in
this and would need to schedule one-to-one meetings with key leaders.
• Dipstick surveys: Once both the steps mentioned above are completed (which,
as per my experience, may take even up to two months for a large enterprise),
the ground is set to go all out across the breadth and depth of the enterprise to
engage with a large set of stakeholders. A good practice to do this is to run an
initial dipstick survey, before conducting workshops. Dipstick surveys can be
run online (through a portal) or offline (through email). Objective of the survey
is to get initial pulse from the stakeholders with regards to their information and
analytical needs, challenges, ideas, and so on. This would help in preparation
for the workshops.
I would like to share a word of caution on surveys, based on my experience.
The survey should be objective in nature and should be kept brief. It should
take around 15 to 30 minutes for a respondent to complete, and not any longer.
One often gets greedy to garner more information and makes the survey very
exhaustive. My experience is that the lengthier a survey is, the lesser number of
people would respond to it, thereby defeating the basic purpose of the survey.
Surveys should be sent to all key leaders, functional managers, and analysts.
The number of potential respondents could run into few thousand in a large
enterprise. One needs to keep the numbers reasonably large so that the voice-
of-customer thus captured is a good representation of organizational diversity.
Also, not everyone is going to respond to the survey. My experience is that if
50% or more people respond, one would be lucky. In my experience, I have
never seen more than 70% people responding.
A final suggestion on dipstick survey is that it should not be made as one
standard set of questions for all target respondents. Instead, depending on the
role or function/BU of the target respondent, the survey should be tweaked.
The more contextual the survey is for each category of respondents, the better
would be the quality of response.
• Design thinking workshops: The final step in “enterprise churning” exercise
is conducting workshops with key stakeholders across BUs and geographies.
26 2 First Element of Strategy—Business Capabilities

These stakeholders should be a sub-set of the people to whom dipstick survey


was sent. Ideally, one would want to conduct workshops with all the people
to whom survey was sent, but practically that may not be possible in large
enterprises.
As far as the method for conducting such workshops is concerned, I recom-
mend adopting design thinking approach. Design thinking focusses on people
and their needs, empathizing with their day-to-day challenges, and developing
human-centric solutions in a creative and iterative manner. It is a very use-
ful approach for conducting brainstorming workshops for defining data and
analytics strategy.
Other than the approach to be adopted for the workshops, I have some addi-
tional suggestions for planning and conducting of such workshops. I have
summarized them below, based on my experience and learnings over years of
having conducted several such workshops.
– Number of workshop participants should not be too high. For example,
if you have thirty people in one workshop, it may not be too effective. I
normally recommend 5 to 15 participants in a workshop (other than the
facilitators).
– Ensure right representation of stakeholders from different functions or
geographies in a workshop, depending on the workshop agenda. If a work-
shop’s agenda is to ideate on financial transformation, for example, one
needs to have the right representation of finance leaders. However, if the
workshop agenda is alignment of common KPIs used by say, finance and
supply chain functions, you would need participants from both the functions
in a common workshop.
– Sequencing of workshops based on interdependencies of different work-
shops is also important. For example, one may need to conduct workshop
with HR leaders after conducting workshop with operations team. This is
because operations team may raise number of HR concerns, which then can
be discussed with HR leaders.
– Sequencing of workshops based on organization hierarchy needs to be
planned. Sometimes, it may be better to conduct workshops with senior lead-
ers prior to those with the next level. Workshops with senior leaders can help
in understanding top-down vision and generating ideas, which can then be
discussed in detail with the next level to understand hurdles that one may
face on the ground while implementing those ideas.
– In-person workshops are more effective than virtual workshops. While
with COVID-19 pandemic, many meetings or workshops across the globe
have become virtual, one must not undermine the effectiveness of in-person
workshops, especially when one needs to brainstorm on out-of-the-box ideas.
During an in-person workshop, you can study the body language of partic-
ipants and ensure that everyone contributes effectively. During a workshop,
there would always be some people who would talk a lot and some others
who would remain reserved. As a facilitator of the workshop, one needs
2.4 Approach for Defining Data and Analytics … 27

to moderate a workshop in a manner that everyone’s ideas come to the


table. Also, better human-to-human bonding happens when people are face-
to-face with each other. One may also get opportunity to talk informally
during pre-workshop and post-workshop huddle, while having coffee or
lunch together. Such interactions and bonding go a long way in not just
making the workshop more effective, but in subsequent interactions as well.
– Sending agenda before the workshop is another important, and often over-
looked, activity. One should send out the workshop agenda in advance to all
participants and highlight the preparatory work that should be done by them
prior to the workshop. My experience is that, once you do that, at least half
the participants would come to the workshop with proper preparation. This
helps in making the workshops more effective.
– Duration of workshop needs to be decided based on the agenda, partici-
pants, and objective of the workshop. In general, I have found three hours to
be an optimum duration for each design thinking workshop for defining data
and analytics strategy. One is often tempted to keep the duration longer, but
that does not work out, since you would lose the focus of participants. Also,
participants need to take care of their routine day-to-day work, so they can-
not spend a lot of focused time. On the other hand, keeping shorter duration,
say 2 h or less, generally turns out to be insufficient for conducting such
brainstorming workshops.
– Conducting of workshops itself should be planned carefully. Workshops
should be held in three stages. In the first stage, the facilitator should do a
quick recap of the overall objective of the data and analytics initiative. She/he
should then mention the objective of the workshop and describe the approach
that would be adopted for conducting it. This should be followed by sum-
marizing survey responses received. Participants should then be encouraged
to add or amend this summary as well as share any missing facts. They
should also be encouraged to share all their concerns and challenges that
they face on a day-to-day basis. One important success factor for a success-
ful workshop is to get participants’ emotions out early. Unless they get their
emotions out by sharing all their concerns about things that are not working
for them, and unless they get a feeling that the workshop facilitator is empa-
thetic towards them, they would not be willing to ideate and come up with
creative ideas. However, once their emotions are out, the facilitator should
move to stage 2 of the workshop. It is important that the facilitator completes
stage 1 of the workshop in one-third to less than half of the total workshop
duration.
In the second stage, focus should be on generating ideas, including some
“out-of-the-box” ones. The purpose of this stage is to leverage the collective
knowledge and experience of participants to generate ideas on how busi-
ness can be done differently, and how data and analytics can help in this.
Facilitator should ensure that, while generating ideas, no boundaries are laid
down. Participants should think of ideas without applying any constraints
28 2 First Element of Strategy—Business Capabilities

that may exist within the enterprise. Once ideas are generated, the facilita-
tor should ensure that these are evaluated, at a high level, for their potential
benefits to the enterprise. This evaluation should be done by way of discus-
sions during the workshop. All the ideas along with their benefits evaluation
should be summarized on a white board (or any other suitable medium).
Once this is accomplished, the facilitator can move to the third and final
stage of the workshop. From timing perspective, by the end of stage 2, facil-
itator should ensure that the time available for stage 3 of the workshop is
between one-third and one-fourth of the total workshop duration.
In the third stage, participants should critically review each of the listed
ideas for practical challenges that would be encountered while implement-
ing them. While generating ideas in stage 2, I had suggested not to apply
any constraints. However, in stage 3, it is important to discuss practical
challenges of implementing the ideas and accordingly prioritize them. Pri-
oritization of ideas should be done based on potential benefits on one hand
and challenges to implement them on the other hand. Once, prioritization
is done, workshop should be closed by summarizing the discussions, listing
out the next steps, and thanking all participants for their contribution.
– Sending workshop minutes and follow-up post the workshop is very impor-
tant. I have seen that many participants do read the minutes thoroughly and
send back an email if they find any important point getting missed out or
if a point captured in the minutes was mis-interpreted. Other than sending
out minutes and incorporating feedback received on the same (if received),
one needs to follow-up on the agreed next steps and keep track of them till
closure.

2.4.2 Step 2: Defining Required Business Capabilities and Other


Strategy Elements

Step 1 was all about “enterprise churning”, to help understand business challenges,
needs, and potential ideas. The step culminates with design thinking workshops.
The inputs received in step 1 is vital to define key business capabilities required
for the enterprise to drive business growth. Some of the ideas generated in step 1
can be incremental, while others can be breakthrough in nature, that may have
the potential to completely change the way business has been conducted for
years. Some breakthrough ideas can be so transformational that one may need
to completely change the business model of the enterprise. Once all such ideas are
captured, the next step (step 2) needs to focus on defining the required business
capabilities in detail. Figure 2.6 illustrates a typical approach that one can adopt
for the same.
The approach illustrated in the figure comprises the following key steps:
2.4 Approach for Defining Data and Analytics … 29

Fig. 2.6 Typical approach to define the required business capabilities

• Root cause analysis of pain areas: Through dipstick surveys and design think-
ing workshops, lots of emotions and pains of stakeholders from across the
enterprise would surface. Root cause analysis for them should be done in detail.
Such analysis would be very useful while defining all the five data and analytics
strategy elements and in coming up with a roadmap that is grounded.
• Evaluation of incremental and transformational Ideas: During the design
thinking workshops, various ideas emerge and practical challenges of imple-
menting them are discussed. However, one would have limited time in the
workshop to evaluate each idea in detail. Also, all the information required
to do detailed evaluation may not be available at that time. Hence, after the
workshops, the exercise of detailed evaluation of all ideas must be done. This
would require collecting lot of additional information and having follow-up dis-
cussions with some stakeholders. Typically, the more transformational an idea
is, the more time one would need to do detailed evaluation.
• Definition of digital transformation framework: After detailed evaluation of
the ideas, the finalized ones can be grouped under key digital transformation
themes, which can then be used to create digital transformation framework for
the enterprise. I had discussed about creating a digital transformation framework
earlier in this chapter, while explaining the approach for scenario 2 (example
of organization B quoted). As mentioned there, these digital transformation
themes would require lot of changes across the enterprise beyond just data
and analytics. For example, one may need to initiate a major business pro-
cess change exercise or complete restructuring of certain business functions, to
30 2 First Element of Strategy—Business Capabilities

enable a theme. All such required changes should be understood, analyzed, and
deliberated upon, while creating the digital transformation framework.
• Definition of enterprise performance management framework: Comple-
mentary to, though not mutually exclusive to, creating digital transformation
framework is the exercise for defining enterprise performance management
framework, that includes a KPI framework. Again, I discussed about this in
detail earlier in this chapter while explaining the approach for scenario 1 (exam-
ple of organization A quoted). During that discussion, I had highlighted the
challenges one would face while establishing this framework.
• Evaluation of enterprise change readiness from data and business process
perspective: For both digital transformation framework and enterprise perfor-
mance management framework, it is important to evaluate enterprise change
readiness on two primary dimensions—data and business processes. On data,
one should understand and analyze data availability and data quality for the
relevant data entities required for each of the initiative. Similarly, on business
processes, one should understand the ease or difficulty of changing business
processes to drive each initiative. Both data and business processes are inextri-
cably linked to each other, hence they should be analyzed in tandem and not in
isolation.
• Listing down of business initiatives/projects: Once the above step is com-
plete, all the identified initiatives should be broken down into easy-to-manage
business projects. These needs to be then prioritized to define roadmap, about
which I will discuss in detail in Step 3 (Prioritizing and creating an integrated
roadmap).

Once the required business capabilities are defined, by using the approach
described above, one can start defining the other four data and analytics strategy
elements. As mentioned earlier as well, business capabilities dictate the strategy for
other four elements. I have covered this aspect in detail in the following chapters:

• Chapter 3—Technology and Architecture.


• Chapter 4—Team, Processes, and Governance.
• Chapter 5—Organizational Change Management (OCM).
• Chapter 6—Value Measurement Framework (VMF).

While business capabilities dictate the strategy for other four elements, it is impor-
tant to note that there is a good degree of interdependence between these four
elements as well. This is depicted in Fig. 2.7.
In the figure, you will notice few examples of interdependence mentioned on the
connecting lines. Let me explain them below (although these have been covered
in detail in the upcoming chapters).

• Technology and architecture: The required business capabilities help deter-


mine various non-functional requirements that need to be considered for
2.4 Approach for Defining Data and Analytics … 31

Fig. 2.7 Interdependence of data and analytics strategy elements

defining technology and architecture strategy. This is discussed in Chap. 3.


Further, planned technology and architecture also helps in defining team skills,
as well as processes and governance around chosen technologies (Chap. 4—
Team, Processes, and Governance). Also, one needs to identify considerations
around technology and architecture change management (Chap. 5—Organiza-
tional Change Management) and ensure continual improvement of technology
and architecture (Chap. 6—Value Measurement Framework).
• Team, processes, and governance: Based on the business capabilities required,
one needs to design team, processes, and governance that would enable those
capabilities. Processes and governance are also required for other three strategy
elements—technology and architecture, organizational change management,
and value management framework. In Chap. 4, I will talk about how processes
should be defined and governed.
• Organizational change management (OCM): Based on the desired business
capabilities, various organizational change considerations, such as interaction
model between data and analytics team and various business stakeholders,
should be deliberated. OCM strategy needs to be defined accordingly. Com-
munication is an important element of OCM. One example of interdependence
of OCM and VMF is about how to communicate business value (delivered by
data and analytics team) to various executives/business leaders. Communicat-
ing value is important to establish credibility of data and analytics organization
and encourage greater adoption of analytics in decision making across the
enterprise. I will discuss these in detail in Chap. 5.
32 2 First Element of Strategy—Business Capabilities

• Value measurement framework (VMF): While defining business capabilities,


it is important to understand how each capability or initiative would deliver
value to the enterprise. This becomes an important input, in the form of value
drivers, for defining value measurement framework. I will discuss about this in
detail in Chap. 6, where I will also cover continuous improvement of all aspects
of data and analytics—technology, architecture, processes, etc.

2.4.3 Step 3: Prioritizing and Creating an Integrated Roadmap

Once all the five strategy elements are defined, the next logical step is to lay out
a roadmap that can be followed to implement the strategy. Just as Rome was not
built in a day, so too data and analytics roadmap should span across few years
of duration. My experience is that a “big bang” approach does not work in large
enterprises. I normally recommend preparing a five-year roadmap that should be
reviewed and revised every quarter. An illustrative roadmap that one would prepare
is shown in Fig. 2.8.
Let me talk about key considerations while defining data and analytics roadmap.

• Roadmap for all five strategy elements: Projects/initiatives for each of the five
elements of data and analytics strategy needs to be laid down explicitly. Often,

Fig. 2.8 Typical data and analytics roadmap and maturity curve
2.4 Approach for Defining Data and Analytics … 33

enterprises make the mistake of including only business and technology projects
and miss out the other three streams.
• Interdependence of projects: Many projects/initiatives would be interdepen-
dent within as well as across the five streams. For example, establishing
technology and architecture foundation upfront would be a pre-requisite for
executing most business projects. All such interdependence should be taken
into consideration while preparing the roadmap.
• Prioritization of projects: An important exercise while laying down the
roadmap is to prioritize various business projects. As mentioned at the end of
step 2, all the identified initiatives should be broken down into easy-to-manage
business projects. Most of the time, there would be more projects than what
would be possible for the data and analytics team to execute in one go. Hence,
the need to prioritize.
One would face various dilemmas during prioritization exercise—Which project
to take up early, and which one should be kept for a later date? How would the
stakeholders, whose projects have been kept for a later date, react? To resolve
such dilemmas, prioritization should be done in a very scientific manner. Busi-
ness case should be prepared for each project based on objective evaluation of
two parameters (and their respective sub-parameters). First are the benefits that
each project would deliver to the enterprise and second are the complexities,
including cost, that would be involved to implement each project. Few examples
of sub-parameters (under the complexity parameter) are data readiness, tech-
nology complexity, business process change complexity, organization structure
change required, and so on. One should then do relative scoring of each project
on each sub-parameter. Parameter score can be derived as weighted average of
all the sub-parameters. Post that, weighted average score of each project can
be calculated. Weights to be used for each parameter and sub-parameter can be
decided based on their relative importance. Finally, prioritization can be done
by comparing weighted average score of each project.
When prioritization is done in a structured manner such as the one described
above, and the same is communicated effectively to all business stakeholders,
one would find much better understanding and acceptance of the roadmap by
all. While communicating, the rationale should be explained in detail. Wherever
required, communication should be done in-person or over phone, instead of by
only sending email. During the “enterprise churning” phase, stakeholders took
out good amount of time to share their inputs to the data and analytics team.
Hence, at this stage, it is only fair to revert to them, explaining how their inputs
were taken into consideration while preparing overall strategy and roadmap.
Organizational change management is critical, and I will talk about various
aspects of the same in Chap. 5.
• Duration of each project: Just because a five-year roadmap is prepared, it does
not mean that each project should be of long duration. Instead, each project
should be broken down into smaller chunks and delivered in an iterative/agile
manner. Duration of each project/sub-project should not be more than three
34 2 First Element of Strategy—Business Capabilities

months. Even within these three months, there should be multiple sprints with
intermittent deliverables, so that business feedback is received on a continuous
basis. One important aspect to note is that, in general, the duration of business
projects would be longer during the first couple of years. However, as the data
lake gets enriched with more enterprise data and overall, the maturity of data
and analytics organization increases, project duration in subsequent years would
progressively become shorter. By year 5, most of the enterprise data would
already be available in the data platform and if the data is architected the right
way, business projects can be executed extremely fast.
• Period of disillusionment: As mentioned in the previous paragraph, during the
first couple of years duration of projects would be longer. There would be lot
of foundational activities that are required to be done during that time. These
would be time consuming, but important. The slow initial pace of business value
delivery often leads to disillusionment amongst the business stakeholders. Dur-
ing strategy definition phase, lot of hype gets created. Overall, expectations of
stakeholders become extremely high. They start expecting some sort of “magic”
to happen in the next one year, that would completely change their decision-
making process. However, when they don’t see results fast, they lose hope and
sometimes, even stop supporting the data and analytics initiatives.
To manage this period, organizational change management plays a critical role.
Right from strategy definition phase to ongoing execution, one needs to educate
the business stakeholders on how certain foundational activities would take time
but would be very useful to them in the medium to long run. The nature of these
foundational activities should be explained to them. They should be told that
if initial shortcuts, beyond reasonable limits, are taken, they would prove to be
disastrous later. This change management exercise needs to be handled well to
set realistic expectations. Enterprises that are not able to successfully go past the
“period of disillusionment”, ultimately fail to leverage data and analytics. They
lose most of the initial investments made in the data and analytics program.
The worst thing that happens because of such failure is that an atmosphere of
distrust for such initiatives gets created across the enterprise and, therefore, it
becomes very difficult to drive such an initiative again in future.
• Maturity evolution: Because of the reasons mentioned in the previous two
bullet points, maturity evolution curve of data and analytics of an enterprise
is not linear. It would evolve at a slower rate initially, followed by sharper
increase later, as illustrated in the figure. Without the right building blocks in
all the five streams of roadmap, risk and chances of failure are extremely high
initially. Hence, while defining the roadmap, one should plan some of the com-
plex projects later, even though they could be resulting in high business value.
Of course, there could be exceptions to this rule due to business exigencies. All
such exceptions should be carefully planned.
Sometimes, data and analytics leaders are put under tremendous pressure by
executives to deliver results quickly or face the axe. Enterprises that have never
2.5 Summary 35

invested strategically into data and analytics cannot transform itself into a data-
driven enterprise overnight, by simply investing few million dollars. It entails lot
of foundational work and organizational change management, that takes time.
• Funding of Projects: Finally, while defining a roadmap, due consideration
needs to be given to how the projects are going to be funded. Typically, for busi-
ness projects funds should come from the respective business unit or business
function, and for technology and other projects, funds should come from the
CIO budget. However, I have a suggestion for the funding of business projects
during the early years of the roadmap. When you start architecting a new data
platform, you need to bring data from various enterprise systems. One would
do this in an incremental manner, starting with the systems that would provide
key master and transactional data for the initial business projects. However, the
dilemma one faces, while fetching data from a system, is whether to bring only
the data required for a project or to bring other data as well that would be used
in future for other business projects.
I always recommend bringing all the relevant data (required for immediate
project as well as for potential future needs) from a particular data source sys-
tem and architect in the data platform in a manner that would create a strong
data foundation for future. In addition to helping create a business process-
oriented data architecture, it would also ensure reusability of data (“bring once,
use multiple times”) across various projects.
While the above approach is good for an enterprise, it requires more effort
and budget. The question then is—why should the sponsor of initial business
projects fund this extra budget? Every business unit or function works under
budget constraints and any such proposition to shell out extra money for the
larger good of the enterprise would not work out. To manage this concern, it
is better for the data and analytics leader/CIO to fund a portion of business
projects, especially during the first couple of years, from her/his budget. If
required, approval from the CEO should be taken by presenting a business case
for the initial investment.

2.5 Summary

Understanding business capabilities required to grow or transform an enterprise


is an important starting point for defining data and analytics strategy. Different
enterprises have different market challenges/opportunities and, therefore, differ-
ent business priorities. Data and analytics strategy must be aligned with business
priorities. Business capabilities can broadly be grouped into two categories: (a)
enterprise performance management, and (b) digital transformation. These two,
however, are not mutually exclusive. At a foundational level, an enterprise requires
a strong data backbone—an information fabric that integrates data from across
36 2 First Element of Strategy—Business Capabilities

multiple business units, functions, and geographies of the enterprise, to drive all
data and analytics initiatives.
A structured three-step approach is required to define enterprise data and ana-
lytics strategy. Step 1 is “enterprise churning” that can help in identifying the
required business capabilities. Step 2 is about defining the business capabilities
in detail, based on which the other four strategy elements can also be defined,
taking into consideration the interdependencies between them. Finally, step 3 is
all about creating an integrated roadmap to implement the strategy. While lay-
ing down the roadmap, inter alia, it is important to understand the typical nature
of data and analytics maturity evolution within an enterprise, and accordingly set
stakeholders’ expectations and drive organizational change management.
Second Element
of Strategy—Technology 3
and Architecture
Establishing Technology and Architecture
Foundation That Is Futuristic and Flexible

3.1 How Not to Define Technology and Architecture


Strategy?

During the first half of the decade 2010–2019, big data was at the peak of its hype
cycle, with every enterprise having inflated expectations from big data technolo-
gies. There was a race amongst CIOs to have a big data plan and infrastructure in
place. Any CIO who did not have such a plan was perceived as being out of touch
with latest technologies. I remember an incident where the CIO of a large multina-
tional bank decided to put a large on-premises big data infrastructure in place. He
invested millions of dollars in the same. When he was asked by a leading analyst
on his plan to leverage the infrastructure, his response was that he was yet to figure
that out and was discussing with various business leaders of the enterprise in this
regard. This is an example of how not to establish a technology and architecture
foundation for data and analytics. Let me share another experience below.

The data and analytics director of a large multinational services company was a dynamic
and aggressive person, known for his quick execution capabilities. Due to his excellent past
performance and good articulation skills, he was able to convince the top management of
the company to allocate around 5 million dollars budget for him to set up a big data plat-
form that would be transformational in nature. This was in the year 2014. His request was
approved.
His plan was simple—Set up a big data platform (data lake) and ingest data from most of
the enterprise transactional systems from across the globe into this data lake quickly. He
didn’t want to spend time during the initial period to establish a strong data architecture,
because he felt that once he would have most of the enterprise data in the data lake, business
stakeholders would be enticed to come to him to meet their data needs. While, in theory, this
sounded like a good strategy, it turned out to be the biggest failure of his professional career.
There were various reasons for this failure, the top three being: (a) He did not engage with
the business stakeholders early on to understand their needs and pain areas, (b) In a hurry
to establish the data lake with enterprise data within in, he did not bother to address data

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 37
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_3
38 3 Second Element of Strategy—Technology and Architecture

quality issues, and (c) He did not give much thought on how to architect and model data
within data lake to make it easy to consume and to ensure reusability.
Because of the flawed approach, whichever business stakeholders worked with him initially
got frustrated with the quality of data and other issues that cropped up. As a result, they
lost trust in him. Word spread around quickly across the enterprise, and anyone else who
were planning to work with him, also dropped the idea. He learnt the hard way that data
and analytics strategy cannot be driven by simply setting up a state-of-the-art technology
platform, where one can dump all enterprise data without a well thought through strategy
in place.

The above examples raise the important question of how one should define tech-
nology and architecture strategy for data and analytics. The focus of rest of this
chapter is to answer this question.

3.2 Understanding Non-functional Requirements to Define


Data and Analytics Architecture

A logical first question that one needs to ask while defining an enterprise-wide
data and analytics architecture is—What is the intended purpose of the architec-
ture? Let me give an analogy of an architect who needs to create a blueprint
of a civil construction. The first step towards this is to understand what needs
to be architected. Is it a museum, dam, road, bridge, airport, canal, shopping
mall, commercial office building, factory, or a residential society? Depending on
what needs to be architected, the architect would ask the next level of detailed
questions to understand the needs of the owner, users, or occupants of the pro-
posed construction. Based on responses to these questions, the architect would go
ahead and create an architecture that would take care of all the key requirements.
These requirements would determine various design criteria—concrete foundation,
structural material, aesthetics, drainage, utilities, and so on.
While the above first step adopted in civil construction industry sounds very
logical, a similar step is either skipped or done hastily in IT industry, especially
in data and analytics. If you enquire from the data and analytics leader or her/his
team as to why they skipped this step, their typical responses would be as below.

“We follow agile methodology in our organization. Speed is most important. We cannot
spend so much time in defining strategy”.

“We are using a proven architecture, so we don’t need to worry”.

“I used the same architecture in my last company, and it worked perfectly fine. Hence, I did
not feel the need to again define the architecture for my current company”.

“Our strategic cloud vendor, that is a global leader in data and analytics, advised us to use
this architecture. Hence, we did not feel a need to change anything”.
3.2 Understanding Non-functional Requirements to Define Data... 39

It is important to understand that the needs of different enterprises vary widely.


I discussed about this in the previous chapter on business capabilities. Data and
analytics architecture of any enterprise needs to take into consideration specific
needs emerging out of CEO’s strategy and business plan. I also talked about how
these needs (or ideas) would have to be translated into specific business projects
over a period of few years. In accordance with the defined roadmap, the proposed
architecture would also need to evolve during this period. It does not make sense
to invest early into capabilities that may be required in, say, fourth year.
Now, I come to the point of how non-functional requirements should be cap-
tured so that data and analytics architecture can be defined. In the previous
chapter, I talked about the exercise of “enterprise churning”, through which vari-
ous business needs and ideas emerge. There are two dimensions of such needs or
ideas:

1. Functional requirements: Functional requirements are the various business


needs or ideas, that can get translated into key performance indicators (KPIs)
such as customer satisfaction score or digital use cases (i.e., business scenarios
for data and analytics) such as customer service improvement.
2. Non-functional requirements: Non-functional requirements are the system
requirements that enable the realization of functional requirements. Hence, it is
very important that while we capture needs and ideas from “enterprise churn-
ing” exercise, we keep documenting all the non-functional requirements as well,
because these would become the starting point for defining data and analytics
architecture.

Mind-map illustrated in Fig. 3.1 shows how one can start the process of defining
data and analytics architecture, by understanding various non-functional require-
ments of business stakeholders across business units, functions, regions, and levels
within the enterprise.
A mind-map can help to understand and analyze non-functional requirements
in a structured manner to ensure that all the architectural building blocks are taken
care of while defining data and analytics architecture. Let me explain the mind-
map with two approaches—moving inside-out and moving in clockwise direction.
Let me first explain with the first approach, where the mind-map can be divided
into three annular layers moving from inside (center of the figure) to outside:

• Layer 1: In this layer, various categories of non-functional requirements are


listed within oval boxes—data sources, mode of delivery/access, temporal, and
so on. I would like to highlight here that the ones listed in the illustration is not
necessarily an exhaustive list. In some enterprises, few additional ones can crop
up. Also, the importance of each of these categories would vary for different
enterprises. Further, the “importance” dimension (i.e., relative importance of
different non-functional requirements) has not been captured in this mind-map.
It is taken care of later during architecture options evaluation exercise, that I
will discuss later in this chapter.
40
3

Fig. 3.1 An illustrative mind-map to understand non-functional requirements of data and analytics
Second Element of Strategy—Technology and Architecture
3.2 Understanding Non-functional Requirements to Define Data... 41

• Layer 2: Moving outwards, one can break down each category of non-functional
requirement into various next level constituents. For example, “data type” can
be broken down into “structured”, “unstructured”, and “semi-structured” data.
Again, the ones listed in the illustration is not necessarily an exhaustive list.
Also, while I have not broken layer 2 into further levels of constituents, one
can break them further, if required.
• Layer 3: In this layer, which is the outermost layer of the mind-map, I have
given one example of a business use case (functional requirement) that each
category of non-functional requirement would typically cater to. This is done
to illustrate typical requirements that would come from business stakeholders,
and how they need to be translated into non-functional requirements. There
would be a plethora of such requirements which one needs to analyze in detail
before arriving at all the non-functional requirements and the mind-map. Also,
it is important to note that, in most enterprises, one must take into considera-
tion the requirements of not just internal stakeholders, but also of stakeholders
from outside the enterprise, such as suppliers or customers. As the supply chain
touch points of enterprises are becoming more integrated, spanning beyond
the enterprise boundaries, it is important that the information needs of stake-
holders outside the enterprise are well understood to define data and analytics
architecture.

With the above annular and inside-out explanation of mind-map, let me now
explain it while moving in clockwise direction, taking one non-functional require-
ment category at a time, starting with “Data Sources”.

3.2.1 Data Sources

One of the reasons why implementing any data and analytics solution is diffi-
cult in large enterprises is the presence of a wide variety of systems in which
enterprise data resides. These include various ERPs, CRMs, PLMs, etc. Typically,
multiple versions and instances of each exist within different business units or
geographies. These might have been implemented at different points of time and
might have been customized extensively, to meet local needs or to cater to non-
standardized business processes. All such factors lead to lots of challenges from
data perspective. To add to this complexity, enterprises today are looking to cap-
ture data coming from IoT (Internet of Things), social media, external research
reports, and other sources to drive various digital initiatives. From data and ana-
lytics architecture perspective, one needs to understand the business criticality of
different types of data and accordingly provision the required capabilities in the
architecture. Let me explain this point through the following example.

I was interacting with the business leaders of an organization that is in food packaging
industry. They produce and sell packaging products that are commodity in nature. They
charge premium price for their products because of a strong brand name. They also provide
42 3 Second Element of Strategy—Technology and Architecture

advisory services to their customers, all of whom are in the food industry. Their advice is
mostly on the type of product and packaging that the customers should focus on, as deter-
mined by market trends and future forecast. Their customers treat the insights thus received
as extremely valuable. The organization focused on maintaining a very high quality of their
advisory services, as they believed that it was key to maintaining high customer retention
rate.

Over the last few years, the organization was facing stiff challenge from Chinese competi-
tors, who had started to produce products very similar to those of this organization. The
Chinese products were of almost equivalent quality but were being sold at nearly half the
price. This was posing threat to the market share of the organization. During my discussion
with their business leaders, they mentioned that, despite their strong brand name, the only
way they could continue to charge premium price was by further strengthening the qual-
ity of their advisory services. They did not want to get into price war with their Chinese
competitors as that would jeopardize the survival of their organization in the long term.

To maintain high quality of advisory services, the organization needed to (a) derive their
advice based on even deeper analysis of market research data, and (b) churn out advice
quicker so that their customers would benefit early. To meet both these objectives, the
organization needed to establish a data and analytics architecture that could combine data
obtained from carious enterprise systems with those obtained from market research reports
received in various formats—word documents, pdfs, ppts, excels and pictures, in addition
to the data obtained from social media. Ability to extract relevant information from var-
ied data sources (of structured and unstructured data) and intelligently combine them, to
derive meaningful insights, was one of the most critical non-functional requirements for the
success of the organization.

3.2.2 Mode of Delivery/Access (of Data)

Objective of data and analytics platforms in most enterprises today is not just
to provide information for reporting and analytical needs, but also to provide an
integrated source of enterprise data to drive various digital initiatives. To cater to
all such analytical and digital needs, one must plan for various modes of delivery
of data/insights to different information consumers, within as well as outside the
enterprise. Let me explain this point through an interesting example from oil and
gas industry.

I was advising a large chemical company on their enterprise data and analytics strategy. One
of the divisions of this company supplies and services chemical programs that support oil
and gas production globally. Selecting an optimal chemical treatment program is critical for
the health of oil wells, and to avoid their failure. Therefore, each day, trucks of this company
would move around the oilfields, injecting chemicals into wells and pipelines. To determine
the type and quantity of chemicals needed, their employees would take regular samples from
the oilfields and test them in their field laboratories. This data would then get matched with
other chemical treatment data (such as production data, chemical dosage data, and past fail-
ure data) to determine the optimal chemical treatment program. Recommendations would
accordingly be made to their customers (i.e., oil and gas producers).
3.2 Understanding Non-functional Requirements to Define Data... 43

The above process not only required integrating lot of internal and external data, but also
required good amount of interaction between the company’s employees and their customer
counterparts. There was a need for process workflows to get approved as soon as all data got
integrated together and recommendation engine provided a recommendation. The CEO of
the company wanted this whole process to be automated. This was part of his digital vision
and would give him competitive advantage. Through this initiative, his company could pro-
vide a premium and differentiated service, that could help the company in increasing its
market share in various oilfields across the globe.

To achieve the above objective, from data and analytics architecture perspective, it required,
inter alia, automation of data consolidation and aggregation process, applying of various
business rules and logic, taking pictures of any oil leakages in pipelines and uploading the
same to the data and analytics platform on a real-time basis, applying artificial intelligence
and machine learning to support production and chemical treatment, designing a recommen-
dation engine, and defining workflows that would need to be accessed by the employees of
both the companies (the chemical company and oil and gas producer). All of these needed
to happen in a real-time basis, during day-to-day operations.

From mode of delivery/access perspective, the critical non-functional requirements that had
to be considered while defining the data and analytics architecture were—self-service ana-
lytics, delivery of various information to consumers/applications, ability to upload images
and other data through mobile applications to the data and analytics platform, access to vari-
ous recommendations from the recommendation engine, and access of workflows to various
data.

3.2.3 Temporal

Temporal is related to time duration pertaining to data. It forms another important


category of non-functional requirement. From architecture standpoint, one needs to
account for various aspects of data duration. Would one need to manage and store
data daily or on an immediate (real-time) basis? Would one need to store lot of
future projections in the data and analytics platform? Would one need to retain lot
of historical data, and, if yes, how many days/years of history for various data types
would be required? Which historical data would be required for legal/auditing
purpose and which would be required for analytical purpose? There are costs (both
visible and latent) associated with all these requirements. There is a misconception
that with cloud infrastructure options available today, the costs are very low. This
is not true. While overall cost of infrastructure has come down with cloud, there
are costs associated with each data storage or data processing requirement. Hence,
cost–benefit analysis needs to be done for each requirement.

3.2.4 Data Security

An extremely critical architectural consideration is data security. I was interview-


ing the president of a global enterprise that I was advising on data and analytics
strategy. While he talked about various business imperatives and what needs to be
done to achieve those, from non-functional requirement perspective, he gave a very
44 3 Second Element of Strategy—Technology and Architecture

brief requirement—“I need a system that is simple to use and enables boundary-less
flow of information. At the same time, it should have stringent monitoring for corpo-
rate espionage. Control is required on who can see what information”. While this
was a short requirement from him, it was by no means an easy one to accomplish.
There are various factors that make this task complex. I am listing few key ones
below.

• Data and analytics platform need to be accessed not just by internal employ-
ees of the enterprise, but by employees of partner ecosystem as well, such as
suppliers or customers.
• Various modes of accessing data and analytics platform is increasing today,
since the platform is meeting not just the reporting and analytical needs of the
enterprise, but also driving lot of digital initiatives, for which the respective
digital applications need to regularly access data residing in the platform.
• Data and analytics platforms are moving outside of enterprise boundaries to
public clouds.
• While we refer to enterprise data and analytics platform as a singular noun, any
global enterprise has an enterprise platform that is a virtual sum of multiple plat-
forms, that are inter-connected and work in tandem (This is commonly referred
to as “data mesh” and there are various architectural principles for creating a
good data mesh architecture). Often there is one or more platform in each coun-
try/region to take care of local data needs of the enterprise. Local platforms are
required not just to meet the local business stakeholders’ requirements, but also
to meet legal and statutory requirements of the local government.

3.2.5 Data Type

Data can broadly be divided into three categories, listed in the decreasing order of
“ease-to-manage”—structured, semi-structured, and unstructured. Structured data
has a pre-defined format or structure, hence easy to manage and analyze. Semi-
structured data, such as a CSV file, has some structure to it, which makes it easier
to manage as compared to unstructured data. Unstructured data, which is the most
complex to manage, does not have any pre-defined format. Audio files, video files,
social media posts, word/pdf documents, and PowerPoint presentations are few
examples of unstructured data. Some enterprises want to have data and analytics
platform only for structured data, and they want to manage their unstructured data,
such as documents, through a content management system only. For them, the
business needs to combine structured/semi-structured data with unstructured data
are not many. However, in many other enterprises, the need to combine various
types of data is very critical for decision-making. I explained this point while
citing example of the company in food packaging industry above (under the non-
functional requirement of “data sources”—the first category in this list).
3.2 Understanding Non-functional Requirements to Define Data... 45

3.2.6 Data Atomicity

Depending on the purpose of use of data, one would need either detailed data (i.e.,
data that is granular at the lowest level of transactional atomicity) or summarized
data (i.e., data that is summarized for a particular customer, product, etc., to under-
stand, say, sales to a particular customer or of a particular product in a month).
From data and analytics architecture perspective, one needs to plan to store a vast
amount of data, though not all of which would be required at a granular level.
One of the key reasons for keeping data at a granular level is that executives or
managers often want to do root cause analysis if a certain KPI is not performing
as per expectation. While senior executives may want to drill down to couple of
levels to understand root cause, managers may want to drill down to the lowest
(transactional) level of data. As an example, let me quote below non-functional
requirement provided by the CEO of a company.

The CEO of a global automotive company gave the following requirement to the data and
analytics team. He had a set of strategic KPIs that he wanted to monitor daily, first thing
in the morning, through a simple and intuitive dashboard. He expected these KPIs to be
updated based on data refreshed by the end of business hours of previous day from all the
countries, across the globe, where the company had business operations. Further, he wanted
that the dashboard should give a clear indication of which of his strategic KPIs were not
performing up to the mark, and for such KPIs that were under-performing, he wanted to
understand root cause of the problem in maximum two clicks on his laptop or mobile device.

Based on the root cause analysis, the CEO wanted to understand who was responsible to
take corrective action for under-performance of his KPIs, so that he could pick up his phone
and talk to the person responsible. He believed that even if the person was couple of levels
below him in the organizational hierarchy, picking up the phone and talking directly to the
person responsible always brought faster results.

To meet the above requirement of the CEO, it was important that while defining data and
analytics architecture, the architecture team, inter alia, created a data architecture that had
the right amount of detailed and summarized data, integrated in a manner that would enable
all such drill downs in an efficient manner.

3.2.7 Latency

Latency means the frequency at which data is ingested from each of the data
sources. It is another important consideration while architecting a data and ana-
lytics platform. For a data and analytics platform, latency would typically vary
from real time (i.e. immediately on completion of a transaction or an activity in a
system) on one hand to monthly (i.e. once in a month—on a particular day of the
month) on the other hand. For many decision-making needs, latency of once in a
day is good enough.
Low latency (i.e., real-time, or near real-time) architecture comes with an asso-
ciated high cost and therefore should be planned carefully. I have seen many cases
in enterprises where they get lots of data on near real-time basis, but their business
46 3 Second Element of Strategy—Technology and Architecture

stakeholders do not need to use all of them till the next day. On the other hand,
there are cases where the need is to take decisions on a real-time basis, but data is
refreshed every few hours or once in a day. Hence, planning for the right latency,
based on decision-making needs, is very important. During “enterprise churning”
exercise, described in the previous chapter, one would get good understanding
of latency needs for various business scenarios, based on discussions in business
workshops.

In the example of the company in oil and gas industry sector that I quoted earlier in this
section of the chapter, under the non-functional requirement category “Mode of deliv-
ery/access (of data)”, data was required to be ingested on a real-time basis into the data and
analytics platform, so that the entire cycle of oil sample collection to recommendation could
be completed quickly and necessary actions taken to avoid any oil rig or pipeline failure.
Any delay meant huge loss of money and, sometimes, could even lead to fatality. Hence,
latency was amongst the most critical non-functional requirements for the success of the
company’s digital initiative.

3.2.8 Data Quality and Integrity

During my interactions with business leaders of different enterprises, I have very


frequently come across the concern that they do not trust the data available in the
enterprise data and analytics platform(s). Because of this, they tend to rely more
on their gut feeling, instead of the data from the platform, for taking decisions. In
my experience, data quality is one of the top reasons why many data and analytics
projects fail. Let me share an example below, in this regard.

I met a delegation of data scientists from the research and development (R&D) function of
a multinational corporation, that is in the business of manufacturing proprietary chemical
and agricultural products. R&D was extremely important for the company and, therefore,
they had a large team of highly qualified data scientists focusing full-time on data mining.
During my meeting with them, they talked about various challenges that they faced while
developing complex algorithms for data mining. However, they mentioned that their biggest
challenge was bad quality of data available. As a result of this, they had to spend more time
in cleaning the data, to make the same usable, rather than in developing algorithms for data
mining and analyzing results. They mentioned that, on an average, they had to spend 80%
of their time in data cleaning. This was under-utilization of the skills of a highly qualified
team. Other than the team getting frustrated, this issue led to delays in projects and loss of
credibility of the results obtained.

The pain of bad data quality is felt not just in R&D function, but in almost every
other function within an enterprise. One function that is almost always unhappy
about data quality is finance. For them, tallying up and reconciliation of numbers
is a must. When the numbers from data and analytics platform do not match with
the financial consolidation or other systems, they end up spending hours and days
in manually working out the numbers in spreadsheets to resolve the differences. It
is a painstaking process.
3.2 Understanding Non-functional Requirements to Define Data... 47

From data and analytics architecture perspective, lot of thought needs to be


given to how data quality would be managed in the data and analytics platform.
Processes need to be established for not just cleaning and harmonizing the data, but
also for regularly monitoring data quality. Efforts should be made to continuously
improve data quality, and for this, one should define and measure KPIs such as data
quality index. Data and analytics architecture should be defined in a manner that
enables executing all the above processes in an automated manner. The platform
should regularly publish a data quality dashboard, that should highlight all the data
quality issues and should show how the quality has been improving over a period.

3.2.9 Business Model

As stated earlier as well, large enterprises are complex. They comprise multi-
ple business units and functions and are present in different geographical regions
across the globe. Their business models vary across the enterprise. Even within
a business unit or a geographical region, where an enterprise has been operating
with a business model for years, digital technologies are compelling the enterprise
to rethink and redefine their existing business model.
An interesting example of business model change is that of servitization of
products, through which enterprises are moving away from B2B to B2B2C model.
Let me describe it below.

Servitization of products is becoming an often-used strategy by many enterprises to differ-


entiate themselves from competition and provide better value to customers. They do this by
providing service through a product instead of upfront selling of product. The definition of
product itself is changing in many cases. Everyone is aware of Netflix and how it is deliver-
ing entertainment content as a service rather than selling DVDs. Similar trend is being seen
in airline industry, where airplane tire manufacturers are selling tires-as-a-service instead
of selling tires upfront to airlines. For each landing of an airplane, the tire manufacturer
charges certain amount of money, that is agreed contractually with the airlines. Beyond that,
the airlines do not have to bother about any upfront payment or repair and maintenance of
the tires. Tire manufacturers regularly monitor wear and tear of tires and decide to either
retread the tires or replace them with new ones, as deemed fit. Data from the sensors in the
tires can also tell the tire manufacturers about the health of the tire.

If one were to look across enterprises that are doing very well today, one would
find many other examples of enterprises driving business model change. From
data and analytics architecture perspective, what this means is that one must take
into consideration the variety of existing as well as evolving business models of
an enterprise. One must understand the flexibility required in the architecture to
accommodate data and analytics needs emerging from the current business models
and the potential future ones.
48 3 Second Element of Strategy—Technology and Architecture

3.2.10 Data Usage

Next category of non-functional requirement is all about understanding various


data usage patterns within an enterprise. Different business stakeholders in an
enterprise have different needs—some need cross-functional data, while others
need global data. Some, such as data scientists, may want to work with raw data,
while others may want to work with harmonized data. In the previous chapter,
I covered this aspect in detail, while discussing about enterprise performance
management, KPI framework and digital strategy.
It is a well-known fact that most valuable insights come from cross-functional
data. Let me share an interesting example to explain this point.

If one does correlation analysis between data of employees’ absenteeism (obtained from HR
systems) and data of quality of product/service delivered (obtained from quality manage-
ment systems), one may find correlation between them. One may infer from this that greater
absenteeism rate led to poor quality of product/service, possibly because the existing team
became overloaded with work. Some further analysis and discussions with relevant stake-
holders would then be required to confirm this hypothesis. Hypotheses such as this, when
validated by data, gives lot of valuable insights into problem areas, thereby improving, inter
alia, quality of product/service and productivity of people.

Cross-functional data is required not only for correlation analysis or for analyzing
various hypotheses but is also required for measuring various KPIs. Business pro-
cesses in enterprises cut across various business functions, and for measuring many
KPIs, one needs data from across various functions/business units/geographies.
From data and analytics architecture perspective, one must architect data in a way
that meets all such cross-functional needs, that were either stated during business
workshops or not stated explicitly but could possibly arise in future.

3.2.11 Metadata

Last in the sequence of non-functional requirements in the figure is metadata,


which, by definition, is “data about data”. Metadata can broadly be divided
into business metadata and technical metadata. Broadly speaking, business meta-
data relates to business terminology, rules, and data traceability, while technical
metadata relates more to data formats, structures, and how data moves across
tables/systems (technical lineage). Metadata management is all about managing
data in a way that makes it easier to understand, use, and analyze. Let me share
below an interesting example on traceability of business data.

I was talking to a senior leader from an organization that is in agriculture and food indus-
try. This organization is present in many different countries and has a global supply chain.
The leader was sharing with me his data needs and challenges. One of the challenges he
talked about was the need for traceability. He mentioned that their customers were keen to
understand traceability of food products “from the farm to the fork”. When the customers
buy a food item from the shelf of a retail outlet, they would like to scan the product using
3.3 Defining Data and Analytics Architecture 49

their smart phone and check traceability. He mentioned that customers today are not only
interested in becoming aware of the nutritional facts of the product they are buying but are
also keen to ensure that the product was made in a sustainable manner. This need has been
increasing over the last few years, as customers are becoming more conscious of what they
buy.

The reason why the leader shared with me the above trend was to discuss on how to establish
a data and analytics platform that, inter alia, could connect all the data together in a way that
would provide end-to-end traceability for every stock-keeping unit (SKU) available on the
shelf of a retail outlet or on an e-commerce site.

Requirements such as the above are not just limited to food industry. Many exec-
utives and managers across various enterprises want to have better traceability of
various data. In the previous chapter, while discussing about enterprise perfor-
mance management, I talked in depth about the importance of having common
definition of data and KPIs. Managing all such definitions also falls under the
purview of metadata management. Understanding all such metadata-related non-
functional requirements, both the generic ones as well as the enterprise-specific
ones, is very important while defining data and analytics architecture.

3.3 Defining Data and Analytics Architecture

Understanding all the non-functional requirements, as discussed in previous


section, is the most critical step in defining enterprise data and analytics architec-
ture. Once all the requirements are understood and analyzed in detail, one should
proceed to identify possible architectural components that would be needed to
satisfy these requirements. In view of the wide variety of disparate non-functional
requirements that would emerge from different business entities of any large enter-
prise, the exercise of architecture components identification needs to be done in a
bottom-up approach, generally starting with business function level within each
business unit of a geographical region/country and aggregating upwards later.
These components would belong to one of the following layers of a data and ana-
lytics architecture (A good data and analytics architecture is always multilayered,
broadly categorized as below):

• Data integration layer, that takes care of integrating or ingesting varied datasets
from a wide variety of transactional applications/data sources into a common
data and analytics platform.
• Data management and storage layer, that takes care of harmonizing and
modeling data in a manner that would fulfill all types of data demand of the
enterprise.
• AI/ML, analytics, and information delivery layer, that fulfills all data,
information, and analytical needs of the enterprise.
• Other layers, within as well as cutting across all the above layers, such
as data security management layer, metadata management layer, data quality
50 3 Second Element of Strategy—Technology and Architecture

management layer, and so on, that ensures governance of data and analytics
platform.

I do not intend to get into technical details of the above layers, as the objective of
this book is not to get into technical aspects of various architectures but is rather
to explain how one should approach defining data and analytics strategy, including
architecture.
With this context, let me share an illustration, in Fig. 3.2, that depicts how one
can create summary matrices to map architectural components required by each
business function to satisfy various non-functional requirements.
The approach shown in the figure would vary based on how an enterprise is
structured. For each cell in the matrix, one should highlight specific non-functional
requirement(s) and the corresponding architectural component(s) that would be
needed to meet the different business use cases emerging from “enterprise churn-
ing” exercise. For example, a use case such as predictive maintenance has the
non-functional requirement of real-time IoT analytics, for which one needs to pro-
vision architectural component(s) for ingestion of real-time IoT streaming data in
the “data integration” layer along with components in the other layers.
Using the above approach, once all the required architectural components are
identified for each of the layers, the next step is the create an integrated end-to-
end view of data and analytics architecture containing all the required components
that would meet all the non-functional requirements. When one tries to do that, it
may emerge that there are more than one possible architecture options. Sometimes,
there could be three or four possible architecture options, each having its own pros
and cons. In such cases, objective evaluation of various options needs to be done.
Before, talking about how to do objective evaluation of multiple options, let me
share an interesting experience from a company that had a strong need for manag-
ing customer master data for the purpose of analytics and other digital initiatives,
and to fulfill this requirement, they needed to decide between two architecture
options.

One of the fortune 500 companies had an extremely complex landscape of enterprise sys-
tems. This complexity resulted into various data related issues. The CEO of this company
had laid out a digital vision that comprised multiple digital initiatives to improve the com-
pany’s top line and bottom line substantially. He was, however, unable to kickstart these
initiatives due to data issues. He believed that if some of the top data issues could be
addressed, he could start many initiatives that he had envisaged.

One of the top data issues was around customer master data. While the company had imple-
mented a single customer master data management (MDM) solution many years ago, it had
multiple shortcomings. One of the shortcomings was that all the relevant customer master
data attributes across the customer life cycle was not being captured by this system. The rea-
son for this was that, over the years, their business had evolved, and new systems had been
implemented that were capturing some useful additional data regarding customer master,
but these were not coming to the central MDM system. Even the data that existed in the cen-
tral MDM system had data quality issues, that resulted from several process and governance
gaps.
3.3 Defining Data and Analytics Architecture
51

Fig. 3.2 Typical approach to map architectural components to business functions


52 3 Second Element of Strategy—Technology and Architecture

Fig. 3.3 An illustrative evaluation of architectural options based on key architecture principles

The company had more than 300 applications that needed various customer master data.
This need was being fulfilled by a complex maze of data flows cutting across various sys-
tems. This obviously led to both high cost and data quality issues, which often led to adverse
business impact, such as bad customer service.

While defining data and analytics architecture for the company, the above challenge
emerged as a major non-functional requirement that needed to be taken care of. To address
this, two options were identified, which are described below.

First option was to enhance the existing MDM solution. To do this, many other systems also
needed to be changed. Further, it required changing some business processes and imple-
menting the same across the globe. This option was the ideal solution, but required high
cost and more importantly, a huge change management drive. Lot of concerns were raised
about the practical possibility of implementing this option. However, the cons needed to be
weighed against the benefits.

Second option was to leave the existing MDM solution as-is and create an operational data
store within the data and analytics architecture to manage customer master data. This data
store could become the central platform to feed customer master data to more than 300
enterprise applications that needed the data. This option did overcome some of the cons of
the first option but had its own limitations.

Hence, the next step was to do a thorough objective evaluation of these options, so that the
right decision could be made. I will discuss an objective evaluation approach soon, however,
for this company, second option was selected. This was based on quantitative evaluation of
the options based on multiple parameters.

While the above example is limited just to customer master data, through it I
wanted to demonstrate the business criticality of evaluation of architecture options.
A bad option, if chosen, can result in substantial loss to business.
Returning to the topic of end-to-end data and analytics architecture finaliza-
tion, once all the architecture options are defined, one needs to evaluate them and
choose the one that is most suited. It is important that this evaluation is done in an
objective manner, based on quantitative scoring of parameters (“key architecture
principles”) and sub-parameters, as needed. An illustrative evaluation is shown in
Fig. 3.3.
3.4 Selecting Relevant Technologies … 53

As shown in the figure, one should score, as per a pre-defined scale, each
parameter (or key architecture principle) and sub-parameter (if any suitable sub-
parameters are identified). Key architecture principles are critical dimensions or
elements of any IT architecture. These principles need to be taken into consid-
eration while defining the architecture. Further, the relative weightage of each
principle depends on its importance in the context of the objective of the initia-
tive for the enterprise. For data and analytics architecture, one can assign weights
based on the importance of each principle as compared to the others. For example,
if flexibility is very important to take care of both frequent data and analytics tech-
nology changes and dynamic nature of analytical needs of business stakeholders,
then one should assign higher weight to flexibility as compared to, say, maintain-
ability. In such case, business benefit of high flexibility may outweigh the cost
resulting from poor maintainability.
In practice, evaluation of architecture options is not as easy as it appears.
Unlike in technology tools evaluation (that I will discuss in the next section of
this chapter), where the degree of subjectivity is less, in architecture options eval-
uation, the degree of subjectivity is very high. Most large enterprises have more
than one stakeholder involved in deciding the architecture. There is, generally, a
difference of opinion in what weight should be assigned to each evaluation param-
eter (i.e., key architecture principle) while calculating overall weighted average
score of evaluation. To resolve such decision-making challenges, one of the best
approaches is to use “Analytic Hierarchy Process (AHP)”. AHP is a systematic
mathematical approach to arrive at weights to be assigned to various attributes
in a decision-making process. I will not describe AHP calculation process here
as there are numerous resources available in public domain that explain AHP in
detail. Using an approach such as AHP helps in arriving at a final score that would
be agreeable to all. No one can refute a scientific basis of arriving at an objective
score.
With the overall approach described above, one can chose the data and analytics
architecture option that would work best for an enterprise. I would like to highlight
one last point—once an option is finalized, some enterprises prefer to take opinion
of an independent third-party expert (who can be either from an IT analyst firm or
from an IT consulting firm), before investing large sums of money into it. I think
this is always a wise thing to do.

3.4 Selecting Relevant Technologies After Defining Data


and Analytics Architecture

Having finalized the architecture for data and analytics, the next logical step is to
select tools and technologies that would satisfy the defined architecture. This is
analogous to selecting building materials for a civil construction once the archi-
tectural blueprint of a building is ready. There are multiple good data and analytics
tools available in the market today. These tools and technologies are from mega
vendors such as Microsoft, Amazon, Google, IBM, SAP, and Oracle in addition to
54 3 Second Element of Strategy—Technology and Architecture

those from niche vendors, such as Qlik, who have just one or a select few tools in
their portfolio. While there is no dearth of tools and technologies available in data
and analytics for every layer of the architecture, one needs to adopt a very struc-
tured and objective approach while selecting the ones that would be right fit for the
enterprise. Figure 3.4 shows an illustrative parametrized approach for evaluation
of tools.
In the example shown in the figure, following steps were used:

1. Potential tools for each architectural component (in this case data visualization)
were shortlisted. Seven tools were shortlisted in this example.
2. Various evaluation parameters, which were important for the enterprise, were
identified for scoring of the shortlisted tools.
3. These parameters were bucketed into three categories based on importance—
high, medium, and low.
4. Weights for each of the three categories were assigned based on their rela-
tive importance for the enterprise. “Analytic Hierarchy Process” approach, as
discussed in the previous section, was not used in this case as there was no
difference of opinion on weights. However, one can use AHP, if needed.
5. For each parameter, multiple sub-parameters were identified (sub-parameters
are not listed in the figure). Each sub-parameter was defined in such a way that
there would be no subjectivity while scoring a tool on that sub-parameter.
6. Once all the above steps were completed, experts started scoring each of the
seven tools on each sub-parameter.
7. Next, an overall score for each parameter was calculated as a simple average
of the scores of all the constituent sub-parameters.
8. Finally, overall tool evaluation score (for each of the seven tools) was calculated
as weighted average score of parameters using weights decided earlier for the
three categories of parameters.
9. The tool with the highest overall tool evaluation score was recommended as the
top choice.

I recommend using the above approach for selecting tool for each of the archi-
tecture component. Let me share few additional suggestions as well. First, in the
above evaluation, you will notice that cost is one of the parameters of evaluation
under “medium importance” category. Sometimes, cost carries higher weightage if
that is a critical consideration for an enterprise. Second, sometimes proof of con-
cept is done using the top tool or the top few tools to ensure that any unforeseen
issue do not crop up later. Most tool vendors are more than happy to facilitate
and support such proof of concept as part of their sales process. Third, almost
all enterprises have an existing set of tools in their landscape. While doing the
above evaluation, one should include existing tool(s) in the shortlisted list. If the
top scoring tool turns out to be different from an existing tool, a deeper analysis
of pros and cons of switching tool should be done. In most cases, one will go with
the new tool if it scores the highest. However, if the risks of switching are high
and/or the difference in scores is not much to justify switching, one may decide to
continue with an existing tool.
3.4 Selecting Relevant Technologies …

Fig. 3.4 Illustrative parametrized comparison of tools for each architecture component
55
56 3 Second Element of Strategy—Technology and Architecture

3.5 Summary

A business-driven approach for defining data and analytics architecture is always


more successful in enabling an enterprise’s required business capabilities than a
purely technology-driven approach. There are many examples of failure of enter-
prises that adopted a purely technology-driven approach, even though they used
best-in-class tools and technologies. Following steps should be followed to define
enterprise data and analytics architecture and choose relevant technologies for the
same.

1. Understand non-functional requirements of business stakeholders across busi-


ness units, functions, regions, and levels within the enterprise.
2. Analyze these requirements using techniques such as mind mapping.
3. Create matrices to map architectural components required by each business
function, across business units and regions/countries as well as the corporate,
to satisfy various non-functional requirements.
4. Using architectural components thus identified, define an overall/end-to-end
data and analytics architecture. While doing so, one would often arrive at more
than one architecture option.
5. Evaluate the architectureal options based on key architecture principles and
rank the options. Choose the highest ranked one.
6. Shortlist suitable tools and technologies for each architecture component of the
chosen architecture.
7. Evaluate the shortlisted tools and technologies based on objective parameters
and sub-parameters. Choose the highest ranked tool for each architecture com-
ponent. Sometimes, one may need to do proof of concept before deciding. Also,
one needs to take into consideration the tools that already exist in the enterprise
before taking a final decision.
Third Element of Strategy—Team,
Processes, and Governance 4
Establishing Building Blocks, Including an
Agile Team, for Success

4.1 Why Data and Analytics Organization and Processes


Need to Be Different from Other IT Functions?

In Chap. 2, I discussed about business capabilities, which is the first element and
the starting point for defining enterprise data and analytics strategy. Business capa-
bilities are enabled by underlying architecture and technologies (second element
of the strategy), about which I discussed in Chap. 3. The third element of strat-
egy is about data and analytics team, processes, and governance, which are equally
important for a successful data and analytics program. In this chapter, I will discuss
about this element in detail.
Many enterprises structure data and analytics organization and processes in the
same way that they do for other IT functions. This is not recommended, as the
approach with which a data and analytics project needs to be executed is different
from how an ERP or a CRM project is executed. This is because, inter alia, the
level of ambiguity in data and analytics projects is generally higher. A decision
maker does not always know what decision needs to be taken. Based on the pat-
terns or insights that data unearths, a decision maker needs to decide on the areas
(key performance indicator, information, or an analytical use case) that need to be
further analyzed. Hence, detailed requirements are rarely easy to define in a data
and analytics project, unlike in an ERP project, where one can define lowest level
of business processes, based on how employees are executing their day-to-day
work. Of course, in an ERP project, one needs to spend time in standardization of
business processes, which entails different type of complexity.
I have seen that when data and analytics project team reach out to the business
stakeholders to understand their business requirements, they often get frustrated
when the stakeholders are not able to tell their requirements clearly. This is because
of the ambiguity that I talked about in the previous paragraph. Instead of trying
to get a prescriptive set of requirements, the project team need to understand the
business context better. They need to put themselves in the shoes of the business

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 57
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_4
58 4 Third Element of Strategy—Team, Processes, and Governance

stakeholders and empathize with the challenges they face. Only then would the
project team be able to design a system that can manage the inherent ambiguities.
The team also need to coach the business stakeholders on how to use data and
analytics system effectively to meet their analytical needs. To realize all of this,
inter alia, it is important to design data and analytics organization and processes
in a manner that helps in managing inherent ambiguities and challenges of data
and analytics projects.
To align data and analytics team to business stakeholders better, over the last
few years, many enterprises have brought their data and analytics organization
under a business leader instead of keeping it under the CIO. In many cases, this
business leader is the CFO since finance function is one of the major consumers
of enterprise data. Further, to give due focus to data and analytics, many enter-
prises have also created roles such as Chief Data Officer, Chief Analytics Officer,
Chief Data and Analytics Officer, or Chief Digital Officer, who leads the data
and analytics organization. These roles typically report into CEO or another CXO.
However, just by bringing the data and analytics organization under a business
leader, instead of the CIO, one cannot solve the challenges I discussed in the
previous paragraphs. Irrespective of which leader the data and analytics organiza-
tion reports to, there are certain guiding principles and best practices for design
of organization and processes that is required for its success. I have summarized
these in this chapter. These recommendations are based on my experiences from
several successful enterprise data and analytics programs, where data and analytics
delivered exponential business value to the respective enterprises.

4.2 Choosing the Right Data and Analytics Organization


Model

The first step towards defining data and analytics organization in any global
enterprise is to decide whether to have it decentralized, centralized, or federated.
Figure 4.1 is an illustration of these three models.
A federated model is often a recommended option to manage complex orga-
nizational dynamics, that is inherent in a large global enterprise. For enterprises
that are not too complex, a centralized or decentralized organization may be more
suitable. However, there are many other considerations, such as enterprise organi-
zation structure and business model, that need to be investigated while deciding
an optimum model. With this context, let me now elaborate on these three data
and analytics organization models and highlight the pros and cons of each one of
them.

4.2.1 Decentralized Organization

Most global enterprises provide lot of autonomy to their business entities in differ-
ent regions/countries. They want the local entities to take most decisions locally to
4.2 Choosing the Right Data and Analytics Organization Model 59

Fig. 4.1 Various data and analytics organization models in global enterprises

meet the local market needs. Since the local entities are accountable for business
results for their respective region/country, they are provided required autonomy.
If they deliver results that is better than the competitors in the local market, the
headquarters do not interfere much. They let the local entities define their own
organization structure, including that of the IT. However, they do centralize cer-
tain functions and processes to derive economies of scale, thereby providing better
support and value to local entities. For example, headquarters often centralize pur-
chase of commodities used by multiple regions, especially those that are supplied
by global suppliers. This helps the headquarters in securing better bargain due to
global scale of operations. Similarly, for IT, central purchasing can get better deal
in purchase of hardware and software, when they do it for all regions/countries
centrally.
Since local business entities have lot of autonomy, they often choose to build
their own IT team, that works under complete control of the local business head
(i.e., region/country head of the enterprise). In such cases, data and analytics
organization in each region/country also works independently, except for some
of the central purchasing activities that I mentioned in the previous paragraph.
Such decentralized data and analytics organization has its own independent team,
processes, technologies, and governance within each region/country. They report
directly into the local business head and work very closely with the local business
stakeholders to cater to their data and analytics needs.
60 4 Third Element of Strategy—Team, Processes, and Governance

The advantage of decentralized data and analytics organization is that the team
understands local business needs much better and caters to these needs, as per the
priority set by local business leaders. Decentralized organization can take decisions
quickly, without the need to seek approval from the headquarters.
The disadvantage of decentralized model is primarily around missing out on
economies of scale. Advantage of economies of scale results not just from central
purchasing but from many other areas as well. For example, over the last few
years, lot of innovation is happening in data and analytics technologies. To deliver
high value to business, it is important for the data and analytics organization to
keep track of all the innovations and, using some of the new technologies, come
up with breakthrough business solutions. Such initiatives can create high business
value but requires good amount of investment of time and money in research and
exploration. Also, such initiatives require lot of trial and error, because of which
success rate is always low. Hence, driving these initiatives at each region/country
can prove to be expensive and, therefore, not viable. Another example of losing
out on economies of scale, in a decentralized model, is the activity of establishing
and maintaining of processes, standards, and governance by each region/country.
Losing out on all such economies of scale often makes the local data and analytics
organization expensive, inefficient, and ineffective.

4.2.2 Centralized Organization

As against a decentralized data and analytics organization that is spread across


the globe, a centralized organization is generally located at the headquarters. I
have, however, seen a few enterprises that have a centralized data and analytics
organization that is based out of one of the regions instead of the headquarters.
This generally happens when the regional team has very strong data and analytics
capabilities. In one enterprise, I saw that this strong capability resulted because
of acquisition of a company in the region, that brought with it a strong data and
analytics team. Hence, the headquarters decided to build further upon that capabil-
ity, instead of building the organization at the headquarters. However, in all such
cases (where centralized team is based out of a regional office instead of the head-
quarters), the headquarters would have good amount of control of the data and
analytics organization. Hence, for the purpose of discussion of centralized model,
I will refer to the central team as the one belonging to the headquarters.
In the centralized model, all the needs of data and analytics of businesses any-
where across the globe are met by this central team. The central team has access
to data from all the enterprise systems and consolidates this data into one or more
data warehouses or data lakes, located in one or more geographical location.
In theory, a centralized data and analytics organization model is the most effi-
cient, since it has the best economies of scale. However, in practice, it often results
into dissatisfied data and analytics consumers in different regions/countries. Only
the headquarters would generally be happy with this team, as the team focusses
more on the needs of the headquarters. Central team is often blamed to be very
4.2 Choosing the Right Data and Analytics Organization Model 61

bureaucratic in their working style and is perceived to be far away from ground
reality. This is true in many cases since the central team do not always understand
the nuances of local business entities. Further, they tend to become so process-
oriented that they lose agility in catering to business needs, especially those of
the regions. When this happens, business functions within regions starts creating
their own local data marts, so that dependency on the central team is reduced to
the minimum. They hire some local IT talent to develop and maintain these data
marts. Over time, such data silos mushroom across the enterprise to such an enor-
mous extent that the core objective of centralized model (of high economies of
scale) is defeated. In my experience, I have rarely seen a completely centralized
data and analytics organization being successful, especially in large and complex
global enterprises.

4.2.3 Federated Organization

A federated data and analytics organization has a core central team and multiple
regional teams that are aligned to the central team. This organization model is the
most practical one for any large and complex global enterprise. It draws upon the
advantages of both centralized and decentralized models, while minimizing their
disadvantages.
Federated organization model centralizes many activities such as core technol-
ogy purchases, research, core processes, and laying down of standards. It shares
best practices and learnings with all local teams and ensures a culture of col-
laboration. The central team takes care of all data and analytics needs of the
headquarters, while it takes care of limited needs of local entities. There are many
needs which are common for central as well as local business stakeholders, for
example, information on daily sales, orders, order backlogs, and so on. Local
stakeholders need a local view of such information every day, while stakehold-
ers at the headquarters need a global view of such information, aggregated from
each region. Hence, for such needs, there can be a common solution for local as
well as central team. This reduces cost of developing solution, ensures consistency
in data quality, and helps in standardizing the way data is visualized and analyzed
across the enterprise.
However, there are many needs that are unique for headquarters as well as for
local businesses. For example, certain corporate functions such as treasury have
unique data and analytics needs. Similarly, local businesses have unique needs
such as detailed sales performance of each sales territory or information sharing
with government bodies to comply with a local regulatory guideline. Local data
and analytics teams should take care of such local needs, since they can develop
better solutions, by virtue of being closer to the local stakeholders. While doing so,
they should comply with overall standards, guidelines, and best practices provided
by the central team. In my experience, the distribution of responsibility of solution
development (between central and local teams) varies across different enterprises,
based on various business dynamics and organization structure.
62 4 Third Element of Strategy—Team, Processes, and Governance

While federated model has many advantages, it also has its own set of chal-
lenges, which are related mostly to people aspects. Federated model often gives
rise to power tussle between the central and local teams, with central team wanting
to deliver more projects centrally, so that they can get greater budget as well as
more recognition. Similarly, local teams like to cater to all local needs. Another
common challenge is that the local teams get sandwiched between the demands
of local business leaders and of central team (since local teams have to support
some central projects as well and share relevant information for the same) leaders.
Central team wants the local teams to give higher priority to the information needs
of the headquarters, while the local business leaders want the local teams to focus
primarily on the local needs.
Despite all the challenges, federated model always scores higher than the other
two models for large and complex global enterprises. Of course, one would need
to manage the human aspects in a tactful manner, and that is why the role of
global data and analytics leader is very important. In the last chapter of this book,
I will discuss about the soft skills that a data and analytics leader must possess.
In addition to having a strong data and analytics leader, there are other measures
required to minimize the disadvantages of a federated model. I will discuss about
these in the next section of this chapter.
Finally, I want to highlight a practical aspect of federated model. This model
is not just a single model but comes in many different forms, based on degree of
federation. While on one hand, a federated model can be skewed more towards the
headquarters in terms of scope and responsibilities, on the other hand, it can be
skewed more towards the regions (with possibly a different skew for each region).
The level of skew is dictated by the enterprise organization structure, business
model, political dynamics, and maturity of each local data and analytics team.
Accordingly, the local teams would be leaner or heavier.

4.3 Defining Data and Analytics Organization and Processes

Once data and analytics organization model (decentralized, centralized, or feder-


ated) is decided for an enterprise, the next step is to define data and analytics
organization structure and processes, along with the required governance. I have
not seen many global enterprises that have defined organization structure and pro-
cesses that are conducive to manage the inherent complexities and ambiguities of
a data and analytics program.
One of the major reasons for failure of data and analytics program is the loss
of trust of business stakeholders in the data and analytics organization. During
my interaction with various business leaders, they shared various concerns in this
regard. Quoting couple of such concerns below.

Our data and analytics team in IT department does not understand what we need. Their
understanding of our business imperatives is poor.
4.3 Defining Data and Analytics Organization and Processes 63

To meet any of our requirements, the data and analytics team comes up with a six month or
one-year project plan. We do not have that much time and patience.

The above concerns are often caused by the absence of proper organization struc-
ture and processes. It leads to business stakeholders developing their own silo data
marts, which ultimately results in mushrooming of data silos across the enter-
prise. I discussed about this while elaborating “centralized model” in previous
section. While these data silos give greater autonomy and agility to the business
stakeholders, they are not optimal solutions. Business functions neither have the
required skills for managing data and analytics complexities nor can they benefit
from economies of scale that a specialized and shared team can bring. This results
in high cost and low value from investment in data silos/marts.
The above discussion raises few pertinent questions. How should one define
data and analytics organization structure and processes that can be agile and busi-
ness friendly? How to have greater proximity to business and understand their
imperatives? How to maintain balance between high data governance/security
and high flexibility (to let business stakeholders play around with multiple
datasets, within and outside their business functions, and accordingly test various
hypotheses that they may have)?
Let me answer the above questions by sharing few best practices that always
works. Let me start with the recommendation that data and analytics organization
should always be logically structured into five towers, each having a clearly laid
down objective, as illustrated in Fig. 4.2.
Please note that the five towers are logical towers. In practice, two or three
towers can be merged into one with a common head and/or people belonging to
one tower can be shared with other tower(s). However, even in such cases, the
objective, processes, and responsibilities (of each logical tower) remain the same.
Later in this chapter, while describing service delivery tower, I have highlighted an
increasing trend towards merging the solution delivery and service delivery towers
into one, with common DevOps team.
Each of the five towers should own certain key processes, that I will describe
soon. Further each tower needs to have a leader who should be accountable for
the tower. The tower leader should have a team as per the workload of that tower.
All the five tower leaders should report into the data and analytics head.

Fig. 4.2 Five towers of an ideal data and analytics organization


64 4 Third Element of Strategy—Team, Processes, and Governance

In the previous section, I talked about three organization models for data and
analytics. Each of the three models have different ways in which data and analytics
teams are spread across the globe and different governance around them. So, one
would wonder as to how the five towers of the data and analytics organization
would look like for each of the three models. Let me address this aspect below.

1. Decentralized organization model: In this model, since all the teams in differ-
ent geographies work independently, each team can structure their organization
independently, as per the five towers approach described above. Of course, the
level of maturity of some of the smaller geographies (i.e., geographies that are
small in terms of business volume of the enterprise) would be lower, as they
may not have the necessary budget and economies of scale. In such cases, many
of the team members may have to wear multiple hats, within the tower or for
other towers.
2. Centralized organization model: In centralized model, since there is only one
team, this team can structure themselves as per the five towers approach
described above. However, central team does not mean that physically all the
team members should be present in one geographical location. In fact, even
in centralized model, I always recommend spreading out the team, especially
the business tower team, across different geographies. The control of the team,
though, remains centralized. I shall be discussing about this aspect in more
detail later in this chapter. A centralized organization model enjoys economies
of scale and higher budget, so they should strive for higher level of maturity.
3. Federated organization model: Of all the three models, structuring data and
analytics organization in this model is the trickiest one. While I recommend the
same five logical towers approach for this model as well, how you constitute the
five towers and how you establish governance in federated model are the ques-
tions that need due consideration. Let me elaborate this point. The central team
in this model needs to have all the five logical towers. However, the regional
teams can have different variants—some can have all the five towers, with a
greater degree of independence, while the others can have only two or three
towers, with dependence on central team to take care of the responsibilities of
the remaining towers. In the previous section of this chapter, I discussed about
how this model can have different skews based on the nature and needs of the
enterprise. The organization structure also needs to be designed accordingly.
A federated organization model, if managed and governed well, can have the
highest level of overall maturity.

Irrespective of the organization model, there are certain key roles and processes
that each of the five towers must have. This is illustrated in Fig. 4.3.
Let me now discuss each of the five towers in detail, covering their key
processes, roles, and responsibilities.
4.3 Defining Data and Analytics Organization and Processes

Fig. 4.3 Ideal data and analytics organization structure and processes
65
66 4 Third Element of Strategy—Team, Processes, and Governance

4.3.1 Governance Tower

This tower is responsible for formulating charter, policies, and processes for data
and analytics program for the enterprise. It is the custodian of all processes and
ensures their compliance. Let me elaborate on some of the key responsibilities of
this tower.

• Strategy and roadmap: This tower anchors both the definition of data and
analytics strategy as well as subsequent implementation of projects (as per
the defined roadmap). Even though the actual work may be done by inter-
nal/external expert groups that are not part of this tower, ensuring that strategy
definition and implementation are done as per plan is the responsibility of
this tower. This tower should also periodically review both the strategy and
roadmap for their relevance. Such exercise should be done jointly with busi-
ness tower and technology and architecture tower. Even after initial roadmap is
laid down, new requests would keep coming to the data and analytics team (as
elaborated in the next paragraph under “demand management”). Accordingly,
required changes in strategy and/or roadmap need to be done.
• Demand management: Data and analytics team in any enterprise would keep
getting a plethora of requests/demands (small, medium, or big) from various
functions within the enterprise. These requests could vary from a need for an
operational report on one end to something transformational on the other end.
In between these two ends of the spectrum, there would be a wide variety of
requests. These requests would be communicated through various channels—
emails, phone calls, in-person discussions, and indirect communication through
someone else. To review and manage all such diverse requests, governance
tower should lay down a structured process for demand management. Such
process would ensure that all requests are reviewed as per laid down norms
and are accordingly prioritized and planned for implementation with a suitable
approach.

Demand management process should comprise the following steps.

1. Understand varied requests coming from across the enterprise through


various channels.
2. Evaluate each request in detail and take a GO-NOGO decision. Let me
explain this step further. In any enterprise data and analytics charter, cer-
tain criteria should be laid down to decide what type of request should be
catered by the data and analytics team and what should be catered by some
other team. For example, if a request is for certain operational report, that is
very tactical and is best to be built within a transactional application, data
and analytics team should direct the request to the relevant transactional
application team.
4.3 Defining Data and Analytics Organization and Processes 67

3. Evaluate business impact of a request. Requests that have the potential to


create bigger business impact or have higher business criticality should be
given higher priority.
4. Evaluate the complexity to develop solution to fulfill a request. This
evaluation should help in estimating cost and time required to imple-
ment the solution. A formal business case should be prepared, taking into
consideration previous and current step, for any mid to large size request.
5. Prioritize the request based on steps 3 and 4 above and, based on budget and
resources available, update the data and analytics roadmap with project(s) to
implement the solution.

I normally recommend a weekly demand review meeting, which should have


representatives from governance tower, business tower, and technology and
architecture tower from within data and analytics organization. In these meet-
ings, it is also advisable to invite the person who put forth the request, especially
for mid to large size requests, so that details and clarifications can be sought. In
person (physical/virtual) discussions always help to understand the request bet-
ter. It also helps in developing a cordial working relationship with the requestor,
that would be useful later. Based on demand review, decision can be taken as
to how to process the request further.

• Program management: Governance tower is responsible for data and analytics


program planning and management. Planning of resources (people, hardware,
and software) as per the roadmap should be done in discussion with other tower
heads. This tower needs to manage budgets and keep a tab on overall per-
formance by conducting periodic reviews. It needs to review risks in various
projects and initiate necessary mitigation measures.
• Change management: Governance tower should also ensure that the required
organizational change is managed as per defined strategy. I will discuss organi-
zational change management (fourth element of data and analytics strategy)
in detail in Chap. 5. While all towers of data and analytics organization
have an important role to play in this, governance tower keeps tab on overall
organizational change management.
• Continuous improvement: As mentioned earlier, governance tower should
ensure regular review of data and analytics program. To do this, it should define
key performance indicators (KPIs) for evaluating performance of the program.
It should set quarterly and yearly goals for the KPIs and measure achievements
against them. Objective of this exercise is to ensure that data and analytics
program continuously improves in its maturity and delivers increasingly more
business value each year. I will discuss this aspect in detail in Chap. 6, that is
on value measurement framework—fifth element of data and analytics strategy.

Governance tower is generally a small team reporting to a governance head. Gov-


ernance tower head should be someone who possesses vast experience in large
68 4 Third Element of Strategy—Team, Processes, and Governance

and complex data and analytics program(s). The person should be very structured
in approach and have a process-oriented mindset. She/He should also have good
knowledge and experience in quality management systems, such as ISO 9000.
Finally, the person should have a knack for organizational change management.
If the overall data and analytics organization is not big, sometimes data and
analytics head can take up the additional responsibility of governance head. This
is normally the case during the initial stages of data and analytics organization
set-up, when the team is small and there are only limited project requests that the
team needs to cater to. However, in most large enterprises, it is better to have a
full-time person for the role of governance head.

4.3.2 Business Tower

This is one tower that I often find missing in data and analytics organization
of many enterprises. Some data and analytics leaders believe that since business
knowledge resides within the business functions, it is not necessary to have a ded-
icated group of business experts within the data and analytics team. They believe
that the technical team can discuss with the business stakeholders to understand
their business requirements. This is a big mistake, as not having a focused business
tower within data and analytics organization is often one of the main reasons why
many data and analytics programs are not successful.
This tower comprises people who not only have good knowledge of business,
but also have very good understanding of data and analytics. Therefore, one of
their key responsibilities is co-innovating with the business partners. They play
advisory role to their business partners and co-innovate with them to address a
complex business problem. Let me quote an example below, to explain this point.

The CEO of a multinational conglomerate felt that their sales was not growing to its full
potential, primarily because the enterprise had more than two hundred different sales incen-
tive processes across the globe. Having so many different processes led the sales team to
believe that some sales representatives were getting better rewarded than others, even when
their sales performance was similar. This was creating dissatisfaction amongst the team,
thereby demotivating them to put in extra effort for driving further sales growth. The CEO,
on becoming aware of this concern, tasked his global sales leader to address the problem.

The sales leader arranged a brainstorming workshop, in which he invited various members
from his sales team, in addition to representatives from few other relevant functions such as
HR, IT, etc. One of the persons invited was from data and analytics team. This person had
extremely good knowledge of both the company’s business and its existing enterprise data
and analytics platform (data lake).

During the workshop, everyone agreed that rationalizing various sales incentive processes
into one or a very few common processes would help address the concerns of sales team.
However, any such business process harmonization exercise across this large conglomerate
was too complex and time intensive. It required not just change in processes, but also build-
ing a tool that can bring together sales and other data from hundreds of systems from across
4.3 Defining Data and Analytics Organization and Processes 69

the enterprise. This would be a time consuming and painstakingly complex task that could
span across several years.

The person from data and analytics organization, who was part of all this discussion, pro-
vided a very practical solution. Based on his knowledge of the existing data lake, he brought
forth the fact that almost 75% of data that would be required to drive the new process was
already available in their enterprise data lake, in an integrated data model. He could state
this with conviction since he had very good sense of what data would be required to drive
such a business process harmonization exercise. When he shared the details with the work-
shop’s participants, everyone agreed that the data already available in the data lake would
provide a big head start for developing the tool/solution, if it was built on top of the data
lake. A solution based on data lake would also address data security concerns. One of the
key considerations for the solution was data security, which in the case of existing data lake
was already in place. Hence, the tool could access sensitive data in a secure manner. It would
ensure that people with only required authorization could access the data that they were
authorized to view.

The suggestion put forth by the data and analytics representative was critically reviewed
further in greater detail. Ultimately, it was unanimously approved by all as the best solution.
What was initially assumed to be a project that would span across few years, turned out to
be potentially feasible to be implemented in one year. The sales leader took the proposal to
the CEO, who also approved it immediately. He then went ahead to implement the solution,
which started delivering incremental value to the company very early.

I have seen many such examples, where someone from data and analytics orga-
nization, with good knowledge of data and analytics as well as business of the
enterprise, provided innovative and quick solutions to complex business problems.
Someone who does not have detailed understanding of capabilities existing within
data and analytics organization of an enterprise cannot provide such value. This is
the reason why I mentioned earlier that the business tower is must-to-have in any
enterprise data and analytics organization.
Let me now describe this tower in more detail. This tower comprises a group
of people who not only have extremely good understanding of business priorities
and challenges of the enterprise but also have a very good understanding of the
following.

1. What it takes to define and deliver a data and analytics solution?


2. What data and analytics capabilities are available within data lakes/data
warehouses available within the enterprise?
3. What are the limitations of data available within the enterprise systems? What
are the specific data quality and other issues?
4. What could be the level of reuse of an existing data and analytics solution (that
was delivered to one business function) for meeting a new requirement coming
from another business function with few similar required functionalities?
5. With changing business dynamics of the enterprise (mergers and acquisitions,
business model change, and so on), what data and analytics capabilities would
be required in future to help the enterprise achieve its business objectives?
70 4 Third Element of Strategy—Team, Processes, and Governance

Business tower should be headed by someone who has spent good amount of time
in multiple business functions across business units and various geographies of the
enterprise. The person should have an outgoing personality—someone who enjoys
developing good working relationship with various leaders. She/He should love
solving complex business problems and must have a very good knack for data and
analytics. In my experience, amongst all the five tower heads, getting a suitable
candidate for business tower head role is the most difficult.
Other than the head, this tower should have roles such as business analysts, data
scientists, usability experts and trainers. These people should be engaged by the
solution delivery tower during requirements and design phases of projects as well
as for user acceptance testing and training of end users. They ensure that business
requirements are well understood and translated for the technical team. They also
ensure that the solution developed by the project team is well understood and used
by the business users, who often need initial handholding and training on tools.
The difference between general business analysts (who are often picked up from
a common pool within IT or from a business function of an enterprise) and the
business analysts coming from the business tower of data and analytics organiza-
tion is that, in the case of the latter, they are highly skilled in data and analytics
with deep understanding of capabilities existing within the data and analytics land-
scape available within the enterprise. Hence, they provide much more value in the
development of data and analytics solution. They also think from long-term per-
spective during solution development cycle. This means that they would provision
for certain additional data and functionality that the business stakeholders might
not have asked for in the current project but may potentially need in the future.
Having discussed the key roles and responsibilities of business tower, let me
address two common questions that I have been frequently asked while defining
business tower of a data and analytics organization.

1. What specific business skills (such as supply chain, finance, or others) should
the team possess?
2. For the business tower, since proximity to the business partners is important,
where should the team be located geographically?

Let me address the first question first. While it is important that the business tower
team has good overall business skills, between themselves they must have a good
mix of complementary expertise in certain specific business areas/functions. For
example, if there is someone who has deep knowledge in finance and accounting,
there should be someone else with deep knowledge in supply chain (if it is relevant
for the enterprise). If there are five members in the business tower, all five should
have, as far as possible, core expertise in different business areas. This mix would
help bring in different perspectives during brainstorming sessions. It would also
ensure that the members collaborate with each other and benefit from each other’s
expertise.
As regards the second question, it is important that most of the team is spread
across the globe, with one or more member in each region/country. They should
4.3 Defining Data and Analytics Organization and Processes 71

co-locate with the business partners within various business functions. They should
be perceived by their business partners as people who are part of their own team
and not someone coming from the headquarters, with limited understanding of
ground reality. Business tower team needs to get firsthand feel of day-in-a-life of
their business partners, so that they understand their love points and pain points.
They need to discuss and brainstorm with their business partners on possible data
and analytics solutions that can help address the pain points faced regularly. Such
an approach would not only help in coming up with best possible solutions but
would also create an atmosphere of mutual trust. With this approach, one can
develop solutions in an agile manner by trying out quick proof of concepts together
and then taking up select few ideas for full blown solution development. This
approach is aligned with design thinking process, which talks about empathizing
with customers and exploring quick solutions to help solve their problems.
IT practitioners are taught about agile methodologies such as scrum, kanban,
lean development, extreme programming, and so on. In data and analytics context,
being agile is much more than just using a methodology. It is more about an
organization structure, mindset, and approach, that can solve a business problem
quickly. Agile is not about quickly developing an end-to-end solution (i.e. the
whole nine yards of data engineering, data management, data quality, data security,
and so on) through a highly secure data and analytics platform. That can wait.
Agile, in data and analytics context, is all about delivering insights to business
first and hardening the architecture later. Let me explain this further. To solve a
complex business problem, one needs to come up with various hypotheses and
test them to shortlist the ones that would work. This requires lots of exploration,
brainstorming, and quick prototypes. Once an optimum solution is ascertained,
one can go ahead and plan for hardening of end-to-end architecture, which would
take some time to implement. However, business partners can continue using the
interim solution developed so that they do not face any delays. This is what a true
agile execution means, and business tower plays the most critical role to realize it.

4.3.3 Technology and Architecture Tower

This tower has the responsibility of defining and managing overall data and ana-
lytics architecture. It also has the responsibility of conceptualizing, designing, and
governing data and analytics solutions. If one were to check about the existence
of such a tower across various large enterprises, there would broadly be three
scenarios that one would notice. These are listed below.

1. This tower does not exist in data and analytics organization. Architects engaged
in executing various projects own the architecture and design of respective solu-
tions, leveraging technologies that are available within the enterprise (owned
either by the IT or by any business function).
2. The tower exists but is too thin. There may be just one or two architects
in the tower. However, they are busy most of the time in either project
72 4 Third Element of Strategy—Team, Processes, and Governance

design/development activities or in project management. In the limited amount


of extra time that they get, they try to create few standards and templates that
can be used across different projects. The tower may not have a head. In cases
where a head exists, the role may be only part-time.
3. A well-defined tower exists, with a leader at the helm of the tower. The leader
has a few senior architects who spend reasonable amount of time in activities
beyond just project execution. I will discuss about the various activities/focus
areas of this tower very soon in this section.

This tower, even if it exists within an enterprise (scenarios 2 and 3 above), often
becomes too tactical. One would find varying levels of maturity of this tower across
different enterprises (even in scenario 3). The towers existing in scenario 2 (and
even low maturity level cases of scenario 3) limits the value that data and analytics
technologies can deliver to the enterprise. One may wonder—How does this tower
add value to the enterprise? What should be the responsibilities of this tower?
What are the key activities that the architects belonging to this tower should do
on a day-to-day basis? Let me answer these questions in the following paragraphs,
under various major focus areas of this tower.

• Data governance: Data governance at an enterprise level is a very vast topic.


I do not intend to deep dive into this topic in this book. Instead, I just want
to highlight the role that data and analytics organization (especially technology
and architecture tower, with some support from business tower) needs to play
in enterprise data governance. But first, let me quote an example of a busi-
ness challenge that resulted from lack of data governance in a multinational
company.

I was advising this company on their enterprise data governance strategy. The company
is in the service sector. It was facing multiple challenges in maintaining information
related to their customers. There were no standards, guidelines, procedures, or other
directives available for managing customer information in a consistent way. Customer
master data was created, stored, and maintained in multiple applications. Also, the def-
inition of “customer” varied across different business units. The company believed that
this was one of the main reasons why the quality of customer service was below their
targeted levels, leading to adverse impact on top-line and bottom-line performance.
Through an enterprise data governance initiative, starting with customer master data, the
company wanted to solve this business challenge.

While the example quoted above relates to a specific business challenge result-
ing from lack of data governance, I have seen many other business challenges
in various enterprises that are not able to govern their key data entities well.
Objective of data governance is to formally manage an enterprise’s key data
entities through well-defined policies, processes, and organization structure.
Data governance organization should have all the required roles, such as data
stewards and data owners, whose responsibilities and accountabilities should be
4.3 Defining Data and Analytics Organization and Processes 73

clearly defined. At the helm of data governance organization should be a data


governance council, comprising senior leaders of the enterprise. Without exec-
utive focus and sponsorship, data governance can never be successful. I have
seen in many enterprises that their CIOs are the primary owners for data gover-
nance. This never works. IT can only play a supporting role in data governance.
Primary owners of data are the business stakeholders. Hence, data governance
council should always be headed by a business leader, preferably one of the
CXOs (other than CIO).
Data and analytics organization plays an important role (even though in sup-
porting capacity) in data governance. It must define processes and policies as
well as use relevant tools to manage certain key aspects of data (as illustrated
in Fig. 4.4), as per the directives of data governance council. Overall, primary
responsibility of data and analytics organization is to make data useful and
secure for consumption for analytical purposes. Data and analytics organiza-
tion should also send regular feedbacks and suggestions to the data governance
council, to help continual improvement of business processes and data matu-
rity of the enterprise. In Fig. 4.4, I have summarized key data governance
responsibilities of data and analytics organization, under four categories.
As mentioned earlier as well, technology and architecture tower (with some
support from business tower) should be responsible for all these four areas of
data governance within the data and analytics organization.
• Enterprise information fabric (or enterprise data model): In Chap. 2, I had
discussed about enterprise information fabric and its importance for driving
digital initiatives in an enterprise in detail. As I mentioned earlier, data is at
the core of all digital transformation for enterprises. A disciplined approach
towards creating a blueprint of data goes a long way in achieving digital ambi-
tion of an enterprise. This tower is responsible for creating and managing an

Fig. 4.4 Data governance responsibilities of data and analytics organization


74 4 Third Element of Strategy—Team, Processes, and Governance

enterprise information fabric (or enterprise data model), with the support of
business tower.
For those of you who do not have much understanding of a data model, let
me give a brief overview. Data model is the business blueprint of data. Like a
process model, that is a diagrammatic representation of business process steps,
a data model is a diagrammatic representation of business data. A very simple
illustration of a conceptual data model, containing few data entities (without
attributes) and their relationships, is shown in Fig. 4.5.
Let me explain this little further. Data model shows various data entities, data
attributes (within each entity), and relationship between different entities, in
an easy-to-understand business process-oriented manner. Data entity is a data
category such as purchase order or sales order, while data attributes are specific
attributes of that data entity such as purchase order number, supplier name, item
number, and quantity ordered. Relationships show how different data entities
are related to each other. For example, a one-to-many relationship between two
data entities means that for each data record of first entity, there can be multiple
data records in the second entity. In Fig. 4.5, “sales region” and “sales territory”
have one-to-many relationship, implying that each sales region of an enterprise
(e.g., western India) has multiple sales territories (e.g., Mumbai, Pune, etc.)
within it.
Data entities cut across various business functions, geographies, and business
processes. A conceptual data model, such as the one shown in Fig. 4.5, does
not contain data attributes, so that it can be easily comprehended by anyone.
Conceptual data model is the starting point for creating an enterprise informa-
tion fabric. Once conceptual data model is created (for one business function,
or one business process, or entire enterprise), next step is to create a logical
data model, that contains all the data attributes within each data entity. Based
on business case, one can decide whether to create a big-bang enterprise logical
data model or to create it for one business area and evolve it later for the entire

Fig. 4.5 A very simple conceptual data model with relationships


4.3 Defining Data and Analytics Organization and Processes 75

enterprise. I normally recommend adopting the latter approach, i.e., an incre-


mental one, so that business value can be delivered quickly. Some enterprises,
that go for the big-bang approach, end up spending one year or more to create
an enterprise data model. By that time, either the business stakeholders lose
patience or the business of the enterprise itself goes through some change, e.g.,
a merger or acquisition, that would require revision of the data model. An incre-
mental approach, with the required flexibility in the data model to incorporate
business changes, is therefore better.
Both conceptual and logical data models are on a “piece of paper” i.e., not
physically deployed in any database. When one develops a data and analytics
solution, a logical data model needs to be translated into a physical data model
in a database. There is a common misconception that a data model is very
technical in nature. At a conceptual or logical level, a data model has nothing to
do with technology. I have seen quite a few good conceptual/logical data models
being presented to business leaders. The leaders not only quickly understand
what is being talked about, but also give valuable feedback on improving the
model, since they understand their business and data much better than anyone
in IT. It is only when one gets into deploying a logical model into a physical
one that technology comes into play.
Technology and Architecture tower needs to be the creator and custodian of
enterprise data model. They need to work very closely with the business tower
to create as well as regularly update it, since lot of business knowledge is
required. The model forms the data backbone for analytics and other digital
initiatives. One can promote reuse of enterprise data across various projects if
an enterprise data model is in place and maintained well. It not only helps in
executing projects in an agile manner, but also helps in ensuring better quality
of data.
• Research and innovation: Today, innovation is happening in data and analyt-
ics technologies (both in data engineering and in artificial intelligence/machine
learning) at an exponential rate. Quite a few new technologies, either from
start-ups or from established players such as Microsoft, Amazon, and Google,
keep emerging and making a mark periodically. It is important that technology
and architecture tower keeps track of all such developments and try out various
proofs-of-technologies to explore how some of the new technologies can solve
specific business problems of their enterprise. Exploiting latest technologies is
extremely critical for ensuring that the enterprise stays ahead of the competi-
tion. For this endeavor, technology and architecture tower should work closely
with the business tower, since business tower has better understanding of the
on-the-ground challenges that business stakeholders are facing.
Let me share below an interesting example to show how keeping abreast of
latest technologies can help an enterprise gain competitive advantage.

I was talking to a technology leader of a company that manufactures large earth mov-
ing equipments. Their human-operated equipments were used in various sites, including
76 4 Third Element of Strategy—Team, Processes, and Governance

mines where environmental conditions are very hot and humid. Such conditions made
the equipment operators get tired very quickly. The company did not want any accidents
to happen because of human error resulting from operator fatigue. Although the shift
durations of operators are kept short to prevent any mishaps, the company believed that
this was not good enough. They wanted to explore if they could use some technology to
manage operator fatigue risk better.

Towards the above goal, the technology leader of the company started working with a
start-up that had a proprietary video analytics technology. This technology could be used
to develop a solution to raise an alert about operator fatigue, based on facial expressions
and body movements of the operator. Sometimes, if an operator was not in the best of
health, fatigue could set in even within couple of hours of work. In such cases, it was
best to give the person rest, and get a replacement. A solution like this, when installed
in an earth moving equipment, would be invaluable to the company from human safety
and productivity perspective. It would give the company a competitive advantage.

• Architecture review and governance: This tower is responsible for laying


down technology and architecture standards and ensuring that all projects
comply with these standards. The standards and guidelines should be flexi-
ble enough to cover all scenarios of project delivery. Even in projects where
short-cuts must be taken to deliver to business quickly, a phase of subse-
quent hardening of architecture should be ensured by this tower. This tower
also ensures that best practices and learnings from earlier projects are lever-
aged, in addition to reusing data or other assets already created as part of earlier
projects. To do all this, the technology and architecture tower must work closely
with the governance tower in setting up an “architecture review board”, that is
responsible for reviewing and approving architecture and design of any project
that is taken up for implementation.
• Other responsibilities: In addition to the responsibilities mentioned above, this
tower must also take care of activities such as infrastructure planning, prod-
uct vendor collaboration, and others as needed from technology perspective.
This tower also supports all the other four towers—(a) governance tower for
demand management and architecture review governance, (b) business tower
for quick prototypes and co-innovation, (c) solution delivery tower for review-
ing and guiding on technical/architectural aspects of solution delivery, and (d)
service delivery tower for guiding on technical aspects of service delivery. Some
of these have been discussed earlier as well.

Technology and architecture tower must be headed by someone who has exten-
sive experience of delivering large and complex data and analytics solutions. The
person must have extremely good understanding of capabilities and limitations
of various technologies and can relate the same to business context. The tower
needs to have a core team of solution architects, data architects, and other required
technology experts. The team from this tower should be utilized for a reasonable
amount of time in projects, especially during design phase. However, they should
have a good amount of time available to take care of all the other responsibilities
that I discussed in the preceding paragraphs.
4.3 Defining Data and Analytics Organization and Processes 77

4.3.4 Solution Delivery Tower

This tower is responsible for executing projects based on project pipeline emerg-
ing from demand management process. The project teams in this tower are
responsible for the entire lifecycle of project till the fully tested and running appli-
cations are handed over to the next tower, viz. service delivery tower, for ongoing
maintenance/support.
Solution delivery tower will have varying load at different periods of time.
Sometimes, there could be many concurrently running projects, while at other
times, there could be only a few projects. To manage this varying workload, the
team within this tower need to ramp up or ramp down as per need. Other than the
tower head and select few core members, rest of the team can come in or move
out. However, such movements could lead to risk. Any new member coming in
would not be aware of the processes being followed by the tower. To mitigate this
risk, there needs to be a well-defined onboarding process for anyone joining new,
irrespective of whether the person is coming from within the organization or from
an external contracted agency. Similarly, before anyone moves out of the tower,
there needs to be a well-defined knowledge transfer process, so that any knowledge
acquired during the project is transferred back to the core team, either within the
solution delivery tower or to other towers, as deemed fit. Both onboarding and exit
processes must be documented along with pre-defined templates and checklists.
This tower works closely with all the other four towers—(a) governance tower
for ensuring that the plan, policies, and guidelines are complied with, (b) busi-
ness tower for ensuring that requirements are well captured and understood by
the project team, (c) technology and architecture tower for ensuring that the
defined architecture principles are followed and design is reviewed and approved
by them, and (d) service delivery tower for ensuring that application knowledge is
transferred seamlessly to the team for ongoing maintenance/support.
Solution delivery tower should be headed by someone who has vast experi-
ence in executing many data and analytics projects across multiple technology
platforms. The person should have excellent project management skills. Strong
people skill is especially important. The core team of this tower should also com-
prise people with good project experience. They should have expertise in various
project execution methodologies. They also need to continuously cross-skill and
up-skill themselves in emerging technologies, so that they can execute any projects
that come up in these technologies.

4.3.5 Service Delivery Tower

This is the fifth and last tower, but by no means less important than any of the
other towers of data and analytics organization. This tower needs to ensure that all
the data and analytics solutions, developed till date and in use, are up and running
with data getting refreshed into them from various transactional systems as per
defined frequencies. Responsibilities of this tower also include supporting tickets,
78 4 Third Element of Strategy—Team, Processes, and Governance

taking care of minor enhancements to the applications, and carrying out continuous
improvements based on analysis of issues that keep coming up regularly.
With agile methodology, DevOps is becoming popular to reduce cycle time
of solution development and deployment. In brief, DevOps is a practice of inte-
grating the processes for development (Dev) and operations/services (Ops) so as
to improve collaboration, increase efficiency, and become more effective. I do not
plan to cover DevOps or various agile methodologies (such as scrum, kanban, lean
development, and extreme programming) in this book. There are many books as
well as resources available in public domain on these topics. For our discussion
of this tower, it is important to highlight that both solution delivery and service
delivery towers need to work very closely for continuous development, deploy-
ment, and operations of data and analytics solutions. In fact, there is an increasing
trend towards merging the solution delivery and service delivery towers into one,
with common DevOps teams. In such case, each DevOps team (or scrum/squad
team, as they are generally referred to in agile methodology), need to take care of
both development and operations responsibilities on a day-to-day basis.
Service delivery tower should be headed by someone who has wide range of
experience across multiple data and analytics technologies. The person should have
worked in both solution delivery and service delivery and should have a service-
oriented mindset. If the tower is merged with solution delivery tower, the common
head of both towers also needs to have these qualities.
Unlike the solution delivery tower, where the team size goes up or down as
per project workload, service delivery tower needs to have a steady team, that can
increase in size as the number of applications increases, though not proportion-
ately. The team should be a mix of seniors and juniors to take care of complex
as well as simple service requests that the tower needs to cater to. In a DevOps
model, a common team takes care of both solution delivery and service delivery. In
such case, a scrum/squad team will generally have a good mix of skilled associates
to take care of both set of activities on a day-to-day basis.

4.4 Week-In-The-Life of Data and Analytics Team

In the previous section, I discussed in detail how a world-class data and analytics
organization should look like and what should be the key roles and responsibilities
of the data and analytics team. Let me now summarize, with the help Fig. 4.6,
how a typical week-in-the-life of the team would look like. The figure illustrates
multiple processes working in tandem, as various projects at different stages get
executed and all five towers work together to achieve the team objectives. Do note
that the processes/activities mentioned in the figure are not exhaustive.
For a well-oiled data and analytics program to work seamlessly, the team would
typically perform the following processes/activities in tandem.
4.4 Week-In-The-Life of Data and Analytics Team 79

Fig. 4.6 Typical week-in-the-life of data and analytics organization

• Filter and categorize various types of demand, coming from diverse stake-
holders, through “demand management funnel”, as per laid down demand
management process.
• Take forward the qualified demand through applicable information/architecture
governance process.
• Execute projects as per defined processes and deploy the developed solutions.
• Transfer the knowledge of deployed solutions to service delivery team for
maintenance and support.
• Conduct solution training for the business stakeholders to ensure that the
solutions/applications are understood and used without any challenges.
• While the projects are getting executed, keep developing quick prototypes
and continue solution research to come up with new ideas to drive business
performance. Co-innovate with business stakeholders on new ideas.
• Conduct regular program and project reviews to ensure that all laid down
guidelines are followed. Also ensure that change management, knowledge man-
agement, and continuous improvement, including measurement of business
value delivered, goes on as per plan.
80 4 Third Element of Strategy—Team, Processes, and Governance

4.5 Summary

The structure of data and analytics organization needs to be different from the
other IT functions of an enterprise. There are inherent ambiguities in most of the
high business value data and analytics initiatives, because of which the way a data
and analytics project needs to be implemented is different from the way an ERP
or a CRM project is implemented. Hence, the need for a different organization
structure. Before defining data and analytics organization and its processes, it is
important to choose the right organization model (decentralized, centralized, or
federated), that would work for an enterprise. Each of these models have their
pros and cons, which must be weighed carefully before choosing the best-suited
one. Having chosen the optimum model, the next step is to define data and ana-
lytics organization structure and processes that are agile and business friendly. A
proven approach is to structure data and analytics organization into five logical
towers—(1) governance, (2) business, (3) technology and architecture, (4) solu-
tion delivery, and (5) service delivery. In a DevOps model, the fourth and fifth
towers (i.e. solution delivery and service delivery towers) can be merged into one.
All these towers need to work in close collaboration. Each one of them need to
have well-defined roles and responsibilities as well as processes that would help
in delivering high business value to the enterprise.
Fourth Element
of Strategy—Organizational Change 5
Management
Driving and Managing Change Across the Enterprise to
Ensure Success

5.1 Need for Change Across the Enterprise

Understanding the evolution of data and analytics over the last few decades, espe-
cially in 2010s, would help appreciate why organizational change management has
become such an important element of enterprise data and analytics strategy today.

5.1.1 Till 2010—A Brief History of MIS Era

Prior to 2010, data and analytics within enterprises was perceived more as MIS
(management information system) function, with standard reports getting delivered
at a pre-defined frequency to various managers and executives. Enterprises used
to have limited capabilities in slicing and dicing of data. Such capabilities were
available to a limited set of “power users”, typically analysts working for senior
leaders, through tools such as Cognos, BusinessObjects, and Oracle BI. Advanced
analytics were done in standalone systems, using tools such as SAS, SPSS, and
MATLAB. The owners of these systems took data dump from IT periodically,
typically once in a month, and developed algorithms for sales forecasting or other
common advanced analytics needs. A very small group of data mining experts,
with specialized skills in statistics and data mining, worked in developing these
algorithms. Not many people in the enterprise were aware of the details of the work
that these experts were doing. These standalone systems remained as “black box”,
that faced both scalability and data quality challenges. These challenges resulted
in very limited use of advanced analytics for decision-making in enterprises.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 81
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_5
82 5 Fourth Element of Strategy—Organizational Change Management

5.1.2 The 2010s—Data Visualization Becomes All-Pervasive


Across Enterprises

The 2010s saw meteoric rise of data visualization tools such as QlikView and
Tableau, that were developed by small companies. These tools created a niche for
themselves by providing business users with very friendly and interactive interface.
Users needed very little training to use these tools. They could easily visualize data
and get intuitive business insights. These tools also had some built-in advanced
analytics capabilities.
Due to the power and flexibility that these tools provided, dependency of busi-
ness users on IT reduced drastically. IT needed to only provision the data that was
required by the business users, in addition to taking care of technical maintenance
and support of the tools. Of course, some governance issues did crop up, for which
IT needed to intervene. The advent of these “self-service” data visualization tools
transformed BI (Business Intelligence) from “MIS” into “data and analytics”.
As a result of the above, data and analytics started getting democratized across
the breadth and depth of enterprises and caught the fancy of senior executives.
Everyone, from top to bottom, wanted to have access to data visualization tools,
so that they could monitor and manage their business better. Analytics was no
longer restricted to a select few “power users”, as was the case prior to 2010. The
ability of these tools to render dashboards on mobile devices only augmented their
demand. While all this was happening in the first half of 2010s, the advent of
digital technologies brought further revolution in data and analytics.

5.1.3 The Latter Half of 2010s—Advent of Digital Technologies

Increased adoption of data and analytics, seen in early 2010s, gained huge momen-
tum in the latter half of 2010s because of the advent and maturing of digital
technologies such as big data, cloud, AI/ML, IoT, and others. As a result of this,
data and analytics not only got further democratized but also became board level
agenda. It started driving enterprise business strategy.
With the variety of data increasing to include newer data sources such as IoT
data and social media data, information consumers got excited thinking about what
all they can do using all this data. With cheap “big-data” processing capabilities
(something that was economically not feasible earlier), information consumers
started expecting data and analytics to deliver not only descriptive analytics to
them, but also diagnostics, predictive, and prescriptive analytics. They expected
analytics to get embedded deeply into business processes, so that the processes
could become more intelligent and self-optimizing in nature and would require
zero to minimal human intervention.
Advanced analytics got revolutionized through open-source and cloud-created
machine learning ecosystem, that automated good amount of data scientist activ-
ities. Developing traditional algorithms became so easy that even a layman, with
basic training, could develop them.
5.1 Need for Change Across the Enterprise 83

Enterprises started aspiring to create new revenue stream through “data mone-
tization”. With the huge amount of customer data that enterprises possessed, many
CEOs started exploring how they could monetize the data by creating value-added
services based on it. In the earlier chapters, I have quoted quite a few examples
around such aspirations of CEOs. I also talked about some CEOs trying to change
(or develop a new) business model leveraging data and analytics.
While all this hype was getting created, enterprises soon realized, inter alia, the
criticality of organizational change management to realize their dreams.

5.1.4 Why Organizational Change Management

Enterprises learned (often the hard way) that, while a few pilot successes may be
easy to achieve, if they wanted to truly democratize data and analytics to drive
digital transformation of entire enterprise, they needed, inter alia, a very strategic
organizational change management approach.
After the advent of ERPs and other transactional systems, that were focused on
making business processes more efficient, enterprises have got used to working in
a certain way. However, to drive digital transformation, enterprises are disrupting
their existing business models and therefore need to adopt new ways of working.
They need to change/reengineer existing business processes. Data and analytics
are the key enablers for any digital transformation initiative and, therefore, the new
ways of working. Hence, a focused organizational change management strategy is
required for data and analytics. Enterprises have employees belonging to different
age brackets and coming from different cultural and educational backgrounds. All
employees may not be technology savvy. Ability and willingness to learn new
technologies (or start using new tools) vary amongst different employees. All such
factors call for careful change management planning.
Driving change in large enterprises is never an easy task. Let me quote an
interesting experience below.

I was conducting an ideation workshop with business leaders of a large global manufactur-
ing company, which is more than hundred years old. These business leaders, representing
various business units, geographies, and functions of the company, had huddled in a large
conference room for the workshop.
During the workshop, we spent quite some time in brainstorming to arrive at various trans-
formational ideas. We then started evaluating each idea in detail. The topmost idea was
related to completely changing how the company managed manufacturing operations in
different regions. The idea involved, inter alia, moving away from local management of
operations to global management. Let me explain this point further in the following para-
graphs.
The company had multiple product lines, with a complex supply chain within and outside
the company. For example, there were cases where a part produced in the company’s plant
in, say, Germany was supplied as raw material to its another plant, say, in France, where
the part was used to produce a semi-finished product, that was then shipped to the com-
pany’s plant in, say, US, where the semi-finished product was converted into final product.
84 5 Fourth Element of Strategy—Organizational Change Management

To add to this complexity of supply chain, the company sub-contracted low-value adding
manufacturing operations to multiple tier-1 and tier-2 suppliers across the globe.
In the existing manufacturing operations planning approach, each plant was managing and
optimizing its own operations. This approach was efficient at a local i.e., plant level. How-
ever, the company was losing out on immense synergy benefits that could be derived by
planning operations at a global level, especially because there was so much interdependence
of supply chain across business units, geographies, and plants of the company. This was a
classic case of local vs global optima.
During the workshop, we calculated the rough order of magnitude of expected benefits that
the company could get because of this idea. It was estimated to be around five hundred
million dollars. Implementing this idea, however, required drastic changes in the way their
business was being run - many business processes needed to change, many existing roles
needed to go away, few new roles needed to be created, etc.
When this idea was discussed during the workshop, many business leaders were quite
enthused by it. However, the senior most vice president in the room (a business leader who
had been in the company for almost thirty years) said that “If this was possible, we would
have already done it by now”. He went on to elaborate that the idea was discussed many
years ago in the past as well. However, it was found to be technologically impossible to
implement, because of which they never went ahead with it. He was right. Before the advent
of digital technologies of 21st century, it would have been impossible (or too complex and
expensive) to implement such an idea. The idea required real-time processing of data from
across multiple systems located across the globe and then deriving and embedding insights
from the data into business processes for global operations planning. All this was techno-
logically complex and risky to implement in the 20th century. However, with the power of
digital technologies, it was possible to implement it without major technology risks and at
a cost that could result in a high return on investment. I tried to explain this to him, but he
didn’t seem too convinced.
Other than the vice president who had raised the concern, I could sense reluctance in quite a
few other leaders as well to go ahead with the idea. And the cause for this reluctance did not
appear to be the fear of technological risk alone. Many leaders were concerned that the idea
required too many changes, for which they and many others would need to get out of their
comfort zone. They would need to put in a good amount of extra effort during the imple-
mentation phase. They felt that people were used to certain way of working and trying to
change that drastically would be very risky. Overall, I could sense a fear of “big change”
amongst many of them.

The above is a good example of why organizational change management strategy


is so important when you want to implement any transformational idea in an enter-
prise. There would always be some people who would resist change. There would
be many others who would detest the speed with which change is happening. And
there would be a vast majority who would be concerned about the risks and uncer-
tainties that change would bring with it. Hence, from an enterprise standpoint,
for any digital transformation initiative to be successful, it is important to define,
inter alia, an organizational change management strategy upfront. Otherwise, any
amount of digital technology investment cannot guarantee success.
5.2 Driving Change—Key Focus Areas and Objectives 85

5.2 Driving Change—Key Focus Areas and Objectives

Organizational change management (OCM) is a very broad topic. It is taught in


all B schools. There are many books and white papers available on this topic.
However, in the context of enterprise data and analytics program, I have not come
across any good book, white paper, article, or guide on organizational change
management. In my experience, there are four key focus areas of change and three
key objectives for change, that one must consider and plan for, to drive enterprise
data and analytics initiative at scale. This is depicted in Fig. 5.1.

5.2.1 Changing Business Environment

As discussed earlier in this chapter as well as in previous chapters, data and ana-
lytics is a key enabler for not just adapting to changing business environment,
but also for capitalizing on opportunities emerging from environmental changes.
The speed at which business environment is changing today has never been seen
below. A few examples of such changes and how data and analytics can enable
those changes are listed below.

• Mergers, acquisitions, and divestitures: The number of mergers and acqui-


sitions (M&A) have increased substantially in the twenty-first century, as
enterprises aspire to grow their businesses inorganically. Many enterprises are
also divesting their non-core businesses to focus and grow their core and more

Fig. 5.1 Four key focus areas and three key objectives for driving change in data and analytics
86 5 Fourth Element of Strategy—Organizational Change Management

profitable businesses. In any M&A or divestiture, data and analytics plays a


crucial role both during the pre-deal phase (i.e., deal origination to deal closure
stage) and post-deal phase (i.e., deal closure to deal exit stage). For example,
during the pre-deal phase of an acquisition, the acquiring company is interested
to understand and analyze, inter alia, the customer segments, products, and mar-
kets of the company being acquired. Analytics plays an important role in this.
Post-deal, the role of data and analytics increases multifold. It helps in nego-
tiating with suppliers for cost-saving opportunities, identifying opportunities to
increase revenue, etc. In fact, right from day 1 of deal closure, data consoli-
dation from acquired company plays a vital role for regulatory compliance as
well as for achieving synergy benefits that formed the basis for M&A decision.
Hence, for any M&A, data and analytics planning must be done much before
deal closure. Most enterprises (that plan for M&A or divestiture) form a focused
data transformation team, comprising of experts from both IT and business, for
this purpose. Integrating transactional systems such as ERPs and CRMs may
take quite long (often many years) post deal closure, due to inherent systems
complexities. However, by integrating data from the two companies, one can
start understanding and analyzing the business of both from day 1. In my experi-
ence, unless the data from both companies are integrated with good data quality
within the first hundred days (post deal closure), the benefits from M&A cannot
be realized to its full potential. The shareholders would start getting impatient
since they would expect the company to start meeting its committed M&A
synergy targets in the first hundred days.
• Business model changes: In this book, I have been discussing about enterprises
that are constantly trying to evolve their business models, to deliver more value
to their customers, thereby driving growth. I talked about enterprises moving
away from B2B to B2B2C model. Any business model change requires renewed
focus on data and analytics. Data and analytics play both direct and indirect role
in driving growth in business model changes. I will discuss about this in the
next chapter, i.e., Chap. 6.
• Organizational restructuring: Enterprises restructure their organization for
various reasons—M&A, business model change, increasing operational effi-
ciency, increasing customer focus, etc. Such restructuring involves not just
changing organization structure but also business processes. An MNC, for
which my data and analytics team was working with, recently restructured its
organization to give more independence to each of its global business units. In
doing so, it reduced the staff at its corporate office drastically. The objective was
twofold—(a) reduce cost of corporate office and (b) empower business units to
focus more on its key products and customers, with minimal corporate inter-
ference. Empowering the business units also meant greater accountability for
their business results. However, any organizational restructuring increases “or-
ganizational chaos”. Transition period can be very difficult. Data and analytics
can play an important role in ensuring that all the data required for day-to-day
operations of business is available and the transition period is short. It also helps
5.2 Driving Change—Key Focus Areas and Objectives 87

in getting early insights on whether the intended objective of restructuring is


being achieved, and, if not, what are the potential reasons for the same.
• Entry into new market and/or launch of new product or service: These are
the growth drivers that enterprises always explore. For new market entry, both
before and after entry, lot of market research needs to be done, where data and
analytics helps. The same applies for launching any new product or service.
Even, post entry into a new market or launching a new product or service,
performance needs to be continuously monitored very closely and corrective
measures need to be taken on time, in case of underperformance. Data and
analytics play an extremely critical role in this.

While the above examples of changing business environment are not exhaustive,
the points that I want to highlight are that (a) business environment in enterprises
is changing at a much faster rate than ever before and (b) data and analytics
play a critical role in helping enterprises to gain competitive advantage with these
changes. However, to realize benefits from data and analytics, four key focus areas
and three key objectives of change need to be understood and planned for. Further,
change leadership plays an important role, about which I will discuss later in this
chapter. Let me first talk about the four key focus areas of change from data and
analytics perspective.

5.2.2 Four Key Focus Areas

For data and analytics OCM strategy, there are four key focus areas - People,
Processes, Technology, and Data. While the primary focus area is always “people”,
it is important to focus on the other three areas (processes, technology, and data)
as well, so that people are enabled, empowered, and motivated to adopt changes.
While I will get into the specific elements of OCM strategy for each of the focus
areas later, let me give an overview of each of them below.

1. People: It is the most important focus area of change, since ultimately it is


the people, primarily the employees of the enterprise, who are responsible for
institutionalizing an organizational change. Talking of people, it is important to
understand the cultural, geographical, and demographical diversity of employ-
ees within an enterprise. Beyond diversity, it is also important to understand the
wide spectrum of information consumers (user personas), that exist in any enter-
prise, and their diverse analytical needs. Change management strategy need to
be developed for each user persona so that they embrace new technologies and
new ways of working.
2. Processes: In Chap. 4, I discussed about defining key processes for data and
analytics team. From process change management perspective, it is not just
about focusing on changes in these processes, but also on business processes
and new ways of working that result from any transformational large-scale dig-
ital initiative in an enterprise. Since data and analytics play an important role
88 5 Fourth Element of Strategy—Organizational Change Management

in such transformations, it is important to focus on key process changes (both


internal to data and analytics team and external to them) and define change
management strategy for them.
3. Technology: While new digital technologies bring a lot of promise, adopt-
ing/implementing them across an enterprise can become a nightmare, if they
are not managed well. Hence, from technology change management perspec-
tive, it is important to focus not just on exploring future data and analytics
technologies (that can make the enterprise ready-for-future to maintain compet-
itive advantage), but also on planning to train people on new technologies and
implement the technologies across the enterprise, beyond initial pilot projects.
4. Data: Today, everyone is talking about “data as the new oil” and highlighting
the importance of treating data as a strategic asset. However, not many enter-
prises have a clue on how to realize the full potential of data. From change
management perspective, it is important to change the way data is treated,
viewed, and managed in enterprises. It is important to have a data literacy plan
for all user personas and to inculcate “data thinking” amongst all employees of
the enterprise, so that data can be democratized.

While focusing on the four change areas described above, it is important that OCM
strategy for data and analytics is always driven by three key objectives. These are
illustrated in Fig. 5.1 and are described.

5.2.3 Organizational Chaos Theory and Three Key Objectives


of OCM Strategy

Chaos theory, a theory to study chaos of a complex system that apparently behaves
randomly, even though it is driven by deterministic rules, was advocated by Henri
Poincare, a French mathematician, in the 1880s. Subsequently, over more than
a century now, it has been applied in various fields such as weather prediction,
biological models, macroeconomic analysis, and stock market analysis.
One pertinent application of chaos theory is analysis of today’s large and global
enterprises. Traditionally, most of these enterprises continue to work in a rigid
hierarchical organization structure. Such structures were good when it was all
about strict discipline to achieve process standardization and efficiency to achieve
predictable outcome. However, with the advent of digital economy and Internet-of-
everything, organizational designs are getting disrupted. Organizational boundaries
are blurring. Competitors of the enterprises can also be their partners in certain sce-
narios. Suppliers can also be the customers. Business models are evolving rapidly.
It is difficult to predict where an enterprise is going to be in the next ten years.
This increased level of uncertainty makes today’s large enterprises more complex
systems than they were a few decades ago.
In such a “chaotic” scenario, one cannot expect the managers to do all the
planning, thinking, and innovating. A good case in point, from data and analytics
perspective, is that it is no longer enough to have business analysts in the offices
5.2 Driving Change—Key Focus Areas and Objectives 89

of CEO, CFO, and CMO to do all the number crunching and decide on key deci-
sions to be taken to improve business performance. Decisions must be delegated
to all levels of the enterprise. Culture of innovation must be nurtured amongst
all employees across the breadth and depth of the enterprise. However, this is
easier said than done. Moving the traditional “command and control” enterprises
to a more decentralized innovative enterprises requires them to become learning
organizations, thereby driving innovation across all levels. Hence, from data and
analytics OCM perspective, three key objectives are important for each of the four
focus areas (people, processes, technology, and data) discussed earlier. These are
described below.

A. Become a learning organization: Once we start viewing enterprises as natural


systems as per “chaos theory”, the need to make them “learning organizations”
becomes a logical choice. A learning organization is one where everyone is
involved in analyzing and solving problems, thereby continuously improving
the capability and competitiveness of the enterprise. Without becoming a learn-
ing organization, where, inter alia, information and knowledge are freely shared
across an enterprise, any improvement or change will remain short-lived. Data
and analytics play a key role in helping enterprises become learning organiza-
tions. However, to realize this objective, certain specific actions are required
on all the four OCM focus areas (people, processes, technology, and data). I
will discuss about these in the next section.
B. Innovate: Closely related to making an enterprise a “learning organization”,
is innovating. To inculcate a culture of innovation across an enterprise, it is
important that it becomes a learning organization. For data and analytics to
drive innovation at scale in an enterprise, in addition to taking it on the path
of becoming learning organization, there are specific interventions that are
required on all the four focus areas. I will discuss about these in the next
section.
C. Institutionalize change: Once enterprises start becoming learning organiza-
tions and start innovating by using latest or fit-for-future technologies, it is
important to take certain additional interventions, on all the four focus areas,
to ensure that changes are permanent and widespread across the enterprise. I
will discuss about these interventions, from data and analytics perspective, in
the next section.

5.2.4 Inter-relationships Between the Focus Areas and Key


Objectives

Before I go to the next section and discuss in detail about the various interven-
tions/initiatives required to drive OCM strategy, I would like to highlight that the
90 5 Fourth Element of Strategy—Organizational Change Management

Fig. 5.2 Inter-relationships between the focus areas and key objectives

four focus areas for change and the three OCM objectives are not mutually exclu-
sive. In fact, they are inextricably linked to each other. In Fig. 5.2, I have attempted
to illustrate their inter-relationships.
One way to look at the four focus areas is to keep “people” at the center, with
the other three focus areas around it. While, for effective change management, one
needs to plan for all the four focus areas in tandem, “people” is always at the core.
This is because, any changes in processes, technology, and data must be reviewed
in the context of the people who are going to adopt the changes. To illustrate this
“human-centricity” aspect, I have depicted people at the center. The other three
focus areas (processes, technology, and data) are also closely linked to each other.
I will be explaining these linkages in the next section.
Moving to the right side of the figure, the first objective of becoming a learn-
ing organization should be the beginning of the journey towards change. As an
enterprise traverses this path, it creates opportunities for innovation, especially
if conducive environment and incentives are provided. It is not possible for an
enterprise to become innovative across the breadth and depth of the organization
without embarking on the path of becoming learning organization. Hence the sec-
ond objective encompasses the first in the figure. Finally, the third objective, i.e.,
institutionalizing change, that entails making changes permanent and widespread
across the enterprise, encompasses both the other two objectives, that are neces-
sary pre-requisites for institutionalizing change. One important point to note is that
the three objectives are not sequential, i.e., they can be pursued in parallel, with
results flowing in small increments.
5.3 Driving Change—Twelve Elements of OCM Strategy 91

5.3 Driving Change—Twelve Elements of OCM Strategy

Taking a cartesian product of the four focus areas for change and three OCM
objectives, that I discussed in the previous section, a set of twelve OCM strategy
elements emerge. This is illustrated in Fig. 5.3.
Let me discuss each of these twelve elements in detail and explain how OCM
strategy for enterprise data and analytics initiative should be defined for each of
these elements. I will go sequentially by each focus area, i.e., I will start with the
first focus area, people, and discuss the three elements under it—1A, 1B, and 1C.
Then I will move on to processes (2A, 2B, and 2C), technology (3A, 3B, and 3C),
and data (4A, 4B, and 4C)—in that order.

5.3.1 1A: User Persona Focus

People and their data and analytics need vary widely across an enterprise. While
defining data and analytics strategy, one often ignores this aspect and treats
everyone in the same way, which is a big mistake. In fact, while doing “samu-
dra manthan” (enterprise churning) exercise, that I discussed in Chap. 2, one
must endeavor to map various user personas, based on their data and analytics
needs. Figure 5.4 illustrates the wide spectrum of information consumers across
an enterprise based on their needs.
In the figure as you move from left to right (i.e., from standard reports on one
end to artificial intelligence on the other end) and/or from bottom to top (i.e.,
from structured data at the bottom to unstructured data at the top), the level of
ambiguity of the need and the consequent complexity of the required data and
analytics solution increases. This means that the solutions on the bottom left corner
would be the least complex, while those on the top right corner would be the
most complex. In the figure, I have shown examples of typical analytical scenarios
(within rectangular boxes) to demonstrate this point.
The starting point for defining organizational change management strategy is
to understand different user personas and their needs/pain points across the above
spectrum. For example, someone from finance function, who needs a standard
profit and loss report at the end of every month, with all numbers reconciling to
the last decimal place, would have a very different need than someone from supply
chain who is responsible for operational risk management, taking into consider-
ation factors such as weather data, possibility of an earthquake or a pandemic,
possibility of a supplier going bankrupt thereby choking the supply of material
or service, or any other supply chain eventuality. These two personas would need
very different type of data and analytics solution. While for the first one, data
quality and accuracy are most important, for the second one, quality of algorithm
and its comprehensiveness to take into consideration all types of supply chain
eventualities are more important.
Continuing with the same two user personas described above, if you plan to
undertake any change (of technology, for example), you need to consider the
92 5 Fourth Element of Strategy—Organizational Change Management

Fig. 5.3 Twelve OCM strategy elements for data and analytics
5.3 Driving Change—Twelve Elements of OCM Strategy
93

Fig. 5.4 Information consumers with diverse data and analytics needs across an enterprise
94 5 Fourth Element of Strategy—Organizational Change Management

impact of such change on each of them separately. For example, if you plan to
change a data visualization tool that the two personas have been using for years,
you must address their apprehensions about the new tool in a different manner. The
finance user would be happy if you can convince the user that the finance reports,
such as profit and loss report, that were earlier being generated at the end of every
month using the old tool, would continue to be generated in the same manner and
format with the new tool, but with even higher data accuracy. However, for the
supply chain user, such an explanation (of delivering regular supply chain reports
in the same manner as in the past) would not suffice. You would need to work
together with the supply chain user to test out various business hypothesis using
sample data (related to supply chain disruption or other critical use cases) with
the new tool and demonstrate how the tool would help in the various use cases—
for example, how the new tool would predict supply chain disruption with higher
accuracy.
Once all categories of user personas across an enterprise are mapped and well
understood, all aspects of change (communication, training, etc.) need to be tai-
lored as per the persona. Since persona mapping is so central to entire OCM
strategy, all the other OCM strategy elements revolve around it.

5.3.2 1B: Collaboration and Motivation

Most innovations, leveraging data and analytics, happen by collaborating with


different organizational entities/functions. In the first section of this chapter, I dis-
cussed an example of a manufacturing organization where an innovative idea of
moving away from local management of operations to global management was
explained. Both ideation and implementation of such innovations require experts
from various functions, business units, and geographies of the enterprise to col-
laborate, understand process and data challenges and come up with a solution that
will be practically feasible for the enterprise.
Hence, from OCM strategy perspective, one must consider how to create an
environment of collaboration and how to motivate people to work together towards
a common goal. While I will be discussing about various elements (such as
defining the right data and analytics organization, formalizing innovation pro-
cess, exploring fit-for-future technologies, driving data literacy, and inculcating
data thinking) that are important (to create an environment conducive for collab-
oration), in the respective elements of OCM strategy, let me highlight here the
importance of motivation for driving collaboration.
As part of OCM strategy, one must have a well laid-down plan for rewarding
and recognizing people. While financial rewards are one way of motivation, I have
seen non-financial approaches to be equally effective. Just appreciating an achiever
or publishing her/his name in a company’s internal portal can provide big boost to
the person’s morale. For a digital transformation initiative, one must plan how to
reward and recognize performers and early adopters. Such a plan should be broad-
based rather than covering only few star performers. The plan should also include
5.3 Driving Change—Twelve Elements of OCM Strategy 95

multiple categories of achievements, such as successful completion of a training,


certification in a new digital technology, contribution in implementing a project
that led to good business outcome, and so on. Publishing these achievements,
through emails, internal portals, or company magazines, would not only recognize
the achievers but would also motivate others to strive to achieve similar goals.
Taking such achievements into consideration for career progression/promotion of
employees will encourage them to support enterprise digital initiatives. They will
be more eager to learn, collaborate, and create innovative solutions.

5.3.3 1C: Communication

Preparing a communication plan is one of the most under-focused areas while


defining enterprise data and analytics strategy, as well as post that. I have seen
many issues arising in enterprises because of the absence of a well thought through
communication plan. Data and analytics initiatives are, in most cases, good for
everyone—enterprise as well as its employees. It increases people’s productivity
and aids in better decision-making. However, this fact is often not communicated
effectively, giving rise to various types of concerns amongst employees. Global
enterprises have employees from various cultures and, therefore, preparing an
effective communication plan is even more important in such cases. A commu-
nication plan that covers the entire breadth and depth of an enterprise goes a long
way in institutionalizing change.
During the early stages (strategy definition phase and initial pilot projects imple-
mentation phase) of a new data and analytics initiative, I recommend bringing in a
change management specialist. Such specialist, who possesses deep understanding
of various cultures and has driven organizational change management for large
initiatives in various enterprises in the past, can support and guide the data and
analytics head in overall change management plan, including communication plan.
Primary responsibility of the plan, though, must reside with the data and analytics
head.
Following are some of my suggestions on effective communication.

(a) What to communicate? “What” refers to the content of a communication mes-


sage. While communicating any change to employees, the message from some
leaders often focusses mainly on how the change would benefit the enterprise.
While it is important to talk about that, the message must also cover how the
change would benefit different employees, i.e., different data and analytics user
personas. This part of the message is commonly referred to as “WIIFM”, i.e.,
“What’s-In-It-For-Me”. The WIIFM content should come early in the commu-
nication message, so that it grabs the attention of target audience/recipients.
Also, the message should be customized for each category of user persona
rather than preparing a generic message for all user personas. This applies to
all the modes of communication, i.e., email, flyer, presentation, or any other
channel.
96 5 Fourth Element of Strategy—Organizational Change Management

(b) Who should communicate? The person who sends a communication is as


important as the communication message itself. I remember an incident where
the CIO of a global enterprise got approval from the CEO to start an enterprise-
wide digital transformation initiative. With all enthusiasm, the CIO sent an
email to the business executives across the globe to (a) inform them about the
initiative, and (b) request their active support, including few hours of their time
to participate in design thinking workshops that were to commence immedi-
ately. However, he was disappointed to see very poor response to his email.
The business executives felt that this was yet another IT initiative, and because
of their busy schedule, they did not commit any time for the initiative. In ret-
rospect, the CIO felt that it would have been better had he requested the CEO
to send the communication email.
This incident is a good example of the importance of identifying the right com-
municator for a message to be effective. “Who should communicate” should be
decided based on the type of message, objective of the message, and the target
audience/recipients. This applies to not only communications from the senior
leadership of an enterprise, but also all other communications. For example, if
the governance tower head of the data and analytics organization is planning
to send a communication to the team members of the business tower, the gov-
ernance tower head should take a moment to ponder on whether it would be
more effective if the communication is sent by the business tower head.
(c) When to communicate? Timing is another important aspect of communica-
tion plan. If you do not communicate at the right time, you may lose the
effectiveness of the communication. When you plan to define enterprise data
and analytics strategy, it is better to inform key stakeholders upfront and then
send them regular updates. This would help in aligning them to the initiative
better. Both timing and frequency of communication are important considera-
tions in communication plan. An example of bad timing of communication is
sending a one-time announcement, after defining enterprise data and analytics
strategy, stating that the corporate office has created a strategy that needs to
be adopted by everyone. Communications such as this will never help you get
full-hearted support from stakeholders.
(d) How to communicate? By “how”, I mean the mode/channel of communi-
cation. Email is the most frequently used form of communication in global
enterprises. The second most used form is probably virtual presentation.
While both are effective channels, when the intended change is big, these
two channels of communication alone may not suffice. One may have to
think about organizing events, where key stakeholders, who are going to be
closely engaged in the data and analytics program, from across the globe
come together and discuss in detail about data and analytics strategy that their
enterprise should have. Such events should be done early on, preferably as a
kick-off of the “enterprise churning” exercise, that I discussed in the Chap. 2
of this book.
One can organize evening socialization during such events, where people get
to know each other better and discuss informally about the impact that the
5.3 Driving Change—Twelve Elements of OCM Strategy 97

changes would bring to their professional lives. When you are thinking about
driving change in an enterprise that has people from multiple geographies
and cultures, events such as these always help. Relationships developed dur-
ing such events go a long way in establishing a culture of collaboration,
thereby making the data and analytics program more successful. Data and
analytics head should strive to not only get executive sponsorship for such
events, but also request the executives to participate in them to communicate
the importance of the initiative.
(e) 3Cs of communication: Finally, I want to talk about three Cs of communi-
cation viz clear, concise, and compelling. Reviewing any communication for
these 3Cs always helps in making the communication more effective.
First C is clarity of the communication. One should focus on creating a
communication in a simple and easy-to-understand language. Target audi-
ence/recipient of the communication should not feel the need to read between
the lines to understand it. Also, the communication should be such that
different people should not draw different inferences.
Second C is about conciseness of the communication. Some people have the
habit of writing very lengthy emails to ensure that they do not miss out any
important point and that all the points are explained in detail. However, if the
email is too verbose, the readers may miss out the core message of the email.
They may not even read the full email. Hence, checking all communications
for conciseness is important.
Third C is about making the communication compelling. A communication
may be both clear and concise, but if it is not compelling enough to enthuse the
target audience/recipients, it will fail to meet its objective. People may not even
remember it later. One of the means to make a communication compelling is
to focus on “What’s-In-It-For-Me”, that I discussed earlier under “what to
communicate”.
Every communication (through email, presentation, flyer, or any other channel)
must be critically reviewed for each of these three Cs. While this appears to
be a very basic advice, I have often found many emails or presentations, often
even from senior executives, failing in at least one of the three Cs. Spending a
few extra minutes to critically plan and review a communication always helps
in making it more effective.

5.3.4 2A: New Ways of Working

To become a learning organization, new ways of working need to be established.


In Chap. 4, I discussed in detail about how to structure data and analytics orga-
nization and processes to manage high level of ambiguity inherent in data and
analytics projects. The five-tower approach and new ways of working that I rec-
ommended would help to address needs of all persona types. Business tower plays
an important role in this. I talked about spreading out the business tower across
98 5 Fourth Element of Strategy—Organizational Change Management

different geographical locations, so that their members can spend more time on
the ground with different business users. I also talked about the importance of
business stakeholders perceiving these members as part of their own team instead
of belonging to a central data and analytics organization. All such considerations
in organization and process design are very important from organizational change
management perspective.
While I discussed all the above aspects in detail in the previous chapter, let
me discuss an additional point here, and that is managing change within the data
and analytics organization itself. Once you decide on adopting new digital tech-
nologies to replace the legacy technologies, the existing data and analytics team,
that had expertise in the legacy technologies, would have concerns around their
future in the organization. They may feel that they would no longer be relevant to
the organization and would therefore have job security concerns. Let me share an
interesting experience in this regard.

I was advising an enterprise on their digital strategy. I had a small team of consultants who
were working with me in the exercise. In the data and analytics organization of the enter-
prise, there were lot of senior technology experts. On an average these experts had more
than fifteen years of experience, predominantly in couple of legacy technologies. My team
was defining a new architecture to drive various digital initiatives. In the new architecture,
these legacy technologies were going to be made obsolete. On learning this, lot of resistance
came from the group of senior technology experts, who cited multiple reasons to justify why
the old technologies should not be made obsolete. They quoted examples where these tech-
nologies were inextricably linked to many critical enterprise applications, and said that by
changing the technologies, there would be high risk for those applications, that could result
in adverse business impact.
On learning about these concerns, I requested my team to do a thorough analysis of the
points that the experts were raising. On evaluating the concerns in detail, my team found
that all the challenges that the experts were raising could be easily taken care by proper
planning. There weren’t any undue risks involved. Overall, pros of the change of legacy
technologies far outweighed the cons.
On learning this, I decided to have detailed one-to-one discussion with each of the senior
experts. After talking to them, I discovered that the root cause of their resistance was their
sense of insecurity, and that needed to be addressed urgently. I also realized that this team of
senior experts had valuable contextual knowledge of the enterprise with them, which would
be of immense value, both during the strategy definition phase as well as during subsequent
implementation of projects. Hence, if their concerns were not addressed urgently, getting
their full-hearted support was impossible, and that could ultimately lead to the failure of
data and analytics program, and overall digital initiative of the enterprise.

Addressing insecurities such as the one quoted above is important, in order to


ensure that people adopt new ways of working. Skill in a technology is just one
dimension of the value that experienced employees bring to the table. Their knowl-
edge of the enterprise, understanding of various systems, insights into the available
data, working relationship with various IT and business stakeholders, etc. are var-
ious other dimensions of value that are extremely useful. Someone who has good
5.3 Driving Change—Twelve Elements of OCM Strategy 99

knowledge in one technology and has good technical mindset can be quickly cross-
skilled in newer technologies. All such cross-skilling plan should be part of change
management plan.
In view of the above, I always recommend that, while defining data and ana-
lytics strategy, when one defines new data and analytics organization structure
with roles, it is important to assess the existing team for their skills and capa-
bilities. Based on this assessment, (a) their learning path should be defined, and
(b) they should be mapped to the roles that have been defined in the new orga-
nization structure. This should be done in discussion with them, preferably in a
one-to-one mode. During the discussion, it should be highlighted how each one
of them would deliver higher value to the organization in the new structure and,
consequently, how that would lead to a better career path for them.
The existing data and analytics team, that has spent many years in the enterprise,
understands various business stakeholders much better than any outside consultants
or new hires. When a new data and analytics strategy gets defined, this team needs
to play the role of change agents, driving change across the spectrum of infor-
mation consumers across the enterprise and helping everyone adopt new ways
of working—new ways in which data can be analyzed, visualized, and used for
decision-making. An organizational change management consultant can only guide
this team but cannot drive change himself/herself. Hence, unless the existing data
and analytics team is excited about the change that the new data and analytics
strategy is going to bring to the enterprise and to them, they will not play the role
of change agents. And you cannot bring in a new set of people (with expertise in
new technologies) and make them change agents. Existing team, with their contex-
tual knowledge, is the best team to drive change across the breadth and depth of
an enterprise. This must be considered while implementing new data and analytics
organization structure and new processes/ways of working.

5.3.5 2B: Innovation Process

I discussed about the importance of collaboration to drive innovation under “peo-


ple”, within Sect. 5.3.2. I also discussed about the need to have a well laid-down
plan for motivating (rewarding and recognizing) people. However, to drive inno-
vation, several other interventions are required. While I will discuss about some
of these under technology and data focus areas, in this section let me talk about
the need to establish and adopt a process for driving innovation.
While every enterprise talks about innovation, not many lay down a formal
process for it. Employees may generate a plethora of ideas. While some may not
be worthwhile to pursue, there could be a few that could be transformational in
nature and many others that could result in incremental improvements. Hence, it
is important to lay down a structured process that defines how ideas are captured,
evaluated, and implemented. Not having a structured process under the pretext of
creativity does not help an enterprise.
100 5 Fourth Element of Strategy—Organizational Change Management

Fig. 5.5 Typical steps in data and analytics innovation process

With the recent advancements in digital technologies (cloud, machine learning,


IoT, etc.), there are huge opportunities for enterprises to innovate by leveraging
data and analytics. Enterprises can capitalize on these opportunities by adopting a
structured process. In Fig. 5.5, I have summarized a typical innovation process for
data and analytics that enterprises should follow.
The process shown in the figure may vary little bit based on the type of idea.
However, broadly, enterprises can use these nine steps to drive innovations lever-
aging data and analytics. The “How” part of the nine steps mentioned in the figure
is self-explanatory (though it may not be exhaustive). Hence, I am not elabo-
rating on them further. I would, however, like to highlight below few important
considerations that enterprises should keep in mind during the whole process.

• In my experience, in data and analytics, co-generation of ideas (i.e., when ideas


are generated jointly by data and analytics team while interacting with other
functions) is the most effective means of idea origination. In Chap. 4 as well
as under Sect. 5.3.4, I talked about how the data and analytics team needs to
spread across the enterprise—geographically as well as in different business
units/functions. One of the focus areas for them should be to work with their
business partners to generate ideas on novel ways of solving business problems
and identify innovative solutions for capturing business opportunities.
5.3 Driving Change—Twelve Elements of OCM Strategy 101

• During the steps 1–3, never disregard gut feeling of people. At later stage, one
would anyway test the hypotheses with data.
• Always keep the idea originator in the loop. Engage the originator as much as
possible in the entire process.
• During the initial analyses, one may have limited time and resources. Hence,
instead of focusing on completeness, it is more important to focus on the
top hypotheses. In my experience, 80/20 rule applies in hypotheses—80% of
hypotheses turns out to be false. Hence, if one’s instincts suggest that certain
hypotheses should be in top 20%, one should do detailed analyses of those to
arrive at a decision to continue pursuing an idea.
• Data gathering and preparation activity (in step 4) is often underestimated.
Without the right quality of data, hypotheses testing would be of little value.
Hence, right experts from within the data and analytics teams should be
involved in this. My experience is that within the business tower and archi-
tecture tower of any data and analytics team, there is immense knowledge
and understanding of data, that can not only save lot of effort but also ensure
availability of good quality of data.
• In step 5, while defining detailed solution, it is important to define all aspects
of the solution. It should not just be limited to data and analytics. For example,
if the solution requires changes in certain business processes to achieve the best
possible outcome, one must analyze and define those as well.
• While preparing business case (step 6), ROI should be calculated for at least
three scenarios—normal case, worst case, and best case. In Chap. 6, on value
measurement framework, I will be discussing this aspect in detail.
• Marketing of solution/business case is an art. I always recommend using the
approach of “storytelling with data” to capture audience’s imagination. Later in
this section, under Sect. 5.3.7, I will be describing this approach in more detail.
• In most cases, especially when a project is expensive and time-consuming, it is
advisable to implement pilot project before going for implementation of com-
plete solution. Further, one should use iterative approach of implementation.
This way, ongoing learning can be applied for continuous improvement.

Immense amount of research and technology advancements are happening in ana-


lytics and artificial intelligence. However, for enterprises, it may probably take
decades to understand the areas where analytics and AI can be applied and realize
their full potential. Enterprises that can innovate using the new technologies ear-
lier would gain competitive advantage. Adopting a structured innovation process
would go a long way in achieving this objective.

5.3.6 2C: Interaction Model with Different Functions

From “process” focus area perspective, I talked about new ways of working of data
and analytics team (under 2A) and establishing a structured innovation process
(under 2B). However, if the process changes envisaged under 2A and 2B need
102 5 Fourth Element of Strategy—Organizational Change Management

Fig. 5.6 Various IT and business functions that data and analytics team interacts with

to be institutionalized, it is important to define the way data and analytics team


should interact and work with other IT and business functions. This would help
in driving analytics at scale, i.e., institutionalizing data and analytics across an
enterprise. Figure 5.6 illustrates various functions within an enterprise that data
and analytics organization typically interact with.
Let me talk about the nature of interaction and working model that the data and
analytics team needs to have with each of these teams, starting with top center of
the figure (IT applications teams—ERP, CRM, etc.) and moving in a clockwise
direction to all the other teams shown in the figure.

(a) IT applications teams (ERP, CRM, etc.): Since majority of enterprise data
required for analytics comes from enterprise systems such as ERPs and CRMs,
data and analytics team must work very closely with these application teams.
Data and analytics team must align with them on data and analytics topics,
understand their respective applications, understand their future roadmap, and
communicate data and analytics roadmap to them. As mentioned in Chap. 2,
lot of operational reports in enterprises get delivered directly from transac-
tional systems such as ERPs and CRMs. Data and analytics team needs to
understand what is getting delivered and what are the issues with data and
operational reports from these systems. This would help in understanding the
transactional systems much better. It would also help in formulating the policy
for deciding what types of information needs should be delivered by trans-
actional systems and what should be delivered by data and analytics team.
Overall, developing a conducive working relationship with teams that take
5.3 Driving Change—Twelve Elements of OCM Strategy 103

care of transactional systems would help not only in mutual alignment, but
also to get required support from them. Getting their support is vital for the
success of data and analytics program.
(b) IT business relationship managers (BRMs): In many large enterprises, there
are formal roles of IT business relationship managers (BRMs). These peo-
ple belong to either IT or business functions. But irrespective of where they
belong, they have very good understanding of both IT and business, which is
the reason why they are assigned the role to act as conduit between both. Data
and analytics team needs to work very closely with them. BRMs can help
the data and analytics team in understanding various challenges that business
stakeholders are facing, and key initiatives being undertaken to address them.
BRMs are also an important channel through which lot of data and analytics
demand flows. Hence, they are important stakeholders in the demand manage-
ment process. Keeping them up to date with data and analytics capabilities
and roadmap is important, so that when they interact with various business
stakeholders across business units, functions, and geographies, they can be
spokespeople for the data and analytics team.
(c) Business teams (finance, supply chain, etc.): They are the end customers for
data and analytics team. In the previous chapter, I discussed in detail about
how the working model between data and analytics team and the business
stakeholders should be. Hence, I will not repeat it here.
(d) Legal: Data and analytics team needs to work with the legal team to get
their approval on contracts with various partners—service providers, hardware
suppliers, or software vendors. Often, there are some special clauses that the
partners may wish to put in the contracts. Legal team needs to review all con-
tracts, for regular as well as special clauses, and give approval. This is required
to avoid any unforeseen future obligation to the enterprise. Data and analytics
team needs to understand the legal policies and processes of the enterprise and
work closely with the legal team to speed up contracting process.
(e) Human resources: Within the data and analytics organization, for the entire
employee life cycle of the team (from hiring to exit), there are multiple
employee engagement touchpoints where close working with HR function in
required. Examples include hiring people with relevant skills in the data and
analytics organization, career planning, internal/external training, and so on.
Close working relationship with HR is important from employee engagement
and satisfaction perspective.
(f) Procurement (for IT): For procuring any service, hardware, or software, data
and analytics team needs to work with procurement function. Vendor contract-
ing, price negotiation, master service agreement, etc., are some of the areas
that require close working with them.
(g) Chief information security officer (CISO): Many large enterprises have a
formal “chief information security officer” role, to address intellectual property
and cyber security concerns. CISO is responsible for both laying down policies
and guidelines for information security as well as conducting regular audits to
ensure compliance. The policies and guidelines apply to all functions within
104 5 Fourth Element of Strategy—Organizational Change Management

an enterprise since cost of any security breach may prove to be very high
for the enterprise. It is important that data and analytics organization aligns
with CISO and all the policies and guidelines. It is also recommended that, in
addition to CISO audit, data and analytics team does its own internal security
audit. This is important as data and analytics team deals with lot of sensitive
enterprise data.
(h) IT enterprise architecture team: In most large enterprises, there is an
enterprise architecture team within IT. This team is responsible for laying
down enterprise architecture standards, policies, and guidelines. All IT teams,
including data and analytics team, need to follow these. However, if certain
deviations are required for genuine business reasons, they should discuss with
the enterprise architecture team and seek their approval.
(i) IT infrastructure team: Another important team within IT is the infras-
tructure team. With cloud becoming more popular, this team is gradually
reducing in size in most enterprises. However, they will continue to exist to
(a) support infrastructure associated with legacy applications, and (b) ensure
governance for cloud infrastructure. Data and analytics team needs to work
with infrastructure team for all its infrastructure-related support requirements.

In general, maintaining a healthy and effective working model with all the above
teams is important for the data and analytics team to be successful. To do this,
inter alia, developing an understanding of focus areas, priorities, and constraints
of all the teams is important. Data and analytics team needs to request all the
functions/teams to include data and analytics representative(s) in their periodic
steering committee meetings, so that the representative(s) can both understand the
perspective of these teams as well as put forth suggestions in areas that have data
and analytics touchpoints. This approach would help them to understand each other
better and, therefore, avoid conflicts and disharmony in day-to-day working.
I have sometimes come across a concern that getting aligned with all the teams
takes lot of effort, which may not always be worthwhile. While I agree that invest-
ment of time is required, but I do not agree that the investment is not worthwhile.
When an enterprise is small, all functions, including IT, are small close-knit teams.
Very little time is spent in getting aligned with each other. People know each other
reasonably well. They often meet formally or informally and develop a bond. How-
ever, as an enterprise grows and becomes global, structures and processes need to
be put in place to manage complexity. While there are many benefits that come
with large size, one must invest more time (as compared to what one needs to
invest in a small enterprise) to follow processes and manage organizational com-
plexities. One cannot choose to only enjoy benefits (such as those arising out of
having various focused groups within IT) and not to invest time in developing
good working relationship with various groups. This applies not just to data and
analytics team, but to every other function/team in a large enterprise.
The head of data and analytics of a large enterprise must ensure that the
relationship between her/his team and other functions/teams of the enterprise is
cordial. In the last chapter of this book, I have discussed about the key soft skills
5.3 Driving Change—Twelve Elements of OCM Strategy 105

that a data and analytics leader must possess. Those skills are very relevant in this
context.

5.3.7 3A: Training on New Technologies

To become a learning organization, it is important that employees are trained to


use new data and analytics technologies. One of the reasons why people resist
change is “fear of the unknown”. If you can replace “fear” with “excitement” in
their mind, you would find drastic change in their attitude. One of the means to
achieve this is through training and motivation.
When any new data and analytics initiative is undertaken in an enterprise,
many business stakeholders have an apprehension about the new data visualiza-
tion/analytics tools that would be brought in to replace the existing ones. Over
the years, people develop a comfort level with the old ones. They have concerns
whether the new tools would have the same functionalities as the old tools. They
also fear that they would need to learn these new tools all over again. Even within
IT, there are apprehensions amongst the existing team about their own relevance
in the organization once the new tools/technologies are brought in.
When a digital/data and analytics initiative is transformational in nature, there
would be many changes beyond just tools and technologies. Business processes
may change. New ways of working may have to be learned. Organization structure
may change. All these create fear amongst employees. Would they be able to adapt
to the new ways of working? How much extra time they would have to spend to
catch up? Would they be able to learn the new tools/technologies? What would
be their role in the new organization structure? Would their jobs remain secure?
Questions such as these keep bothering them. To address such concerns, inter
alia, training plays an important role. Hence one needs to plan for it diligently.
Following are my suggestions in this regard.

(a) Who and what of data and analytics training? It is important to identify and
segment people to whom training is to be imparted. Training content and mode
of training would be dependent on this. Following are the key considerations
for segmentation.
– Age group: People in the older age bucket are more comfortable with tra-
ditional style of training (such as classroom training), while the younger
generation prefer online self-learning training courses that they can go
through on their mobile devices or laptops at their own pace.
– Training IT versus business stakeholders: For the IT stakeholders, train-
ing content must be in-depth and hands-on in the new technologies, while
for business stakeholders, the focus should be more on how to apply the
new toolset in various business context. Further, within IT itself, there are
different groups (as discussed in Sect. 5.3.6, earlier in this chapter), and
training/awareness sessions must be planned according to the needs of each
106 5 Fourth Element of Strategy—Organizational Change Management

group. Similarly, for business stakeholders, the training should be based on


user personas, as elaborated in the next paragraph.
– Nature of analytical work: In Sect. 5.3.1, earlier in this chapter, I talked
about a wide spectrum of information consumers in an enterprise. On one
end of the spectrum are people who use a reporting tool to view standard
financial or other reports. They do not have deep analytical needs. They
do not need to do lot of slicing and dicing of data. Such people need to
be provided basic training to create awareness on aspects such as—how to
use the tool; how to login (such as what URL to use, which browser to
use); what are the key differences in look and feel/usability of reports from
earlier tools; what are the advantages they would have from the new tool;
and whom to reach out if they face any issue. On the other end of the spec-
trum are data scientists/analysts who spend considerable amount of time
in exploring various types of data, uncovering hidden patterns, and devel-
oping algorithms for various business problems. The training approach for
such people obviously needs to be very different and much more detailed.
They need to be made aware of the working of the new tools in detail, by
demonstrating the new functionalities that the old toolset did not possess.
They need to be coached on how to use these new functionalities. It would
be easier to excite this group of people if one can demonstrate solving com-
plex business problems using the new tools. Once they start appreciating
the fact that by using the new tools their work would get more interest-
ing and they can be more productive, they would be happy to embrace the
tools.
(b) How to impart training?—The importance of storytelling. Training can be
provided through various modes—classroom sessions, virtual sessions, online
self-study courses (text/audio/video), and on-the-job tasks. Relevant mode can
be chosen based on trainee segmentation, as discussed earlier. Irrespective of
the mode, I have always found storytelling approach to be very effective in
data and analytics training. Storytelling helps in engaging the audience better.
For business stakeholders, storytelling with data, using data visualization tools,
can be intriguing. Let me elaborate “storytelling with data” below.

It is a well-known fact that more than 50% of human brain is dedicated to process
information visually. Some studies have suggested that this number is as high as 80%.
Irrespective of the actual percentage, all experts agree that visual learning is the best
mode of learning for majority of people. Human brain has a remarkable ability to
remember visual images. The left side of the brain has cognitive (thinking, verbal, and
analytical) abilities while the right side of the brain has perceptive (visual, intuitive,
and creative) abilities. Both sides are tied together by nerve fibres and work together.
The more is the visual stimulus, the faster the brain works in interpreting. So, when one
trains business stakeholders through storytelling approach, by using good data visual-
izations, one can vividly demonstrate how the new technologies can help them take
faster decisions and be more productive.
As an example, while conducting training for a supply chain manager, one can start by
sharing an understanding of the typical day-in-the-life of the manager e.g., how she/he
would like to start the day by understanding fleet movement of inbound/outbound
5.3 Driving Change—Twelve Elements of OCM Strategy 107

logistics, how she/he would like to locate certain trucks/ships that are on critical path,
how she/he would like to pre-empt delays in supply chain, and so on. One can then go
on to demonstrate, using the new data visualization tool, how the manager can view
various fleet movement in a geographical map, locate specific shipments, and com-
bine such visuals with weather data in the same map to identify links in the supply
chain that could potentially be disrupted due to bad weather. The trainer can go on to
demonstrate how the tool can propose alternate routes to mitigate supply chain disrup-
tion risk. Once the supply chain manager sees how the new data and analytics initiative
can make her/his life much better, she/he would not only embrace change, but would
also go all out to provide any support required to make the change happen.

5.3.8 3B: Exploration of Fit-for-Future Technologies

In 1B, I talked about the importance of collaboration and motivation, and in 2B, I
talked about the importance of establishing innovation process to drive innovation
in an enterprise. The third important element that needs to be focused on, to drive
innovation in an enterprise, is continuous exploration of upcoming technologies
that can support innovative ideas.
Technology and architecture tower of data and analytics organization should be
primarily responsible for exploring fit-for-future technologies. I discussed about
this in Chap. 4, under the responsibility of “research and innovation” of the tower.
I also discussed about how innovation is happening in data and analytics technolo-
gies (both in data engineering and in artificial intelligence/machine learning) at
an exponential rate, both by start-ups and established players such as Microsoft,
Amazon, and Google. I discussed about the importance of keeping track of all
such developments and trying out various proofs-of-technologies to explore how
some of the emerging technologies can solve specific business problems of the
enterprise, thereby ensuring that the enterprise stays ahead of the competition. I
also shared an interesting example in this context.
Following are my suggestions on exploring fit-for-future technologies.

• Senior people from the technology and architecture tower should be continu-
ously engaged in boundary spanning for new technologies. Boundary spanning
should include, inter alia, searching the Web, attending relevant conferences,
joining open-source communities (since these are often the first to adopt new
technologies), and exploring what enterprises in the same or other industries
are doing.
• It is important to collaborate with academia and start-up ecosystem while
exploring potential future technologies. It is a win–win situation for all the
parties.
• While boundary spanning can be limitless from time and effort perspective,
early identification of potential technologies that could be important in the
future in the context of the enterprise’s business is important. Researching just
for the sake of academic interest, without converging into potential idea for
108 5 Fourth Element of Strategy—Organizational Change Management

application of the potential technology in an enterprise’s business, may not get


funded for long.
• While identifying potential future technologies, the technology and architec-
ture tower should work closely with business tower of the data and analytics
organization for identifying potential business application areas.
• While some business applications of potential future technologies could be
directly related to addressing certain business challenges, there could be others
that could be truly transformational in nature—something that can even help
change the business model of the enterprise. Hence while identifying potential
future technologies, the data and analytics team needs to think out-of-the-box,
without getting constrained by organizational boundaries and challenges.

While it is important to explore new technologies to drive innovation and take the
enterprise on the path of change, it is equally important to institutionalize these
changes, i.e., implement these new technologies at an enterprise level, once their
potential value in the context of an enterprise’s business is promising. Let me
discuss about this next.

5.3.9 3C: Institutionalization of New Technologies

Any new technology change is difficult, especially when it is about institutionaliz-


ing it across an enterprise. Based on my experience, I would suggest the following
to ensure that technology change management is a smooth process.

• Articulate business case very well. It will go a long way in demonstrating the
value of new technology to all stakeholders.
• Align the new technology with the defined enterprise technology and architec-
ture strategy for data and analytics, that I discussed about in Chap. 3. If the
business case demands revisiting the defined technology and architecture strat-
egy, it should be done with due consideration to all the principles that I talked
about in that chapter.
• Select the team responsible for implementing the new technology very care-
fully. Some of the considerations for choosing the team are diversity, good
technology skills, good business knowledge, good leadership skills, and abil-
ity to manage ambiguities/uncertainties. There would be lot of challenges, both
technological and organizational, that the team would face. Having a team that
enjoys challenges is key for success.
• Start implementation of the new technology with a pilot project, using an MVP
(minimum viable product) approach. MVP, in a new technology context, is a
product (an IT application) that has just enough “features” to satisfy initial
users/customers. It helps in verifying assumptions made and checking the use-
fulness of the new technology. Feedback from initial users of MVP helps in
improvements before developing all the “features”. A small point worth men-
tioning here is that an MVP is not the same as a prototype. The latter is a
5.3 Driving Change—Twelve Elements of OCM Strategy 109

mock-up and not a fully functional product, as against an MVP that is fully
functional, albeit with select “features”.
• Communicate the value of new technology to each user persona clearly. Focus
especially on the “What’s-In-It-For-Me” message that I discussed about in 1C
of this section.
• Create training plan for new technologies for each user persona. I discussed
about it in 3A of this section. This is a critical aspect of technology change
management.
• Recognize and reward the implementing team as well as initial users (early
adopters).
• Publish success story of MVP widely, across the enterprise. This will create
excited anticipation amongst everyone about the new technology.

5.3.10 4A: Data Literacy

To make an enterprise a learning organization, one important focus area is to make


it data driven, and for that to happen, one of the first steps that the enterprise needs
to take is to drive data literacy amongst all employees. While every enterprise
would agree to this statement, most enterprises in the world today are not fully
successful in this.
Let me explain what I mean by data literacy. It is the ability of all employees
(both data professionals and non-data professionals) in an enterprise, irrespective
of country, culture, educational background, gender, age, etc., to converse in a
common language of data, in the context of the business of the enterprise they are
working in. For a person to be data literate, it does not mean that she/he should be
able to do statistical analysis or be expert in data mining. Instead, it requires the
person to possess the following understanding/knowledge at a high level.

• General understanding of the different types of data in the enterprise.


• General understanding of the different systems in the enterprise where data is
getting generated.
• General understanding of the enterprise’s data ecosystem, i.e., different tech-
nologies/platforms used by the enterprise, not just for creating data but also for
analyzing data.
• General understanding of the enterprise’s data language, i.e., general defini-
tion of common data entities such as customers, suppliers, products, services,
purchase orders, and sales orders and common performance metrics such as
revenue, gross margin, and carbon footprint.
• Ability to do basic data analysis and interpret results, e.g., ability to read var-
ious types of charts and graphs, and understand what they mean and, more
importantly, what they do not mean. Often, people misinterpret an analysis or
a data visual if the business context of the same is not understood well.
110 5 Fourth Element of Strategy—Organizational Change Management

Fig. 5.7 Steps to drive data literacy in an enterprise

• General understanding of data quality and related issues in an enterprise’s data


ecosystem, and how these issues can distort any data analysis.
• Ability to use basic data visualization tools and techniques. In every enterprise,
excel is the most common tool for data analysis and, therefore, at the min-
imum, every employee should be able to use data analysis features of excel
comfortably.
• Awareness of the various data and analytics teams in the enterprise to whom
one can reach out to for help.

Driving data literacy in an enterprise should be an iterative process, as depicted in


Fig. 5.7.
Let me describe the steps, to drive data literacy, in the following paragraphs.

• Initial data literacy assessment and gap identification: Before planning for a
data literacy initiative in an enterprise, it is important to assess current state of
data skills of employees. This assessment should be done for each user persona
vis-à-vis their analytical needs. I discussed about user persona mapping (as per
their analytical needs) under Sect. 5.3.1 earlier in this chapter. The assessment
would help in identifying gaps that exist in different user personas. The assess-
ment should be done by the data and analytics team (especially the business
tower and technology and architecture tower) since they interact with almost
all business units and functions of the enterprise across the globe. A structured
questionnaire should be prepared for this assessment. The questionnaire should
be shared with a representative sample of employees—select people chosen
5.3 Driving Change—Twelve Elements of OCM Strategy 111

carefully from amongst different user personas across the enterprise. Assess-
ment of data skills and identification of gaps (with respect to the desired state)
should be based both on the results of the survey and the practical experience
of the data and analytics team, while working with various user personas.
• Prepare data literacy plan: Once the gaps are identified, data literacy plan
should be prepared for each user persona. The plan should consist of a combi-
nation of one or more of the following: self-learning courses, training programs,
knowledge-sharing sessions, and engagement in data and analytics projects. For
self-learning/training courses, enterprises can leverage some of the standard
trainings available from popular professional course providers. However, it is
important to complement these with specific content prepared in the context
of the enterprise. Such learning/training should cover enterprise-specific con-
tent such as common business vocabulary, and overview of the enterprise’s data
ecosystem. Further, all the considerations that I discussed about in Sect. 5.3.7
earlier would apply here as well. Overall, my recommendation is that enter-
prises should endeavor to establish a data and analytics academy, as part of
data and analytics organization. This primary responsibility of this academy
can reside with either business tower or technology and architecture tower.
• Execute the plan: Once data literacy plan is prepared, it needs to be executed.
The starting point of execution is to communicate the objective and plan of data
literacy drive to all the stakeholders. Such communication should always be ini-
tiated by executive leadership of the enterprise and should highlight the benefits,
especially WIIFM (“What’s-In-It-For-Me”) that I discussed about earlier in this
chapter. Further, all the other suggestions that I discussed about in Sect. 5.3.3
should be followed. Once communication is sent and buy-in of employees is
secured, trainings can be conducted, in addition to other interventions as per
the data literacy plan. Faculty for conducting trainings should be chosen from
the business tower and/or technology and architecture tower. They should be
data analysts or architects who are excellent coaches and skilled in storytelling
with data.
• Evaluate effectiveness: There are various means by which the effectiveness
of execution of data literacy plan can be evaluated. Some of these are—feed-
back from employees, feedback from faculty, formal evaluation of participants,
external certification of employees in specific data and analytics competencies,
feedback from supervisors with regard to on-the-job improvements observed
after attending training program, etc. All such feedback should be reviewed
critically to (a) evaluate progress made, (b) identify the gaps that still exist in
data skills of employees, and (c) make changes to the data literacy plan, so that
the plan can be made more effective. This should be an ongoing exercise.

I would like to add a final point on driving data literacy in an enterprise. Learning
is a journey and not a one-time activity. Hence, the zeal with which an enterprise
drives data literacy, in its effort to become data driven, should not die down after an
initial push. Both the executive leadership and the data and analytics organization
of an enterprise need to maintain the zeal on an ongoing basis.
112 5 Fourth Element of Strategy—Organizational Change Management

5.3.11 4B: Data Thinking

Data literacy is all about developing basic skills in understanding and analyzing
enterprise data amongst all employees, so that they understand the enterprise’s
data ecosystem and converse in a common data language, as they strive to become
data-driven in their day-to-day working. While, through data literacy initiative, an
enterprise would become a learning organization, for it to drive innovation at scale,
it needs to do more than just become data literate. If an enterprise aspires that
data and analytics drive its business strategy through innovative digital solutions,
it needs to inculcate data thinking amongst a large population of the enterprise.
Let me explain what I mean by data thinking and how it can be nurtured in an
enterprise.
What is data thinking? Data thinking is an approach to view, analyze, and trans-
form an enterprise through data-driven solutions. It encompasses the following.

• Understanding business strategy and value chain of the enterprise.


• Understanding customer needs, competitive landscape, and other business
dynamics of the enterprise.
• Understanding key business processes of the enterprise.
• Understanding data blueprint of the enterprise—a blueprint that cuts across
business processes, business functions, business units, and geographies.
• Understanding data ecosystem, including technology landscape of the enter-
prise.
• Taking both top–down and bottom–up approach to think out-of-the-box and
transform the enterprise with data-driven solutions at all levels—operational,
tactical, and strategic.
• Applying design thinking principles to data science to conceptualize and
implement innovative data and analytics solutions with latest/fit-for-future
technologies.
• Collaborating with key stakeholders across the enterprise to achieve the above
objectives.
• Driving enterprise business strategy by doing all the above.

Data thinking goes much deeper than data literacy in terms of skills required.
Hence, while data literacy is targeted towards every employee in the enterprise,
data thinking is targeted towards all employees who are data savvy and have
immense passion for data.
How can data thinking be nurtured in an enterprise? Before discussing on how
an enterprise can nurture data thinking, let me summarize below the key hard
skills required for data thinking. In addition to these, there are various soft skills
required, such as collaborating with others, that are required for developing any
innovative solution—I have discussed about it in multiple places in this book and,
therefore, not discussing it here again.
5.3 Driving Change—Twelve Elements of OCM Strategy 113

• Data literacy (pre-requisite).


• Data science—artificial intelligence/machine learning algorithms.
• Technology.
• Business strategy, value chain, and processes.

One may argue that these skills can reside only in the data and analytics organi-
zation and that it would be very difficult for anyone outside that organization to
be engaged in data thinking. However, my experience is that, in any enterprise,
there is a large community of people who are data savvy and have reasonably
good understanding of data science and technology. These people, with focused
training and engagement, can prove to be very useful in coming up with innovative
data and analytics solution ideas. Of course, data and analytics team need to work
closely with them. In this regard, in Chap. 4 of this book, I discussed about the
importance of business tower of data and analytics organization working closely
with various business stakeholders.
For data thinking, overarching framework depicted in Fig. 5.8 is extremely
useful.
Before describing this framework, let me give an overview of the terminol-
ogy/concepts used in the framework.

• Value chain: Michael Porter, in his famous 1985 book, Competitive Advantage,
introduced the concept of value chain, that takes a process view of enterprises.

Fig. 5.8 Overarching data thinking framework for an enterprise


114 5 Fourth Element of Strategy—Organizational Change Management

He broke down an enterprise into nine activities, under two categories—pri-


mary activities (inbound logistics, operations, outbound logistics, marketing
and sales, and service) and support activities (procurement, technology, human
resource management, and infrastructure). Many variants of this model have
been created for enterprises in different industries. Value chain analysis has
become very popular over the years, since it takes a systems view of enterprises,
breaks processes into activities, understands the inputs/outputs/transactions for
each activity, evaluates what value each activity is creating for the customer,
and identifies how value can be maximized at each point of the chain. As enter-
prises became more complex and global over the last few decades, value chain
analysis has proved to be very useful in driving their business strategy and
improve their business performance.
• System agnostic enterprise data model: An enterprise data model is a busi-
ness blueprint of enterprise data. It is agnostic of transactional or other systems
that exist in the enterprise. I discussed about enterprise data model in Chap. 4.
I explained the difference between conceptual, logical, and physical data
models. I also explained the advantages of creating and maintaining an easy-to-
understand business-process-oriented enterprise data model and had mentioned
that technology and architecture tower of the data and analytics organization (in
collaboration with the business tower) needs to be the creator and custodian of
enterprise data model. Key takeaway of that discussion was that a disciplined
approach towards creating a blueprint of data (enterprise data model) goes a
long way in achieving digital ambition of an enterprise.
• Product data model: Before explaining product data model, let me define prod-
uct. Earlier in this chapter (in Sect. 5.3.9), I had talked about MVP (minimum
viable product) as an IT application that has just enough “features” to satisfy
initial users/customers. In general, a digital product (and that includes data and
analytics product as well) can be a hardware or software or a combination of
both. For example, a sales analytics software application built on cloud infras-
tructure and accessible through a mobile/desktop device, is a product that can
be used by sales team to analyze sales to customers. When I talked about MVP
earlier, I was referring to an iterative/agile process of product development, so
that value could be delivered to users in shorter time (incrementally) and regular
feedback could be taken for continuous improvement of the product. Moving
on to data model, any analytics product requires a comprehensive and reliable
data model, that can enable various analysis that users would want to do for
answering various business questions. Hence, product data model is a core and
critical component of an analytics product.

With the above overview of the terminology/concepts used in the overarching data
thinking framework, let me now explain the framework. The top portion of the tri-
angle in Fig. 5.8 is “system agnostic enterprise data model”. It should be based on
the value chain and business processes of an enterprise. When one starts creating
such an enterprise data model, one is faced with two major dilemmas, as listed
below.
5.3 Driving Change—Twelve Elements of OCM Strategy 115

1. Should the model be built in a big-bang approach, or should it be built


incrementally? I addressed this point in Chap. 4, where I recommended an
incremental approach.
2. What should be the level of completeness of the enterprise data model? This
dilemma arises from the fact that any large enterprise has lot of systems in
each business units and geographies. Further, many local regulatory as well as
other business nuances would apply to each geographical business. If one tries
to take into consideration all of those while defining the data blueprint, it will
become a herculean task. Hence, I always recommend building an enterprise
data model that is close to 80% complete from data coverage perspective. The
remaining 20% would typically be about local data specifics that should be
taken care of during projects for developing analytics products. I will elaborate
this point further as I explain the framework more in the following paragraphs.

The bottom portion of the triangle in the figure on overarching data thinking
framework is about product data model, which should be developed based on the
following principles.

1. Product data model should be derived from the enterprise data model, as a
starting point.
2. It should then be enhanced for both system-specific data attributes and local
geographical data attributes required for the purpose of the product.
3. Product data model, once successfully completed, should flow back into the
enterprise data model, which gets enriched with the new data entities and
attributes that got added to the product data model.

I have illustrated this process in Fig. 5.9, taking the example of P2P (procure to
pay) process and developing an illustrative product of “maverick spend analytics”
(Maverick spend is the amount of money spent in purchasing certain products or
services from suppliers on an urgent basis, due to business exigencies, without
following standard laid-down purchasing process. It leads to higher cost to an
enterprise, as they may miss out on supplier discounts on these purchases. It often
gets misused and, therefore, many procurement heads want to analyze maverick
spend so that it can be minimized).
Salient features/key advantages of using the overarching data thinking frame-
work are listed below.

• Agile development: While there would be an initial effort needed to develop


enterprise data model, development of each product would be much faster.
• Reuse of data: Same data can be reused across multiple products. This would
ensure consistency and better quality of data across products.
• Common business vocabulary: Using a common enterprise data model would
ensure common terminology and definitions of data across the enterprise.
• Drive innovation across the enterprise: An enterprise data model would enable
data thinking. Viewing a business process-oriented enterprise blueprint of data
116
5

Fig. 5.9 Developing a product leveraging enterprise data model (illustrative)


Fourth Element of Strategy—Organizational Change Management
5.3 Driving Change—Twelve Elements of OCM Strategy 117

would help in coming up with ideas for data-driven innovations. It would also
help in testing these ideas for feasibility and implementing the qualified ones.
• Business process transformation: Insights from data (by various means,
including applying data science-based process mining techniques to understand
business process performance) can help in business process reengineering as
well as robotic process automation (popularly referred to as RPA). There are
various process mining and RPA tools available today that can be leveraged.
• Redefine value chain and business strategy: The framework can help come
up with few insights and ideas that can be so transformational in nature that
the enterprise may consider re-defining its value chain, thereby changing its
business strategy.

In this book, I have shared quite a few experiences of enterprises that came up
with innovative data-driven solutions. However, if an enterprise wishes to truly
institutionalize data thinking, they need to do more than just develop data literacy
and data thinking skills. This is what I will discuss in the next (and last) element
of OCM strategy—data democratization.

5.3.12 4C: Data Democratization

Data democratization is a state of an enterprise where there is boundary-less flow


of information, complete trust in data, common data language between employees,
seamless use of data in day-to-day decision-making by everyone, and a culture
of innovation using data-driven solutions. When data democratization is not just
limited to the enterprise, but extends to its entire partner ecosystem, the enterprise
reaches a state of “data nirvana”. This is depicted in Fig. 5.10.

Fig. 5.10 Data democratization within and beyond the enterprise—“data nirvana”
118 5 Fourth Element of Strategy—Organizational Change Management

As illustrated in the figure, both data literacy and data thinking initiatives are
required to support data democratization aspiration of an enterprise. However, data
democratization requires much bigger institutional changes, the most important
being data governance—you cannot democratize data without having strong enter-
prise data governance. In Chap. 4, I discussed about the need for enterprise data
governance to formally manage an enterprise’s key data entities through well-
defined policies, processes, and organization structure. I also highlighted the role
of technology and architecture tower (with support from business tower) of the
data and analytics organization in driving enterprise data governance, especially in
four key areas listed below.

• Metadata management,
• Master data management,
• Data security management, and
• Data quality management.

Since I discussed about these in Chap. 4, I will not repeat here. I would just
like to add that enterprises today have realized, often the hard way, the need to
mature their data governance to succeed in their digital initiatives. Hence, data
governance has started getting due attention from executive leaderships of most
enterprises. However, maturing enterprise data governance is an uphill task and
requires persistent focus.
While data governance is an important focus area for data democratization,
another aspect that is important, especially in the context of democratizing (and
possibly monetizing) data beyond the boundaries of the enterprise, i.e., involving
customers, suppliers, and other partners, is establishing a data marketplace. Data
marketplace is a data exchange platform, where data providers and data con-
sumers can exchange data (or data products, to be more specific), either free of
cost or for a fee. This is illustrated in Fig. 5.11.
Salient features of (and considerations for) a data marketplace are listed below.

• Scope: It is important to define the scope of data marketplace. First aspect that
needs to be defined is whether it is for internal use (for data exchange only
within the enterprise) or for both internal and external use. In the case of exter-
nal use, in addition to employees, data providers and data consumers can be
customers, suppliers, and other partners, who can collaborate in a common data
marketplace. The second aspect that needs to be defined is the scope of data
and enabling applications, if any. This is more related to the purpose of data
marketplace. Let me share an example here. When COVID-19 was spreading
in India in the year 2020, the government of India wanted COVID-19 testing to
be accessible to all its more than one billion citizens. For a large country like
India, this was a herculean task, especially because the supply chain for test-kits
had not matured. There were quite a few tier-1, tier-2, and tier-3 manufactur-
ers of test kits, enzymes, etc., across the globe. To resolve this challenge, a
data marketplace was established where test-kit supply data of tier-2 and tier-3
5.3 Driving Change—Twelve Elements of OCM Strategy 119

Fig. 5.11 An illustrative data marketplace

manufacturers of kits, enzymes, etc., could be accessed by tier-1 suppliers. This


helped in increasing COVID-19 diagnostic test-kit production capacity to a mil-
lion test kits a day. There are other examples of data marketplaces in different
industries such as nutrition data marketplace and financial data marketplace that
are gaining traction.
• Incentive: The most important aspect to consider while establishing a data mar-
ketplace is to evaluate the value that it will provide to the participating players.
Unless there is incentive for both the data providers and data consumers, no one
will participate to share/consume data in a data marketplace. One incentive for
an enterprise is data monetization, wherein the enterprise can share and mone-
tize different types of data that provide good value to various data consumers.
I will discuss more about data monetization in Chap. 6, while discussing about
value measurement framework.
• Platform: Data marketplace can be on any cloud platform. There are quite
a few platforms available from the cloud vendors (Microsoft, Amazon, and
Google) and from other vendors that can host their data marketplace platform
on Microsoft, Amazon, or Google cloud.
• Technologies: To establish a data marketplace, other than the platform, vari-
ous technologies are needed to prepare data catalog, ensure data security and
privacy, enable search and discoverability of data, provide self-service capa-
bility, and enable signing of digital data contracts. There are various other
enabling technologies such as those for data virtualization and knowledge
synthesis (using graph technologies) that can enhance the capabilities of a
data marketplace. Blockchain technology can be used, if needed, to establish
120 5 Fourth Element of Strategy—Organizational Change Management

data trust between data sharing partners. In general, with the recent techno-
logical advancements, technology is not a constraint for establishing a data
marketplace.

While data marketplaces are being effectively used by a few enterprises, over-
all, it is still at an early stage of adoption. From OCM strategy perspective, data
marketplace, in conjunction with other initiatives that I discussed earlier, can help
enterprises in democratizing data and become truly data-driven.

5.4 Stages of Change and Importance of Change Leadership

Any transformational change, such as the one resulting from a strategic enterprise-
wide data and analytics program, goes through various stages of implementation
and adoption. Any enterprise would face resistance from employees, in addition
to other hiccups, as it goes through the change process. There would be a learning
curve for the enterprise as well.
At a broad level, there are three stages that the enterprise goes through, in
terms of implementation and adoption of change across the breadth and depth of
the enterprise. This is illustrated in Fig. 5.12.
Let me give an overview of the three stages and explain why the relative impor-
tance of the three OCM objectives (and consequently the twelve OCM strategy
elements that I discussed in previous section), as depicted in the figure, changes
over a period.

Fig. 5.12 The three stages of organizational change


5.4 Stages of Change and Importance of Change Leadership 121

5.4.1 Stage 1: Prepare and Initiate

This stage starts with defining OCM strategy (as part of overall data and analyt-
ics strategy), covering all the twelve elements. Preparation should also include,
inter alia, creating a plan, defining KPIs for measuring success, identifying
potential bottlenecks, devising mitigation strategy for these bottlenecks, identi-
fying resources required, and securing executive sponsorship. Once preparation is
complete, change can be initiated.
An important aspect of initiation is communication. I discussed about com-
munication strategy under Sect. 5.3.3 in the previous section of this chapter. It
is important that employees appreciate the need for change. Getting initial buy-in
from employees will help reduce resistance later. Few quick win data and analytics
projects should also be done at this stage. Under Sect. 5.3.9 of previous section,
I discussed about implementation of a new technology with a pilot project, using
an MVP (minimum viable product) approach. Early success through quick wins
helps in establishing credibility of a new data and analytics initiative.
While all the three OCM objectives are important, greater focus at this stage is
to become a learning organization, followed by innovating and institutionalizing
change, in that order. This is depicted in the figure accordingly.

5.4.2 Stage 2: Scale-Up

Once few quick wins are implemented, it is time to scale up the planned changes
across the enterprise. Some select large and high business value data and analytics
projects should be undertaken at this stage. One would naturally encounter greater
resistance from employees. I discussed under Sect. 5.3.1 of the previous section
of this chapter on how to focus on different user persona, based on their analytical
needs, to address their concerns. Such an approach will help minimize resistance.
Communication and training play a critical role in this stage.
It is important to collect and analyze feedback from employees regularly. It
will help in taking necessary corrective measures. Results and successes should be
communicated across the enterprise. Once some large data and analytics projects
are successfully implemented and success demonstrated to executives as well as
to a large part of the enterprise, and most employees start perceiving change as
positive, one can move on to the next stage.
From the three OCM objectives perspective, all the three are equally impor-
tant at this stage, as depicted in the figure. Overall OCM efforts would increase
considerably in this stage, as compared to the previous stage.

5.4.3 Stage 3: Institutionalize

At this stage, one can go all out to drive enterprise-wide transformational data
and analytics projects. Once this stage reaches a reasonable amount of maturity,
122 5 Fourth Element of Strategy—Organizational Change Management

data would be truly democratized across the enterprise. Data literacy and data
thinking efforts undertaken in the first two stages would start yielding results.
Data-driven collaboration would increase, and innovative solution ideas would
come from across various functions of the enterprise (and not necessarily only
from the data and analytics team). New technology adoption would increase and
create substantial business value. Employees would start embracing new ways of
working.
To reach the state of maturity described above, OCM efforts at this stage would
be substantially greater than the earlier stages. More efforts would be required
on all the three OCM objectives. However, comparatively, greatest effort would
be required in the third objective, followed by the second and then the first, as
depicted in the figure.
An important point to note at this stage is that there would be a tendency
amongst some employees to go back to the old ways of working. Hence, even if
the desired changes are achieved, focus should be on sustaining changes, till they
become part of the enterprise culture and practices.

5.4.4 Importance of Change Leadership

The importance of change leadership in any transformational change cannot


be overstated. In the previous figure, I have illustrated how change leadership
becomes more critical as an enterprise moves successively from the first stage to
the third stage of change. In the following paragraphs, let me discuss a bit more
on change leadership.
Implementation of any large-scale change takes months, if not years. During
this period, employees go through various emotional states. In general, there would
always be an initial resistance to change. Some employees would be angry with the
proposed change and would go to the extent of outright rejection of the change.
Others may resist initially but would quickly adapt to the changes. In general,
emotional response from most employees and the time to accept change would
vary widely. This is illustrated in Fig. 5.13.
The degree and duration of emotional response, amongst various employees,
to a transformational change (as illustrated in the figure) would vary based on
primarily three factors. These are listed below.

1. Background (education, age, culture, etc.) of the employee: Background of


an employee, to a large extent, determines how the employee will respond to
various types of changes. For example, if a change requires learning new data
and analytics tool, generally people in the younger age group and with tech-
nology background would adapt to change more quickly. Similarly, if a change
requires adapting to a new business process, someone who is new to the enter-
prise would be less reluctant to adapt the change as compared to someone who
has been working in the enterprise for a long period.
5.4 Stages of Change and Importance of Change Leadership 123

Fig. 5.13 Typical emotional response to change amongst various employees

2. Analytical needs of each user persona: I discussed this aspect in detail under
Sect. 5.3.1 in previous section of this chapter. If someone’s analytical needs
are very basic and the person is not impacted much from, say, a new data
and analytics tool, she/he would not resist much if the data and analytics team
explains the features of the new tool in an easy-to-understand manner.
3. Position of the employee in the organizational hierarchy: In the figure, the
delay in the start of the emotional response is primarily attributed to the level of
an employee in the organization structure. Top leaders are generally informed
about a transformational change earlier than others. These leaders would then
communicate to the middle managers, who in turn would communicate to rest
of the employees. There is often a lag between these communications. Fur-
ther, the impact of change on various employees also may occur at different
times. All these factors lead to delay in the start of the emotional response, as
shown in the figure. One challenge with such lags is that by the time top lead-
ers are in “acceptance” mode, rest of the enterprise may still be struggling to
accept change. The leaders, instead of empathizing with the others, sometimes
fail to understand why others are still struggling. Impatience and inability to
empathize may lead to adverse conditions for the changes to be accepted.

When all the employees are going through various emotions, the role of change
leader(s) becomes critical. A change leader needs to be not only good in OCM
planning but should also be good in managing the emotions of employees, to suc-
ceed in leading the enterprise on a transformation journey. While there are various
leadership styles that I have come across in different enterprises, my experience
is that a “transformational leadership” style works best when it comes to data
124 5 Fourth Element of Strategy—Organizational Change Management

and analytics organizational change management. I see “charisma” as an inherent


characteristic of a transformational leader. Hence, we can call such a leader as
a charismatic leader as well.
A transformational or charismatic leader is very passionate, can articulate a
compelling vision of the future, can make strong connections with others, and can
influence them to overcome their emotional barriers to accept changes quickly.
There are various other leadership skills that such a leader must possess. I will be
discussing about these in detail in Chap. 7, under Sect. 7.3.

5.5 Summary

Till 2010, data and analytics in enterprises was treated more as MIS (management
information system) function. In the early 2010s, with the popularity of data visu-
alization tools, data and analytics started getting democratized across the breadth
and depth of enterprises. In the latter part of 2010s, due to the advent of digital
technologies, data and analytics became the core focus for digital transformation
and became boardroom agenda. However, enterprises soon realized that if they
aspired to truly democratize data and analytics to become data-driven, they needed,
inter alia, a very strategic organizational change management approach. Such an
approach should have four key focus areas, as listed below.

1. People: OCM strategy should focus on understanding the cultural, geograph-


ical, and demographical diversity of employees and their different analytical
needs.
2. Processes: Here, the focus should be both on defining new processes and new
ways of working to drive innovation at scale across the enterprise.
3. Technology: It is important to focus on both training people on new technolo-
gies and exploring future data and analytics technologies that can make the
enterprise ready-for-future, to maintain competitive advantage.
4. Data: It is important to change the way data is treated, viewed, and managed
in enterprises. It is important to have a data literacy plan for all user personas
and to inculcate “data thinking” amongst all employees of the enterprise, so
that data can be democratized.

Further, OCM strategy should be driven by three key objectives, as listed below.

A. Become a learning organization: A learning organization is one where every-


one is involved in analyzing and solving problems, thereby continuously
improving the capability and competitiveness of the enterprise.
B. Innovate: Parallel to becoming a learning organization, it is important to
inculcate a culture of innovation across the enterprise.
C. Institutionalize change: To ensure that changes are permanent and widespread
across the enterprise, it is important to take necessary steps to institutionalize
changes.
5.5 Summary 125

Taking a cartesian product of the four focus areas and the three OCM objectives,
a set of twelve OCM strategy elements emerge. An enterprise needs to have OCM
plan for each of these twelve elements. Further, ability to implement change is as
important as good planning. Implementation of change goes through three stages—
prepare and initiate, scale up, and institutionalize. The relative importance of the
three objectives changes as the enterprise goes through these stages. Overall, OCM
efforts increase considerably during the journey and criticality of change leader-
ship increases exponentially. A change leader needs to be not only good in OCM
planning but should also be good in managing the emotions of employees dur-
ing the implementation process. To do that, enterprises need a transformational or
charismatic leader.
Fifth Element of Strategy—Value
Measurement Framework 6
Establishing a Framework to Define and Measure Value
of Data and Analytics Program

6.1 The Need for a Value Measurement Framework

Whenever an enterprise embarks on a strategic data and analytics initiative, there


is an initial excitement amongst executives. Expectations from the initiative are
high. As a result, getting budgetary approval for a few million dollars may not be
difficult. However, after a year or so of the start of the initiative, executives start
questioning the value they got. They often feel that the value was not commensu-
rate with the investment. There are three main reasons why such concerns arise.
These are listed below.

1. Once a strategic data and analytics initiative is started, there is a maturity path
that an enterprise goes through even when the initiative is started with the best
possible approach. It takes some time for the right foundational components
of data and analytics to be established. Once the foundation is in place, which
takes quite some time early on, the enterprise can move up the maturity path
faster. I discussed about this in Chap. 2, where I talked about the period of disil-
lusionment in data and analytics roadmap, and the importance of organizational
change management and expectation management during this phase.
2. Despite putting together the best possible data and analytics team, there is a
learning curve that the team goes through in the context of the enterprise.
Hence, there would be fewer successes during the first year. The learning
curve can be made steeper to some extent by defining a cross-skilling plan for
the existing data and analytics team early on and leveraging their contextual
knowledge, as discussed in Chap. 5 on organizational change management.
3. Even when projects with potential to make high business impact are imple-
mented, measuring business value is a tricky task. In some cases, there is no
way to measure value directly. Hence, even if a project led to improvement
in business performance, demonstrating business value to executives becomes
difficult.

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 127
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_6
128 6 Fifth Element of Strategy—Value Measurement …

To address the third point above, it is important to establish a value measurement


framework, that continuously captures and communicates business value delivered
(or planned to be delivered) by various data and analytics projects. This would help
in sustaining the initial euphoria created by a strategic data and analytics initiative
and ensuring that the initiative does not lose executive sponsorship.
Getting executive sponsorship and budget is not the only reason for establish-
ing a value measurement framework. Another major reason for establishing the
framework is to ensure that data and analytics team strives not only for achieving
technical, architectural, and operational excellence, but also for achieving strategic
business objectives of the enterprise. In fact, the latter is more important than the
former. I have often seen that data and analytics team measures their performance
solely based on how efficiently they have been running their operations. While
measuring operational efficiency is important, it is more important to measure
performance based on business value delivered.

6.1.1 Data and Analytics Efficiency-Value Matrix (EV Matrix)

Based on the two dimensions of operational efficiency of data and analytics team
and the business value that the team delivers, I have created a data and analytics
efficiency-value matrix (EV Matrix), as depicted in Fig. 6.1. Any enterprise can
be plotted in this matrix based on the maturity of its data and analytics team on
these two dimensions.
In the figure, x-axis represents business value and y-axis represents operational
efficiency. For the sake of simplicity, I have segmented both the axes in “low” and

Fig. 6.1 Data and analytics efficiency-value matrix (EV matrix)


6.1 The Need for a Value Measurement Framework 129

“high” buckets, thereby creating four quadrants. I have given a unique name to
each of the four quadrants. Let me elaborate them below.

• Infant: Enterprises that have one or more data and analytics teams that are
working in silos, operating more as MIS (management information system)
function(s), and delivering basic reports to business stakeholders, fall in this
quadrant. MIS functions in an enterprise are treated as cost centers. They just
deliver standard reports and some basic analytical capabilities to the business
stakeholders. Often, business stakeholders request for data dump from this team
and use it to develop their silo analytical systems. In today’s digital era, any
enterprise that has their data and analytics team working mostly as MIS function
would have low team morale, since they do not get exposure to new digital
technologies. Talented people from the team may leave and take up job in an
enterprise where they can get experience in latest digital technologies.
Enterprises that have their data and analytics team in this quadrant would lag
their competitors in gaining competitive advantage through data and analytics.
Do note that in some enterprises, MIS function is very efficient, working as an
integrated global entity rather than as silos across BUs/functions/geographies.
In such cases, the enterprises would fall in the “introvert” quadrant.
• Introvert: I call this quadrant “introvert” because enterprises that belong here
have data and analytics teams that are more inward focused. They strive to
improve only internal efficiency and technical excellence. These teams may be
working as an efficient MIS function or might even have invested in digital
technologies. However, even in the latter case, they are focused only on tech-
nical excellence, with very limited focus on how the digital technologies can
deliver substantial business value.
There are two major concerns regarding the teams belonging to this quadrant.
First, they are perfectionists who search for a perfect technical solution for
every business problem that they come across. They do not realize that certain
problems cannot be solved through technical solutions. There are various politi-
cal and other issues in an enterprise for which you need non-technical approach
such as organizational change management.
Second, for these perfectionists, internal efficiency becomes so important that
the larger purpose of their existence, which is to deliver business value, is lost.
I am not undermining the need for technical excellence and operational effi-
ciency. But any effort, that does not ultimately align with business priorities,
would go in vain, from an enterprise point of view.
• Hero: Often, in a data and analytics team, that has been operating as an “infant”,
there emerge few “heroes”, who are very keen to deliver some high business
value projects. They would go out of the way to understand business challenges
in an area and work closely with the relevant business stakeholders to create
a solution that addresses the business problem. Such heroes are driven more by
their passion rather than by any organizational mandate. Enterprises that have
such heroes in their data and analytics teams fall in the “hero” quadrant.
130 6 Fifth Element of Strategy—Value Measurement …

If enterprises aspire to institutionalize business value delivery and truly democ-


ratize data and analytics, they need more than just a few heroes in the data and
analytics team. They need to define a well-thought-through data and analytics
strategy and drive execution based on it. They need data and analytics charter
that is derived from the enterprise business strategy.
• Leader: Enterprises that are in the “leader” quadrant have data and analytics
teams that strive to continuously deliver greater business value by becoming
both business-focused and operationally efficient. Such enterprises are well
on course to become analytical leaders. In the previous chapters, I discussed
in detail about how data and analytics team should work closely with their
business partners and ideate/innovate with them to solve complex business prob-
lems. Such an approach is an absolute must, and there is no alternate approach
for achieving business excellence.
Having said that, while the team focusses primarily on delivering business
value, the speed, quality, and cost with which they deliver the value should
also improve year-on-year. This improvement is brought about by their addi-
tional focus on improving operational efficiency and technical excellence. Such
a balanced approach helps in continuous improvement of return on investment
(ROI) year-on-year. In the last section of this chapter, I will talk about ROI in
more detail.

In the data and analytics efficiency-value matrix, enterprises should aspire to move
from the bottom left quadrant to the top right quadrant. The path of this transition
will vary for different enterprises. Some would traverse upwards before moving
rightwards, while others may take a different route. However, irrespective of the
path that an enterprise takes, it is important to appreciate the fact that moving
up the maturity path is always a journey and not a one-time “big-bang” leap.
Sometimes, executives are willing to invest a large sum of money in a data and
analytics program to achieve a giant leap. However, it never works that way. One
can only accelerate the speed of change, but cannot take a quick giant leap, because
of the inherent complexities of a large and global enterprise, that I have been
discussing in this book.
Irrespective of the speed with which an enterprise traverses on their data and
analytics maturity path, establishing a value measurement framework helps in
ensuring that the journey is always steered in the right direction, is continuously
well oiled through sponsorships, and creates competitive advantage for the enter-
prise. In the next two sections of this chapter, I will discuss in detail about each
of the two dimensions of EV matrix in detail, including how value measurement
framework can be established. In the final section, I will explain how to calculate
return on investment from data and analytics investment.
6.2 Defining and Measuring Business Value 131

6.2 Defining and Measuring Business Value

In a data and analytics project lifecycle, the process of defining and measuring
business value needs to start right from demand management process, i.e., when
a request comes to the data and analytics team from any business function or
when the data and analytics team proactively takes a proposition to their business
partners. In many cases, a formal business case is prepared during the demand
qualification stage, wherein potential business benefits (quantified and/or unquan-
tified) are identified and evaluated. However, even for projects for which initial
business case was prepared, I have not seen many enterprises putting in efforts to
systematically capture business value during project execution or post that. While
enterprises do diligently track projects for timely and on-budget execution, a for-
mal means of capturing and measuring business value is often missing. This lack of
focus is one reason why many enterprises move only vertically upwards, instead of
moving in the northeast direction, in the data and analytics efficiency-value matrix.
When I asked data and analytics leaders of various enterprises as to why they
do not formally capture business value resulting from data and analytics projects,
they told me several reasons. However, one common reason that everyone said was
that it is very difficult to quantify business value resulting from data and analytics
projects. Even in my personal experience, I have found that quantifying business
value, that can be attributed to a data and analytics project, is never a straight-
forward task. However, if one adopts a systematic approach, one can achieve the
objective of measuring and demonstrating value in a reasonably accurate manner.
In this section, I will discuss about such an approach.
In Fig. 6.2, I have illustrated four areas where data and analytics projects can
make positive business impact in an enterprise. I have also listed few examples in
the figure.
Before I get into detail into each of these four areas, I want to highlight that
most of the examples that I have mentioned (in each area of the figure) create
a positive impact on more than one area. For example, while a “sustainability”
initiative helps in a company’s image building, it can also positively impact one or
more of the other three areas, viz. revenue increase, cost reduction, and business
risk mitigation (based on the type of sustainability initiative). You will realize this
fact (of multi-area impact) while going through many of the other examples as
well, in the following paragraphs. However, to keep it simple, I have classified
each example in the area on which the example has the biggest impact.

6.2.1 First Impact Area: Revenue Increase

Everyone in an enterprise agrees that analytical solutions such as cross-sell ana-


lytics and sales-incentive analytics have a direct and positive impact on revenue.
However, when it comes to quantification, there would be difference of opinion
on what percentage of revenue increase can be attributed to data and analytics
132 6 Fifth Element of Strategy—Value Measurement …

Fig. 6.2 Four areas where data and analytics projects make a positive business impact

initiatives. There would be cases where the degree of correlation between an ana-
lytics solution and revenue increase can be established with conviction. But there
would be many other cases where it may be difficult to establish causality for the
correlation.
In cases where there is a difference of opinion on the exact percentage of rev-
enue increase that can be attributed, one should calculate revenue impact for three
scenarios—a best case one, a worst case one, and a normal case one. I will describe
this with an illustration in the last section of this chapter, while explaining how to
calculate return on investment for data and analytics projects.
Let me now explain, with the help of few examples (that are illustrated in the
figure as well) how one should approach to quantify benefits to a reasonable level
of accuracy. The examples that I have chosen are the ones that are of high focus
areas for most enterprises today.
Dynamic pricing: Pricing (of a product or a service) plays an important role in
maximizing both top-line and bottom-line of an enterprise’s financial performance.
Enterprises are always interested to understand how a purchaser would respond to
an alternate price for a product or a service. If they have good understanding of
6.2 Defining and Measuring Business Value 133

this, they can price their products and services in a manner that can help improve
their profitability, without losing any customer. Because of its criticality on busi-
ness performance, pricing strategy is formulated/reviewed at an executive level in
all enterprises. Data and analytics play a key role in formulating pricing strat-
egy. It especially helps in developing dynamic pricing models, which almost all
enterprises are either adopting or aspiring to adopt. For those of you who are not
familiar with dynamic pricing, let me describe it below, starting with traditional
pricing models.

Traditionally enterprises have been using cost-plus pricing approach, wherein they calculate
their standard costs (for manufacturing and/or selling a product, or for providing a service
to a customer) and then add their desired profit margin to total cost to arrive at list price.
They then give standard or special discounts based on customer, as needed to sell their
product/service. Sometimes, they get into a long-term contractual agreement with some cus-
tomers, who agree to purchase high volume of product/service and offer such customers
some payback schemes. Enterprises have been using various means like these to ensure that
their customer churn rate (i.e., rate of losing customer base) is low and that new customers
are also attracted. Many times, such decisions (of offering discounts or other schemes) are
taken by the sales team based on their gut feeling.
Enterprises realized that using traditional cost-plus pricing approach does not take into con-
sideration the fact that different customers have different price elasticities, i.e., willingness
to pay a price for a certain product varies for everyone. This is based on supply and demand
microeconomic model for price determination. To come up with price elasticity model,
demand models are developed based on nature of demand. However, due to complexities
such as data unavailability and high data volume, often the models are developed at a cus-
tomer segment level rather than individual customer level. Also, these models do not adapt
and change on a real-time basis.
Over the last few years, many CEOs have been aspiring to move towards a dynamic pricing
model, wherein they can determine price elasticity of each customer on a real-time basis
using AI/ML (artificial intelligence/machine learning) models and set price accordingly.
Airline and hotel industry were pioneers in this, but enterprises in most other industries
today are aspiring to adopt (or have already adopted) dynamic pricing model to benefit
from real-time data-driven pricing optimization. Data and analytics play the most important
role in dynamic pricing. There is a common misconception that dynamic pricing model is
only relevant for B2C (business-to-consumer) companies. This is not true. Dynamic pricing
model is equally relevant today for B2B (business-to-business) and B2B2C (business-to-
business-to-consumer) companies.

Calculating revenue impact of pricing model enabled by data and analytics is a


straightforward mathematics. To do this, data and analytics team should leverage
the data that they already possess. After implementing a solution for dynamic
pricing model, the team can calculate the price at which a product/service would
have been sold to a customer (had there been no dynamic pricing model in place)
in a cost-plus model. They can do this based on historical data analysis. They
can then compare the calculated/derived price with the actual price at which the
product/service was sold using dynamic pricing model. The difference between
these two is a direct measure of revenue increase resulting from dynamic pricing
solution. No one will raise a concern when such calculation is fact/data based.
134 6 Fifth Element of Strategy—Value Measurement …

New business model: Enterprises keep exploring new business models to remain
relevant in the marketplace. One such example in the recent years is of B2B com-
panies who, to become more customer-centric, are shifting from B2B to B2B2C
business model. They do this by, amongst other things, becoming more data driven.
Capturing end customer data not only enables initiatives such as dynamic pricing,
that has a direct revenue impact, but also helps to capitalize on various oppor-
tunities to provide value to customers. Through such opportunities, enterprises
can either increase an existing revenue stream or create a completely new one.
However, it may require changing the existing business model.
In Chap. 3 of this book, I shared an example of business model change, while
talking about servitization, where tire manufacturers are selling tires-as-a-service
instead of selling tires upfront to airlines. This not only provides more value to
airlines, but also helps generate more revenue for the tire manufacturers over the
lifecycle of a tire.
To quantify the benefit in such cases, where the revenue increase is solely
because of the business model change, enabled by data and analytics, one can
calculate increase in revenue for each item (which is tire in this case) sold. Based
on historical data, one can calculate the revenue that the tire manufacturer would
have got had the tire been sold one time with upfront invoice to customer. There-
after, one can calculate the actual revenue over the life cycle of a tire, based on
invoices raised on selling tires as a service. The difference between the two will
give a direct measure of revenue increase for each tire. On adding up this increase
for all the tires, total revenue impact can be calculated. One may argue that getting
actual revenue over the life cycle of a tire will take several years (that the tire lasts)
and waiting for so many years to quantify revenue increase benefit may not excite
executives. This is a valid argument. Hence, one should use forecasting techniques
to project the revenue over the life cycle of a tire, so that quantified benefit can be
demonstrated early on.
Let me share another interesting example of business model change from
automotive industry.

This example is not just a business model change but is about getting into an altogether new
business itself. Automotive companies today are planning to create a new revenue stream by
getting into car insurance business, an area that has traditionally always been taken care of
by insurance companies. Automotive companies believe that they can provide better value
to customers, by providing insurance at a cheaper cost to drivers whose driving styles are
safe.
More and more electronics are getting embedded into cars that are being manufactured
today. These cars have lot of sensors that regularly monitor various parameters while the
car is in motion. The data from these sensors are often collected online (using IoT tech-
nologies) or offline (when the car arrives at a service centre). By applying analytics to the
data thus collected, one can get good understanding of driving stye of a driver. Of course,
one needs customer consent to collect and use this data. Customers are happy to give con-
sent once they see the ensuing value. Lower insurance cost is one such value for good car
drivers. This is a win-win situation for both automotive companies and drivers. It is also
good for enhancing road safety in general, since it would provide incentive to all drivers to
drive safe.
6.2 Defining and Measuring Business Value 135

Experts believe that, for the automotive companies, revenue from selling car insurance will
very soon become a major portion of their total revenue. Since this new revenue stream will
be generated primarily by data and analytics, one can calculate the direct revenue impact
without any problem.

Data monetization: Data monetization in a broader context can mean leverag-


ing data to improve business performance. Examples include cross-sell analytics,
dynamic pricing, etc., that I discussed about earlier. However, if one were to use
the term “data monetization” in its literal sense, it means selling data to generate
a new revenue stream. Here, I am referring to “data monetization” in its literal
sense.
One of the best examples of data monetization is of market research compa-
nies that sell market research reports based on analysis of data collected from
various sources. Another interesting example of direct data monetization is from
telecommunication industry, where some telecom companies sell analysis done
on customer data (collected from Web-browsing history, app usage, and location
history of their customers) to third parties for developing custom products or for
targeted advertisements. Calculating direct revenue impact in data monetization
cases is a straightforward task. No one can argue against the revenue impact that
data and analytics have in such cases.

6.2.2 Second Impact Area: Cost Reduction

Till now I discussed that though it is difficult to calculate revenue increase resulting
from a data and analytics project, it is not impossible to do so if one adopts a
structured approach. But, when it comes to calculating cost reduction resulting
from a data and analytics project, the task is comparatively easier. Let me explain
this with the help of few examples.
Productivity gain: There are two broad areas where productivity gains result from
a data and analytics project. These are listed below.

1. Time saved in manual data crunching: If you ask any mid- or senior-level
employee in an enterprise about the amount of time that they spend in crunching
data manually, you will learn that they spend a substantial number of hours per
week in data collection, cleaning, standardization, and summarization. They do
most of these activities in spreadsheets. For various meetings as well as for
reporting upwards, the data or reports that they get from the existing systems
are generally not good enough to be used as-is. Hence, they need to spend time
in preparing data and reports to suit their needs.
While advising my customers on their data and analytics strategy, I generally
conduct an initial dipstick survey to get the pulse of various business stakehold-
ers. I talked about dipstick surveys in Chap. 2, while describing the approach
for enterprise churning (“samudra manthan”). In the various surveys that I do
for executives, business managers and analysts, I always include a question
136 6 Fifth Element of Strategy—Value Measurement …

requesting them to share approximate number of hours per week that they spend
in crunching data manually. An analysis of the past responses that I got from
various enterprises shows that the business stakeholders spend between 4 and
16 h per week, i.e., between 10 and 40% of their productive time (assuming 40
working hours per week). In many cases, even executives were spending more
than 8 h per week in crunching data manually. This is a waste of their precious
time.
For the same enterprises for whom I defined data and analytics strategy, I did a
repeat survey (for some of the enterprises) after a year or so of implementation
of few pilot projects. Such surveys were done for business stakeholders for
whom the projects were implemented, as per the roadmap defined in the data
and analytics strategy definition phase. I included the same question that was
asked earlier, i.e., the approximate number of hours per week that they need
to spend post the implementation of new data and analytics solution (that took
care of most of the manual work that they were doing earlier). An analysis of
their responses showed that the manual time spent in crunching data came down
to between 5 and 10%. This means that, on an average, they saved between 50
and 75% of time that they spent earlier in crunching data manually.
To calculate the cost reduction benefit (in dollar terms) in such cases, one can do
simple mathematics, i.e., (approximate hourly wage for each level of business
stakeholders) × (average number of hours saved for the level) × (approximate
number of business stakeholders at the level). Since the approximate hourly
wage for each level varies by region/country, one can do this calculation for
the stakeholders in each region/country. The business stakeholders can use the
extra time that get because of above in various productive activities.
2. Making business processes more efficient: Making the right data/insight avail-
able at the right time helps in saving non-value adding steps in a business
process. Let me explain this point with the example of breakdown maintenance
of equipment. If any equipment, say a diesel generator installed in a manu-
facturing site or a customer site, goes out of order, a technician goes to the
equipment, opens it up, and tries to diagnose the cause of the failure. Typically,
to diagnose the cause, the technician goes through a pre-defined checklist and
based on it, verifies various components for damage. If it is a large equipment,
this exercise can take hours, which not only leads to high cost of maintenance,
but also results in loss of productivity of the equipment.
To address the above challenge, one can develop an AI/ML-based self-learning
data and analytics solution. The solution collects and stores data for all the past
failures of the equipment or other similar equipment. Such data includes the
problem(s) that occurred, process parameters when the equipment failed, the
cause of failure that was diagnosed subsequently, and the remedial action(s)
taken to help rectify the problem(s). Such historical data is then used to build
an AI/ML-based recommendation engine that creates relationship between dif-
ferent data patterns and recommends the best possible actions, in a specified
order of priority, that can help in rectifying the failure quickly. With such a
solution in place, diagnosis and corrective action can be completed much faster
6.2 Defining and Measuring Business Value 137

than what a technician can do using a standard checklist. It can help in get-
ting rid of all the non-value adding steps of going through the entire checklist.
Most of such AI/ML-based solutions being developed today have self-learning
algorithm, i.e., as more and more equipment data is collected, the algorithm
becomes more accurate in its recommendations.
In the above example (or other similar examples), it is very easy to quantify
the cost saving and other benefits resulting from data and analytics. Few key
potential benefits are summarized below.
– Maintenance cost saving, which equals (number of hours of maintenance
cost saved) × (standard maintenance cost per hour, that includes labor and
other costs, as defined by the company).
– Avoidance of equipment downtime cost, which is the cost to the company
for each hour of downtime of the equipment. This is the non-maintenance
cost that is incurred by the company when an equipment is down. Examples
include loss of production, penalty to a customer, etc.
– Non-quantifiable benefits, such as increase in customer satisfaction (if the
equipment was in customer’s site). Such benefits, even if they cannot always
be measured in dollar terms, should be captured for value articulation.
Quantifiable costs above can be calculated by comparing MTTR (mean time
to repair) data, with and without a data and analytics solution. Since historical
data is generally available with the data and analytics team, they can check
MTTR for each type of failure, when there was no data and analytics solution
in place. They can then compare the same with MTTR for the corresponding
failure after an AI/ML-based data and analytics solution was implemented. The
difference can help in quantifying various benefits.
While I used the above example (of equipment breakdown maintenance) to
explain how making the right data/insight available at the right time helps in
saving non-value adding steps during the maintenance process, I have seen
many similar examples in other business processes across various industries. I
described about one such example in Chap. 3, where I explained how data and
analytics helped a chemical producer in selecting an optimal chemical treatment
program for oil wells during day-to-day business operations of oil fields.
In the recent times, a new technology area called RPA (robotic process automa-
tion) has emerged, that focusses on automating non-value adding/repetitive
tasks in various business processes by using software robots, also popularly
known as bots. Further, newer focus areas such as intelligent automation (that
uses, inter alia, artificial intelligence and machine learning) promise to not only
save costs by making business processes faster, but also to make the processes
more effective. They promise to automate decision-making itself, with minimal
human intervention. All such innovative solutions require a very strong data and
analytics foundation. From value measurement perspective, one needs to cap-
ture how data and analytics is enabling cost-saving and other benefits through
all these new areas.
138 6 Fifth Element of Strategy—Value Measurement …

Quality improvement: Quality of products or services delivered to customers is


a key focus area for any enterprise. Various six sigma and other initiatives are
undertaken to ensure that the quality of product/service meets the standards laid
down by the enterprise and/or by the customer. Data and analytics play an impor-
tant role in improving quality of both products and services. Better quality means
lower wastes, lesser rejections, higher productivity, and lower cost to company.
This is the reason why I have put quality improvement under the category of cost
reduction. However, better quality (of product/service) also has both direct and
indirect impact on the other three areas, viz. revenue increase, business risk mit-
igation, and company’s image building. For those of you who are not conversant
with cost of quality, let me give an overview below, that also explains how data
and analytics help in improving quality and reducing cost of quality.
All enterprises measure and track cost of quality as an important KPI. This
is done not just to control costs, but also to improve customer perception about
the product/service delivered. For both products and services, cost of quality is
commonly categorized into four types, as listed below.

(a) Prevention cost, which is the cost incurred to prevent/minimize failures.


(b) Inspection/detection cost, which is the cost incurred to conduct various
inspections, assessments, and audits.
(c) Internal failure cost, which is the cost incurred due to failures/defects before
a product/service is delivered to the customer.
(d) External failure cost, which is the cost incurred due to failures/defects after
a product/service is delivered to the customer.

While the sum of “a” and “b” is referred to as cost of good quality (COGQ), the
sum of “c” and “d” is referred to as cost of poor quality (COPQ). Cost of quality
(COQ) is the sum of all the four costs, i.e., sum of COGQ and COPQ, as shown
by the equation below.

COQ = COGQ + COPQ

As enterprises invest more in prevention and detection measures, COGQ


increases and failure costs, viz. COPQ reduces. On the other hand, if prevention
and detection investment is reduced, COGQ goes down and COPQ increases. This
relationship is quite intuitive.
High COPQ, especially if it results from external failure costs, has an adverse
impact on an enterprise’s reputation. Therefore, adequate investment in COGQ is
required to be made to minimize COPQ. However, if one strives to reach to zero
COPQ, the investment that is required to be made in COGQ grows up exponen-
tially. Hence, there is always an endeavor to arrive at an optimal quality level of a
product/service, at which total cost of quality (COQ) is minimized.
Data and analytics helps in reducing both COGQ and COPQ, thereby reducing
total COQ. It does so in broadly two ways, as mentioned below.
6.2 Defining and Measuring Business Value 139

1. Monitoring processes by capturing relevant data in the entire value chain (from
product/service design to delivery to customer) and identifying areas (within
each of the four cost categories—“a”–“d”) where costs can be reduced, or waste
can be avoided. Techniques such as statistical process controls help in quality
control by collecting process data on a real-time basis (using sensors and IoT
technologies) and getting insights to avoid any poor quality/waste. Using these
insights, various process improvement initiatives can also be undertaken for
quality improvement.
2. Conducting root cause analysis of failures/wastes and providing insights to pre-
vent them in future. For example, for a manufacturing industry, one can collect
all the relevant process parameters and quality data from various processes—
product design, raw material inspection, in process inspection, final inspection,
post-sales warranty issues resolution, service reports, and customer complaints,
and use them to identify root cause(s) of failures and rejections. Such analyses
can provide very specific insights. I am quoting couple of examples below.
– In company A, regular customer complaints on a product were being raised
for bad quality. On initial investigation, it appeared that while manufacturing
the product, due care was not being taken. However, despite tightening qual-
ity control at the shop floor, the issue was not resolved. Hence, a detailed
root cause analysis using data and analytics was done. The analysis revealed
that the quality of a raw material, received from an approved supplier and
tested as per the defined sampling plan, was the root cause of the problem.
Once this came to light, the raw material sampling plan for incoming inspec-
tion was made more stringent and the issue was taken care of with very little
extra effort. Additionally, the matter was discussed with the supplier, who
also took necessary preventive measures.
– In company B, a premium product failed to perform in certain environmen-
tal conditions for no apparent reasons. On detailed root cause analysis using
data and analytics, it emerged that there were certain design flaws in the
product. Making necessary changes in the design took care of the issue
permanently.
Quantification of benefits resulting from quality improvement initiatives is pos-
sible, even though it can be a little difficult sometimes. Data and analytics team
normally has all the past defect/failure data. Hence the best way to quantify bene-
fits is to use this data to extrapolate and then compare with reduced defects/failures
because of data and analytics initiatives. It can help in calculating cost reduction
in dollar terms. Other direct and indirect benefits should also be summarized while
doing this exercise.
I want to highlight an important point regarding the difference between prod-
ucts and services. Between them, it is always more difficult to understand and
measure cost of quality of services. This is because of the intangible nature of
services. An interesting difference between product and service is that service
is consumed simultaneously with production, making preventive intervention (for
quality improvement) very difficult. Let me share an interesting experience, from
140 6 Fifth Element of Strategy—Value Measurement …

a multinational company, on how service quality was improved using data and
analytics.

This multinational company is in the business of providing hygiene solutions and services to
restaurants, hospitals, food retailers, food processors, and various other establishments. The
company also provides consulting services to them to improve the hygiene of their facilities,
thereby helping them to comply with the local regulatory requirements.
The nature of the company’s business demands a highly qualified field service force, who
can advise customers in hygiene practices and provide them solutions, leveraging premium
products that the company manufacturers. Hence, many employees in the field service force,
hold Ph.D.s in industrial hygiene, food hygiene, and other similar areas.
Since field service is so critical for their business, the company always tries to improve their
service and provide more value to their customers. Towards this objective, in the late 2010s,
the company identified and implemented various initiatives, as listed below.

(a) They established a data platform called customer 360, that entailed creating 360-degree
view of customer data, by bringing in data from across various enterprise transactional
systems.
(b) They built new field service applications on top of customer 360 data platform to pro-
vide easy access of all relevant customer data to the field service team during any cus-
tomer interaction. This meant that the team did not have to waste any time in collecting
data and had all the required data to serve their customers quicker and better.
(c) They created functionality in the new field service applications that allowed the team
to capture more data related to specific customer pain areas and the solutions that were
provided during their field visit to the customers. Earlier, in the old applications, all
such data was captured as free flowing text in service reports, which did not provide
much value from analytics perspective. The structured data captured in the new field
service applications was brought into customer 360 data platform, so that analytics
could be done on it. This helped the company in several ways, such as providing quick
reference to service teams across different territories on similar customer pain areas and
corresponding solutions provided, helping corporate contracting team to demonstrate
value delivered by the company during annual contract renewal with customers, etc.
(d) They added new value-added services for customers, such as conducting food safety
audits as per regulatory norms, providing hygiene trainings, etc. All this was possi-
ble because relevant data and content, including procedures, guidelines, and templates,
were now stored in the centralized customer 360 data platform.
(e) They developed a centralized/global service level commitment engine. Earlier, service
level (i.e., number of service visits to be paid, types of services to be performed, etc.) for
various customers was being decided by the local service teams within each territory,
based on the information that they had. However, with customer 360 data platform, it
was possible to develop a centralized engine to recommend service levels, based on dif-
ferent types of customers. This was a more scientific approach and helped the company
in maximizing overall service level, within the available resource constraints.
(f) Finally, they defined and measured additional key performance indicators (KPIs) for
improving service as per globally laid down standards. Since quite a few additional data
was now getting captured, it was possible to define and measure these new KPIs. Also,
these KPIs could now be tracked daily, since all the service data was getting refreshed
in the data platform regularly.

As you can see, data and analytics formed the core for driving all the above initiatives.
The initiatives helped in redefining customer processes and experience, leading to both cost
6.2 Defining and Measuring Business Value 141

reduction and high-quality value delivery to customers. Each initiative delivered both tan-
gible and intangible benefits to the company. The tangible ones were quantified, and a fair
proportion of the value was attributed to data and analytics. During such an exercise (of
attributing value), there can be some difference in opinion on agreeing to one single per-
centage number. However, when value delivered is high and data and analytics form the
core enabler of the initiative, even attributing 50% of value (which is on the lower side)
provides enough justification for return on investment in data and analytics.

Working capital reduction: CFOs in capital-intensive industries are always keen


to reduce working capital to lower cost of capital for the enterprise. Data and
analytics play an important role in identifying elements within current assets and
current liabilities (which are the two constituents of net working capital) that can
be optimized to reduce net working capital. Data and analytics can provide specific
insights and recommendations to optimize certain elements of current assets and
current liabilities, so that net working capital can be reduced without putting the
business at risk.
Couple of examples of working capital reduction are inventory optimization and
accounts receivables management, where data and analytics help in identifying
obsolete inventory, recommending optimal safety stock, providing insights into
accounts receivables aging, and so on. Let me share an interesting example of
inventory optimization from automotive industry.

The company is a global manufacturer of automotive vehicle. For any automotive company,
selling service parts, after the vehicles are sold, is both a profitable business and an impor-
tant focus area for ensuring customer satisfaction. Following are some of the statistics for
this company for its after-market business.

• Each vehicle model contains thousands of parts and the company had many models in
the market. Hence, they had to manage more than 2 million SKUs (stock keeping units)
to meet demand of various service parts in different regions across the globe.
• The company sourced most of the service parts from more than 45,000 suppliers across
the globe.
• On the distribution side, they had 25 warehouses across the globe to stock service parts
that were ordered/consumed by more than 6000 retailers/service centers in different
countries.
• There were more than 600,000 different transportation routes in their supply chain.
• As the average age of their vehicles in the market was increasing, spare parts and service
business, as a percentage of total revenue, was also increasing.

For the company it was getting very difficult to predict demand of service parts, since there
was high degree of randomness in demand pattern. Randomness of demand pattern meant
that high buffer stock had to be maintained at the warehouses or with the retailers/service
centers for more than 2 million SKUs. If buffer stock was insufficient, parts would run out of
stock leading to low customer satisfaction. In such scenarios, where an important part went
out of stock and a customer needed it urgently, the company had to ship the part quickly by
air freight. Both the options (i.e., maintaining high buffer stock and shipping by air freight)
were expensive for the company. The company ended up maintaining a high level of buffer
stock, which increased their working capital substantially.
142 6 Fifth Element of Strategy—Value Measurement …

To address the challenge of balancing cost and customer satisfaction at an optimum level, a
data and analytics solution was developed, that collected various data required to develop a
simulation model that could help in better service parts planning for the company’s global
distribution network. The simulation model identified optimal planning parameters for max-
imizing service level (availability of service parts) and minimizing inventory holding costs.
It helped in planning buffer stocks smartly for each SKU, based on various market dynam-
ics. Hundreds of terabytes of data were processed by the simulation model to come up with
the recommendations.
The above solution helped the company save more than 5 million dollars per year. Calcu-
lating this saving was easy. One needed to calculate the reduction in inventory carrying and
other costs for maintaining the same service level. All the data required for the calculation
was available in the data platform. It only required someone within the data and analytics
team, with very good business understanding, to do the mathematics.

6.2.3 Third Impact Area: Business Risk Mitigation

While it is a well-known fact that data and analytics initiatives create direct impact
on revenue increase and cost reduction, there are many examples from various
enterprises where data and analytics helped in mitigating various types of financial
and non-financial business risks. While it is tricky to quantify the business value
resulting from mitigation of these risks, there are ways in which one can come up
with ballpark estimation of financial and non-financial losses that an enterprise can
face in the eventuality of a risk occurring. The approach for such quantification
revolves around extrapolating historical risk data. Let me explain this with the help
of three compelling examples.
Supply chain risk mitigation: Most large global enterprises have a very complex
supply chain. The risk of a flood or an earthquake leading to supply chain dis-
ruption is high. There are other supply chain risks such as an important supplier
going bankrupt. There are multiple examples where businesses of enterprises got
adversely affected due to supply chain disruption. The year 2020 is known for
various supply chain disruptions caused by COVID-19. Not many enterprises are
ready to manage such disruptions.
Data and analytics help in managing supply chain risks better by getting on-
time insights. It helps in providing end-to-end supply chain visibility on a real-time
basis, thereby helping supply chain managers in taking timely actions to avoid
possible disruptions. Sensors on fleet along with IoT technologies make it possible
to track exact fleet movements and understand how some of the delays can impact
planned fulfillment of customer orders. Advanced analytics can help in predicting
adverse future events that can pose high risk to supply chain. These events could
be a supplier going bankrupt, a political instability in a country where a supplier
is based out of, or adverse weather. Lot of internal data of the enterprise needs
to be combined with external data, such as weather forecasts, suppliers’ financial
reports, and macroeconomic data to make the predictions.
6.2 Defining and Measuring Business Value 143

When it comes to quantifying the benefits resulting from data and analytics (to
manage supply chain risks), I recommend using past data of losses as a starting
point. One can use the past data to extrapolate losses to the current and future
years, with the assumption that if the enterprise did nothing (i.e., it did not invest
in a data and analytics initiative), it would continue to incur similar losses in
future as it did in the past. The difference between extrapolated loss and actual
loss incurred (as and when the data for actual loss is available) is the amount of
loss avoided due to data and analytics. This is how benefit can be quantified. Of
course, computing past losses itself requires connecting lot of enterprise data from
across various systems and applying various business logics.
Most executives would not refute the fact that doing nothing is not a solu-
tion and that similar losses, as in the past, would continue if no action is taken.
However, quantifying the benefits, resulting from avoidance of losses, helps in
justifying investment made in data and analytics.
Fraud prevention: The threat of fraud has increased considerably in the twenty-
first century in not only financial sector but also other sectors. The threat continues
to increase every year with new types of frauds emerging. Hence, enterprises have
been investing in AI/ML-based data and analytics solutions for fraud detection and
prevention. It is a well-known fact that if enterprises, especially in the financial
sector, do not continuously invest in better algorithms to prevent various types of
frauds, their losses would be huge.
To quantify the benefits resulting from such investments, one can follow an
approach like the one suggested in the previous example, i.e., supply chain risk
mitigation. While the approach includes extrapolation of growth rate of frauds
based on past data, in the case of fraud prevention, it must also take into consid-
eration new fraud types that keep emerging. Thus, the total impact of data and
analytics, in terms of dollar value of fraud-related losses avoided, can be calcu-
lated as the difference of extrapolated losses (in the hypothetical scenario of zero
investment in data and analytics) and actual losses measured, after fraud prevention
initiatives were undertaken.
Accident prevention: Safety at workplace is of paramount importance to all CEOs
because cost of an accident or an injury at workplace is extremely high. While
on one hand, it leads to financial losses to the enterprise, such as wage loss, pro-
ductivity loss, and medical expenses; on the other hand, it causes physical, social,
and environmental impact to employees and society. Major accidents such as fire
or chemical leakage not only lead to huge financial and non-financial losses for
everyone, but also tarnish an enterprise’s reputation. In some cases, it can even
make an enterprise go out of business.
To avoid or reduce this risk, enterprises adopt broadly five key measures.

1. First, they plan for safer office infrastructure and sophisticated equipment for
manufacturing/material handling so that the risks are reduced.
2. Second, they lay down standard operating procedures at workplaces.
3. Third, they train employees on safety best practices and conduct safety drills.
144 6 Fifth Element of Strategy—Value Measurement …

4. Fourth, they install more sensors at office, manufacturing site, or construction


site to sense a potential hazard.
5. Fifth, they use advanced analytics to predict safety incidents/accidents and take
preventive measures to avoid such incidents from happening.

Data and analytics play a vital role in the fourth and fifth measures mentioned
above. AI/ML is being used extensively today to develop algorithms using past
safety data collected from various systems and sensors to predict potential inci-
dents/accidents. Real-time data from various sensors can help in alerting quickly
in case there is a possibility of occurrence of a safety incident/accident.
With regards to quantifying business benefit resulting from data and analytics
initiatives undertaken to prevent accidents, an approach like the one suggested for
the other two risk mitigation examples that I discussed earlier (supply chain risk
mitigation and fraud prevention), can be used.

6.2.4 Fourth Impact Area: Company’s Image Building

A company’s image in the society is important not just for maintaining its brand
but for its very survival. Customers today are becoming increasingly sensitive
about aspects such as how environment friendly a company is, whether the
company uses child labor, whether the company is doing enough in CSR (cor-
porate social responsibility), and so on. Often, these considerations are important
influencers in their buying decisions. Further, any non-compliance to regulatory
requirements, be it environmental or others, may lead to stiff penalty and ero-
sion of company’s image. Most CEOs, therefore, put good amount of focus in
maintaining a healthy image of the company.
Data and analytics play an important role in ensuring that the initiatives
undertaken to improve company’s image yield results and that any deviations
or non-conformities are detected in a timely manner, so that adequate preventive
measures can be taken. To some extent, this impact area (i.e., company’s image
building) has an overlap with the third impact area (i.e., business risk mitigation).
However, building a company’s image is more than just mitigating day-to-day
business risks and has quite a few different focus areas.
Let me talk about this impact area with the help of three examples.
Sustainability: Today, everyone across the globe is talking about sustainability and
its importance to maintain ecological balance of earth from long-term perspective.
All large enterprises are under pressure from various stakeholders—government,
society, shareholders, employees, and customers, to demonstrate that their business
is run in a manner that is conducive to sustainable development. As a result, most
enterprises have created a dedicated role of “Sustainable Development Head” in
their corporate offices. This role typically has a small team that collects data related
to sustainability from all the offices/sites of the enterprise across different coun-
tries. They analyze this data and create sustainability reports that are published
6.2 Defining and Measuring Business Value 145

to the external world. Each office/site of the enterprise generally has a person
identified as “Sustainability Officer”, who is responsible for implementing good
sustainability practices in their respective offices/sites and sending data related to
the same to the corporate office.
While conducting the exercise of periodic sustainability reporting, most enter-
prises are realizing that the data that they collect in the process can be used for
not only complying to the laid down environmental and other regulatory norms,
but also for identifying various cost saving and other opportunities. Let me share
an interesting example from cement industry to explain this point.

Before explaining how sustainability data can help a cement manufacturer in saving cost,
let me describe the manufacturing process of cement in brief. Crudely speaking, cement
manufacturing is a two-step process. In step 1, a batch of limestone, obtained from mines,
is used as the primary ingredient to produce a batch of clinker, which is an intermediate
product in cement manufacturing. In step 2, the batch of clinker is mixed with gypsum, fly
ash, and other additives, before grinding them together to produce cement.
Cement is classified into different grades, based on its properties. The most common cement
in the world is ordinary portland cement, that generally comes in three grades, namely 33-
grade, 43-grade, and 53-grade. Grade denotes the compressive strength of cement.
During production of cement of any grade, a manufacturer can use certain amount of fly
ash as an alternate raw material, if that amount is not compromising the minimum required
strength of cement for that grade. Fly ash is a waste by-product of power plants, and avail-
able almost free of cost. It can be used as a substitute for limestone, which is an expensive
raw material. An additional benefit of using fly ash is that it increases corrosion resistance
property of cement. However, if fly ash is used in excess, it can compromise the minimum
compressive strength requirement of cement required for the respective grade of cement,
making it unusable.
Thus, the challenge faced while using fly ash as a substitutable and cheap raw material is to
determine its optimum proportion that a cement manufacturer can use. However, determin-
ing the optimum quantity of fly ash is not an easy task, since it depends on (a) chemical
properties of every batch of clinker and other raw materials used in manufacturing each
batch of cement, and (b) manufacturing process/equipment settings for production of that
batch of cement. There are many variables/parameters that come into play. Experts have
tried to establish mathematical relationship between these variables and the corresponding
strength of cement, with an objective to predict the strength of cement based on raw material
test results and process parameters. If the strength of every batch of cement can be pre-
dicted, one can determine the maximum amount of fly ash that can be added safely in each
batch. However, because of the inherent complexity, none of the experts could establish any
mathematical model.
To address the above challenge, one can use artificial neural network (ANN). In situa-
tions when establishing mathematical relationship between input and output variables is not
possible, ANN can be used. It can help in predicting cement strength and accordingly deter-
mining optimum amount of fly ash that can be added in each batch of cement. However,
developing an ANN model requires both substantial historical and current data for all the
variables. This is where the data collected as part of sustainability initiative, comes in handy.
This is an excellent example of how sustainability data can be leveraged beyond just for
sustainability reporting.
146 6 Fifth Element of Strategy—Value Measurement …

Calculating benefit by using data and analytics in this case is easy. Other than the non-
quantifiable benefit of sustainability compliance, one can calculate the percentage of addi-
tional fly ash that could be used due to ANN model, resulting into reduction of total raw
material cost.

Another example of leveraging sustainability data is in energy optimization. Enter-


prises collect energy and emissions-related data for sustainability compliance. This
data can be used beyond just for compliance. Let me explain this point with the
example of an office or a site of a company that generates electricity through its
captive power plant or diesel generator. For power generation equipment, com-
panies must collect emissions data for carbon dioxide, carbon monoxide, etc., for
reporting to local environment authorities for statutory compliance. This emissions
data can be analyzed to understand if the power generation equipment is running
at optimal efficiency. If emission levels are higher than normal, it is an indication
that the equipment may not be running efficiently. One can then examine and take
corrective actions.
Business value measurement from data and analytics initiatives in sustainability
can be classified into the following three categories (which is an approach that is
used in many other areas as well).

• Direct benefit quantification: There are cases in sustainability where benefits


resulting from data and analytics initiative can be directly quantified. The case
of raw material cost saving resulting from usage of fly ash in cement manufac-
turing that I discussed earlier is a good example of this. Calculating cost saving
is easy in such cases.
• Indirect benefit quantification: There are cases where benefit cannot be
attributed directly to data and analytics initiative in sustainability since many
factors (other than data and analytics) might also have contributed to the same
benefit. In such cases, one can use an approximation approach and provide the
rationale used for approximation of benefits. One way to do this is by extrap-
olating past data, about which I discussed in the section on the third impact
area—business risk mitigation. While using this approach, one must clearly
state all the assumptions made. I also recommend that one should provide a
range of dollar value of benefit (between best-case scenario and worst-case
scenario) instead of providing a single dollar value. The case of energy opti-
mization of power generation equipment that I talked about earlier is one such
example where many factors, other than data and analytics, can contribute to
increase in efficiency of equipment.
• Non-quantifiable benefits: In cases where there is no possibility of quantifying
benefits, either directly or indirectly, one should summarize the benefits that the
initiative resulted in, without mentioning any dollar value against them. As an
example, increase in sustainability rating of a company resulting from improved
sustainability compliance gives a major boost to a company’s image. It is an
important aspect from business continuity perspective, even though one cannot
quantify the benefit.
6.2 Defining and Measuring Business Value 147

Whenever any review of benefits is done by executives, all quantifiable and non-
quantifiable benefits should be stated together, so that one can get a complete view
of how the data and analytics initiative helped the enterprise.
Regulatory compliance: Every company must comply with various financial and
non-financial regulatory norms. The cost of non-compliance can be as high as com-
plete closure of business. Non-financial regulations relate to environment (covered
under sustainability), labor, etc. Financial regulations, on the other hand, relate
more to corporate governance. The twenty-first century has seen a series of finan-
cial scams that resulted in stringent regulatory frameworks getting enforced. In
the USA, SOX (Sarbanes–Oxley) Act, 2002 was brought in to improve corporate
governance and improve investor’s confidence, after major scandals such as those
of WorldCom and Enron. International Financial Reporting Standards (IFRS) was
adopted by EU in 2005 for all EU-listed companies. Many other countries also
adopted IFRS subsequently.
The global financial crisis of 2008, having its origin in the US housing market
bubble, exposed excessive risk-taking by banks. The crisis accentuated the need
for better regulation and supervision of the financial sector. As a result, almost all
countries responded with stricter regulations. International Basel Committee intro-
duced Basel III norms in 2010, which has now been adopted by many developed
and developing countries.
The purpose of mentioning all these regulations and norms that came up in
twenty-first century is to highlight the fact that the purview of regulatory com-
pliance has been growing and the cost of non-compliance has become very
high today. Data and analytics play an extremely critical role in ensuring and
demonstrating compliance. In the last couple of decades, almost all large, global
enterprises have spent large sums of money in data management for regula-
tory compliance. Quantifying business benefits from such investments is not easy
(except for calculating productivity gain achieved by financial analysts and audi-
tors who may need to spend lesser number of hours in data management and
reviews). Hence, business value resulting from investment in data and analytics
in most types of regulatory compliance initiatives should be demonstrated in non-
dollar terms, by using parameters such as reduction in audit non-conformities and
avoidance of potential non-conformity costs/risks.
Customer perception: Just as beauty lies in the eyes of the beholder, a company’s
image lies in the minds of the stakeholders. Customers are one of the most impor-
tant stakeholders, and how they perceive a company is crucial for the growth of
the company. Customer perception depends on multiple factors such as quality of
product/service, brand image of the company, and charitable work done by the
company. One of the ways by which customer perception is created/changed is
through word-of-mouth. Social media has made word-of-mouth communication
very widespread.
Enterprises spend lot of money and effort in marketing and publicity, both
online and offline, to market their products/services and create positive customer
perception. Hence, it is important to analyze how customer perception has been
148 6 Fifth Element of Strategy—Value Measurement …

changing amongst different customer segments because of these efforts, so that


money can be spent more effectively in future. Data and analytics play an impor-
tant role in this. It helps in understanding customer sentiments by gathering and
analyzing data from social media and other sources, such as customer surveys
and customer feedback. It can help identify certain customer pain points/concerns
that the company may be missing out. Using natural language processing and
other techniques, analytics can not only analyze customer concerns, but can also
recommend actions to address them.
Hence, many data and analytics projects are directly or indirectly linked to
improving customer perception and satisfaction. From value measurement perspec-
tive, one can attempt to get credible data-based evidence, that can help correlate
data and analytics initiative and change in customer satisfaction/perception. One
can try to evaluate how data and analytics projects contributed towards the
improvement of projected customer lifetime value (i.e., amount of money a cus-
tomer is expected to spend on your company’s services or products). While these
are means of indirect quantification of benefits, direct quantification is difficult
in the case of customer perception. Hence, it is important that one summarizes
how data and analytics initiatives contributed to analyzing customer sentiments,
understanding them better, and improving customer perception.

6.2.5 Business Value Measurement: Correlation Does Not


Necessarily Mean Causality

As discussed in this section, while there are many data and analytics projects
for which direct quantifiable business benefits can be established and calculated,
there are quite a few others for which it is difficult to do so. In the case of latter,
one can attempt to quantify indirect benefits, by studying the correlation between
analytical scenarios implemented (as part of data and analytics projects) and busi-
ness outcomes (that the scenarios can potentially impact). For example, cross-sell
analytics may lead to business outcome of “revenue increase”. Similarly market-
ing campaign analytics can help in the business outcome of “increase in market
share”. However, the pertinent question is—can one attribute entire or a part of
business outcome to data and analytics simply by showing positive correlation?
It is important to understand that correlation does not necessarily mean causal-
ity. While trying to demonstrate business benefits through correlation, data and
analytics team may face counterarguments that benefits, such as revenue increase
or market share increase, were due to other factors such as a major business
process change or a new marketing campaign. Hence, data and analytics team
(especially the business tower of data and analytics organization) should do a holis-
tic evaluation of correlation. Justification for attributing a portion of the business
outcome/benefit to data and analytics project(s) should be supported by facts and
examples. During interaction with various business stakeholders (before, during,
and after a data and analytics project), one can capture such facts/examples, that
6.3 Defining and Measuring Operational Efficiency—Continuous … 149

can be used for corroboration. Further, analytics (such as extrapolation) on histor-


ical and current data can also be done, as discussed in various examples earlier in
this section, to establish and demonstrate causality.
To summarize this section, data and analytics team needs to define a com-
prehensive framework and adopt a disciplined approach to systematically capture
business value resulting from data and analytics projects. Any demonstration of
value should be based on credible evidence that can help prove that the value truly
resulted from the projects. In cases where the precise value (in dollar terms) is
debatable, one can come up with three estimates instead of one—an optimistic
one, a pessimistic one, and a neutral one. There are, however, quite a few non-
quantifiable business benefits as well that result from data and analytics projects.
These should also be summarized for value demonstration. Finally, all benefits
should be discussed and agreed mutually with concerned business stakeholders.

6.3 Defining and Measuring Operational


Efficiency—Continuous Improvement

While delivering business value is of paramount importance for data and analytics
team, the team needs to continuously improve their operational efficiency, so that
business value is delivered in a consistent and efficient manner. Governance tower
of the data and analytics organization needs to define and measure operational
efficiency, in discussion with the other towers. There are four key dimensions of
operational efficiency that need to be focused on, as depicted in Fig. 6.3.
Let me talk about each of the four dimensions’, citing examples of specific KPIs
(Key Performance Indicators) that can be used to measure operational efficiency.
These KPIs should be measured periodically, and efforts should be made to ensure
that the KPIs improve continuously.

6.3.1 People Performance

By people, I mean the core team constituting the data and analytics organization.
It includes the data and analytics head, the tower heads, and all the team mem-
bers belonging to all the five towers. Data and analytics organization’s operational
efficiency would improve if everyone, irrespective of her/his role, continuously
upskills and cross-skills herself/himself, and comes up with innovative ideas to
do their work more efficiently. All ideas, big or small, must be recognized and
rewarded. In the long run, small ideas leading to incremental improvements are as
important as the big ideas leading to breakthrough innovations.
Defining and tracking KPIs to measure performance of people and, accordingly,
rewarding high performers is important to encourage a performance-oriented cul-
ture. Few KPIs that one can use are—“training effectiveness”, “new skills added”,
150 6 Fifth Element of Strategy—Value Measurement …

Fig. 6.3 Four dimensions of data and analytics operational efficiency

“number of new ideas proposed and accepted”, “innovation score (based on pre-
defined formula)”, “number of customer appreciations received from business
stakeholders”, etc.

6.3.2 Process Effectiveness

In Chap. 4, I talked about various processes that data and analytics organization
need to define and implement to run its operations efficiently. I also mentioned
that governance tower should be the custodian of all the processes. Process doc-
uments such as manuals, guidelines, templates, and checklists should be created,
maintained, and periodically reviewed for their effectiveness.
Effectiveness of any process (that was mentioned in Chap. 4) can be reviewed
broadly on three parameters, as listed below.

• Cost of running the process: For certain processes, especially project execution,
cost is an important consideration. Budget for some projects runs into millions
of dollars, hence tracking cost is important. One can use KPIs such as “%
projects delivered as per planned efforts” and others for tracking cost.
6.3 Defining and Measuring Operational Efficiency—Continuous … 151

• Quality of the process: Quality of a process can be judged by outcome. For


tracking quality, KPIs such as “first time right %”, “end user satisfaction”, and
“% increase in users/usage” can be used.
• Cycle time of the process: Timely completion of a process is another important
aspect to be tracked. KPIs such as “on-time delivery of projects” and “cycle
time for resolution of issues” can be used.

6.3.3 Technology Capability

In Chap. 3, I discussed in detail about how the data and analytics organization
should strive for technology and architecture excellence. I talked about the impor-
tance of continuously exploring new technologies and evaluating how they can
enable some of the new art-of-the-possible solutions for business. The team should
also try to standardize technologies (in case multiple non-standard technologies
exist in the enterprise), consolidate tool vendors, and maintain technology infras-
tructure efficiently. Even after a robust data and analytics architecture is defined
and technologies are selected, there would always be pockets within an enter-
prise that would continue to use various old technologies. Data and analytics team
should convince those users to switch to standardized technologies, as part of
organizational change management process.
Examples of KPIs for evaluating improvement in technology capabilities are—
“number of proof-of-technology done on technologies of the future”, “number of
ideas generated for solving a business problem using futuristic technology”, “%
non-standard tools in use”, “system availability”, “report/data query performance”,
etc.

6.3.4 Data Maturity

In Chap. 4, I discussed about data governance at an enterprise level. I also talked


about the importance of data governance council, headed by a business leader. In
that discussion, I mentioned that data and analytics organization must define pro-
cesses and policies as well as use relevant tools to manage key aspects of data
under the directives of overarching data governance council. The key aspects of
data, that I talked about, are (a) metadata management, (b) master data manage-
ment, (c) data quality management, and (d) data security management. Overall,
primary responsibility of data and analytics organization is to make data useful
and secure for consumption across the enterprise. Data and analytics team needs
to continuously improve data maturity of the enterprise, by defining short-term,
medium-term, and long-term goals.
I would like to highlight two KPIs that are very relevant for the data and
analytics team from data maturity perspective.
152 6 Fifth Element of Strategy—Value Measurement …

• Data quality index: This KPI is a quantitative score to calculate the quality of
business data based on various quality parameters, such as accuracy, complete-
ness, and uniqueness. For some of the common and important data domains,
such as customer master data, process should be established to calculate data
quality index. This KPI should be reviewed regularly, and necessary steps
should be taken to improve the index. Data and analytics teams should strive to
reach close to 100% data quality index for all the important data domains.
• % Reuse of data: Reusability of data means what percentage of data, required
for a new project, is reused from the data already existing in the data platform.
In Chap. 2, I had briefly talked about the importance of data reuse. Let me
elaborate it here further.
In any enterprise, there are various common data entities that are required by
multiple business functions. This means that there is good amount of common
data that can be reused by various projects, once the data is available in the data
platform. Architecting the data platform in a manner that ensures reusability
(i.e., “bring once, use multiple times”) helps in delivering projects faster, even
though the efforts required (to do so) during the initial stages of establishing the
data platform would be higher. More importantly, it ensures data consistency
and quality. Architecting data platform to ensure reusability of data across var-
ious projects requires three key architectural principles to be followed, as listed
below.
– First principle is to model data at the lowest level of data granularity. Granular
data means data at the lowest possible detail. For example, for purchase
data, one aspect of granularity is having the data of each individual line
items (of the purchase order) in the data platform. Once you have granular
data, you can use it to answer different types of business questions related
to the data. You do not have to reach out to transactional system every time
a new business question is asked. For example, questions such as “who are
my top ten suppliers?”, “for which item our spend was highest last year?”,
“for which item maximum number of purchase orders were placed?”, and
“what was the average lot size of different items that were purchased?” can
be answered by having granular data. However, if instead of architecting
granular data, you just keep summarized data (such as spend per supplier for
each item in a month), you cannot answer many business questions without
doing additional data engineering work.
– Second principle is to model data in a business process-aligned approach.
In Chap. 3, I talked about the importance of this. Let me reproduce it
here verbatim. “Business processes in enterprises cuts across various busi-
ness functions, and for measuring many KPIs, one needs data from across
various functions/business units/geographies. From data and analytics archi-
tecture perspective, one must architect data in a way that meets all such
cross-functional needs, that were either stated during business workshops
or not stated explicitly but could possibly arise in future”. If data model is
aligned to business processes, the ability to cater to various functional and
cross-functional needs becomes easy, thereby promoting reuse of data.
6.3 Defining and Measuring Operational Efficiency—Continuous … 153

– Third principle is to bring all relevant data from a transactional system when
you connect to that system for the first time. Every time you reach out to the
team owning a transactional system, there is a lead time to source data. This
lead time arises because of activities such as talking to the system owners,
discussing various technical details, and understanding data tables. To avoid
this, one should get all relevant data (or a large portion of relevant data),
required for immediate need or for potential future use, in one go. Often,
a team working on one data and analytics project tends to request for only
the data that is required (from a system) for that project. They do not want
to spend any extra time and effort to get other relevant data that may be
required for future projects. However, this extra effort, if put in, pays off in
the long run. This is where the architecture tower and governance tower of
data and analytics organization needs to play an important role to ensure that
the extra effort is planned, and additional budget is sought to design the data
platform for reusability.

6.3.5 Operational Efficiency and Maturity Assessment

Governance tower should periodically review KPIs of all the four dimensions
of operational efficiency, as discussed in the previous paragraphs. The objective
of such an exercise is to ensure that operational efficiency of data and analytics
organization increases continuously.
Other than periodic review for operational efficiency, I recommend conducting
a holistic maturity assessment of enterprise data and analytics, preferably once
in a year. In Chap. 2, I talked about data and analytics maturity evolution, while
explaining how to define data and analytics roadmap. Enterprise data and analyt-
ics maturity assessment takes into consideration not only operational efficiency,
but also various other factors such as business benefits delivered and customer
feedback received. Such assessments help in evaluating how enterprise data and
analytics is moving up the maturity path, year-on-year. In this book, I am not
planning to cover a detailed parameterized maturity assessment model, since it is
a detailed topic and there are quite a few standard maturity models available, from
leading analysts and consulting companies, that an enterprise can leverage.
Maturity assessment should be done by an independent consulting agency once
in a year, so that the evaluation is objective and unbiased. Additionally, the data
and analytics team should also assess their maturity periodically, say every quarter,
by themselves to track their own progress.
Finally, I want to highlight one parameter based on which data and analytics
team can get a sense of their effectiveness and success in the long run. It is by
evaluating how the demand of their services is increasing across the enterprise.
If the demand is increasing every year and if the percentage of money spent in
projects, as compared to money spent in application maintenance and support, is
154 6 Fifth Element of Strategy—Value Measurement …

increasing, it implies that more and more business functions are securing their ser-
vices. This in turn reflects that data and analytics team is delivering good business
value to more and more stakeholders every year.

6.4 Calculating ROI from Data and Analytics Investment

In this chapter, till now I have discussed about defining and measuring business
value delivered by data and analytics, and operational efficiency of data and analyt-
ics organization. Let me now talk about an important question that is often asked
from value measurement perspective, viz. how to calculate ROI (return on invest-
ment) from a data and analytics program/project? ROI is required to be calculated
for three reasons.

1. First, it helps in justifying past investments made in an enterprise data and


analytics program. Executives often ask the data and analytics leader to show
the ROI for past investments.
2. Second, before embarking on a new data and analytics project, especially when
the required budget is high, one needs to prepare a business case, including
potential ROI, and take a go/no-go decision accordingly.
3. Third, calculating ROI helps the data and analytics leader run her/his organi-
zation more as a profit center, instead of a cost center. This helps her/him in
never losing focus on delivering business results to the enterprise.

Let me now describe a practical approach on how one should go ahead to cal-
culate ROI. Each of the two components of ROI, i.e., benefits and costs must
be calculated meticulously, as described in the following paragraphs, to calculate
ROI.

6.4.1 Calculating Benefits

Calculating actual or potential business benefits, about which I discussed in


detail earlier in this chapter, is an important first step towards calculating ROI.
I explained why this is a difficult task and suggested how to adopt a disciplined
approach to systematically capture business value resulting from data and analyt-
ics projects. Without repeating all that in detail, let me summarize below the steps
that one should take to calculate benefits.

• Bucket all business benefits (both direct and indirect) into two categories—
revenue impact and cost savings. Include items with quantifiable benefits from
“business risk mitigation” and “company’s image building” also in these two
categories.
6.4 Calculating ROI from Data and Analytics Investment 155

• Evaluate benefits for each line item, i.e., each actual/potential benefit, and put
a dollar value for them. While doing so always come up with three numbers—
best-case scenario, worst-case scenario, and normal-case scenario. Creating
these three scenarios would create a benefit corridor that takes into consid-
eration the ambiguities involved in the assumptions. This approach would lead
to greater acceptance of benefits calculation by various stakeholders. Also note
that, benefits would be spread across multiple years. Hence, calculate all the
values, for each line item, for each year.
• List down all the IT benefits and put dollar value for each one of them. For cer-
tain projects, there would be IT benefits such as reduction in hardware/software
costs. All such benefits should be evaluated. Quantify the benefits for the three
scenarios (best-case scenario, worst-case scenario, and normal-case scenario)
for each year.
• Calculate total quantifiable benefits. This is the sum of all business and IT
benefits, for each year, as calculated above.
• Mention all the assumptions made during calculation of benefits. The assump-
tions should be as detailed as possible, so that everyone understands the
rationale behind the calculations.
• State all the non-quantifiable benefits. I discussed earlier about various areas,
such as regulatory compliance, where it may not be possible to quantify all
benefits. Anyone reviewing the business case/ROI calculation should take a
holistic view, that includes non-quantifiable benefits as well, rather than looking
only into the quantitative value of ROI.

6.4.2 Calculating Costs

Once benefits calculation is complete, one needs to calculate projected/actual


project costs. I recommend the following steps for the same.

• Calculate incremental hardware and software costs. One should include only
the incremental/additional hardware and software costs for implementation of
the project. Any costs that were incurred prior to the project should be excluded.
Just like benefits, costs may spread across multiple years. Hence, calculate all
the costs, for each line item, for each year.
• Calculate professional services charges. Enterprises often hire external pro-
fessional service provider for projects. Their charges (actuals, quoted, or
estimated), as applicable to the project, should be calculated for each year.
• Estimate cost of employees. In addition to external professional services, various
stakeholders (both business and IT), who are employees of the company, invest
their time in a project. All such time should be calculated in terms of number
of hours and converted into dollar value, using approximate employee cost per
hour. To keep things simple, the approximation can be done for three levels of
employees—junior, middle, and senior, based on the average hourly employee
156 6 Fifth Element of Strategy—Value Measurement …

cost for each of these three levels. Like other costs, employee costs should also
be totaled for each year.
• Add other applicable costs. During any project, there are various costs, other
than people costs, that are incurred. These costs can relate to travels, workshops,
trainings, certifications, etc. All such costs should be calculated per year.
• Calculate total costs. This is the sum of all the costs, for each year, as calculated
above.
• Mention all the assumptions made during calculation of costs. The assumptions
should be as detailed as possible, so that everyone understands the rationale
behind the calculations.

I want to highlight the importance of excluding all “sunk costs” in the above
calculations. Sunk cost is the money that is already spent prior to the project, i.e.,
the money is “sunk”. Although excluding sunk costs is a commonsense approach
while calculating ROI, many enterprises commit the mistake of including them
in their calculations. This gives a distorted view of the ROI. Especially when a
business case is being prepared for a new project, if one includes sunk costs, the
ROI may become very low. This can lead to a decision to shelve the project.
However, such decisions may be bad if, by excluding sunk costs, the ROI value
turns out to be high. Irrespective of whether the new project is undertaken or not,
sunk costs are going to be incurred by the enterprise.

6.4.3 Calculating ROI

Once all the benefits and costs are calculated, the next step is to calculate ROI. I
recommend the following steps for the same.

• Prepare yearly cash flows of benefits and costs. While laying down the steps to
calculate benefits and costs, I had suggested to sum them for each year, since
benefits and costs typically span across multiple years. While preparing yearly
cash flows of benefits and costs, I suggest using a time horizon of 5–10 years.
• Discount/compound cash flows to the current year. All future cash flows should
be discounted to the current year using the company’s standard rate of cost of
capital. Discounting is required when calculations are done for preparing initial
business case. Discounting of future benefits and costs reduces the numbers,
since present value of future cash is always lower. However, when calculations
are done for past benefits and costs (for projects implemented in the past),
instead of discounting the cash flows, they should be compounded to the current
year. Compounding increases the numbers, because of time value of money.
• Calculate NPV (net present value) of cash flows. NPV of cash flows is the
difference between present values (i.e., discounted/compounded values) of all
benefits and costs. This means that to calculate NPV of all cash flows, calculate
PV (present value) of benefits, calculate PV of costs, and find their difference.
Many executives are more interested in NPV of a project than they are in ROI. A
6.4 Calculating ROI from Data and Analytics Investment 157

positive NPV means that the project was/is useful, while a negative NPV means
it was/is not. The more positive the NPV of a project is, the more valuable it is
for the enterprise.
• Calculate ROI. Once the above calculations are done, ROI can be calculated
simply by dividing NPV of all cash flows by PV of costs. Multiply this output
by 100 to calculate ROI in percentage terms.

While ROI and NPV are two commonly used measures for evaluating attractive-
ness of an investment, executives are often interested in two additional measures,
as listed below.

• Payback period: It is the time required to recover an investment, i.e., to reach


breakeven point. Shorter the payback period, more attractive an investment. I
have seen a few strategic data and analytics projects where payback period
was as low as one year. However, it is common to have projects with payback
periods between 2 and 3 years.
• IRR or internal rate of return: It is the discount rate that makes NPV of all
cash flows zero. IRR is equivalent to the compounded annual rate of return that
is earned by an investment. Higher the IRR, more attractive an investment.

Figure 6.4 illustrates how one can summarize all the four return measures for a
data and analytics project. In the figure, project ABC is a hypothetical project, for
which business case has been prepared. Cash flows (both benefits and costs) have
been calculated for all the three scenarios (best-case, worst-case, and normal-case)
for ten years. NPV has been calculated using these cash flows and a discounting
rate of 10%. Graph shows cumulative NPV, i.e., NPV for a particular time horizon.
Cumulative NPV increases with time, since most of the costs are generally incurred
during the initial period and, in subsequent years, only benefits follow. Finally, as
mentioned earlier as well, any financial analysis of return should be supported with
detailed calculations, including assumptions, and complemented with a list of all
non-quantitative benefits.
Having explained how to calculate ROI, I want to share few more practical
suggestions in the context of ROI and business case.

• When an enterprise is in the initial stage of establishing strong foundation to


implement a new data and analytics strategy, ROI of projects would be low or
even negative. During this stage, it is important to seed an initial investment,
without expecting high return on the same. It is important for executives to
understand that this investment would lead to projects with high ROI in the
future (after a year or so). I discussed about this in Chap. 2 as well, while
talking about “period of disillusionment” in data and analytics roadmap.
• During the early days of a new data and analytics organization, focus should
be on delivering few quick-win business projects, that are of small to medium
size, are fast to implement, and have good business value. This would help the
data and analytics organization in establishing its credibility. Having few happy
158 6 Fifth Element of Strategy—Value Measurement …

Fig. 6.4 Illustrative financial analysis of return from data and analytics investment

internal customers initially, helps in spreading message about the excellence of


the new data and analytics organization to other business stakeholders through
“word-of-mouth”.
• When demand for projects is high and there are limited resources, data and
analytics team needs to prioritize projects. While doing so, there is often a ten-
dency to take up projects from business stakeholders who have higher budgets.
This may not always be a wise approach, as high budget does not necessarily
mean high return of investment for the enterprise.
• For projects where complexity and level of ambiguity are high, developing a
proof-of-concept may help. Though some investment is required for developing
a proof-of-concept, it would help in deciding whether a particular idea would
work or not. If the proof-of-concept fails, it would save the enterprise from
losing a much larger investment.
• When ROI calculation is being done for preparing a business case, it is
important to summarize all the project risks as well, so that anyone review-
ing the business case and ROI gets a complete picture. This helps in better
decision-making.
• ROI calculations are generally done as part of initial business case preparation,
especially when the required investment is high. However, I have seen in many
cases that, once project starts, only costs are tracked. Benefits are generally not
captured during or after project implementation. As highlighted in the beginning
6.5 Summary 159

of this chapter, it is important to capture business value during and after project
implementation as well.

6.5 Summary

Measuring business value delivered by a data and analytics initiative is a difficult


task. In many cases, it is not possible to directly measure the value. This often
leads to executives questioning the return of their investment. One can address
this challenge by establishing a value measurement framework that can help cap-
ture and demonstrate both quantifiable and non-quantifiable business benefits. Data
and analytics initiatives help enterprises in achieving broadly four business out-
comes—(a) revenue increase, (b) cost reduction, (c) business risk mitigation, and
(d) company’s image building. While delivering business value is of paramount
importance, data and analytics team also needs to continuously improve their
operational efficiency, so that they can deliver value in a consistent and efficient
manner. For this, they need to focus on four key areas—(a) people performance,
(b) process effectiveness, (c) technology capability, and (d) data maturity. Finally,
for calculating financial return of any data and analytics investment, one needs to
calculate/estimate cash flows of benefits and costs (for both business and IT). The
financial measures commonly used are NPV (net present value), ROI (return on
investment), payback period, and IRR (internal rate of return).
The Profile of a Data and Analytics
Leader 7
Key Skills of a Leader Who Can Lead the Enterprise to
Success

7.1 Key Skills That Any Enterprise Data and Analytics


Leader Must Possess

An enterprise can define the best possible data and analytics strategy, either with
the help of good consultants or by leveraging internal experts, but to implement
the strategy one needs a team, that not only has the zeal for execution, but also
has the right skills required to succeed. Most importantly, this team needs a leader
who has all the required hard and soft skills to lead them to success. Throughout
this book, I talked about organizational complexities (political, structural, etc.) that
the leader needs to manage. I dedicated an entire chapter on organizational change
management and discussed its importance. Having a data and analytics leader who
possesses the necessary skills to manage inherent complexities and ambiguities of
a strategic data and analytics initiative is therefore extremely important. This is the
reason why I decided to dedicate the final chapter of the book to this topic.
Out of curiosity, I have been reading job postings of large enterprises, that plan
to hire data and analytics leader. I noticed that, in many postings, around 75%
of “desired competencies” are focused on technology skills that the person must
possess. They want someone who has hands-on experience in various technologies.
I could never understand why an enterprise requires data and analytics leader with
such deep hands-on technology expertise, and very little other skills. I wonder if
they are looking for a leader who can herself/himself develop data and analytics
solutions and conduct hands-on technology training for her/his team. While I am
not undermining the importance of having good technology expertise in a data
and analytics leader, I believe, one reason why many enterprises fail in a strategic
data and analytics initiative is that they do not have a leader who possesses other
required skills.
In different enterprises, there are various designations of a data and analyt-
ics leader—Chief Data Officer, Chief Analytics Officer, Chief Data and Analytics
Officer, Vice President—Data and Analytics, Director—Data and Analytics, and

© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 161
P. Sah, Defining Enterprise Data and Analytics Strategy, Management for Professionals,
https://doi.org/10.1007/978-981-19-5719-2_7
162 7 The Profile of a Data and Analytics Leader

Fig. 7.1 Key skills that a data and analytics leader must possess

so on. However, the desired skills in the leader are the same. So, the obvious
question that would come to your mind is—What are the skills that the leader
must possess? While writing this chapter, I gave a good amount of thought to this
question. I recollected all my experiences with multiple data and analytics leaders,
both successful and unsuccessful ones. Accordingly, I listed down nine key skills
(3 hard skills and 6 soft skills) that are required for the leader to be successful.
These are summarized in Fig. 7.1.
In the figure, I have illustrated each skill with a representative annular sector
within a semi-circle. You will notice that each sector has a different size. The size
of the annular sector indicates the relative importance of the skill (as compared
to the other skills) that it represents. However, the relative size does not indicate
the exact weightage of the skill. Let me explain this point through an example.
Between the two skills of technology and data science, their relative sizes only
indicate that data science is more important than technology, but not necessarily
approximately seven times more important (as indicated by the figure, based on
the relative sizes of their annular sectors).
Overall, as you can see from the figure, in any data and analytics leader, soft
skills are more important than hard skills. Having said that, it is important to appre-
ciate the fact that possessing hard skills is not optional. The leader must possess
all the three hard skills—technology, data science, and business. Possessing soft
skills alone cannot make the leader successful. With this overall context, let me
now elaborate on each of the nine skills.
7.2 Hard Skills 163

7.2 Hard Skills

7.2.1 Technology

A data and analytics leader must have good understanding of relevant tech-
nologies. While the leader does not need hands-on experience in them, she/he
should have good understanding of what these technologies are capable of and
what are their limitations, i.e., a good understanding of art-of-the-possible and
what-is-not-possible.
The leader should keep abreast of the latest developments in data and analytics
technologies and what these mean for the business of the enterprise. The leader
must proactively seek opportunities to learn about various technologies. There are
many avenues through which such learning opportunities are available. Technology
product vendors are always keen to explain technical details of their new products.
They are eager to conduct some proof of concepts for given problem statement(s),
to prove that their product is good. Another avenue for learning is by attending
webinars or other events on data and analytics. Further, there are a wide variety of
online learning courses available today, that the leader can go through at her/his
own pace. Most large enterprises have subscription to some of the leading online
course providers. Finally, in my experience, lot of learning comes from interactions
with peer group, subordinates, and seniors, provided one is keen to learn.
A good leader must have passion and hunger for learning and must have a self-
learning plan to broaden her/his perspective. To do so, she/he should explore all
possible learning avenues that I talked about in the previous paragraph. Few data
and analytics leaders have a misconception that, since they are in a senior role,
technology understanding is not important.
You may wonder then that when technology understanding is so important, why
does it carry the lowest weightage amongst all the nine skills that I have illustrated
in the figure. The reason for that is simple—other skills are even more important
than technology skills.

7.2.2 Data Science

The second must-to-have hard skill is data science. While different people use dif-
ferent definitions of data science, for me it is a broad term that encompasses all the
activities involved in “data-to-insight” process. Data science includes data engi-
neering (extracting, cleansing, harmonizing, and architecting data) on one hand,
and data analytics/mining (data visualization, artificial intelligence, and machine
learning) on the other hand.
Data and analytics leader must have passion for data and must be thrilled by
the way one can uncover hidden patterns in data to derive insights. Data science
is more an art than science. While there are certain data science skills that can be
taught to a person, there are others that cannot be taught, if the person does not
have some inborn traits. For example, someone who is not very detail oriented by
164 7 The Profile of a Data and Analytics Leader

nature would not generally have passion for data. Similarly, data science requires
very good imaginative ability in a person - a trait that is often inborn.
There is often an expectation that data and analytics leader must have hands-
on expertise in developing AI/ML models. I strongly believe that the leader need
not necessarily have hands-on experience in writing complex algorithms or pos-
sess deep knowledge of statistics. However, the leader must have very good
appreciation of various AI/ML techniques, must understand the challenges and
complexities involved in developing AI/ML models, and must also understand
how these models can solve various business problems.

7.2.3 Business

In almost all the chapters of this book, I have stressed upon the importance of data
and analytics team possessing good business knowledge. The leader of this team
must have an extremely good understanding of the business of the enterprise. Let
me share an interesting experience in this regard.

While defining enterprise data and analytics strategy for a Fortune 500 company, I had mul-
tiple discussions with its data and analytics leader on the various elements of strategy. One
such discussion was on how the data and analytics organization should be structured. I had
recommended to have a business tower as part of data and analytics organization (as per
best practice that I discussed in Chap. 4). However, the leader said that he did not want to
spend lot of time in understanding the complexities of the business, as he had enough tech-
nology challenges on his plate to manage on a day-to-day basis. He just wanted to focus on
solving technological complexities because that is what he was good at and that is where
his passion lied. While I tried to argue against this point, he was adamant.
He proposed to his bosses (the CIO and the CFO) that he should have a counterpart from
business who would understand business needs and pass on well-defined requirements to his
team in IT. His team would then do all the technical work of developing and maintaining
solutions. Essentially, he proposed that data and analytics organization for the enterprise
should be run as a two-in-a-box model. His proposal was accepted, and the CFO identified
a senior leader from her team as the counterpart of the data and analytics leader.
The first strategic project that these two leaders jointly embarked on was on enterprise per-
formance management for one of the business units. In the first month itself, the project
started to get into trouble. In six months, the project was a complete failure. Both the lead-
ers blamed each other for not understanding their respective points of view. Having spent
more than a million dollars and having lost six months of valuable time, the CFO and CIO
decided to discard the two-in-a-box model and went back to my initial recommendation of
having the business tower as part of data and analytics organization.

The above experience reinforces the point that data and analytics leader must take
complete responsibility of ensuring that any project that is executed by the data
and analytics team is aligned with the business needs. The primary accountability
of delivering projects to improve business results of the enterprise lies with the
data and analytics leader.
7.3 Soft Skills 165

This brings up an important question that I have often been asked—How to


find such a leader, who possesses both technology and business expertise, given
that most of the data and analytics professionals come with strong technology
background, but limited business knowledge? In my experience, I have seen two
categories of leaders who bring in a good mix of both technology and business
knowledge. I have described them below.

1. The first category of leaders is those who, during their work experience, spent
considerable time in IT but had keen interest in the business of the enterprise.
They always asked the question—How would the solution that we are develop-
ing help business? It was their natural inquisitiveness that led them to develop
business knowledge. Some of them even went on to pursue a part-time or full-
time MBA degree to hone their business skills. While having an MBA degree
is not mandatory, one’s inquisitiveness to understand business and passion to
solve complex business problems (by leveraging technology and data science)
are must.
2. The second category of leaders comes from business background, i.e., they
worked in different roles in various business functions. Over the years, they
gained considerable knowledge on how business works across the value chain.
Additionally, they had passion for technology and data science, due to which
they got engaged in some transformational data and analytics projects. This
passion then drove them to switch to a full-time data and analytics role.

I have seen both categories of people becoming successful data and analytics lead-
ers, provided they possess other required soft skills, about which I will discuss
next.

7.3 Soft Skills

When I was listing down various must-to-have skills that a data and analytics
leader must possess, the easiest task for me was to list down the three hard skills.
But when it came to the soft skills, the list went through few iterations. I did
not want to put together a laundry list of soft skills. Instead, I wanted to call out
specific skills which are very critical for the role of data and analytics leader. I
finalized the list of six soft skills (illustrated in the figure at the beginning of this
chapter) after a lot of deliberation.
I want to highlight the fact that there exists some degree of overlap between
these six skills. For example, “dealing with ambiguity” and “innovation and risk-
taking” overlap, because each require certain element of the other. You cannot
be innovative if you cannot deal with ambiguity, and vice versa. However, I kept
them separate, because both are extremely important skills that a data and analytics
leader must possess. With this context, let me now talk about each of the six
“must-to-have” soft skills.
166 7 The Profile of a Data and Analytics Leader

7.3.1 Dealing with Ambiguity

Data and analytics leader should not only be comfortable in dealing with ambiguity
but also enjoy capitalizing on the opportunities that ambiguous scenarios present.
The reason why this skill is so important is that most complex business problems
never have clear-cut solutions. This applies to data and analytics solutions as well.
While developing analytics solution for a complex business problem, based on the
hidden patterns that data reveals, a solution thought process starts to emerge. Data
and analytics leader should be comfortable in traversing through a maze of possible
solution options, deal with underlying uncertainties, make reasonable assumptions,
try out multiple iterations, and then arrive at an optimal solution. If the leader is not
comfortable in dealing with ambiguity, her/his team would also become nervous.
This creates an atmosphere that would not be conducive to problem solving.
I recall a case of a data and analytics leader, who had most of the skills required
(as illustrated in the figure earlier), except an ability to deal with ambiguity. This
turned out to be the reason for his failure. Let me share the details of this case
below.

A newly appointed data and analytics leader of a global enterprise, that is headquartered
in New York, US, was assigned the role based on his past success. Prior to taking up the
role he was a successful leader in finance function and reported into the CFO. During that
time, he was tasked with implementing a highly complex financial reporting project, that
required consolidating data from multiple diverse systems. He worked closely with the IT
team at the headquarters and implemented the project not only successfully but also before
the target completion date. He received appreciation from the CFO and other executives.
During that period, the enterprise was looking to hire a new data and analytics leader, since
the previous one had resigned and left. Because of the success of the financial reporting
project, this finance leader was offered the role of heading data and analytics. Since he had
passion for data, understood the technological complexities of implementing a large data
and analytics project, and had very good understanding of the business of the enterprise, he
accepted the offer and took up the new role.
However, few months down the line, it turned out that he could not replicate his finan-
cial reporting project success to other functions such as supply chain. He had all the hard
skills required for the role of data and analytics leader – good appreciation of technolo-
gies, knowledge of data science, and deep understanding of business. He had quite a few
required soft skills as well. However, one of the main reasons for his underperformance was
his discomfort with ambiguity.
One reason for his discomfort was his financial accounting background. Let me explain
why. In financial accounting, every process needs to be well defined to the lowest level of
granularity. Every penny needs to reconcile in the financial statements. There cannot be
any ambiguity in any task. These are inherent characteristics of any finance and account-
ing function. This leader was used to working in such a disciplined environment. Hence,
he became very uncomfortable when, say, a supply chain leader would come to him with a
complex supply chain problem, for which even the problem statement was not clear.
To solve ambiguous problems, one may need to adopt a less structured, fail-fast-and-learn-
fast approach. Such an approach made this leader uncomfortable. He was used to working
in a manner where both problem and solution had to be meticulously well defined.
7.3 Soft Skills 167

To be fair to him, he put in sincere efforts to work as data and analytics head. However, it
did not work out. Hence, after two years, he gave up and went back to finance function.

7.3.2 Team Leadership

Data and analytics leader of any global enterprise needs to manage a team with
varied skill sets, from diverse cultural background, and physically spread across
various countries. To manage such a team, the leader must have strong leadership
skills.
The market demand for skilled data and analytics experts, especially those pro-
ficient in latest technologies, is very high. Because of this, employee churn rate
in data and analytics is high. Many experts with good proficiency in latest tech-
nologies are in younger age bracket, where the churn rate is the highest. Hence,
retaining talent is a major challenge and test of leadership quality of a data and
analytics leader. The leader may not have lot of leeway to increase monetary com-
pensation of the team; hence, motivating the team through other means is very
important.
Let me briefly talk below about some of the key attributes that a data and
analytics leader must possess from team leadership perspective.

• Leading by example: This is a cliché that has been almost always used when
talking about leadership skills. Hence, I will not talk much about it. It is
enough to mention here that in all aspects of work—self-learning, communi-
cation, problem-solving approach, etc.—the leader must demonstrate qualities
that would inspire the subordinates to follow her/him.
• Team building and conflict resolution: With a diverse team, that is typical of
data and analytics in a large enterprise, many situations of conflict would arise.
Sometimes, to avoid conflict, an easy solution is to create teams of like-minded
people. However, this solution does not work when one is trying to develop a
transformative data and analytics solution, that requires out-of-the-box thinking.
For developing innovative solutions, one needs to bring together a set of creative
people with diverse background, who can brainstorm and put forth new ideas.
Such diversity invariably leads to conflicts, and as leader one must ensure that
conflicts are managed in a healthy manner.
The leader must invest in team-building exercises, such as workshop in a fun
location or informal sessions. During such exercises, the team can discuss
everything other than work. This helps the team to know each other better and
develop mutual trust and respect amongst themselves.
Even after conducting team-building exercises, conflicts would arise, and the
leader needs to adopt various techniques for resolution of conflicts. Sometimes,
bringing in more clarity of role of each team member within the team would
resolve some of the conflicts. At other times, the leader may have to adopt some
other soft techniques.
168 7 The Profile of a Data and Analytics Leader

Irrespective of the situation and type of conflict, the leader must have the abil-
ity to sense and resolve conflicts within the team. Avoidance of intervention
to resolve a conflict, with an assumption that the members would resolve it
themselves, is an approach that does not work.
• Delegation and empowerment: To drive digital transformation of an enterprise,
data and analytics leader needs to spend lot of time in organizational change
management and other activities, for which it is important that she/he frees up
her/his time. Therefore, the leader must learn to delegate few key tasks and
decisions to her/his team.
Delegation and empowerment also convey the message to the team that their
leader trusts them. This induces a greater sense of responsibility and moti-
vates them. Subordinates start feeling that they are being treated as partners,
instead of order-takers. This encourages them to come up with new ideas and
suggestions.
While delegating has many advantages, it is important that it is done in a pru-
dent manner. There are four key aspects that the leader must take care of. First,
while delegating, it is important to evaluate the capability of the person to whom
a task or decision is being delegated. Second, the person must be provided with
the necessary resources. Third, the leader must regularly monitor the person to
whom a task is delegated and provide prompt feedback. Fourth, the leader must
not blame the person if anything goes wrong. Accountability should always be
with the leader. One can delegate task, but not accountability. It means that
when a review happens with senior executives, the leader should not pass on
blame for any failures/shortcomings to any person from the team.
• Performance management of team: Performance management of employees
is important from career growth and motivation perspective. There are six key
aspects of performance management of the team that the leader needs to take
care of.
First, it is important for the leader to understand capability of each individual
and assign tasks accordingly. True capability does not necessarily depend on
the number of years of experience a person comes with. I have often seen some
young data scientists developing very complex algorithms. For a leader, it may
take some time to understand true capability or potential of an individual, but
the leader must put in efforts to understand it.
Second, goals for an individual should be set keeping in mind both the person’s
capability as well as her/his aspirations. Unless the goals are aligned with a
person’s aspirations, she/he would not be motivated to work.
Third, goals and expectations from the person should be discussed one-on-
one. Such discussions help in avoiding any communication gaps and give an
opportunity to understand each other better.
Fourth, progress on goals should be reviewed periodically and feedback pro-
vided. Providing feedback on areas of improvement should not be a once in a
year activity. Rather, it should be provided regularly and, sometimes, on certain
transactions, where the leader believes that the example from the transaction
can help the person take immediate corrective measures. Further, the manner
7.3 Soft Skills 169

of providing feedback should not be in a fault-finding style but should be in a


constructive style that the subordinate would also appreciate. Feedback should
be based on facts and should cover both the positive and negative aspects.
Fifth, a leader should help and guide the subordinate in developing a person-
alized competency development plan. The leader must coach and mentor the
subordinate, to help create a successful career path for her/him.
Sixth and final key focus area from performance management perspective is the
sensitive aspect of punishment. Punishment should be the last resort and should
be avoided as far as possible. However, it is sometimes required for correcting
adverse behavior. While administering any punishment, an extremely important
aspect is that it should be done to improve behavior and should not be used as a
retribution measure. If a punishment leads to emotional side effects, it will not
help the leader. Hence, the manner and style of punishment are very critical.
• Stress management: When a leader tries to do something transformational for
an enterprise (using data and analytics), it will entail, inter alia, doing things
differently, experimenting a lot, solving problems through trial and error, man-
aging change, and so on. All this will create lot of stress for the leader and the
team. The leader must, therefore, be adept at dealing with stress. High stress
level of the leader will invariably cascade down to the team, thereby creating
an atmosphere that would not be conducive to achieve the goals that the leader
aspires to achieve. To manage stress, the leader must take three actions.
First, the leader must monitor and analyze the stress level of self and team. This
would help her/him to understand the root cause of stress and accordingly take
corrective actions.
Second, she/he should adopt a healthy lifestyle, that should include proper diet,
exercise, yoga, and meditation. This will allow both the body and mind to
release stress.
Third, she/he should develop good relationships both at work and social front.
It helps in reducing stress. To establish good relationships, the leader must have
good interpersonal skills and an honest approach.

7.3.3 Innovation and Risk Taking

Throughout this book, I have highlighted the importance of thinking out-of-the-


box to come up with solutions to solve complex business problems and prepare an
enterprise for the future. To make it happen, data and analytics leader must be very
innovative and possess risk-taking quality. The leader should also encourage inno-
vation and risk-taking behavior in her/his team. While trying to do this, the leader
must be able to manage the dilemmas/complexities such as the ones mentioned
below.

• To govern data as a strategic asset, the leader needs to impose good amount of
governance around data, especially on who can access what data. However, for
170 7 The Profile of a Data and Analytics Leader

encouraging innovation (especially co-innovation with business stakeholders),


good amount of leeway, to access various types of data, needs to be provided to
both data and analytics team and the business stakeholders. Unless they access
various types of data, they would not be able to discover unknown relation-
ships between them. For example, to understand how product quality is linked
with absenteeism of employees, one needs to analyze correlation between the
two data sets. Hence, the leader must lay down exceptions where extraordinary
access of data can be provided. Further, she/he should also define governance
around such exceptions (such as duration of access, what to do with the data
once experimentation is complete), so that there is no data security risk to the
enterprise. The leader must have the knack for doing such balancing acts (bal-
ancing governance and lack of governance of data), without adversely impacting
security.
• To encourage developing innovative solutions in data and analytics, it is impera-
tive to test out various business hypotheses using data. In my experience, 80–20
rule applies in hypothesis testing, i.e., approximately 80% of hypotheses does
not turn out to be correct. With the remaining 20% (that turns out to be correct),
one can develop some innovative and high business value solutions. However,
it takes time to test various hypotheses and prioritize the most important ones.
The challenge in most enterprises is that the data and analytics leader is always
under pressure from the executives to deliver quick results. The leader accord-
ingly sets tight result-oriented goals for her/his team. However, if the team is
punished if satisfactory results does not come on time, they would not test all
hypotheses. Instead, they would pick up some quick-to-deliver low-risk solu-
tions for development, even if these solutions may not have potential to deliver
high business value. Data and analytics leader should, therefore, have the capa-
bility to take pressure from the executives, without passing it on to her/his
team. The leader should also know when not to punish failure. In fact, if some-
one dares to try something out-of-the-box but does not get favorable outcome,
the leader should appreciate and reward the person for having the courage to
take the risk.

7.3.4 Organizational Change Management

Amongst all the skills that I illustrated in the figure at the beginning of this chapter,
I gave the highest weightage to the skill for “organizational change management”
(based on the relative size of its annular sector compared to those of others).
This is the most important skill that a data and analytics leader must possess.
Organizational change management is also one of the five elements of data and
analytics strategy. Accordingly, I dedicated one entire chapter of this book on it.
To define and implement any data and analytics strategy, the most difficult task
for the leader is to drive organizational change management. Driving change within
the data and analytics team is in itself a difficult task. However, this challenge is
7.3 Soft Skills 171

nothing as compared to driving change across the enterprise. The leader does not
have direct command and control on people outside her/his team. That is where the
leader’s change management skills are really tested. Despite doing everything else
right, if the leader is not able to drive change, all the efforts and top management
investments would go in vain.
Following are some of the key qualities that the leader must possess to drive
change within her/his team as well as across the enterprise.

• Communication: I discussed this point in detail in the chapter on “Organiza-


tional Change Management”. The leader must possess a very good knack for all
the five aspects that I talked about in that chapter, viz. “what to communicate”,
“who should communicate”, “when to communicate”, “how to communicate”,
and “3Cs of communication—clear, concise, and compelling”.
• Listening: An important attribute that the leader must possess is the ability to
listen actively (as against listen passively). An active listener is interested in
understanding the problem of a speaker and is completely focused on hearing
and asking questions to understand the problem in detail. A sincere effort to lis-
ten to and understand a speaker is important. To give an example of insincerity,
sometimes during a meeting I have seen people not putting their cell phones in
do-not-disturb mode. If they receive an incoming call, they take it, giving an
excuse that it is very urgent. This conveys a message that the person who has
called on the phone is more important to the leader, at that point of time, than
the speaker and her/his problems.
• Assertiveness: Data and analytics leader needs to be assertive, without getting
aggressive. For the type of business problems that the leader needs to solve,
there are no easy solutions. Often the proposed solutions would be critically
reviewed by business leaders. If the data and analytics leader is convinced that
a proposed solution is the best possible one under the circumstances, it is impor-
tant that she/he defends the same with arguments. The leader should be able to
do this diplomatically but assertively and should not get irritated if the discus-
sion does not go as planned. The leader should address all the concerns raised
(by the business leaders) with an honest approach and an open mind.
• Building relationships and gaining trust: Data and analytics leader must be
very good in building relationships with all stakeholders across the enterprise—
with business leaders, with IT leaders, with peers, and with subordinates. Such
relationships create an atmosphere of collaboration. It helps in co-innovating
for developing out-of-the-box solutions. To build strong relationships, the leader
must proactively invest lot of time and effort. Unless the leader enjoys building
strong relationships, she/he will not be successful. The leader must also have
the quality to inspire trust in various stakeholders across the enterprise. If the
stakeholders do not trust the leader, they would not be willing to invest their
time to work with her/him.
• Energy and passion: Finally, the data and analytics leader must have high
energy and passion, that is required to drive change across a large enterprise.
The leader needs to work against considerable odds. I already discussed many
172 7 The Profile of a Data and Analytics Leader

challenges/roadblocks in this book—new technologies, mindset change, new


ways of doing business, different stakeholders having varying priorities, and so
on. It requires great perseverance to overcome all odds and keep driving change.
One needs both high amount of energy and passion (for data and analytics) to
do that.

7.3.5 Design Thinking and Empathy

Design thinking is a “human-centric” approach for solving problems. It involves


adopting an empathetic approach towards understanding need of a person/customer
first. In the recent years, there has been lot of focus on adopting design think-
ing approach by various enterprises—not only for data and analytics projects, but
also for other business and IT projects. Many universities have started teaching
design thinking as a formal course. Many enterprises have also started training
their employees on design thinking approach for problem solving.
In data and analytics, design thinking approach is extremely useful. In the
chapter on “organizational change management”, while explaining “storytelling
with data”, I had described an example of the approach that started with under-
standing a typical day-in-the-life of a supply chain manager and then developing
a solution to make the manager’s life easy. I had mentioned about demonstrating
to the manager (using data visualization tool) how the solution can help her/him
view various fleet movement in a geographical map, locate specific shipments,
combine such visuals with weather data in the same map to identify links in the
supply chain that could potentially be disrupted due to bad weather, and propose
alternate routes to mitigate supply chain disruption risk. Adopting an approach
such as this involves empathizing with the day-to-day challenges of the busi-
ness stakeholders with whom data and analytics team works with. The approach
involves experimentation, prototyping, testing, and trying out new ideas jointly
and in an iterative manner. It helps in understanding each other better and solving
complex/ambiguous problems effectively.
Data and analytics leader must have an empathetic approach towards problem
solving. The leader must always put herself/himself in other person’s shoes and
empathize. Even when a solution developed by her/his team turns out to be sub-
optimal, the leader should not become defensive. Instead, she/he should try to
understand how the solution developed failed to achieve the intended objective.
While design thinking approach can be taught, unless the leader has an inherent
trait of empathizing with her/his stakeholders, she/he cannot be successful. And, if
the leader does not have the trait, her/his team may also not adopt an empathetic
approach.
7.3 Soft Skills 173

7.3.6 Marketing

The last soft skill, that a data and analytics leader must possess, is marketing. One
may wonder why marketing is such an important skill for the data and analytics
leader. The reason is that, to increase adoption of data and analytics across an
enterprise, the leader must generate excitement about how data and analytics can
help completely change the game for the business. Without focused marketing
and credibility-building exercise, it would be difficult for the leader to help the
enterprise move up the maturity path in data and analytics.
Let me explain this point further by describing how data and analytics adoption
evolves in a large enterprise. Broadly, there are three stages of maturity of data and
analytics within an enterprise, as described in the following paragraphs. Marketing
plays no role in the first stage, a limited role in the second stage, and very impor-
tant role in the third stage. Overmarketing during stage 2 and under-marketing in
stage 3 can have adverse effect on maturity evolution.

Stage 1: An enterprise that has very low data and analytics maturity normally does not have
an enterprise-level program on data and analytics. Each function within each business unit
normally develops solutions in silos to meet their day-to-day information needs.
Stage 2: When push comes from the top management to move up the maturity path and drive
data and analytics more as a program (either at the enterprise level or at individual business
unit/geographical region level), investments are made in, inter alia, hiring new people and
adopting new technologies. Often this is the stage when a data and analytics leader is hired.
Also, it is at this stage that enterprises define either a tactical (purely technology focused)
or a strategic (business vision aligned) approach for data and analytics.
Irrespective of the approach that is adopted at stage 2, the first customers of the initia-
tive are typically business functions such as finance or sales, who need standard reports
comprising financial data or sales and order backlog data, consolidated across various busi-
ness units/regions. Business stakeholders are happy to slice and dice the data to derive
better insights than what they were getting earlier. At this stage, challenges for the data
and analytics team are more around data/systems complexity, i.e., collecting and harmo-
nizing structured data from various disparate systems into a common data platform. Once
reasonable quantity and quality of data are available in the common data platform, business
stakeholders reach out to the central data and analytics team requesting for data to meet their
key information needs.
At this stage, data and analytics leader needs to start marketing the capabilities and suc-
cesses of her/his team, so that more business stakeholders start coming to the common data
platform for their needs. Here, marketing is more about creating awareness, to get more
internal customers onboarded.
Stage 3: While stage 2 is all about meeting basic information needs, that is made possible by
having integrated data at enterprise level, stage 3 is more aspirational in nature. Enterprises
move to stage 3 when they want to transform their business leveraging data and analytics.
They start seeing how data and analytics can create substantial business value.
However, at this stage business stakeholders are choosier. They are free to decide whether
they should drive the transformational initiatives themselves, instead of leveraging the data
and analytics team. They often wonder whether the team would understand their complex
needs. They have the dilemma whether they should just request for data dump from the data
and analytics team (and use the same to develop their own innovative solutions, by hiring
174 7 The Profile of a Data and Analytics Leader

few IT experts from market), or they should work together with the data and analytics team
to co-innovate and develop solutions.
In the former case (i.e., developing transformational solutions on their own), the business
stakeholders have more flexibility to develop, change, and manage the solutions. However,
in the latter case (i.e., working with the data and analytics team), the business stakehold-
ers have to first explain the business problem to the team and then follow governance and
control imposed on the solutions by the team. So, unless the stakeholders see value in the
data and analytics team, they will not partner with them. Earlier in this book, I discussed
about the cons with the former approach. One such con is that, for certain complex busi-
ness problems, the business stakeholders may not even be aware that their problems can be
solved leveraging certain new technologies. They may miss out on the expertise of a central
team.
To address the dilemma of business stakeholders at this stage, data and analytics leader
must establish initial credibility by co-innovating to solve some complex business problems.
But, for the business stakeholders to be ready to co-innovate with data and analytics leader,
they need to trust the capability of the leader and her/his team. This is a classic Catch-
22 situation. The only way to resolve this is by executing a couple of highly successful
pilot projects, that can help establish the credibility of data and analytics leader. These pilot
projects can be done jointly with one or two business stakeholders. If required, the leader
should take out money from her/his budget to execute these projects. Once the leader has
got these early successes, she/he needs to market them aggressively across the enterprise,
so that other business stakeholders (across various functions, business units, or geographies)
also start reaching out to the leader to work together. While word-of-mouth will ensure cer-
tain amount of marketing, a focused marketing plan is important to publicize achievements
of data and analytics team. This can help in increasing adoption of data and analytics at
stage 3 and in moving the enterprise up in the maturity path.

7.4 Summary

The data and analytics leader (Chief Data Officer, Chief Analytics Officer, Chief
Data and Analytics Officer, Vice President—Data and Analytics, Director—Data
and Analytics, or any other role that the leader is referred to by) of a large and
global enterprise must possess three hard skills—(1) technology proficiency, (2)
data science expertise, and (3) business knowledge, in addition to six soft skills—
(1) dealing with ambiguity, (2) team leadership, (3) innovation and risk taking,
(4) organizational change management, (5) design thinking and empathy, and (6)
marketing. While hiring a data and analytics leader, enterprises often make the
mistake of focusing only on the hard skills such as technology expertise. However,
soft skills are more important than the hard skills. If the leader does not possess all
the required soft skills, data and analytics adoption and maturity in an enterprise
may remain low, despite investing large amount of time and money.

You might also like