The Norwegian leading blog on Business Intelligence published my view on BI in 2016.
Check the prognosis here and get back to me at the end of the year to see if my crystal ball was clear on at least a few issues.
Thoughts on business intelligence and customer relationship management as customer analytics need process based support for meaningful analysis.
dinsdag 16 februari 2016
A Little Side Step into Norway: BI 2016 Forecast
dinsdag 26 januari 2016
zondag 22 november 2015
Book Review: Business Analysis
3rd
Edition, edited by Debra Paul, James Cadle and Donald Yeates
Preamble: the island and the continental species
When BCS, the Chartered Institute for IT deems
a book worth publishing, it is certainly worth reviewing from a continental
point of view. Why? Because experience shows that the UK’s business analyst has not exactly the same profile as the variety on the
mainland.
On the British Isles, a business analyst covers
a much wider scope: “One of the most important aspects of a business analysis
project is to decide what the focus is and which areas need to be investigated.
For example, on some projects the focus may be to explore possible improvements
on how part of the organization works. In this case, we might begin by
examining all of the current working practices, including the staffing and job
roles, and the work may focus on analysing and evaluating the options for the
future business system. Another project may focus on the IT system needs and
whilst understanding the situation and all of the stakeholder perspectives is
important, the potential for the use of IT to improve the business system will
dominate the analysis.” (p .59)
Clearly, the island species covers a far
broader scope than the continental one. Of the hundreds of business analysts I
have met on projects, in training courses and seminars, ninety percent come
from an IT background. In the application or OLTP world,
I have met with dozens of ex-developers who became functional analysts and
expanded their horizon towards business analysis. In the OLAP or analytics
world, there is dominant share of DBAs who became business analysts. Suddenly I
realise that I am more of an island species as I evolved from sales, marketing
and finance into business analysis and studied computer science to make sure I
can communicate with the designers and developers.
A comprehensive introduction
The editors
take you on a journey through the analysis practice, defining the concept, the
competencies and introducing strategy analysis, business analysis as a process,
touching the investigation techniques and introducing stakeholder analysis.
After modelling the business process, defining the solution and making the
business and financial case, the requirements are discussed as well as a brief
introduction to modelling the requirements and delivering the requirements and
the business solution.
Delivering
this body of knowledge in fourteen chapters on 280 pages indicates this book is
a foundation for practitioners.
Models, models and… models
The 280 page book is packed with models, 112 of them are illustrated and explained as well as integrated in a logical process flow of the business analysis practice.
In that
sense, the foreword of president of the IIBA UK Chapter, Adrian Reed, hits the
spot when he calls it “an extremely useful resource that will referenced by new
and experienced practitioners alike”.
Novice
analysts can use this book as an introduction to the business analysis practice
in the broadest sense while experienced business analysts will consider it a
valuable placeholder for useful frameworks, concepts and material for further
study. The Reference and Further Reading sections at the end of each chapter
contain extremely useful material. With
regards to “further reading” there is a
caveat I need to share with you. It‘s not about the book itself but more about
models in general.
A caveat about models
Let me tell
you a little story from my marketing practice to illustrate my point.
A very
familiar model in portfolio management is the Boston Consultancy Group’s Share Matrix. It is used on a strategic level
to analyse business units and in the marketing practice, the product portfolio
is often represented and analysed via this model.
For those
not familiar with the model, here’s a little reference to the theory: https://en.wikipedia.org/wiki/Growth%E2%80%93share_matrix
When I
worked for a multinational FMCG company I discovered what I called “Cinderella
brands”. These were brands with a small market share, low growth and considered
a dead end street for the marketer’s career. You could find product managers
with little ambition in that position, fixing up and manoeuvring to keep the
brand afloat while people higher up in the organization where waiting for the
right moment to axe the brand. I managed to convince the people with the axe
that an appropriate marketing approach cold not just save the brand but grow it
into a profitable niche product, sometimes contributing more than their
so-called cash cows. We built the business case on processed cheese with a
budget on a shoestring and proved our point that a model can never take over
from thorough analysis and critical thinking. After that, nobody mentioned
“dogs” anymore, “Cinderella” became the household name for forgotten brands
with unrealized potential. (And we got much more business from the
multinational.)
The
illustration below from an academic author shows exactly what can go wrong when
models take over from scrutiny and
critical thinking.
These are
the questions to ask when you look at a growth share market:
· Who
says cash cows don’t need substantial investment to maintain their dominant
market share and keep up with market growth? Ask Nokia if you doubt it.
· Who
says dogs need to have a negative cash flow? Sure, if your marketing spend is based on the same mental
models as those for stars and cows you will be right but guerrilla marketing
techniques may prove the opposite.
· Who
says stars’ growth will continue for eternity? Ever read “Crossing the Chasm”
by Geoffrey Moore? Especially in high tech marketing, novelties may only appeal
to the techies but never reach the mainstream market…
In fact,
question marks are in the only quadrant in the above model where some form of
nuance can be observed… Notice the
expression “analyse … whether…”
In
conclusion: follow the editors’ further
reading advice. It will help you to become a mature business analyst providing
your customers not only the “know what” and some of the “know how” as described
in the book, but also the “know why”. Wisdom may be harder to quantify but its
value is beyond doubt in the business analysis practice. By the way, from the
same editor, I recommend “Business Analysis Techniques” to increase your know
how.
Regular updates needed
The
business analysis practice evolves rapidly and the only criticism I can come up
with is the lack of an accompanying website with extra updates and reference
material. Let me add at least two of them:
a benefit map and the business canvass models are very much in the
business analysis practice today.
To conclude,
all you continental business analysts out there, buy the book and increase your
knowledge by an order of magnitude.
Available at http://shop.bcs.org, paperback ISBN: 978-1-78017-277-4
vrijdag 24 juli 2015
The Future of Information Systems: Design from the Data
This third
post in a series of three on BI programme management looks at a new way of
designing systems for both transaction and decision support to improve
the organisation’s effectiveness further. I will examine the concept of BI
architecture further and give hints of how BI programme management can evolve
towards an ideal architecture which merges transaction and decision support
systems in a powerful ensemble, ready for the new economic challenges.
I propose
an “Idealtyp” knowing that no existing organisation can achieve this in less
than a decade for reasons like sunk cost fallacies, the dialectics of progress
and simply resistance to change.
But new
organisations and innovators who can make the change will notice that the
rewards of this approach are immense. They will combine architectural rigidity
with business agility and improve their competitive power with an order of
magnitude.
Why a BI Architecture is Necessary
I am a fan
of Max Weber’s definition of “Idealtyp”[i],
which has direct links with architecture in information technology. BI
architecture is an abstraction of reality, and as such an instrument to better
understand a complex organisation of hardware, network topologies, software, data
objects, business processes, key people and organisational units. All these
components interact in –what appears to outsiders- in a chaotic way. An
architectural framework brings order to the chaos and provides meaning to all
the contributors to the system.
Architecture
is used as a benchmark, a to be situation by which the present state of nature
can be measured. It is a more crisp and more manageable concept than CMM-like
models which express maturity sometimes in rather esoteric terms. For a quick
scan, this will do but for in-depth managing of the above mentioned BI assets,
an architectural framework is better for BI environments.
CMM Level
|
BI symptoms
|
Principal risks
|
Initial
|
A serious case of “spreadsheetitis”: every
decision maker has its own set of spreadsheet files to support him in his
battles with the other owners of spreadsheets. Everyday tugs of war over who
has the correct figures.
|
Your project may never take off because of
political infighting and if it does, there will be a pressing need for change
management of the highest quality and huge efforts will have to be invested
in adoption tracks.
|
Repeatable
|
The organisation uses some form of project
management, in most cases inherited or even a carbon a copy of systems or
application development
|
The project management method may
be totally inadequate for a BI project leading to expensive rework and
potential project failure in case everybody remains on his position.
|
Defined
|
The organisation has a standard
procedure for the production of certified reports. These can connect with one
or more source systems in a standardised way: direct connection to the source
tables, import of flat files, or some form of a data warehouse.
|
Resistance to change.
This depends on the way the
organisation has implemented the data warehouse concept and how reversible
the previous efforts are in a migration scenario.
|
Managed
|
The development processes are
standardised and monitored using key performance indicators and a PDCA cycle.
|
The iterative and explorative
approach of BI project management may frighten the waterfall and RAD fans in
the organisation. Make sure you communicate well about the specifics of a BI
development track.
|
Optimising
|
The development processes only need
fine-tuning.
|
Analysis
paralysis and infighting over details may hamper the project’s progress.
|
Table 2 Example of the BI version of the Capability Maturity Model as described in Business Analysis for Business Intelligence on page 202. In the book, it is positioned as a tool to help the BA with identifying broad project management issues
No matter what industry you look at, market leaders fulfil their basic marketing promise: provide stability, predictable behaviour and a very high degree of CYA (google it) to the buyer. But that doesn’t mean the purchase decision is the best possible decision for future use. Market leaders in IT are also very keen on “providing” vendor lock-in, disallowing the client to adapt to changing requirements.
Why this "Idealtyp" is not Easy to Achieve
Proposing an
ideal BI architecture is one thing, achieving it, another. I will only mention
three serious roadblocks on the path towards this ideal BI architecture that
unifies transaction systems and decision support systems: the sunk cost
fallacy, the dialectics of progress and resistance to change.
The sunk cost fallacy is a powerful driver
in maintaining the status quo; organisations suffering from this irrational
behaviour consider they have invested so much effort, money, hardware,
training, user acceptance and other irretrievable costs that they should
continue to throw good money at bad money. And sometimes the problem is compounded when
the costs were spent on technology from market leaders.
No one ever got fired for buying…
(fill in any market leader’s name)
No matter what industry you look at, market leaders fulfil their basic marketing promise: provide stability, predictable behaviour and a very high degree of CYA (google it) to the buyer. But that doesn’t mean the purchase decision is the best possible decision for future use. Market leaders in IT are also very keen on “providing” vendor lock-in, disallowing the client to adapt to changing requirements.
As a
footnote: today, buyers are more looking at the market cap or the private
equity of the Big Data technology providers than at their actual technical
performance and their fit with the organisation’s requirements. Yes, people
keep making the same mistakes over and over…
At the
other end of the spectrum are the dialectics
of progress: this law was discovered
by the Dutch journalist Jan Romein who noticed that gas lights were still used
in London when other European capitals already used electricity. This law suggests-and I quote an article on
Wikipedia- that making progress in a particular area often creates circumstances
in which stimuli are lacking to strive for further progress. This results in
the individual or group that started out ahead eventually being overtaken by
others. In the terminology of the law, the head start, initially an advantage,
subsequently becomes a handicap.
An explanation for why the phenomenon occurs is
that when a society dedicates itself to certain standards, and those standards
change, it is harder for them to adapt. Conversely, a society that has not
committed itself yet will not have this problem. Thus, a society that at one
point has a head start over other societies, may, at a later time, be stuck
with obsolete technology or ideas that get in the way of further progress. One
consequence of this is that what is considered to be the state of the art in a
certain field can be seen as "jumping" from place to place, as each
leader soon becomes a victim of the handicap.
(From: https://en.wikipedia.org/wiki/Law_of_the_handicap_of_a_head_start)
(From: https://en.wikipedia.org/wiki/Law_of_the_handicap_of_a_head_start)
As always, resistance to change plays its role.
New tools and new architectures require new skills to be trained, new ways of
working to adopt and if one human species has trouble adapting to new
technologies it is… the tech people. I can produce COBOL programmers who will
explain to you that COBOL is good enough for object oriented programming or IMS
specialists who see nothing new in the Big Data phenomenon…
What is BI Architecture?
Here’s architecture
explained in an image. Imagine Christopher Wren would have disposed of modern
building technologies. Then either the cathedral, based on the architecture “as
is” would have looked completely different, with higher arches, bigger windows,
etc… Or,… the architecture could have evolved as modern technology would have
influenced Wren’s vision on buildings.
Exactly
this is what happens in BI architecture and BI programme management.
Figure 5
On the left: architecture, right: a realisation of architecture as illustrated
by Wren’s Saint-Paul’s Cathedral
|
Architecture
descriptions are formal descriptions of an information system, organized in a
way:
- that supports reasoning about
the structural and behavioural properties of the system and its
evolution.
- These descriptions define the
components or building blocks that make up the overall information
system, and
- They provide a plan from which
products can be procured, and subsystems developed,
- that will work together to
implement the overall system.
- It thus enables you to manage
your overall IT investment in a way that meets the needs of your
business.
It is also
the interaction between structure, which is requirements based, and principles
applicable to any component of the structure.
What is the Function of BI Architecture?
BI Architecture
should reflect how the BI requirements are realized by services, processes, and
software applications in the day-to-day operations. Therefore, the quality of
the architecture is largely determined by the ability to capture and analyse
the relevant goals and requirements, the extent to which they can be realized
by the architecture, and the ease with which goal and requirements can be
changed.
Figure 6
The Open Group Architecture Framework puts requirements management at the
centre of the lifecycle management. The connection with business analysis for
business intelligence is obvious.
|
Reality Check: the Two Worlds of Doing and Thinking
Now we have
established a common view on BI architecture and programme management, it is
time to address the murky reality of everyday practice.
Although
Frederick Taylor and Henri Fayol’s ideas of separation between doing and
thinking have been proven inadequate for modern organisations, our information
systems still reflect these early 20th Century paradigms. You have the
transaction systems where the scope is simply: execute one step after another
in one business process and make sure you comply with the requirements of the
system. This is the world of doing and not thinking. Separated from the world
of doing is the world of thinking and not doing: decision support systems. The
business looks at reports, cubes and analytical results extracted from
transaction and external data and then makes decisions which the doers can
execute.
What if the
new economy were changing all this in a rapid pace? What if doing and thinking
came together in one flow? That’s exactly what the Internet is creating, and I
am afraid the majority of organisations are simply not ready for this
(r)evolution. Already in 1999, Bill Gates and Collins Hemingway[ii]
wrote about empowering people in the digital age when they gave us the
following business lessons:
- q The more line workers understand the inner workings of production systems, the more intelligently they can run those systems.
- q Real-time data on production systems enables you to schedule maintenance before something breaks.
- q Tying compensation to improved quality will work only with real-time feedback of quality problems.
- q Task workers will go away. Their jobs will be automated or combined into bigger tasks requiring knowledge work.
- q Look into how portable devices and wireless networks can extend your information systems into the factory, warehouse and other areas.
I am afraid
this advice still needs implementation in many organisations. The good news is
that contemporary technologies can support the integration of doing and
thinking. But it will require new architectures, new organisational and
technological skills to reap maximum benefits from the technology.
The major and most relevant BI
programme management decision criterion will be the answer to the question: “Which
quality data yield the highest return in terms of competitive advantage?”
Bringing IT Together: Design from the Data
What if we
considered business processes as something that can change in 24 hours if the
customer or the supplier wants it? Or if competitive pressure forces us to
change the process? What if information systems would have no problem
supporting changing business processes because the true cornerstone, surviving
any business process is data? This could be a real game changer for industries
that still consider data as a product of a business process instead of the
objective of that process.
The schema
below describes a generic architecture integrating transaction and decision
support systems in one architectural vision. Let’s read it from left to right.
Any
organisation has a number of business drivers, for example as described by
Michael Porter’s generic strategies: be the cost leader, differentiate from the
competition or focus on a niche. Parallel with the business drivers are
decision making motives such as: “I want complete customer and product insight”
and finally, the less concrete but very present knowledge discovery driver to
make sure organisations are always in the lookout for unpredictable changes in
the competitive environment. These three drivers define a number of business
objects, both static and dynamic. And these entities can be endogenous to the
organisation (like customer, channel, product, etc..) or they can be external
like weather data, currency data, etc…. These business objects need to be
translated into data objects suitable for transaction and decision support
Conclusion:
an integrated view on transactions and decision making will improve BI
programme management supported by this architectural vision. The major and most
relevant BI programme management decision criterion will be the answer to the
question: “Which quality data yield the highest return in terms of competitive advantage?”
And thus, which project (whether on the transaction or decision support systems
need the highest priority in allocation of resources?
[i]
According to the excellent website http://plato.stanford.edu/entries/weber/ this is the best description of Max Weber’s
definition:
“The methodology of “ideal type” (Idealtypus) is
another testimony to such a broadly ethical intention of Weber. According to
Weber's definition, “an ideal type is formed by the one-sided accentuation of
one or more points of view” according to which “concrete individual phenomena
… are arranged into a unified analytical construct” (Gedankenbild); in
its purely fictional nature, it is a methodological “utopia [that] cannot be
found empirically anywhere in reality”. Keenly aware of
its fictional nature, the ideal type never seeks to claim its validity in terms
of a reproduction of or a correspondence with reality. Its validity can be
ascertained only in terms of adequacy, which is too conveniently ignored by the
proponents of positivism. This does not mean, however, that objectivity,
limited as it is, can be gained by “weighing the various evaluations against
one another and making a ‘statesman-like’ compromise among them”, which is often proposed as a solution by those sharing Weber's
kind of methodological perspectivism. Such a practice, which Weber calls
“syncretism,” is not only impossible but also unethical, for it avoids “the
practical duty to stand up for our own ideals”.”
What is less known is that Weber used the concept also
in decision making theory when he analysed the outcome of the Battle of
Köninggratz, where Von Moltke defeated the Austrian-Bavarian coalition against
Prussia and its allies in 1866, an important phase in the unification of Germany.
[ii] “Business at the Speed of Thought” Bill Gates and Collins Hemingway, Penguin Books, London England, 1999 pp 293 -294
dinsdag 7 juli 2015
The Eternal Business Intelligence Conundrum
Finding
an Optimum between a Manageable BI Architecture and Business Agility
This is the
second post in a series of three about programme management in Business
Intelligence (BI). In the previous post we positioned project- and programme
management in BI and the latter’s relationship with BI architecture.
In this
post, we discuss the universal and eternal problem, conflict, dialectics,… (call
it what you want) between the business who wants a decision support solution
here and now, no matter what the consequences for the IT department are and the
IT guys who want to steer the team and the infrastructure into calm waters.
“Calm waters” meaning a strict architectural, TOGAF based approach to managing
the BI assets.
Head for the Cloud!
I won’t
describe the situations where the IT guys –according to the business- waste
time with the introduction of new tools and the business strike a direct deal
with a vendor, using the tool completely outside the managed environment.
Needless to
say that many cloud based solutions offer a solution with a small IT footprint:
all the business needs is a browser. Well, that’s what the business thinks.
Issues like data quality, data governance and data security are not always
handled according to corporate standards and legislation on data privacy and
data security is becoming stricter and more repressive every so many years.
What Business Stakeholders Need
As I
pointed out in the section “Managing Strategy” (Business Analysis for Business
Intelligence p. 66 – 71) business stakeholders need decision support for their
intended strategy as well as emergent strategies (note the plural in the
latter). To support analysis and monitoring of intended strategies (i.e. the
overall business plan or a functional strategy as described in a marketing-,
HR- or finance plan for example) a balanced scorecard (BSC) does the job. If
well done, a BSC aligns all parties concerned around a well-designed causal
model breaking down strategic priorities into critical success factors and key
performance indicators as well as a project plan, a data model and an impact
study on the existing analytics architecture. But to capture, evaluate, monitor
and measure the impact of emergent strategies is a different ball game. The business
intelligence infrastructure needs an agile approach to produce insights on the
fly. Some vendors will suggest that all can be solved with in-memory analytics.
Others suggest the silver bullet is called “Self Service BI” (SSBI) Yet even
the most powerful hard- and software is a blunt and ineffective weapon if the
data architecture and the data quality are in shambles.
Sometimes
new tools emerge, producing solutions for niches in finance, marketing or
production management which cause the business to urge IT for adopting these
tools. This ends with either a mega vendor acquiring the niche player or the
niche player broadening its offerings and competing head on with the
established vendors. In any case, if the IT department happens to have
standardised on the “wrong” technology partner, there will be bridges to cross
for both…
Other than issues
with new software “interfering” with IT’s priorities, most of the troubles are
found in the data architecture. The reason is simple: if not all BI projects
are backed by an enterprise wide data architecture that is connected with BI
programme management, new information stovepipes will emerge. This is quite
ironic as the initial reason for data warehousing was to avoid the analytic
stovepipes on transaction systems. So here’s my advice to the business:
Whatever business you are in,
make sure you have an enterprise view on the major information objects for your
analytic projects
Without it,
you are destined to waste money on rework, on incomplete and even false information.
Whatever business you are in,
make sure you have an enterprise view on the major information objects for your
analytic projects
What IT Stakeholders Need
IT
management has many constraints do deal with. Keeping up with business
requirements, while getting the biggest bang for their buck means pooling skills,
facilities and technology components to optimise license cost, education and
training and hard- and software performance. The final objective is to provide
high service levels and keeping their customers happy at a reasonable budget.
But if ”happy customer” means: acquiring new, exotic software, training new
skills and insourcing expensive tech consultants from the vendor to explore new
terrain without experience or knowledge of best practices, then IT management
may be at the short end of the stick.
Take the
example of data visualisation ten years back. Business had a point that the
existing vendors weren’t paying too much attention to good visualisation to
produce better insights in data. Even the most common tools had problems creating
a histogram, let alone sophisticated heat maps or network diagrams. Then came
along vendors like Tableau Software selling end user desktop licenses at affordable
rates, educating the business to enjoy the benefits of visual exploration of
data. The next step in this “camel’s nose” or “puppy dog approach” is getting
the organisation to acquire the server for better management, performance and
enterprise wide benefits of the technology.
So here’s
my advice for IT Management:
If a new technology becomes
available, it will be used. Make sure it is used in a managed and governed way instead
of contributing to information chaos.
Don’t fight
business intelligence trends that have a pertinent business case, fight BI
fads only.
If a new technology becomes
available, it will be used. Make sure it is used in a managed and governed way instead
of contributing to information chaos.
A Governance Decision Model for Conflicting Interests
I don’t like
dogmatic thinking in management but when it comes to governance in BI, I will
defend this dogma till the bitter end: only duopolistic governance will produce
the best results in analytics.
That a
business monopoly won’t work was clear after a consulting mission where I found
a data warehouse with no less than six (6!) time dimensions. This extreme
situation can only be explained by what I call “the waiter business analysis
model”. Without any discussion, counterarguments nor suggestions, the
analyst-waiter brings the ordered tables, cubes and reports to please the
business. If the business funds the projects solely, then accidents will
happen.
Figure 3 The BI Waiter Model: don't argue with the customer, bring him what he wants, no matter what...
But IT
monopolies also are a recipe for failure in BI. At another client’s site, the
IT department repeats over and over “x unless…” (x is a well-known BI tool
provider). As it happens, this tool provider is lagging seriously in data mining
and visualisation functionality so the business is wasting money on external
service providers who do the analytics off line. Another source of waste are business managers
installing software on their private PC to explore new ways of analytics at
home.
In a
duopolistic governance model, decision makers from both sides have to consider
five key governance decisions. This will result in a better mutual
understanding of each other’s concerns and priorities as well as provide a
roadmap towards a managed analytical environment.
Figure 3 The BI Waiter Model: don't argue with the customer, bring him what he wants, no matter what... |
The Five Key BI Governance Decisions
(from my
book Business Analysis for Business Intelligence, page 300 -301)
1. BI
Principles decisions:
a.
In what measure do we value data quality in the
transaction systems?
b.
If we have a trade off between security issues
and potential gains from better distribution of information, which direction do
we choose?
c.
Do we choose a proactive or a reactive attitude
towards our BI users, i.e. do we deliver only the required information or do we
make suggestions for enhancements?
2. BI
Architecture decisions
a.
Do we follow the general architecture policies
or is there a compelling reason to choose an alternative route?
b.
If we need alternatives, where will they be of
importance: in databases, ETL tools, BI server(s), client software,…?
3. BI
infrastructure decisions
a.
What are the shared IT services the data
warehouse will use?
b. What part of the infrastructure will be
organised per department or business unit?
c. What are the access methods for the information
consumers: local client PC, PDA, web based, VPN,…?
4. Business
Application needs
a. Specify the business need
b.
Specify the urgency
c. Present alternative solutions
5. Prioritisation
of investments in BI
a. How will we evaluate the priorities?
b. Who will handle conflicting interests?
c. Which user profiles will be served first?
d. Which subject areas will be tackled first?
In the next
post I will have a look into the next generation of information design and architecture.
Comments are welcome!
maandag 29 juni 2015
Business Intelligence Programme Management: Optimising Time to Market with a Manageable Architecture
In this
series of three posts, I will address some typical aspects of programme
management for decision support. The first post will define programme and project
management in Business Intelligence (BI) and its relationship with IT
architecture.
The second
post will deal with some issues in governance, the tensions that arise between
business and IT and how to deal with them.
The third
post will propose a new way of designing systems for both transaction and decision support to improve the
organisation’s effectiveness further.
A project holds the middle ground between routine and improvisation to
produce a product
This is the
essence of project management: managing unknown and unfamiliar risks and
uncertainties using experience with proxies and lessons learnt to produce
something within a time frame, a budget and within a certain quality range
while delivering correct management information for the steering committee
about the progress and the resources needed.
In Business
Intelligence, this product orientation sometimes leads to overly focussing on
the deliverables while ignoring valuable opportunities along the way. Many BI
projects are described in terms of delivering x reports on y KPIs or delivering
OLAP cubes and reports on x sources to describe what I call “stocks and flows”
of business processes. I am not arguing against this approach but project
steering committees should be aware that a tunnel vision can be a costly
liability in BI.
Already in
1994, Séan Kelly (not the Irish cyclist nor the politician, but the
datawarehouse manager from Eireann Telecom) stated that the business user can’t
formulate correct requirements at the start of the BI project and changes these
requirements during the project’s lifecycle. Twenty years later, many BI
project teams haven’t learned anything from Kelly’s observation. Either they
stick to the initial product description or they reiterate the analysis –design
–build cycle at extra cost and throughput time. The major reasons are a lack of
BI maturity in the organisation and the lack of an overall BI programme vision,
at best an incomplete one. In the next
section I describe BI programme management and how this contributes to more
effective BI projects.
Projects deliver products,
programmes have outcomes
There is
certainly some truth in this dictum on one condition: BI programme management
should focus on favourable outcomes for decision making. The BI programmes can
focus on improving decision making processes, on information objects like
customer, product, channel or on broader outcomes like knowledge sharing, improved
positioning, improved competitive capabilities etc.
The
decision on the programme scope is not trivial. Sometimes programmes can define
too broad an objective for the organisation’s maturity. Not everyone sees the
relationship between a new column store and improved competitive positioning.
Some
authors and practitioners see a BI programme as a collection of projects, a
higher hierarchical level from tasks via projects to programmes. I strongly
disagree. In my experience, BI programmes have links with other programmes on
HR, IT infrastructure, marketing and sales, and other functional or strategic
endeavours to produce a better outcome for the BI project portfolio.
The matrix
below shows a few examples of how BI programme management interacts with other
business programmes in the organisation.
Functional strategies and programmes
|
BI Programme interaction
|
Dependencies outside the BI programme
|
Marketing:
improving customer knowledge
|
CRM
and ERP systems deliver data to the customer analytics programme
|
The
organisational question of customer ownership needs to be addressed when
multiple divisions deal with the same customer.
|
Marketing:
improving direct communication response rates
|
A customer MDM
programme is initiated to improve data quality for BI and CRM processes
|
The customer logging
processes,( e.g. in the contact centre) needs improvement initiatives
|
Finance:
reducing days receivable outstanding
|
Within
the customer analytics programme, a data mining project is initiated to
predict payment behaviour
|
The
invoicing process needs updating
|
HR:
reducing absenteeism
|
HR and ERP systems
deliver data about potential influencers on absenteeism as well as customer
analytics to examine the impact of customer behaviour on absenteeism
|
The organisation’s
management style needs adjustment for the new economy
|
Table 1 How BI Programmes interact with other business
programmes
From these
simple examples you can see clearly that the outcome of a BI programme affects
other programmes and in its turn is affected by other programmes. In the case
of BI projects we can also distinguish dependencies with other projects but
these dependencies are usually smaller in scope and easier to manage in the
steering committee. You will probably
recognize some of these quotes:
I can’t get the data from the HR department,
it’s classified information,
They say I have to take the matter to the
architecture board,
The infrastructure needs upgrading,
The license negotiations are slowing down the
project, etc…
Now that the
link with external programmes is clear it is high time to look what is inside a
BI programme. This is where the link with architecture becomes clear.
BI
programmes should have an overarching view, vision, business case and target
setting for the principal contributors to better decision making.
In Kimball
terms: the conformed dimensions, in Linstedt terms: the data vault’s hubs and
satellites. No matter which solution approach you choose, a vision on the
principal information objects is quintessential to the success of individual BI
projects.
From Information Objects
to BI Architecture
Thinking
about information objects like customer,
channel, product, location, is
thinking about the way they are created, stored, updated and deleted in the
various information systems of the organisation. It also relates to the
business processes using these information objects to produce context for
transaction registrations like information
request, complaint, order, payment etc..
Thus,
answering the what, where, why and how questions as the Zachman Framework does
is talking architecture. The illustration below is a classical BI situation.
With the advent of complex event processing and big data technology on Hadoop,
things are changing but for 99% of the organisations, classical BI is still the
modus operandi.
Figure 1 A generic architecture of contemporary
information systems: transaction systems and decision support systems are
separated systems where information objects pass from the transaction systems
to the decision support systems via an Extract Transform and Load Process.
Some
comments with the above schema in Archimate modelling language:
Business
drivers such as “end to end support for the order to pay process” define
various business processes which in turn are supported by transaction
registration systems. These systems create transaction lines like order, order
confirmation, bill of material, manufacturing data, shipping bill, delivery
note, customer receipt registration, invoice,
customer payment registration etc…
All these
transactions relate to information objects like date and time, product,
customer, shipment mode, etc..
Via the
Extract Transform and Load process these objects are scrubbed, normalised and
made ready for publishing in analytical environments and reports. That’s when
they become decisional data objects: facts and dimensions are always the end
result, no matter what intermediate storage you use: a data vault, an anchor
model or an enterprise data warehouse in the third normal form. The facts are
the measures in the reports and the dimensions are the perspectives on these
measures. Usually you read the facts per dimension, i.e. the sales per region,
per outlet, per account manager,…
In data
mining projects the facts and dimensions will be flattened in a matrix for
offline analysis and combined with semi structured and unstructured data in
Hadoop files systems. In streaming analytics temporary snapshots are compared
with the scoring model which is derived from the facts and dimensions as well
as semi structured and unstructured data in Hadoop file systems.
With new
Big Data technologies, new architectures will emerge but that is for another
post.
Suffice it to
conclude that managing enterprise wide facts and dimensions as well as semi
structured and unstructured data is both the task of programme management and
BI architecture.
BI Architecture and Programme Management: See the Picture
Programme
managers need to see the entire picture to set priorities, look for synergy
between projects and initiate data management projects to fill the gaps between
the BI projects as required by the business.
An example can make this clear.
Let’s take
the first example from table 1: “Improving Customer Knowledge” driving a
programme to consolidate all static and dynamic information about customers and
their behaviour.
Conferring
with the enterprise architecture board, the programme manager discovers that
the geographical coordinates of each customer are valuable information for the
logistics department to optimise delivery schedules. Though it is outside the
scope of marketing, the programme manager will initiate a project to add
geolocation data to the customer dimension. Later in the process, the marketing
manager discovers the potential of geomarketing.
Of course,
the interaction can also work the other way around: the BI architecture review
board evaluates programmes and projects and readjusts priorities and project
scope on the basis of availability and cost of data capture. Sometimes the
replacement or adjustment of a source system can impact the BI programme
heavily. It always boils down to defining a business case that sticks. Excuse
me for hitting the nail over and over but it all starts with business analysis
for business intelligence that sets the scene. Too many projects and programmes
have failed in BI because of a gung ho approach of the designers and builders
who cannot wait till the specs are thought through after thorough analysis of
the strategy process and the data landscape.
Abonneren op:
Posts (Atom)