dinsdag 7 juli 2015

The Eternal Business Intelligence Conundrum

Finding an Optimum between a Manageable BI Architecture and Business Agility

This is the second post in a series of three about programme management in Business Intelligence (BI). In the previous post we positioned project- and programme management in BI and the latter’s relationship with BI architecture.
In this post, we discuss the universal and eternal problem, conflict, dialectics,… (call it what you want) between the business who wants a decision support solution here and now, no matter what the consequences for the IT department are and the IT guys who want to steer the team and the infrastructure into calm waters. “Calm waters” meaning a strict architectural, TOGAF based approach to managing the BI assets. 

Head for the Cloud!

I won’t describe the situations where the IT guys –according to the business- waste time with the introduction of new tools and the business strike a direct deal with a vendor, using the tool completely outside the managed environment. 
DataMaestro, for data mining in a browser
Figure 2 The data mining tool Data Maestro is an example of a powerful cloud based tool

Needless to say that many cloud based solutions offer a solution with a small IT footprint: all the business needs is a browser. Well, that’s what the business thinks. Issues like data quality, data governance and data security are not always handled according to corporate standards and legislation on data privacy and data security is becoming stricter and more repressive every so many years.

What Business Stakeholders Need

As I pointed out in the section “Managing Strategy” (Business Analysis for Business Intelligence p. 66 – 71) business stakeholders need decision support for their intended strategy as well as emergent strategies (note the plural in the latter). To support analysis and monitoring of intended strategies (i.e. the overall business plan or a functional strategy as described in a marketing-, HR- or finance plan for example) a balanced scorecard (BSC) does the job. If well done, a BSC aligns all parties concerned around a well-designed causal model breaking down strategic priorities into critical success factors and key performance indicators as well as a project plan, a data model and an impact study on the existing analytics architecture. But to capture, evaluate, monitor and measure the impact of emergent strategies is a different ball game. The business intelligence infrastructure needs an agile approach to produce insights on the fly. Some vendors will suggest that all can be solved with in-memory analytics. Others suggest the silver bullet is called “Self Service BI” (SSBI) Yet even the most powerful hard- and software is a blunt and ineffective weapon if the data architecture and the data quality are in shambles.
Sometimes new tools emerge, producing solutions for niches in finance, marketing or production management which cause the business to urge IT for adopting these tools. This ends with either a mega vendor acquiring the niche player or the niche player broadening its offerings and competing head on with the established vendors. In any case, if the IT department happens to have standardised on the “wrong” technology partner, there will be bridges to cross for both…
Other than issues with new software “interfering” with IT’s priorities, most of the troubles are found in the data architecture. The reason is simple: if not all BI projects are backed by an enterprise wide data architecture that is connected with BI programme management, new information stovepipes will emerge. This is quite ironic as the initial reason for data warehousing was to avoid the analytic stovepipes on transaction systems. So here’s my advice to the business:

Whatever business you are in, make sure you have an enterprise view on the major information objects for your analytic projects
Without it, you are destined to waste money on rework, on incomplete and even false information.

What IT Stakeholders Need

IT management has many constraints do deal with. Keeping up with business requirements, while getting the biggest bang for their buck means pooling skills, facilities and technology components to optimise license cost, education and training and hard- and software performance. The final objective is to provide high service levels and keeping their customers happy at a reasonable budget. But if ”happy customer” means: acquiring new, exotic software, training new skills and insourcing expensive tech consultants from the vendor to explore new terrain without experience or knowledge of best practices, then IT management may be at the short end of the stick.
Take the example of data visualisation ten years back. Business had a point that the existing vendors weren’t paying too much attention to good visualisation to produce better insights in data. Even the most common tools had problems creating a histogram, let alone sophisticated heat maps or network diagrams. Then came along vendors like Tableau Software selling end user desktop licenses at affordable rates, educating the business to enjoy the benefits of visual exploration of data. The next step in this “camel’s nose” or “puppy dog approach” is getting the organisation to acquire the server for better management, performance and enterprise wide benefits of the technology.
So here’s my advice for IT Management:

If a new technology becomes available, it will be used. Make sure it is used in a managed and governed way instead of contributing to information chaos.
Don’t fight business intelligence trends that have a pertinent business case, fight BI fads only.

A Governance Decision Model for Conflicting Interests

I don’t like dogmatic thinking in management but when it comes to governance in BI, I will defend this dogma till the bitter end: only duopolistic governance will produce the best results in analytics.

That a business monopoly won’t work was clear after a consulting mission where I found a data warehouse with no less than six (6!) time dimensions. This extreme situation can only be explained by what I call “the waiter business analysis model”. Without any discussion, counterarguments nor suggestions, the analyst-waiter brings the ordered tables, cubes and reports to please the business. If the business funds the projects solely, then accidents will happen.

Business Analysts need to interact with the business requirements
Figure 3 The BI Waiter Model: don't argue with the customer, bring him what he wants, no matter what...
But IT monopolies also are a recipe for failure in BI. At another client’s site, the IT department repeats over and over “x unless…” (x is a well-known BI tool provider). As it happens, this tool provider is lagging seriously in data mining and visualisation functionality so the business is wasting money on external service providers who do the analytics off line. Another source of waste are business managers installing software on their private PC to explore new ways of analytics at home.
In a duopolistic governance model, decision makers from both sides have to consider five key governance decisions. This will result in a better mutual understanding of each other’s concerns and priorities as well as provide a roadmap towards a managed analytical environment.

The Five Key BI Governance Decisions

(from my book Business Analysis for Business Intelligence, page 300 -301)

1.       BI Principles decisions:
a.       In what measure do we value data quality in the transaction systems?
b.      If we have a trade off between security issues and potential gains from better distribution of information, which direction do we choose?
c.       Do we choose a proactive or a reactive attitude towards our BI users, i.e. do we deliver only the required information or do we make suggestions for enhancements?
2.       BI Architecture decisions
a.       Do we follow the general architecture policies or is there a compelling reason to choose an alternative route?
b.      If we need alternatives, where will they be of importance: in databases, ETL tools, BI server(s), client software,…?
3.       BI infrastructure decisions
a.       What are the shared IT services the data warehouse will use?
b.       What part of the infrastructure will be organised per department or business unit?
c.      What are the access methods for the information consumers: local client PC, PDA, web based, VPN,…?
4.       Business Application needs
a.      Specify the business need
b.      Specify the urgency
c.      Present alternative solutions
5.       Prioritisation of investments in BI
  a.        How will we evaluate the priorities?
  b.        Who will handle conflicting interests?
  c.        Which user profiles will be served first?
  d.        Which subject areas will be tackled first?

Bert Brijs' book on Busines Intelligence governance, business analysis and project management
Figure 4 More on BI Governance in this book, available in all major bookstores

In the next post I will have a look into the next generation of information design and architecture. Comments are welcome!

maandag 29 juni 2015

Business Intelligence Programme Management: Optimising Time to Market with a Manageable Architecture

In this series of three posts, I will address some typical aspects of programme management for decision support. The first post will define programme and project management in Business Intelligence (BI) and its relationship with IT architecture.
The second post will deal with some issues in governance, the tensions that arise between business and IT and how to deal with them.

The third post will propose a new way of designing systems for both transaction  and decision support to improve the organisation’s effectiveness further.

A project holds the middle ground between routine and improvisation to produce a product

This is the essence of project management: managing unknown and unfamiliar risks and uncertainties using experience with proxies and lessons learnt to produce something within a time frame, a budget and within a certain quality range while delivering correct management information for the steering committee about the progress and the resources needed.
In Business Intelligence, this product orientation sometimes leads to overly focussing on the deliverables while ignoring valuable opportunities along the way. Many BI projects are described in terms of delivering x reports on y KPIs or delivering OLAP cubes and reports on x sources to describe what I call “stocks and flows” of business processes. I am not arguing against this approach but project steering committees should be aware that a tunnel vision can be a costly liability in BI.
Already in 1994, Séan Kelly (not the Irish cyclist nor the politician, but the datawarehouse manager from Eireann Telecom) stated that the business user can’t formulate correct requirements at the start of the BI project and changes these requirements during the project’s lifecycle. Twenty years later, many BI project teams haven’t learned anything from Kelly’s observation. Either they stick to the initial product description or they reiterate the analysis –design –build cycle at extra cost and throughput time. The major reasons are a lack of BI maturity in the organisation and the lack of an overall BI programme vision, at best an incomplete one.  In the next section I describe BI programme management and how this contributes to more effective BI projects.

Projects deliver products, programmes have outcomes

There is certainly some truth in this dictum on one condition: BI programme management should focus on favourable outcomes for decision making. The BI programmes can focus on improving decision making processes, on information objects like customer, product, channel or on broader outcomes like knowledge sharing, improved positioning, improved competitive capabilities etc.
The decision on the programme scope is not trivial. Sometimes programmes can define too broad an objective for the organisation’s maturity. Not everyone sees the relationship between a new column store and improved competitive positioning.
Some authors and practitioners see a BI programme as a collection of projects, a higher hierarchical level from tasks via projects to programmes. I strongly disagree. In my experience, BI programmes have links with other programmes on HR, IT infrastructure, marketing and sales, and other functional or strategic endeavours to produce a better outcome for the BI project portfolio.
The matrix below shows a few examples of how BI programme management interacts with other business programmes in the organisation.

Functional strategies and programmes
BI Programme interaction
Dependencies outside the BI programme
Marketing: improving customer knowledge
CRM and ERP systems deliver data to the customer analytics programme
The organisational question of customer ownership needs to be addressed when multiple divisions deal with the same customer.
Marketing: improving direct communication response rates
A customer MDM programme is initiated to improve data quality for BI and CRM processes
The customer logging processes,( e.g. in the contact centre) needs improvement initiatives
Finance: reducing days receivable outstanding
Within the customer analytics programme, a data mining project is initiated to predict payment behaviour
The invoicing process needs updating
HR: reducing absenteeism
HR and ERP systems deliver data about potential influencers on absenteeism as well as customer analytics to examine the impact of customer behaviour on absenteeism
The organisation’s management style needs adjustment for the new economy
Table 1 How BI Programmes interact with other business programmes

From these simple examples you can see clearly that the outcome of a BI programme affects other programmes and in its turn is affected by other programmes. In the case of BI projects we can also distinguish dependencies with other projects but these dependencies are usually smaller in scope and easier to manage in the steering committee.  You will probably recognize some of these quotes:
I can’t get the data from the HR department, it’s classified information,
They say I have to take the matter to the architecture board,
The infrastructure needs upgrading,
The license negotiations are slowing down the project, etc…
Now that the link with external programmes is clear it is high time to look what is inside a BI programme. This is where the link with architecture becomes clear.
BI programmes should have an overarching view, vision, business case and target setting for the principal contributors to better decision making.
In Kimball terms: the conformed dimensions, in Linstedt terms: the data vault’s hubs and satellites. No matter which solution approach you choose, a vision on the principal information objects is quintessential to the success of individual BI projects. 

From Information Objects to BI Architecture

Thinking about information objects like customer, channel, product, location, is thinking about the way they are created, stored, updated and deleted in the various information systems of the organisation. It also relates to the business processes using these information objects to produce context for transaction registrations like information request, complaint, order, payment etc..
Thus, answering the what, where, why and how questions as the Zachman Framework does is talking architecture. The illustration below is a classical BI situation. With the advent of complex event processing and big data technology on Hadoop, things are changing but for 99% of the organisations, classical BI is still the modus operandi.




Figure 1 A generic architecture of contemporary information systems: transaction systems and decision support systems are separated systems where information objects pass from the transaction systems to the decision support systems via an Extract Transform and Load Process.

Some comments with the above schema in Archimate modelling language:
Business drivers such as “end to end support for the order to pay process” define various business processes which in turn are supported by transaction registration systems. These systems create transaction lines like order, order confirmation, bill of material, manufacturing data, shipping bill, delivery note, customer receipt registration, invoice,  customer payment registration etc…
All these transactions relate to information objects like date and time, product, customer, shipment mode, etc..
Via the Extract Transform and Load process these objects are scrubbed, normalised and made ready for publishing in analytical environments and reports. That’s when they become decisional data objects: facts and dimensions are always the end result, no matter what intermediate storage you use: a data vault, an anchor model or an enterprise data warehouse in the third normal form. The facts are the measures in the reports and the dimensions are the perspectives on these measures. Usually you read the facts per dimension, i.e. the sales per region, per outlet, per account manager,…
In data mining projects the facts and dimensions will be flattened in a matrix for offline analysis and combined with semi structured and unstructured data in Hadoop files systems. In streaming analytics temporary snapshots are compared with the scoring model which is derived from the facts and dimensions as well as semi structured and unstructured data in Hadoop file systems.
With new Big Data technologies, new architectures will emerge but that is for another post.
Suffice it to conclude that managing enterprise wide facts and dimensions as well as semi structured and unstructured data is both the task of programme management and BI architecture.

BI Architecture and Programme Management: See the Picture

Programme managers need to see the entire picture to set priorities, look for synergy between projects and initiate data management projects to fill the gaps between the BI projects as required by the business.  An example can make this clear.
Let’s take the first example from table 1: “Improving Customer Knowledge” driving a programme to consolidate all static and dynamic information about customers and their behaviour.
Conferring with the enterprise architecture board, the programme manager discovers that the geographical coordinates of each customer are valuable information for the logistics department to optimise delivery schedules. Though it is outside the scope of marketing, the programme manager will initiate a project to add geolocation data to the customer dimension. Later in the process, the marketing manager discovers the potential of geomarketing.
Of course, the interaction can also work the other way around: the BI architecture review board evaluates programmes and projects and readjusts priorities and project scope on the basis of availability and cost of data capture. Sometimes the replacement or adjustment of a source system can impact the BI programme heavily. It always boils down to defining a business case that sticks. Excuse me for hitting the nail over and over but it all starts with business analysis for business intelligence that sets the scene. Too many projects and programmes have failed in BI because of a gung ho approach of the designers and builders who cannot wait till the specs are thought through after thorough analysis of the strategy process and the data landscape.

vrijdag 28 november 2014

The Stockholm Papers on Self-Service BI

I had the pleasure of moderating a peculiar kind of brainstorm session in Stockholm called speed geeking, a process which will remain unrevealed to those who weren't present. Never mind the "how". Let's talk about the "what". The "what" is a set of interesting replies to the three theme questions on Self- Service Business Intelligence (SSBI)

The three theme questions discussed were:

  • Why do you use SSBI?
  • What are the major problems encountered?
  • Will IT become obsolete? (a more challenging version of "How will SSBI affect IT's role")

Why organisations use SSBI

The problems SSBI can solve are low BI service levels,  elicitating better requirements for the data warehouse as uses will see the gaps in the available data and providing a workaround for slow DWH/BI development tracks. But the majority of answers went in the direction of opening up new opportunities.
The number one reason for SSBI  is time to market: support faster decision making, explore the organisation's creativity better, enhance flexibility, innovation, exploration,.. it's all there. Some teams  came up with deep thoughts about organisational development: "Distributes fact based decision making" was a very sharp one as well as "getting the right information to the right person at the right moment" although both motivations will need to be managed carefully. Because SSBI is not a matter of opening up the data warehouse (or other data lakes) to everybody, the paradox is that the more users are empowered, the more governance and data management are needed.

Conclusion: as always, two approaches to this question emerge: either it solves a problem or it creates an opportunity. My advice is to look for opportunities if you want a concept, a technology  or a new business process to last. Because problem solvers will limit the new technology from the problem perspective which is a form of typecasting the technology whereas opportunity seekers will keep exploring the  new possibilities of a technology.

What are the major problems encountered?

SSBI is not a walk in the park for neither IT nor the business users.
Data quality management, as well as the related management of semantics and governance of master data are the principal bumps on the road. Lack of training is also high on the radar as well as performance and security and integrity. So far, no real surprises. But strangely enough, an issue like "usability" appeared. You'd think that this is the main reason of developing an SSBI platform but apparently it is also the main show stopper.

Conclusion: in this mixed audience of IT and business professionals I have noticed few defensive strategies. Yes, there are problems but they can be solved was the general mood I felt in the room. Maybe this is one of the reasons why Sweden is one of the most innovative societies in the world?

How will SSBI affect IT's role

There was a general consensus between the IT and the business professionals: IT will evolve into a new role when SSBI is introduced. They will develop a new ecosystem, optimise the infrastructure for SSBI and act as an enabler to advance the use of SSBI.
Other interesting suggestions were pointing towards new IT profiles emerging in this ecosystem like data scientists, integrators of quality data, managers of business logic form both internal and external systems. In short, the borders between IT and the business users will become vaguer over time.  But one remark was a bit less hopeful: one group concluded that the CIO is still far away from the business perspective.

Maybe that's because many CIOs come from the technology  curriculum and there are still organisations out there that do not consider ICT as a strategic asset. Every day I praise myself lucky that I worked in a mail order company, early in my career. The strategic role of ICT was never questioned there and it was no surprise that the CIO of my company became the CEO as customer data, service level data, financial data and HRM data were considered as the core assets.

zondag 19 oktober 2014

Defining Business Analysis for Big Data


Introduction to an enhanced methodology in business analysis

Automating the Value Chain


In the beginning of the Information Era, there was business analysis for application development. Waterfall methods, Rapid Application Development, Agile methods,.. all were based on delivering a functioning piece of information technology that supports a well defined business process. There are clear signs of an evolution in the application development area.
Core operations like manufacturing and logistics came up with automation of human tasks and the IT department was called the “EDP department”. Some of the readers will need to look up that abbreviation. I can spare them the time: Electronic Data Processing indicated clearly that the main challenge was managing the data from these primary processes.
Information as a business process support becomes an enabler of (new) business processes

This schema gives a few hints on the progress made in automation of business processes: the core operations came first: finance, logistics and manufacturing which evolved into Enterprise Resource Planning (ERP). Later sales, marketing and after sales service evolved into customer relationship management which later on extended into Enterprise Relationship Management (ERM) incorporating employee relationship management and partner relationship management.  Finally ERP and ERM merged into massive systems claiming to be the source of all data. The increase in productivity and processing power of the infrastructure enabled an information layer that binds all these business processes and interacts with the outside world via standardized protocols (EDI, web services based on SOAP or REST protocols).
The common, denominator of these developments is: crisp business analysis to enable accurate system designs was needed to meet the business needs.

The "Information is the New Oil Era"

Already in the mid nineties, Mike Saylor, the visionary founder and CEO   from Microstrategy stated that information is the new oil.  Twenty years later, Peter Sondergaard from Gartner repeated his dictum and added “and analytics is the combustion engine”.  A whole new discipline –already announced since the 1950’s- emerged: Business Intelligence (BI). Connecting all available relevant data sources to come up with meaningful information and insights to improve the corporate performance dramatically.
The metaphor remains powerful in its simplicity: drill for information in the data and fuel your organization’s growth with better decision making.
Yet, the consequences of this new discipline on the business analysis practice remained unnoticed by most business analysts, project managers and project sponsors. The majority was still using the methods from the application development era. And I admit in the late nineties I have also used concepts from waterfall in project management and approached the products from a BI development track as an application where requirements gathering would do the trick. But it soon became clear to me that asking for requirements to a person who has an embryonic idea about what he wants is not the optimum way. The client changes requirements in 90 % of the cases after seeing the results from his initial requirements. That’s when I started collecting data and empirical evidence on which approach to a business analysis method leads to success.  So when I published my book “Business Analysis for Business Intelligence” in October 2012, I was convinced everybody would agree this new approach is what we need to develop successful BI projects. The International Institute of Business Analysis’s (IIBA) Body Of Knowledge has increased its attention to BI but the mainstream community is still unaware of the consequences on their practice. And now, I want to discuss a new layer of paradigms, methods, tricks and tips on top of this one? Why face the risk of leaving even more readers and customers behind?  I guess I need to take Luther’s pose at the Diet of Worms in 1521: “Here I stand, I can do no other.” So call me a heretic, see if I care. 
Luther at the Diet of Worms in 1521

The new, enhanced approach to business analysis for business intelligence in a nutshell deals with bridging three gaps. The first gap is the one between the strategy process and the information needed to develop, monitor and adjust the intended strategic options.
The second gap is about the mismatch between the needed and the available information and the third gap is about the available information and the way data are registered, stored and maintained in the organization.
Now, with the advent of Big Data, new challenges impose themselves on our business analysis practice.

Business Analysis for Big Data: the New Challenges

But before I discuss a few challenges, let’s refer to my definition of Big Data as described in the article “What is really Big About Big Data” In short: volume, variety and velocity are relative to technological developments. In the eighties, 20 Megabytes was Big Data and today 100 terabytes isn’t a shocker. Variety has always been around and velocity is also relative to processing, I/O and storage speeds which have evolved. No, the real discriminating factor is volatility: answering the pressing question what data you need to consider as persistent both on semantic and on a physical storage level. The clue is partly to be found in the practice of data mining itself: a model evolves dynamically over time, due to new data with better added value and / or because of a decay in value of the existing data collection strategy.
Ninety percent of “classic” Business Intelligence is about “What we know we need to know” . With the advent of Big Data the shift towards “What we don’t know we need to know” will increase. I can imagine in the long run the majority of value creation will come from this part.
From “What we know we need to know” to
“What we don’t know we need to know”
is the major challenge in Business Analysis for Big Data
Another challenge is about managing scalability. Your business analysis may come up with a nice case for tapping certain data streams which deliver promising results  within a small scope but if the investment can’t be depreciated on a broader base, you are dead in your tracks. That’s why the innovation adage “Fail early and fail cheap” should lead all your analytical endeavors in the Big Data sphere. Some of you may say “If you expect to fail, why invest in this Big Data Thing?”. The simple answer is “Because you can’t afford not to invest and miss out on opportunities.” Like any groundbreaking technology at the beginning of its life cycle, the gambling factor is large but the winnings are also high. As the technology matures, both the winning chances and the prize money diminish. Failing early and cheap is more difficult than it sounds. This is where a good analytical strategy, defined in a business analysis process can mitigate the risks of failing in an expensive way.
Business Analysis for Big Data is about finding scalable analytical solutions, early and cheap.
So make sure you can work in an agile way as I have described in my article on BA4BI and deliver value in two to three weeks of development. Big Data needs small increments.
Data sources pose the next challenge. Since they are mostly delivered via external providers, you don’t control the format, the terms and conditions of use, ... In short it is hard if not impossible to come with an SLA between you and the data provider. The next challenge related to the data is: getting your priorities right. Is user generated content like reviews on Yelp or posts in Disqus more relevant than blog posts or tweets? What about the other side of the Big Data coin like Open Data sources, process data or IOT data? And to finish it off: nothing is easier than copying, duplicating or reproducing data which can be a source of bias.
Data generates data and may degenerate the analytics
Some activist groups get an unrealistic level of attention and most social media use algorithms to publish selected posts to their audience. This filtering causes spikes in occurrences and this in turn may compromise the analytics. And of course, the opposite is also true: finding the dark number, i.e. things people talk about without being prominent on the Web may need massive amounts of data and longitudinal studies before you notice a pattern in the data. Like a fire brigade, you need to find the peat-moor fire before the firestorm starts.
The architectural challenge is also one to take into account. Because of the massive amount amount of data and their volatility which cannot always be foreseen, the architectural challenges are bigger than in “regular” Business Intelligence.
Data volatility drives architectural decisions
There are quite a few processing decisions to make and their architectural consequences impact greatly the budget and the strategic responsiveness of the organization. In a following article I will go into more detail but for now, this picture of a simplified Big Data processing scheme gives you a clue.

Big Data Architecture Options


Enabling Business Analysis for Big Data

We are at the beginning of a new analytical technology cycle and therefore, classical innovation management advice is to be heeded.
You need to have a business sponsor with sufficient clout, supporting the evangelization efforts  and experiments with the new technologies and data sources.
Allow for failures but make sure they are not fatal failures: “fail fast and cheap”. Reward the people who stick out their necks and commit themselves to new use cases. Make sure these use cases connect with the business needs, if they don’t, forward them to your local university. They might like to do fundamental research.
If the experiments show some value and can be considered as a proof of concept, your organization can learn and develop further in this direction.
The next phase is about integration:

  •  integrate Big Data analytics in the BI portfolio
  • integrate Big Data analytics in the BI architecture
  • integrate Big Data analytical competences in your BI team
  • integrate it with the strategy process
  • integrate it in the organizational culture
  • deal with ethical and privacy issues 
  • link the Big Data analytical practice with existing performance management systems.


And on a personal note, please, please be aware that the business analysis effort for Big Data analytics is not business as usual.

What is the Added Value of Business Analysis for Big Data?

This is a pertinent question formulated by one of the reviewers of this article. “It depends” is the best possible answer.
The Efficiency Mode
It depends on the basic strategic drive of the organization.  If management is in a mode of efficiency drive, they will skip the analysis part and start experimenting as quickly as possible. On the upside:  this can save time and deliver spontaneous insights. But the downside of this non directed trial-and-error approach can provoke undesirable side effects. What if the trials aren’t “deep” and “wide” enough and the experiment is killed too early? With “deep” I mean the sample size and the time frame of the captured data and with “wide” the number of attributes and the number of investigated links with corporate performance measures.
The Strategy Management Mode
If management is actively devising new strategies, looking for opportunities and new ways of doing business rather than only looking for cost cutting then Business Analysis for Big Data can deliver true value.
It will help you to detect leading indicators for potential changes in market trends, consumer behavior, production deficiencies, lags and gaps in communication and advertising positioning, fraud and crime prevention etc…
Today, the Big Data era is like 1492 in Sevilla, when Columbus went to look for an alternative route to India. He got far beyond the known borders of the world, didn’t quite reach India but he certainly changed many paradigms and assumptions about the then known world. And isn’t that the essence of what leaders do?

maandag 26 mei 2014

Elections’ Epilogue: What Have We Learned?

First the good news: a MAD of 1.41 Gets the Bronze Medal of All Polls!

The results from the Flemish Parliament elections with all votes counted are:

Party
 Results (source: Het Nieuwsblad)
SAM’s forecast
20,48 %
18,70 %
Green (Groen)
8,7 %
8,75 %
31,88 %
30,32 %
Liberal democrats (open VLD)
14,15 %
13,70 %
13,99 %
13,27 %
5,92%
9,80%

Table1. Results Flemish Parliament compared to our forecast

And below is the comparative table of all polls compared to this result and the Mean Absolute Deviation (MAD) which expresses the level of variability in the forecasts. A MAD of zero value means you did a perfect prediction. In this case,with the highest score of almost 32 % and the lowest of almost six % in only six observations  anything under 1.5 is quite alright.

Table 2. Comparison of all opinion polls for the Flemish Parliament and our prediction based on Twitter analytics by SAM.

Compared to 16 other opinion polls, published by various national media our little SAM (Social Analytics and Monitoring) did quite alright on the budget of a shoestring: in only 5.7 man-days we came up with a result, competing with mega concerns in market research.
The Mean Absolute Deviation covers up one serious flaw in our forecast: the giant shift from voters from VB (The nationalist Anti Islam party) to N-VA (the Flemish nationalist party). This led to an underestimation of the N-VA result and an overestimation  of the VB result. Although the model estimated the correct direction of the shift, it underestimated the proportion of it.
If we would have used more data, we might have caught that shift and ended even higher!

Conclusion

Social Media Analytics is a step further than social media reporting as most tools nowadays do. With our little SAM, built on the Data2Action platform, we have sufficiently proven that forecasting on the basis of correct judgment of sentiment on even only one source like Twitter can produce relevant results in marketing, sales, operations and finance. Because, compared to politics, these disciplines deliver far more predictable data as they can combine external sources like social media with customer, production, logistics and financial data. And the social media actors and opinion leaders certainly produce less bias in these areas than is the -case in political statements. All this can be done on a continuous basis supporting day-to-day management in communication, supply chain, sales, etc...
If you want to know more about Data2Action, the platform that made this possible, drop me a line: contact@linguafrancaconsulting.eu 

Get ready for fact based decision making 
on all levels of your organisation