Thursday, April 11, 2019

Is it time to reboot your Analytics Program? Try thinking like a Product Manager


It’s been a long time between posts; but now I want to share a common theme from my recent consulting experience. I have spent much of my time working with organizations that either had no serious analytics capability at all, or were getting little or no value from what they had. These engagements, mostly with smaller firms and non-profits, reminded me of my enterprise clients from a decade ago. They felt they had some level of business intelligence and analytics, but it was not working well for some or all of the following reasons:
  • They had plenty of static reports being generated, but most largely ignored as being obsolete
  • Presentation was almost entirely spreadsheet-based, with embedded logic compensating for the weaknesses in the source reports
  • There was no effective data governance effort, let alone a formal one with an accountable person running it. This created doubt in the reliability of the reporting
  • Often, there were some dashboards being generated;  but they were there mostly for the sake of having them and failing for the same reasons the reports were
  • Dashboards and visualizations were being presented without the benefit of context or any level of storytelling
  • The reports and dashboards were undocumented, and the original authors long gone; or if they remained, they were a single point of failure
  • There would be a few scattered desktop purchases of visualization tools like Tableau and Qlik, but no real plan for fully adopting the technology
  • Those tools were being used to generate visualizations that were long on style and short on value
  • Data was not timely or updated automatically
  • Mobile delivery, if it existed, did not translate well to small screens with limited bandwidth

There was no plan in place to climb the BI capability ladder– so they decided to call for help.
The most common root cause of a failed BI program is the Big Lie the person who sold the tools often tells: that buying state of the art BI tech will, by itself, solve a business problem out of the box.

Successful BI executives know that BI implementations are software products, not just tools or implementation programs. This implies that they must think like a product manager. Product managers know that great tech is necessary but not sufficient to create a successful product. Great products need marketing, production control, quality control, user support, documentation, and an achievable rollout, maintenance and enhancement plan.

All of these traits apply to BI programs that that seek to create a great user experience defined by data quality, system reliability, accessibility, ease of use, and support when it is needed. The goal is to provide your users at all levels of the organization with a decision support capability. This requires people, and process assets to go along with the technology.

One note about people: Experienced BI pros are often scarce and overpriced.  Try to find your data analysts and system admins from within. They already understand your business and provide context. They are usually easy to train up on modern tech which has become very usable for those with little or no coding background.

Process is where outside help can be most valuable. Experienced consultants can expedite the establishment of key IT processes that are vital to success, including:
  • Data acquisition across multiple source systems
  • Data governance and quality control
  • Strategic and tactical alignment: Fitting BI products to business decisions, target audiences, and use cases
  • Self-service capability and support
  • Prioritizing maintenance and enhancement requests as part of release management
  • Putting the necessary security in place
  • This is critical - Marketing your BI products

If your organization already has a product management function, seek their advice. They know how to do this in the context of your business and can help you apply their best practices to your BI program.

Saturday, May 19, 2018

Is your digital presence failing? It’s probably not a technology problem


A few years ago, I wrote a post where I lamented the fact that analytics fails are often blamed on technology when, in fact, the technology is not the problem and changing it solves nothing.

One of the main causes for failure I noted there was faulty data resulting not from a technology issue; but from poor governance rendering the analytics untrustworthy – and thus nearly useless. In other words, it’s a content problem, not a packaging issue.

As a follow-up to that post, I want to discuss an analogous issue that I see in digital marketing.
Lately, I have been implementing analytics on the digital properties for organizations that do not have much experience with digital marketing. These clients launch their websites and social presence with high expectations that they will generate interest and high levels of engagement with their intended audiences.

Often, I can report that they do… at first.

And then it all falls off. Sometimes very quickly. Traffic starts to wither and the visitors that do come no longer engage as well.

Storytelling is no fun when the data tells a sad story.

The reactions to this tend to follow a pattern: The technology must be to blame. First they question the accuracy of the data. Once I assure them nothing has changed and the analytics are working properly, attention is turned to the website design and the branding elements on the social platforms. What often follows is some modest experimentation with design changes that yields little or no improvement.

Soon, there is talk of an entire website redesign. It’s around this time that I suggest that maybe the technology platforms are not the problem. Perhaps the content that worked well at launch has become stale and needs regular updates to remain relevant and compelling.

Once that reality sets in, the idea of experimenting with fresh content in the form of messages, photos, video, etc. starts to look better than a time consuming, and likely expensive site redesign.
At this point, the problem becomes that the staffing plan does not include having people dedicated to developing new content on an ongoing basis. This, in turn, creates resistance. A one-time design change is easier to pay for.

The only way I know to counter this flawed thinking is with a focused effort, even if it is temporary, to release new content, and monitor the impact.

Results will vary, but fresh content nearly always results in a positive spike in all the important metrics. This improvement, however, can only be sustained with an ongoing effort and dedicated resources, along with an incremental design optimization program to maximize the impact of your content.

That is - people and process will matter more than technology in the long run.

Friday, May 4, 2018

Analytics Fail? It's Probably Not a Technology Problem

This post was originally published on LinkedIn - re-posting here as it generated quite a few responses.


There is a dirty little secret that those who sell analytics technology won’t tell you. More often than not, the data that initial applications yield just sits on the servers and is never used effectively, if it is used at all.

This probably will not come as a surprise to many of you reading this.  If you are an experienced practitioner, you have likely seen it firsthand.   So why are expectations - so artfully set in the technology sales cycle with beautifully polished demos and proofs of concept - not fulfilled initially?  Note: I say “initially” as there are expensive re-implementation efforts that follow and these tend to be more successful.

So what happens when you roll out your brand new analytics technology (hopefully on time and budget ;-) to great fanfare and expectation, and you find out a month later that it is getting very little user traction?  Of course, you do a survey to discover what is wrong and you get responses like these:
  • I can’t find what I need; too much noise in the data
  • The tool is too hard to use
  • It does not tell me the right things
  • I don’t trust the data; it does not match what I had before
  • The data is inconsistent
  • I need more history for trending
  • I can’t slice it the way I want
  • I let my analyst get me what I need, and it takes them so long it is useless by the time I get it
  • I cannot compare it to anything
  • Data is too old
  • I can’t easily export to Excel
  • The graphics suck (see - export to Excel)
  • I can’t make notes or track external events or share with others
  • It's fine as far as it goes, but It's just telling me what happened, not why
  • It's just not actionable
These are just a few examples; I could go on for quite a while here and feel free to comment with your own.

For many, the initial instinct is to blame the technology.  This works really well if the folks who own the implementation are not the ones who selected the technology. In most cases though, the technology is not at fault.  This truth tends to be reinforced by that customer at the user conference (we got to attend for free last year) that proudly showcased all the wonderful things we cannot seem to do.

The real barrier to acceptance tends to lie in the failure to properly define, document, and act upon the true functional requirements of the application. 

This fact has not gone unnoticed by the community.  (See: How to Choose a Web Analytics Tool: A Radical Alternative and the comments for one take on technology selection failures and Why Analytical Applications Fail for a view on the difficulties inherent in defining requirements.)

In my experience as a consultant and a practitioner, I have found requirements definition failure generally results from one or more of the following mistakes:
  1. Assuming that purpose built niche applications like digital analytics are useful out of the box and do not require a formal set of requirements
  2. Thinking the requirements used to select the technology are sufficient to drive the implementation
  3. Defining requirements without any defined process or using methodology that is completely ill-suited to analytic applications
Anyone with experience in this area will tell you 1) and 2) are patently false apart from the simplest and most basic applications.  As for 3), it happens quite often.  Let’s break this down a bit by input, process and output:

Input: All analytic applications are hungry for data that needs to be captured within the underlying business process and technology.  For customer analytics, think website clicks, survey responses, call center interactions, etc. Event capture nearly always requires some custom coding beyond the plug-ins and templates that the tools provide. Software developers are not great mind readers; they need solid requirements to do that well.

Process:  Sticking with my customer analytics example, buying a tool in lieu of a custom build is an easy decision when the high level requirements fit what is on the market. The tools themselves provide enough flexibility to satisfy most of the situations they encounter.  That flexibility comes at the price of complexity.  This takes the form of what can be a bewildering array of configuration options and combinations; any number of which can be applied to solve the same problem.  Finding the optimal combinations, even for those with the best expertise in that tool requires a well communicated set of current and anticipated requirements.

Output: Again, purchased analytic applications usually come with a slew of predefined reports, but the utility of these is directly tied to the manner in which those report designs mesh with the application configuration and the preferences of the intended audiences, not to mention the devices those reports will be viewed on.  As such, they must also be implemented and modified based on documented requirements.

The takeaway here is, when analytic applications fail to gain adoption or prove to be ineffective, resist the temptation to blame the technology and replace it thinking that will solve the problem. Most of the time, we choose technologies that can do the job. Focus the effort on establishing a baseline set of requirements that can deliver value relatively quickly and validate the technology. From there, it is just a matter of delivering frequent incremental value and letting the application evolve as needs change and new opportunities present themselves.

Thursday, October 12, 2017

Why are Technology Managers walking away from Analytics?

As we all know, a robust business Intelligence and analytics capability has become a major priority for global enterprises. Yet, in a very real sense, Technology managers (IT) are actively distancing themselves from it. How do we explain this paradox? Is it a case of liberation or abdication?

The two megatrends in analytics are self-service, where IT pushes application development out to users and Cloud-based analytic platforms, where IT outsources hosting to a service provider or tool vendor. Both of these dramatically reduce the role of IT in analytic application development, delivery and maintenance.

Why is this happening? There are several factors at work. One is something we call Data Democratization. This is just another way of saying that the massive amounts of data business now produces and highly values can become directly accessible to business consumers and freed from IT ‘tyranny’. IT customers love this idea.  Business units gain the control they seek over their own budget and priorities. Let’s call this Do-It-Yourself (DIY) analytics.

Recent developments in technology enable DIY analytics. The custom hardware and specialized skills once needed to acquire and warehouse data are being replaced by wave of specialized products that can:

  • Acquire data using pre-defined connection software. No technical expertise required.
  • Store, process, and analyze very large volumes of structured and unstructured data using low-cost commodity hardware
  • Retrieve data, visualize results and build applications efficiently and quickly without the need for optimized data models and custom coding. Professional developers no longer needed.

Technology is not the only driver here. Organizational factors also make this model appealing for IT. When IT ‘liberates’ analytics, it has: 

  • Less budget to justify
  • One less source of backlogs and maintenance headaches
  • More time to focus on other issues like security and disaster recovery
  • Created a sense of agility within the enterprise with more responsive application development

The market for analytics technology has also become biased toward users and away from IT. Although the tools have become much easier to use, they have also become more difficult to select and administer as they become more diverse and complex. They are expected to support everything from data warehousing to reporting, advanced visualization, collaboration, predictive modeling, and much more. Matching these capabilities with business requirements requires in depth knowledge of business processes. Beyond that, the tools market has become more fragmented, with many startups and specialized, industry-specific products.  Vendors, for their part, prefer selling to users in general as these sales cycles tend to be faster.

Are these trends a good thing for your organization? It depends on both the situation and the quality of execution of a decentralized DIY BI strategy, but there are some important things to consider in every case. Governance is the elephant in the room. Striking the right balance between IT and user control is key. Risk management and overall cost controls are best managed centrally. Moreover, sound data management dictates that raw data are enterprise level assets for all to use, and thus should be managed centrally, or at least through some kind of federated model to assure quality and integrity.

Another consideration is accountability. If a self-service analytics model fails, IT will likely get the blame. As such, IT needs to do all it can to assure success and a smooth and responsible transfer of responsibilities.

Here are some recommendations: Analytics is not a technology or an application. It is a capability with people, process, and technology components. Organizations within an enterprise become partners to create this capability by doing what each does best. For example, IT is usually best positioned to maintain and manage data quality, security standards, tools training, systems administration, and vendor management. The business units can then apply the technology to support their decisions and business processes most effectively. These organizations also need to cooperate and, to some extent, govern themselves with regard to things like: knowledge sharing, vendor relationships, data sharing and certification of results.

There is one other thing to keep in mind: The shifting of responsibilities for analytics is not a new phenomenon. Control of analytic applications has been something of a cyclical tug of war between users and IT since the early days of timesharing, the PC and spreadsheets. At this point, we are near the peak of user control in the cycle. History tells us this may reverse in time if governance fails and breakdowns in security and trust force a return to central control.

x

Tuesday, April 18, 2017

Data Scientists vs BI Analysts: Why is this a thing?

I came upon this popular Linked In post that attempts to define and draw distinctions between these two roles. It is one of several I have read recently.

Upon reading this, my initial reaction was how much I disagreed with these characterizations. My feeling is that, if we want to draw such distinctions at all, they should be between business analysts and data analysts. Data Scientist is just a newer title that combines some attributes of both business and data analysis. It nearly always includes a mastery of big data technologies and statistical methods, thus commanding higher compensation.

Then I realized that this is all beside the point. These role definitions are more about recruiting, HR job descriptions, org charts, and pay grades than what is actually required to succeed in an analytics program. What matters is having the necessary skill sets on the analytics team, regardless of what roles or organizations they come from.

As has always been the case in BI & Analytics, the critical skill sets can be considered using the classic Input à Process à Output model:

Input
Process
Output
·   Data sourcing & extraction (ETL/ELT)
·   Data preparation
·   Data quality
·   Data governance
·   Data navigation & investigation
·   Data discovery
·   Business analysis
·   Modeling
·   Predictive analytics
·   Reporting
·   Dashboards & KPIs
·   Visualization
·   Operational applications
·   Presentation/storytelling

BI/Analytics technology no longer respects the walls between these skill sets. The market has moved away from niche tools to suites that address the entire analytics capability set. For example, what were once pure visualization tools now offer data sourcing, transformation and modeling features. The impact of this has been to democratize the entire data supply chain in such a way that it has moved much closer to the business and completely obscured the role distinctions between data analysts, scientists and yes, decision makers. In fact, the overlap of these roles and the trend toward self-service BI tends to create organizational redundancy within larger organizations that can afford it.

The fact that the technology is available to many roles does not mean that individuals should be expected to have all the necessary skills to leverage it effectively. In fact, very few people do. Our trade has always placed a high value on those who can navigate data, develop actionable information, and present it effectively because they are still rare. This won’t last. The generation that is now entering the workforce has a much higher level of data skills than its predecessors and will value the ability to develop their own stories and support its own decisions with data as it rises to executive positions.

If the goal is to leverage data most effectively and maximize decision support success, don’t look to your organization to create a new role. Look to your team to fill any skills gaps, preferably by expanding the roles already in place. The goal is to minimize the organizational distance, handoffs, and filters between your sources of data and those who directly put it to use in business processes.

Thursday, December 8, 2016

Agile is dead: Analytic Applications Edition - continued

In part 1 of this post, I wrote about how the software development community has speculated that Agile methodologies may have become over-hyped and the implications of this for BI/Analytics practitioners.

It’s now official.  Agile methodology for data and analytics has indeed reached the peak of the hype cycle. How do I know this? Because the venerable strategy firm McKinsey has blessed Agile as a data ‘transformation’ enabler.  See this white paper. I encourage you to read it in its entirety, but I will highlight some of the major points here.

The main thesis of the piece is that data has become a key strategic asset for their large enterprise clients (!) and that the diversity, volume, and velocity of that data require a high level of agility to leverage it quickly enough to effectively support decision making and opportunity discovery.

The key challenges that these clients face are integrating the data silos that their IT architecture creates and drawing a direct connection between data management success and quantifiable business benefits. This, in turn makes it difficult to justify the significant investments required to manage diverse data at scale.

They point out that Agile methodology that has been adopted by IT management to make applications software development more responsive to business needs. They then posit that Agile can have a similar impact on the establishment of enterprise data management capabilities in the age of Big Data.

Wow! They are just coming to the realization that enterprise data management is a good thing and traditional IT practices in this sphere can be ponderous, excruciatingly slow, exceedingly expensive, and out of touch with their missions?

The analytics community has known this for decades. Typically, the fix has been for business areas to go ‘rogue’ and build out their own data capabilities to drive their analytics. This is an old issue but it is now getting new currency with McKinsey’s IT clients as technology in the form of cloud-based BI platforms, powerful self-service BI tools, and APIs that make ‘roll your own’ analytics practical and relatively cheap at scale.

Of course, this is always the case when technology creates new-found BI ‘democracy’. Governance suffers and the data silos reach their limits quickly when business operations require true cross-functional analysis with integrated data.

McKinsey's solution approach has 4 major aspects:
  • Create and empower Agile cross functional teams (scrums)
  • Update the technology infrastructure to support and integrate “Big” and legacy data assets
  • Emphasize new forms of communication to demonstrate value and discover new opportunities
  • Develop KPIs to measure success
What’s interesting to me about the suggested solution approach is that there is little news here for BI/Analytics professionals and only passing mention of Agile tenets like scrums, user stories, Kanban, etc. What they are really advocating is good old cross-functional engagement, starting with delivery of modest high priority value, constantly iterating, and doing a better job of demonstrating ongoing success.

Here is where I take issue with what they are saying:  It makes perfect sense from a strategic perspective, but can be very difficult to implement tactically. Agile works best when applied to a discreet software product with its own life cycle. Forming scrum teams to work in parallel sprints churning out stories and epics, and then ultimately disband, can be practical in this scenario.

Data management, however, is not a project, product, or even a platform. It must be an ongoing capability if it is to work. To McKinsey, this requires drafting business experts to join their highly talented and experienced IT counterparts and wall them all off in a data lab. This, however, cannot be a short term assignment. In fact, if they succeed in discovering new opportunities, these labs will create an ongoing need to remain in place. Even the largest organizations I have worked in cannot afford to take that talent from their native organizations and send them to a lab for long.

For blue-chip consulting firms, promoting this kind of transformation initiative makes for some very lucrative consulting opportunities (I know. I have worked some of them.) I believe what works better in practice is to take an entire line of business, and building (or rebuilding) it from the ground up to support not only a comprehensive data management capability, but a data driven culture where everyone has some direct responsibility in their job description for acquiring, processing, deploying and using data in their daily work. The success of that effort can be used to propagate the culture across the other businesses in the enterprise.

Perhaps instead of thinking in terms of minimum viable products, we should set a minimum viable business unit as the initial data transformation goal. From that, we can deliver, iterate, improve, and expand by example.

Thursday, November 17, 2016

Agile is dead: Analytic Applications Edition

For those of you who are not involved in developing software applications, The Agile movement was, in essence, a revolt against the then common practice of taking on the entirety of large software projects all at once; adhering to a methodology that stressed a strict sequential progression of stages beginning with requirements gathering, followed by programming, testing and culminating in a ‘big bang’ implementation. This came to be known as ‘waterfall’ methodology. Among the many issues with it is the fact that requirements usually change over the course of these lengthy efforts and what ends up being delivered often no longer adequately addresses the needs of the businesses that commissioned them.

Those of us who develop and manage analytic applications came to this conclusion decades ago out of necessity. One reason is that these applications support the very unique business process that is decision making. Unlike more conventional business processes such as booking airline reservations, decision making as a process can take many paths as data is analyzed and new information is discovered. The idea that one can precisely define requirements in advance of implementation simply does not apply for all but the most structured decisions like automated algorithm-based credit decisions. Even those require frequent updates as outcomes drive new learning and improved algorithms. Successful analytic application developers learned to use prototyping and relatively short incremental development cycles to keep their products relevant and achieve customer satisfaction. In essence, we adopted agility long before the Agile development hype cycle began. I went into more detail on this in a previous post on the impact of Agile on Analytics.

Since that time, there has been both a technology and culture-driven boom in analytic applications development as businesses of all sizes and maturities adopt a more data-driven culture. At the same time, the tenets of Agile methodology have been zealously embraced by IT executives who bought into the hype as they sought to deliver better applications sooner and cheaper. In fact, Agile certification became a requirement to work in some shops on all projects. Collisions ensued as the realization set in that orthodox Agile methods were not developed with analytics in mind and often could not be applied to Business Intelligence and Analytic application development projects successfully.

BI/Analytic application project teams were put in a familiar and awkward position. They could either try to explain why a methodology designed for a different purpose does not apply; or create the illusion of compliance with Agile dogma, technology and terminology that added little or no value to their efforts. Meanwhile, vendors and consultants in the Analytics space were all too happy ride the wave, coining the term “Agile Analytics” in an attempt to reconcile Agile mandates with proven methods in the BI/Analytics discipline.\

It now appears that the software developer community at large is having qualms about Agile software ‘revolution’ and what it became. Even the original thought leaders of the Agile movement have reservations about what has come of it. There has even bit something of a developer revolt against it. Then there are the chronicles of the Agile hype cycle and some very thoughtful pieces around how to move forward from Agile.
These critiques of the Agile movement as it is currently practiced have several points in common:
  • Agile has passed the peak of its hype cycle and benefits resumes and consultants more than software projects
  • Agile, as it is currently practiced, has become more process than objective-driven. This is exactly one of the faults it was designed to cure
  • Requirements definition (often in the form of vague ‘user stories’) has suffered to the point where it has degraded testing, compromised necessary documentation and caused a fall in overall quality of delivered products
  • Adoption of Agile in Name Only (AINO) practices where a waterfall mentality persists, development performance metrics remain a goal unto themselves, scrums become meetings, sprints become epics before defined value is actually delivered, and development teams remain as disconnected from the business as ever
  • Applications architecture suffers as semi-independent project teams ignore standards and governance to meet their time and cost constraints. This one in particularly is deadly in the long run
Aside from sounding very familiar, what does this mean to us in the Analytics community? It means we need to stress  that what works for more traditional business process applications often does not apply to the unique nature of decision support and analytics applications. We define capabilities, not user stories. We lay down a sustainable data platform before we attempt to build applications on it. We prove concepts and prototype before we make major investments, we govern our data relentlessly to preserve credibility, and we develop our environments with an eye towards how they will be maintained and enhanced. Most importantly, we must remain focused on the results we can achieve while they are still needed and not worry so much on how we achieve them.