25 Mistakes you are Making in your Advanced Analytics Program

Raise your hand if you are making more than 15!


  1. Day-dreaming that analytics is a plug & play magic wand that will bring short term ROI. Well executed basic excel models might have brought quick wins in the 2000s but advanced analytics requires some time. Analytics is never plug & play because plugging data into models is extremely lengthy, because learnings are not transferable across companies or markets and because they require a high OPEX in people and high CAPEX in systems.
  2. Solving problems that are not really worth solving, which results in a waste of time and resources. Analytics is not about solutions looking for problems but problems looking for solutions. Questions such as “What can we do with blockchain?” do not make sense. The worst mistake of the Chief Data Analytics Officer is not having an extremely clear view of what key challenges and opportunities each functional area is confronted with.
  3. Relying solely on vendors or consultants for analytics, especially on model creation. The post-mortem of how corporates fail developing capabilities with consultants is as follows: the client hires a consultant to deliver a project and at the same time develop internal capabilities. The client has far too unrealistic expectations about the impact of the project and consultants never say “No” and oversell the project. The impact does not materialize and one day the client tells the consultant “If you do not get some impact in the next month, I will stop your contract”. That day capability development officially dies, if it had ever existed. RIP. A few million dollars in the trash bin. Anyway, analytics is the brain of your company. How could corporates even think they could outsource it?
  4. Not developing a fully comprehensive list of priorities, since you can only count with five fingers in one hand. Therefore, management level should pick at most five metrics rather than making everything seems important.
  5. Saying yes to random management requests, like pet projects or glamorous visualizations and reporting which often results into analysis-paralysis syndrome.
  6. Assuming that abstinence from external data monetization or from cloud is the solution to data privacy and security. While there are some regulatory restrictions in some industries and sometimes ethical limits, external monetization and cloud done properly do not necessarily involve any security risk.


  1. Organizing analytics under functions which do not drive the business on a daily basis such as IT or strategy. Analytics is only powerful if it is coupled organizationally with daily operations.
  2. Letting multiple analytics teams flourish with organizational siloes among them. Analytics needs to keep an integrated view of the business.
  3. Attracting talent only through base compensation. Instead it is necessary to build a sense of purpose, to create a powerful employer brand and to develop internal talent.
  4. Hiring a bunch of PhDs who strive to develop highly nuanced models instead of directionally correct rough-and-ready solutions, hence they fail to prove actionable insights. So, don’t hire PhDs, hire highly coachable fast learners.
  5. Hiring a technical Chief Data Analytics Officer. Hiring a non-technical Chief Data Analytics Officer. Instead he needs to be both: technical enough to coach his team and business-driven enough to understand business problems.
  6. Not bringing domain experts and internal business consultants to the analytics teams to bridge the gap between business leaders and analytics teams to ensure an end-to-end journey from idea to impact.
  7. Neglecting the creation of a data-driven culture through active coaching across the whole organization from sales agents to the CEO. Oh yes, especially sales agents and the CEO.
  8. Not being objective enough and remaining biased to the status quo or leadership thinking. Analytics teams deeply embedded in business functions or BUs are more likely to have these troubles than centralized ones. This is why some organizations create quality control teams


  1. Not embedding analytics in the operating models and day-to-day workflows. This will result in a failure to integrate technology with people. Using analytics as part of their daily activities, help users to make data-focused judgement, make better-informed decisions, build consumer feedback into solutions and rapidly iterate new products, instead many are still relying on gut feelings and Hippos on decisions (Highest Paid Person Opinions)
  2. Not collocating data scientists with the business teams they support. Otherwise they will not talk to each other.
  3. Managing analytics projects in waterfall. Parameters of a model cannot be known upfront. They are determined through an iterative process which looks more like an art than a science. Therefore analytical projects need to be iterative by following, for example, the Agile Framework.
  4. Not being able to scale analytics pilots up. Analytics often starts piloting use cases Companies often end up killing pilots as soon as they need to reallocate funding for other shorter-term initiatives.
  5. Neglecting data governance as a fundamental enabler. Data governance refers to the organization, processes, and systems that an organisation needs to manage its data properly and consistently as an asset, ranging from managing data quality to handling access control or defining the architecture of the data in a standardized way.


  1. Trying to create data science models without refining your data engineering infrastructure: cleaned repositories, efficient engines and streamlined extract-load-transfer processes. Data engineering without real use cases to model is also wrong. Both modelling and engineering must go in parallel or in an iterative way.
  2. Not using any of the following basic technologies: Hadoop, Spark, R, Python, an advanced visualization tool of your choice, and a granular self-service reporting system open for the whole organization.
  3. Having technological siloes among data repositories which makes it difficult to integrate different kinds of data into a model. The power of analytics increases exponentially with the diversity of data.
  4. Not automating analytics withI., which can be an extremely smart assistant to data scientists they help them to cleanse data, to check for correctness, to deploy models, to detect relevant prediction features and obsolescence of models, or even to generate hundreds or thousands of variations of models. All in all, the analytics strategy of the business has to be a subset of the whole A.I. strategy since the datasets needs to feed the A.I systems.


  1. Not allocating enough budget for analytics platforms, but yet still keeping Shangri-La dream expectations. And the opposite is also an error, allocating more than enough money which have no direct correlation to business outcomes.
  2. Not measuring the ROI of analytics initiatives. We know ROI is mid-term but that does not mean you don’t measure it.

Disclaimer: Opinions in the article do not represent the ones endorsed by the author’s employer.

About the author

Pedro URIA RECIO is thought-leader in artificial intelligence, data analytics and digital marketing. His career has encompassed building, leading and mentoring diverse high-performing teams, the development of marketing and analytics strategy, commercial leadership with P&L ownership, leadership of transformational programs and management consulting.


Special Thanks to Shahid Shayaa, Kah Ming Lim, Vala Ali Rohani for sharing of ideas.



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s