Praxis invited me to write a blog on what project data analytics is and how it can help us. There is a fear that robots will take our jobs and make us all redundant. I see it differently, it has the potential to give us superpowers and an ability to see into the future. Removing the burden of repetitive work and extracting insights from the patterns that we know exist in project data.
So what is project data analytics?
Project data analytics, at its simplest, is the use of past and current project data to enable effective decisions on project delivery. To be effective we also need data, often large volumes of it, requiring us to keep a keen eye on data harvesting and stewardship.
Imagine a world where we can use data from previous projects to predict where future projects are likely to diverge from plan, where we get better at understanding return on investment, estimating how to resource a project or where best to invest contingency. A lot of these insights are encapsulated in the experience that we accrue through projects.
But we often distil this down into selective abstracts and call it lessons learned. But we all know that there are complex relationships and cause-effects that exist that are difficult to summarise into a paragraph. We now have an opportunity to deploy the latest methods in data science to find patterns in data that help us to gain insights that would otherwise have been incredibly difficult to uncover.
I spent around a year pulling together a dataset of 20,000 lessons learned. I expected to delve into the data and find the secret sauce to enable us to deliver projects on time and budget. The more I looked, the more reports I found on the top 10, 20, 100 reasons for project failure; or the inverse, reasons for project success. What I discovered was that the lessons learned were largely worthless. Let me explain why.
If we think like a bot we need to codify how we flag those lessons up to someone to save them trawling through massive datasets. We need to understand what the lesson related to (work breakdown or schedule) and any constraints. We also need to understand the conditions at the time of the lesson so we know how much to change next time. I didn’t find any lessons learned datasets that were able to do this.
Imagine an analogy between managing a project and driving a rally car at full speed. We don’t need to understand the performance of every rally car on every road in all conditions. We want to understand the areas of concern specific to the car, the driver, the road and the weather on the day. Which drivers lost time on a specific sector and why?
Instead, we run our F1 equivalent project teams by sitting in a meeting room at the end of the season and pontificating on what we could have done better, then write it down in paragraphs in a spreadsheet. It is sheer madness.
Lessons learned is fundamentally misguided.
We expect project professionals to know about every potential arising, in every possible context, in real time. It requires super human power. Alternatively we can augment our knowledge with a computer. Think Ironman rather than Wikipedia. The basis of machine learning is to use algorithms to learn from data without relying on rules-based programming; isn’t this a good match to our use case?
So where does project data analytics come into it?
If we accept that data helps to codify our experience, we can use data analytics to provide insights at the point of need. We leverage our hard won experience.
The first step is to ensure that we are collecting the right data to answer the questions that we need answered as project professionals. The second step is to apply methods and tools to derive insights from that data.
Some organisations jump straight in and experiment with PowerBI, PowerApps and Power Automate. It helps them to build capability, awareness and traction.
Others will map out what they want to achieve and what data they need to do it, then design an ecosystem to deliver it. A big bang waterfall project is unlikely to succeed because the technology is moving at pace. But it is important to set out a roadmap and the key lines of development such as up skilling people through to data pipelines and tools.
A lot of organisations constrain their thinking to ‘their data’ but those who will outperform are beginning to consider the project data, end to end, across the lifecycle. How to capture, harness and leverage it.
How can it help us?
The opportunities are vast and practically limitless. I have never seen such a wide open goal with such potential to make a difference.
Taking a few examples:
Using robotic process automation to remove the burden of repetitive processes; the things that we all dislike doing. Whilst doing so we also improve data quality.
Organisations such as nPlan are processing hundreds of thousands of schedules to understand where variance is likely to occur and define probabilistic out-turns.
We gain insights into which work packages are more predisposed to quality defects or compensation events and why.
We can understand team sentiment and divergence between leadership and the delivery team. We gain insights into lead indicators and areas of focus.
We can understand the likely shortfalls in benefits and the factors that are most likely to influence benefits realisation.
The opportunities are vast.
We all need to start somewhere and I would advocate engaging with the Project Data Analytics Community to understand the art of possible, get inspired and learn from others. Also take a look at the Project Data Analytics Task Force White Paper that puts more flesh on the bones. Then develop your own implementation strategy and plan.
Some organisations are already starting to invest heavily. Others are ambivalent. But we will see the divergence between early adopters and laggards rapidly accelerate throughout 2021. Each will develop their own superpower niche and begin to outperform. Hugely exciting.