You’ve spent weeks analyzing the data.
Through your analysis, you’ve found some interesting trends and even created some visualizations to highlight them. There are key messages you hope the audience will come away with, but all of a sudden you’re feeling a bit hesitant:
Is it my place to write a conclusion here? What if I’ve gotten something wrong? Do I even have enough data?
Providing data-driven insights is now something I consider an integral part of my analysis work.
This step is so crucial, we actually begin each immersive cohort curriculum with the foundational Thinking Like an Analyst course which clearly outlines the importance of data-driven insights as a key step of the Data Analysis Workflow. Providing the audience with data-driven insights helps ensure clear communication and can even help drive teams to action.
But this hasn’t always been something I’ve done, so after a bit of reflection, I’ve come up with a few barriers that I can recall holding me back from including insights or recommendations in the past and some tips on how I’ve worked to overcome them.
1. Being afraid I didn’t have ‘enough’ data.
Some of my earliest work experiences in data analysis came out of a laboratory environment where data collection is a very controlled process. Writing formal protocols that specify what data to collect, how to evaluate it statistically and the corresponding acceptance criteria were all part of the standard practice. Plus, it was not uncommon to conduct experiments multiple times to check for both the repeatability and reproducibility of results.
When I moved out of the lab, I felt a bit overwhelmed working with real-world data for the first time. Real-world data is messy! Often I was pulling information from systems that were not purpose-built for the measurements needed. Sometimes there were gaps in the data. Other times we had to use estimates because the ideal measurement didn’t exist.
Overall, I quickly learned that the type of controlled environment I had come from in the lab did not (and could not) occur everywhere. As a result, I found myself second-guessing if I had sufficient data to draw an appropriate conclusion or make a recommendation.
My comfort level in drawing conclusions from various types of data sets grew when I was able to add a risk-based lens into the reporting. Many times this would mean providing clear statements for the audience about the limitations of the analysis or conclusions based on available data or overall data quality. Other times it meant advocating for pausing and going back to the data collection process before proceeding.
Ultimately, having these conversations helped me break my own analysis paralysis and engage with the wider team to help keep the work moving forward.
2. Feeling like it wasn’t my place to make a recommendation.
As specialized and often in-demand, team members with analytical skills can be highly sought after for project work.
In my experience, this has often resulted in being added to various types of project teams in a consultative role. By consultative, I mean having a role within the team to analyze and report on data but no specific responsibility or authority for directing process change. Having transitioned from an operational role where I did have decision-making responsibilities, I found myself stumbling at first to know where or when to add my voice.
This changed for me when I swapped the narrative in my own mind. I purposefully started to shift my thinking from ‘What do I recommend’ to __‘What is the data telling us or We might need to focus on…’__.
This helped me center my communication around what the data was revealing about a process and using that as a way to inform teams as they moved through a project. And I learned that this strategy was both influential and supportive!
Over time, teams would become more comfortable with integrating data-driven insights into their workflow, and before long certain teams would actually pause along the way and ask for analysis before proceeding.
3. Being afraid of getting it wrong because I lacked domain knowledge.
Transitioning from that operational role where I had both technical and historical knowledge made me alert that there were going to be elements that I might not understand or identify simply by looking through a data set.
I was fearful that my lack of context would cause me to make a huge mistake in my analysis and result in faulty conclusions being made…and then it actually finally happened!
I was in the process of sharing a series of graphs I had painstakingly prepared for a project team update when the team members began to look at me with confusion. Gently, they let me know that there was something very wrong with the values I had calculated. And of course, they were 100% right!
I had mistakenly used the wrong field in the database to calculate a time interval resulting in nonsensical values being visualized. Without the domain knowledge of what an expected time interval could or should be, I had gotten all the way through preparing slides without even realizing the mistake.
So, while I was totally embarrassed at the time, this experience taught me two valuable lessons:
First, owning up to that mistake wasn't as big of a deal as I had made it out to be in my mind.
Once the error had been identified, I didn’t need to defend what I had done but humbly acknowledged the error, went back to the calculations to make the correction, and provided a revised update. This was key to maintaining the team’s confidence in the analysis and in me as a teammate.
The second lesson… the power of showing the data or analysis to the people closest to the work.
Clearly, this is an error I wouldn’t repeat once I understood the proper definitions for each field in the dataset, but without domain knowledge, I might continue to be vulnerable to such errors. But knowing that the people closest to the work were immediately able to identify something wasn’t quite right, I continue to build space into my review process for someone close to the work to provide their feedback early on in the analysis process.
This experience reinforced for me that I needn’t rely solely on myself. Instead, I see my analysis skills as a compliment to the domain knowledge and experience of subject matter experts, drawing them in (the earlier, the better) to add context and help all of us head off those avoidable mistakes.
Final Thoughts
Remember: no one gets everything perfect right away.
It’s taken me practice and experience to become more confident stepping into the space of drawing conclusions or making recommendations through data-driven insights, so it’s okay if it takes you some time, too!
It’s been a process of replacing these fallacies I had initially hindered myself with strategies to help me leverage my own skills, communicate effectively, and become more comfortable with what I can bring to the table to help support my team.
With a little time and effort, you’ll get there.
Be passionate. Seek mastery. Learn with humility.
-Stacy
SUPER EARLY BIRD IS HERE!
For a limited time, save 25% on our upcoming Python & Power BI immersive programs!
Explore how our immersive programs with direct instructor access, weekly live sessions, and collaborative environments can elevate your skills and accelerate your career.
Stacy Giroux
Cohort Learning Lead
Stacy is a former Cohort Learning Lead for Maven Analytics, helping to design, manage, and faciliate immersive bootcamp experiences for aspiring data professionals.