You might feel like building an analytic model is the goal of data science. And you wouldn’t be alone. Companies commonly complain that around 70% of analytics projects fail, and the biggest reason is not because of poor predictions. It’s because many teams don’t know how to finish.
There are critical steps beyond creating predictive models.
In order for a business to realize the value of analytics, data science models must be delivered in useful ways to stakeholders and customers. The deployment process, where we make data science usable by a company, or the world, is one of the final steps that companies must take to fully leverage their data and realize measurable ROI. Yet businesses struggle with this crucial step, finding it a challenge to get right.
This Tech Tuesday is the first in a series of five articles in which I will show you how to get real business value from data science, by leading you through a framework to establish deployment as the final pillar of a complete analytics process.
I’ll share the proven methods we use for deployment, from ideation to optimization, with the objective of helping you understand what is required to effectively productize A.I.
Starting this week, I will show you why deployment should be one of the earliest conversations you have when starting an A.I. project, and how to work with stakeholders as end-users and ensure buy-in.
Next week, I’ll highlight the importance of choosing the right metrics to track deployment success, the difference between pre-deployment and post-deployment metrics, and also cover what comes after deployment – There’s more to this than you might think!
Let’s look at where deployment fits in the overall data science process. It’s the last step after handling data preparation and building machine learning models:
At a more granular level, here’s our tested and proven 10-step data science process (very similar to the open standard CRISP-DM):
- Business understanding
- Data assessment and understanding
- Choose model candidates
- Data preparation
- Pipeline construction
- Monitoring and maintenance
Notice all the work that must be carried out during the data science process before getting to the deployment stage. Essential tasks such as attaining a strong business understanding and ensuring access to clean data will weigh heavily on the success of data science efforts.
Imagine building a house without considerations of what it will be like to actually live there; to walk down the hallways, or come home from a long day and put away the groceries. Does the refrigerator open when a cabinet is also open? Such questions exemplify the importance of having a comprehensive understanding of the early and late stages of a data science project lifecycle.
Also take note that model deployment isn’t the single “end goal”. There’s monitoring, maintenance, optimization and testing which dictate how long data science works remains viable. In other words, predictive models need babysitting to make sure they remain valid.
In next week’s Tech Tuesday, article 2 out of this series, I’ll dive into more detail about how deployment makes it possible to share valuable information and insights company- and world-wide. By sharing data-driven insights through deployment, companies realize value from their analytics dollars.
In the following weeks, I’ll cover all the essential components of successful deployment and what we do with our customers to make sure our predictive models deliver real business impact.
If you have any questions, hit reply! And if there’s anything specific you’d like to learn in this guide, feel free let me know.
Here’s wishing you a great week,
Tired of Sitting Inside? You’re not Alone. Here’s an Analysis of Lockdown Fatigue
Google and Apple have released public mobility datasets. Here’s an analysis of the severity of European lockdowns and their observed mobility implications using this data, current through mid-may.
Did you Know you can A/B Test Your Algorithms Too?
The same methodology used for testing whether a red or blue button will get more clicks can be used to test whether that fancy new predictive model will outperform the one you’re already using. Here’s a simple guide to A/B testing for data science:
Artificial Intelligence Cannot be Inventors, U.S. Patent Office Rules
An A.I. system called DABUS “invented” two new devices, but the USPTO says only humans can do that.