Learn data science best practices

Supporting Your Digital Transformation

Digital transformation occurs when enterprise organizations use technology to drive an overhaul of their performance or reach. Planning for consistent radical change is a daunting task, but the reality is that digital transformation is the key to staying competitive in the modern world.

It’s now commonplace for companies to use data science to understand consumer needs, but only 22% of companies surveyed by Forrester last year reported they were leveraging their data well enough to get ahead of their competition. So what’s the missing piece? In order to influence a digital transformation, data science efforts must be seamlessly supported by IT.

You already know that you want your data scientists’ models to be reproducible and easy to implement into a production environment, so don’t treat IT requirements like standardization and deploy functionality as an afterthought. It’s crucial to understand and prepare for common challenges associated with data management, governance, and access before they cripple a data-driven digital transformation at your organization.

Using software containers is one of the most impactful steps you can take to implement IT management best practices. These standardized development environments ensure that the hard work your data scientists put into building predictive models won’t go to waste when it’s time to deploy their code in a different environment. Without them, a data scientist launching a new environment must either wait for IT to build one for them each time, or launch one themselves using the unique combination of packages and resources they prefer to use for the problem at hand. There are two key issues associated with both of these approaches:


    They don't scale. When data scientists are individually responsible for configuring environments on a project-by-project basis, it becomes a nightmare to ensure their work is reproducible. Containers provide a perfect solution to scaling data science because they can put the power in the hands of IT to standardize how environments are configured in advance.



    They're actually slower. With a little bit of foresight, your IT team can build templates called images using container services like Docker with standard sets of languages, packages, and frameworks from which data scientists can launch and even customize environments with just a few clicks.


Using containers to launch environments easily saves time for both your data science and IT teams, but this is just the first step. For more IT management best practices, check out our latest article in Forbes about Data Science, IT, and Your Digital Transformation.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.