By using website you agree to our use of cookies as described in our cookie policy. Learn More

RESEARCH & RESOURCES’s Free CORE Community Version Empowers Data Scientists to Focus on Innovation

New, no-cost community version helps advance the AI landscape.

Note: TDWI’s editors carefully choose vendor-issued press releases about new or upgraded products and services. We have edited and/or condensed this release to highlight key features but make no claims as to the accuracy of the vendor's statements., the enterprise data science platform, has released its community version, CORE, amid extended remote work and social distancing to advance ML development and help the data science community leverage its model management and MLOps capabilities at no cost. 

The data science community has been central to the rapid growth of AI and machine learning innovation. Before its hype across the industry, data science was an unexplored pasture for enthusiasts to gain deeper insights combining the power of mathematics, statistics, and computer science. What once was a way of tinkering with algorithms has become widely embraced by the corporate world as a competitive strategy.

With the growing technical complexity of the AI field, the data science community has lost touch with the core of what makes data science such a captivating profession -- the algorithms. CORE opens its end-to-end solution to the community to help data scientists focus less on technical complexity and DevOps and more on the core of data science.  

The new platform offers a code-first workbench with workflow management tools and cluster orchestration for any on-premises or cloud computing environment. Built by data scientists for data scientists, CORE offers everything a data scientist needs to build high-impact ML models from research to production with:

  • ML workflow management: End-to-end tracking and monitoring capabilities
  • Cluster orchestration: Hybrid/multi-cloud-native Kubernetes and meta-scheduler 
  • AI frameworks in one click: Native integration with NGC GPU-optimized containers 
  • Full resource utilization: Advanced compute querying and autoscale 
  • Optimized resource management: Compute dashboard and advanced compute permissions

The platform can be installed on premises or in a cloud environment directly via the website. Users will be able to connect data sources, run ML experiments at scale, and deploy to production with full tracking and monitoring tools. 

More information is available at

TDWI Membership

Get immediate access to training discounts, video library, research, and more.

Find the right level of Membership for you.