An open toolkit for composable, automatic, and scalable learning
Composable
To quickly assemble your applications
Automatic
To automatically tune your models
Scalable
To efficiently train your large models
For machine learning in the real world
Learn more
Examples
3
Scale across GPUs with Minimal Coding
A novel TensorFlow training engine for distributed deep learning
CASL Updates
Latest updates, publications, and news about CASL
Pollux: Co-adaptive Cluster Scheduling for Goodput-Optimized Deep Learning
In Submission

Introducing Texar-PyTorch:
An ML Library Integrating the Best of TensorFlow into PyTorch

EMNLP 2020
A data-centric framework for composable NLP workflows

ASYML
Machine Learning as Machine Assembly

AAAI 2021
BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search

NeurIPS 2020
A Study on Encodings for Neural Architecture Search

Journal of Machine Learning Research (JMLR), 2020
Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

AAAI 2020
Tutorial: Modularizing Natural Language Processing