Wallaroo delivers pushbutton productionization of ML models today, but that wasn’t always the case -- before our pivot, our small team of engineers set out to build a next-generation stream processing framework.
The Wallaroo team was clear that building another distributed computing framework in Java was not the answer. We built an initial prototype in Python to test some ideas, and we considered C++, Pony, and Rust. It turns out Pony was a great language for our initial goals around Wallaroo. It helped us solve hard problems quickly without having to build a concurrent runtime from scratch, and it enabled us to meet ambitious performance targets.
However, in the meantime data science and machine learning continued to explode, and we found that customer interest in deploying machine learning models far outstripped demand for more conventional stream processing data algorithms. By the time we started focusing on MLOps, the situation had changed. On one hand, Rust had come a long way in a short period of time. On the other hand, we found that we needed the more robust ecosystem of existing libraries available for Rust.
Rust provides most of the same advantages as Pony (designed for very high performance systems programming, memory and data race safety at compile time, the ability to use C/C++ libraries, compiles to native code, etc.)
Rust has a robust ecosystem of libraries. We get ready-made support for a wide variety of integrations.
There is a much larger pool of engineers who are eager to learn and work in Rust, or who already have significant Rust experience.
There are more resources available for learning Rust, and more opportunities for participation in conferences, etc.
Our new Rust-based platform recently handled millions of inferences a second of a sophisticated ad-targeting model with a median latency of 5 milliseconds, and a yearly infrastructure cost of $130K. From a customer perspective, that’s a resounding success.