Pavel Nesterov’s Post

I like that OpenAI reminds us about this waste. I'm personally not exactly "honored for passing 1 trillion tokens". And btw, we have already burned several trillion, but there is no "award" for a bigger waste. If you want to help us reduce our dependence on external LLM vendors and move more of the load to in-house models, send your CV to pavel.nesterov@revolut.com. We are hiring Software/Infrastructure Engineers to work on our serving platform, and ML Engineers with experience in training multimodal models.

  • No alternative text description for this image

Hit us up at Hugging Face. We have open models that you can deploy on your cloud, our endpoints, inference providers, or even bare metal. hf.co

TLDR: Before again going with ML/probability-based solutions, consider deterministic workflows. Lots of your LLM activity can likely be replaced with sophisticated deterministic workflows. I've recently launched an open-source tool for that. I've send you a connect.

Is it possible to work remotely from EU, specifically from the Czech Republic? By the way, I've run models locally.

This is like complaining about using too much electricity but insisting on keeping every light burning in your home 24/7. Calling out OpenAI for sharing a small token of gratitude for your custom isn't a good look in my humble opinion.

Congrats on the trillion-token non-award - how about an award for least tokens burned? 😅 We build a compiler-based runtime (Inceptron.io) that auto-compiles your in-house models so they run lean.

Pavel Nesterov, can we have this printed on a t-shirt, please?

  • No alternative text description for this image

why is it a waste? If it is a waste, it is the fault of the consumer not the service provider. They assume that you have used these one trillion tokens for good!

would you consider "outsourcing" part of this push? At *tabularis.ai*, we're building efficient in-house AI models tailored exactly for these kinds of workloads. Happy to chat about it :)

Very refreshing view on this award.

See more comments

To view or add a comment, sign in

Explore content categories