Containers in the Age of AI: A Chat With New Docker President Mark Cavage

When a new CEO comes to a company, they usually bring their own posse.
So, perhaps it was not a surprise that in March, Docker named Mark Cavage as its new president. He arrived a month after Docker picked its new CEO, Don Johnson, who helped create Oracle Cloud Infrastructure (taking the place of former longtime CEO Scott Johnston) — with the help of Cavage, who worked with Johnson at Oracle.
Now he is helping Johnson run Docker, which, after the company sold the enterprise-focused side of the business to Mirantis in 2019, and is now focused on containerization tools strictly for the developer, namely Docker Desktop and Docker Hub.
We recently caught up with Cavage to get a better understanding of how he and Johnson will be serving the developer community in the years to come. In this interview, Cavage stressed Docker’s position as the de facto container company, with 20 million registered developers and billions of pulls in Docker Hub. He is focusing in on three areas, he told us: developer productivity, security, and AI integration, mentioning new tools like the MCP catalogue and Docker Model Runner. And then we talked about AI some more, and how Cavage doesn’t see AI being a huge disrupter to Docker, and how the containerized approach helps facilitate faster AI deployments.
This interview was edited for clarity and brevity.
You have a heavy operations background. How does that fit in with your role here?
I’ve been working with Docker for at least 10 years now. I started my career as an engineer, moved into management and operation roles, and have run things across engineering, product, sales, and marketing.
Why come here? When I was at Joyent [2011-14], we were building effectively building container clouds, containers in the cloud before Docker, and then Solomon [Hykes] and Docker came out with containers, and effectively that changed the game for everybody.
When I was at Oracle, I was working with the Docker team. Oracle was the first company to go put enterprise software in the Docker Hub and Docker catalog. I think it brought a lot of early validity to Docker. And so really, I’ve just watched the company on and off over the last 10 years, and it’s been near and dear to my heart.
In 2025, Docker is the de facto container company. And you know, we’re, you know, 20 million registered developers and billions of pulls in Docker Hub.
What are the duties of the president?
So, well, it’s a pretty spanning role. Don and I are partners in running the entirety of the company in terms of the org and responsibilities. It is largely looking across marketing, customer success, effectively, what it takes to go operationalize and drive the business and drive customers and make them successful.
What are the hardest issues that developers are facing today, and what is Docker doing to help?
There’s three things I’ll talk about, which is developer productivity, security and AI.
In developer productivity, there’s the developers doing every single thing they can do with the most cutting edge part of best practices, of containers. They’ve automated everything, and they’ve done everything it takes to go make their product be cloud-native, container-native, low cost, super high scale, etc.
That’s a small percentage of where the world is at. Our bread and butter is the full spectrum of developers, both new developers entering the market — in indie shops or inside of corporations — and developers who are struggling with brownfield applications they’re trying to get to the cloud.
To Docker, this is our bread and butter, and this is what we do, and we’ve been doing. But I think we have a lot more to go do.
I’m really excited about some of the work we’re doing around Docker Build, and Testcontainers Cloud that help developers go faster, take their muck away, take their friction away, and, like, really help them modernize their apps.
In security, we’re doing a lot of investments on helping developers manage their environment, and for them not have to deal with security vulnerabilities and not have to deal with CVEs, and just take that pain away and help them be productive and help them ship securely and safely.
And then AI is what we’re most excited about. We just launched the Docker Model Runner. We’re fervent believers the world is going to be composed of gardens of LLMs. And I think developers need choices, and they need access to those things on their laptops. They need access to those things in production, whether it’s for privacy, cost, latency, or whatever the reason is. And so bringing all the Docker tooling and all the Docker friendliness and all the Docker things that developers love to the ability to go work with LLMs and, in the future, agentic apps is an amazing opportunity.
There must be a way for developers to experiment with AI, without going through large cloud providers…
Docker’s sweet sauce has always been how we make it accessible, we make it easy, we make we just take the pain away and make it fun.
We have over 20 million developers who love us. I got hugged in the in the elevator yesterday, just for being the new guy. And I think we owe it to them, actually, to give them the ability to get started in AI development, no matter where they are in their career, and help them ship agentic apps and LLMs as quickly as possible.
There’s a place for the OpenAI and Anthropic and [Google] Gemini and all the big frontier models. And then there’s a lot of use cases where — whether it’s privacy or just cost or simplicity — you just want to run that locally. And so we think Docker has the responsibility to give the developers the path from taking that from their laptops to production.
So are you are working with the model providers?
We’re working with model providers like Mistrals and Llama and Gemma. Those are all packaged up as OCI standard containers in Docker Hub. The way you work with models is the way you work with any other container. And it’s just, I think, a lot simpler and a lot more accessible for the developers that have previously been excited about this but scared to get into it, because its hard for them to go interact with and it is hard for them to enter the ecosystem. We’re bringing all this just naturally to the ecosystem.
Just from a cost point of view, it just takes away all the burden of if I can run a local LLM in a Docker container easily and effectively for free. Your laptop will get screaming hot, but it will run for free, without you paying for tokens. That is the definition of reducing the barrier and making it accessible to developers.
So you can work the LLM via Docker. You can write your app via Docker. You can do all of that on your local machine. And when you’re ready to go to prod, you use the same Docker tooling and the same workflows.
Docker also released its MCP Catalog and Toolkit…
MCP is the, currently the most frothy topic on Reddit and Hacker News. And about 30% of the comments are, “I still don’t understand what MCP is.”
We’ve been working with Anthropic since early on. I think MCPs democratize how people make their LLM and their AI applications talk to their private data sources or their other services.
Now, when you look at all these MCP servers, and how your agentic app or how your LLM wants to talk to them, fundamentally they look like APIs, and they’re often talking to paywalled services, whether it’s like a Stripe or a Twilio or anything else, anything commercial, and they often have logic and filtering in them. And when you put that all together, you basically have versioning and packaging problems, and you have security and authentication problems.
Well, that sounds an awful lot like what Docker does. And so it’s a very natural thing for us to bring containerization to MCP. So a lot of people already doing this today, but this is the official Docker-stamped way to go do MCP in containers, making that available on the Hub. Docker Hub is just a huge distribution network for open source developers to get their software out to everybody else, and for those things be versioned and managed the way anything else is. And then, yeah, we’re going to solve the authentication problem.
Given so much of this is about your LLM talking to your MCP tool and needing to talk to something else that has these credentials, and so having that all be naturally part of the Docker flow is something we’re really excited about it.
Thanks to AI, there’s a lot of counterfeit images, since we have the the ability to produce malicious images a lot more rapidly than before. This is something that Docker Hub is gearing up to guard against?
Docker already does a lot of this today. Docker official images constitute the vast majority of deployments out there, because people trust us, and so they know that we already go through a vetting process, we go through a quality assurance process, and they understand that if I get an upstream open source package that’s popular from a Docker official image, I can probably trust it, and I can work with that, and I understand it.
And then you take that all the way back down to the desktop, and we have things like Docker Scout, where you can basically shift all the security scanning and all the policy enforcement down to the left to the developer where they’re working.
What we find a lot of the problems developers are having now is ultimately managing vulnerabilities and managing policy enforcement around some of the details inside of those images, and keeping up with vulnerabilities and all the rest. That’s where we’re making a lot of investments now to make that problem Docker-simple, Docker-easy and accessible to the masses to be secure.
I’ve been hearing a lot about “brownfield ” applications lately. There’s still a lot of migration going on from the mainframe or from legacy apps in general. Are you finding that this is still a big task for your developers?
Absolutely, just in the in the last couple days, I’ve heard about folks trying to standardize [Oracle] VirtualBox migrations onto Docker as a compelling target to make developer experience a lot better. When I think about that, I’m like, “Oh, that’s some long-ago technology.”
There’s a lot of software written in the 90s, the 2000s and the 2010s that is monolithic and often still on legacy environments. Sometimes they’re on Windows, sometimes they’re Unix ported to Linux. And I think a lot of those shops went through this migration of a lift-and-shift to get from on-prem to cloud, but then they didn’t really modernize their development practice and really bring velocity to iterate on those brownfield apps, which often constitute a pretty big part of revenue for a company.
And so we hear about this all the time. With Docker, you have an easy place to write your first app and the place that you ultimately help developers get more productive inside the enterprises they’re working in with those legacy apps. This is bread-and-butter for what we do every day.
Docker has always had really strong with documentation. How will you bring this into the AI era?
It’s something we’re thinking a lot about. We’ve got an agentic AI interface on the Docker site where you can ask your questions, and it’s very helpful. We have an application agent called Gordon, where you can interact and it will help you with your Docker compose files. It’ll help you with all the little image bits on the file system. So we already we’re doing those things, but, yeah, this question comes up a lot. We’re thinking a lot about just what’s going to happen when more web searches become agents doing the search, as opposed to humans doing the searches. We don’t have all the answers yet, but we’ve got things in place already.
How can Docker help small businesses get up to speed with AI apps?
When you think about what it takes to go build an AI or an agentic app, yes, you have this new superpower that did not exist even two or three years ago, where it can take unstructured text and just give you answers and generate it gets amazing. But then all the stuff around it still looks, smells and feels like the normal part of an application. You still have data storage problems. You have traditional problems of what to do with application state? What do I do with my application logic?
Almost everything requires some level of back and forth from the agent to the LLM. You have this new way of working with a control flow. But fundamentally, your application is still code that you’re writing. And so when you take all that stuff together, it looks, smells, and feels a lot like every other application. And so we think that, containers are not such a radical departure. It’s natural that Docker and containers are just how you continue to build AI apps.
I don’t think we see it as something where we need to change Docker completely to make it work with AI apps. I think it’s more a case of making it super easy and bringing these different patterns together into a cohesive experience.