Technology is a huge contributor to carbon emissions, and it’s only going to get worse as we run more and more things with higher demands — like AI workloads. 

By understanding what your IT’s actual environmental impact is, you can make changes to lessen that impact, and at the same time, lower costs.

In the most recent episode of our podcast, Get With IT, Kai Wombacher, product manager at Kubecost, talked to us about how cloud computing impacts carbon emissions. 

Here is an edited and abridged version of that conversation:

How is technology itself affecting the carbon footprint? 

Nowadays, everybody is running their applications in the cloud, and it’s become, really, really easy to get access to the infrastructure, the CPU, the memory, whatever you need to run your application or host your website, and do whatever you need to do. So that, in a way, has been amazing. It’s really democratized everybody’s access to it, and it’s led to this explosion of functionality of technology everywhere in our lives. But the downside to this, of course, is that there are these massive data centers all over the world that take tremendous amounts of power to run those applications, run those jobs on the computers, and also to cool those machines. And I think a lot of times, people overlook or don’t consider the environmental impacts that these applications or this technology is going to have.

So how did you get involved in this? Why did you start to measure this? How do you measure it, and what can we do about it?

Kubecost and Opencost — Opencost is really the open source version of Kubecost — have been focused on making cost visibility and cost monitoring much, much easier for teams in the Kubernetes and cloud computing space. One of the things we’ve seen is that a lot of teams not only didn’t consider the dollar costs, but also aren’t considering the environmental costs of their workloads. So, we really had this idea and partnered with open source technology called ThoughtWorks, and we’ve been able to give teams just at a glance, “hey, here is the carbon impact of this workload or this application. We make it easier to understand what this is costing me, not only in dollars and cents, but also costing the earth. 

As I go to different conferences like KubeCon or FinOps, and we demonstrate our Kubernetes or cloud optimization functionality, it’s been amazing to see developers are oftentimes far more interested or more receptive to optimizing their applications for the environmental impact more than the dollars and cents impact. So it’s been a really impactful feature, I think, for our teams and users.

So tell me a little bit about Opencost. When was that developed? How was that developed? Were you guys behind the project? 

So we were the creators and primary maintainers of Opencost. I want to say it’s about four plus years old now, and yeah, it’s basically, it’s free, it’s open source to the community. It’s really easy to get started. It’s just a simple copy and paste the helm command, and you are up and running with total visibility into all of your Kubernetes workloads all the way down to the container level. And not only do you again get the insights into how much those are costing you in dollars, but also the carbon footprint, and you can also pull in the cost of workloads not associated with Kubernetes, if you have cloud costs, or costs coming in from like Datadog, external costs. It’s really helping teams get all of their IT costs together in one place. 

When you hear the word Kubecost, you think immediately, well, what is the cost of Kubernetes in this climate problem? But actually, because of what you just said of how you can monitor it right down to the container level that might actually be mitigating some of those problems.

I think that kind of visibility down to the container level helps you really understand your costs or environmental impact however you like, whether you want to see it, for your individual team, for an individual person, or for an individual application. I think that’s where the power really comes, because nobody structures their infrastructure the same way.

On a macro level, you know, we have these Paris Accords, and every nation in the world has promised to reduce their carbon emissions by a certain percentage and what have you. And yet, it’s business that has kind of opposed it a little bit, because it’s too restrictive, it’s going to put too many requirements on us, it’s going to cost us money, etc. So how are you able to try to convince the people in the technology sector that this is important for the Earth and how is that message getting across?

That’s a great question. The nice thing about our space is it’s pretty linear between saving money and saving carbon waste. So for us, it’s like, hey, if you reduce the amount of memory or CPU that a given container or application is requesting, if you’re able to reduce that on a systematic ongoing basis, then you are able to not only reduce the money you’re paying for that application to run, but you’re also going to reduce the carbon footprint of that application as well. So for us, it’s kind of like the best of both worlds where you can tell the developer who’s maybe passionate about environmental justice, like, “hey, you really need to go in and reduce these container requests.” And you also have the business justification of, “hey, you’re going to save 10%, 15%, maybe even 50% or 60% by just going in and right sizing your container requests.”