Note: The views expressed here are the author’s own and do not reflect the views of Azolla Ventures or Prime Coalition.
Geothermal
I’m a fan of geothermal energy for a bunch of reasons. It’s a source of zero-carbon power and heat, it’s staggeringly abundant, it’s always on, and it uses oil and gas technology and talent. There are big drills and beautiful landscapes. Everything that nuclear is, but cooler.
In addition to decarbonization, I think geothermal could also be a useful tool for climate adaptation, via ground-source heat pumps whose coefficients of performance don’t vary with the weather outside. I could go on about the virtues of geothermal, and have done so elsewhere.
There are two big reasons geothermal isn’t everywhere yet: 1) exploration risk, and 2) resource development costs. For a while, I’ve felt strongly that we ought to chip away at these barriers by funding the development and deployment of new technologies. The societal payoff is massive, and if you craft a clever business model, the financial payoff can be too.
To that end, it’s been a big month for venture-backed geothermal startups.
Zanskar
(Full disclosure: Zanskar is a Prime Impact Fund portfolio company.)
Zanskar, an exploration company based in Salt Lake City, raised a $12MM Series A led by Union Square Ventures. Zanskar is tackling the exploration risk problem by bringing prospecting into the modern age. In the company’s words:
Geothermal’s upfront risk, and its even greater perceived risk, has led to it having amongst the highest soft costs and costs of capital of any power development. Soft costs are all of the costs other than the direct equipment costs, which can include transmission studies, engineering reports, financing costs, etc. And recall that the decline in solar and wind costs this past decade was two-fold: first a decline in the equipment costs, but the final victory was really a significant reduction in soft costs.
The inherent risk of exploration and the fact that there are no guaranteed outcomes is the key driver of soft costs in geothermal. It leads to:
failed wells which are expensive but contribute nothing to power generation;
high costs of capital, especially in the riskiest phases of exploration and development;
longer project timelines, which are especially penalizing given the higher costs of capital.
Because of the high perceived risk, geothermal developers tend to only be able to access capital in the range of 10-20%, with smaller developers paying more. Compare this with weighted average cost of capital (WACC) of 3-6% for utility-scale solar PV and onshore wind range, depending on the region, and it becomes clear why geothermal has had a hard time competing. If geothermal development could be financed at 3-10% WACC’s it could generate electricity at significantly lower levelized costs of electricity (LCOE).
It’s hard to know what’s underground before you spend millions of dollars to do major drilling. This means there’s significant risk of failed wells, which increases both capital required (to drill more wells) and the cost of that capital (how risky an investment is it?). Zanskar uniquely knows how to de-risk geothermal resources before doing major drilling. Here’s how:
Leveraging big data and predictive modeling to discover new resources.
Leveraging stochastic geo-modeling and decision science to optimize the exploration workflow for de-risking known resources.
Leveraging advanced seismic characterization technologies to reduce dry-hole risk when drilling and to better understand reservoir characteristics prior to operations.
Notice how frequently the word “leverage” comes up. There’s a ton of value to be unlocked by adapting technology from elsewhere to this industry.
Fervo
(Full disclosure: I’m not an investor, but was involved in funding Fervo at ARPA-E.)
Fervo Energy, a geothermal energy developer, raised a $138MM Series C led by DCVC. Fervo is tackling the resource development cost problem. As they write:
Fervo has adapted innovations pioneered by the oil and gas industry, such as horizontal drilling and distributed fiber optic sensing, to make reservoirs of hot rock that exist beneath the earth’s surface into practical, economically viable, clean sources of energy. The new [$138MM Series C] funding helps Fervo complete power plants in both Nevada and Utah and evaluate new projects in California, Idaho, Oregon, Colorado, New Mexico, and internationally.
Adapt modern oil and gas well construction and reservoir operation techniques, and you can economically develop resources that incumbents have long ignored. Also take note of the wording of “such as” when describing their technologies. Fervo doesn’t have just one technology, they’ve crafted an integrated offering that not many in all of climate tech can match: 24/7 zero-carbon power.
Customers have taken notice, including East Bay Community Energy and Google. Here’s a graphic from Google showing why 24/7 clean power matters and how Fervo helps them achieve it.
Looking back and looking ahead
For Zanskar and Fervo, there are both tech transfer stories: from oil and gas, from big tech, from telecommunications; and also analogies to the shale revolution: perfect a new offering that unlocks a long-known but untapped resource, and you can revolutionize an industry.
But there’s still so much more to do. We need new drilling physics, new well and reservoir designs, new high-temperature electronics, new sources of capital to put holes in the ground, and most importantly, to 10x the flow of talent into the field by convincing 1% of the oil industry to make the switch. We don’t find new frontiers often, but in geothermal we have an old one ready to be reinvigorated.
In 2018-2019, I spent a few months getting to know the geothermal R&D community. A big part of that was going to conferences and listening – yes, for what people were talking about, but just as much for how they were talking about it. There was a nugget that stuck with me. After one presentation about an advance in drilling technology, an industry veteran raised his hand with a question – really, a comment – about how much he preferred 40 year-old technology to the new stuff. In that moment, a bunch of thoughts crystallized: Oh my god, we’re sitting on a major undeveloped resource,1 but A) the veterans aren’t willing or able to take big swings, B) the academics aren’t that interested in startups, and C) the oil and gas companies haven’t noticed yet.
It’s time for option D: disruption. Congratulations Zanskar and Fervo – I can’t wait to see what you all build and who follows your lead.
Life years
Climate change mitigation is a nice problem to work on. While it’s a hard thing to accomplish, it’s at least clear what we need to do: stop emitting greenhouse gases and draw down excess greenhouse gases already in the atmosphere. It’s relatively straightforward to quantify and more or less evenly distributed around the globe.2 This is convenient: as a funder of climate change mitigation technologies, I’m able to think in gigatons of CO₂-equivalent emissions. It's the real world and of course there are other factors to consider, but if we make that number go down, a lot follows.
Climate change adaptation is a less nice problem to work on. The effects of climate change are complex, regional, socio-political, and sometimes just weird. It’s often not clear what we need to do about them, and once we do figure out what to do, it's harder to measure the impacts. It doesn’t boil down to a magic number.
If we define climate adaptation in our usual terms:
Continuing the flourishing of the species as the planet changes around us.
Then as we try to quantify the impact of climate adaptation technologies, we might lean on existing measures of human flourishing: GDP per capita (going up?), economic losses (low?), population (growing?), life expectancy (going up?), happiness (improving?), Gini coefficient (going down?), and so on. These seem logical, and to varying degrees, we have ways to measure the effectiveness of interventions on them. But as indicators, they all strike me as a bit… macro, and lagging, for our purposes.
So I’m interested in how we might measure the effectiveness of climate adaptation interventions in a way that’s granular and fast, and doesn't mask data at the extremes.3
There are solution-specific metrics that are worthwhile to track – things like wildfires contained, floods correctly predicted, buildings not destroyed, all versus appropriate counterfactuals. But counterfactuals can be tricky, and every time you switch categories, you’ll need to deal with (or come up with) new metrics. It’s hard to compare across categories.
So how do we move forward from here? Well, let’s look for shortcuts. What other industry deals in the quantification of human flourishing?
That’s right, it’s everybody’s favorite: healthcare.
The healthcare industry deals with an incredibly broad range of assaults on the human body, but then has to prioritize among treatments based on 1) quantity of life, 2) quality of life, and 3) economics.
Enter QALYs:
The quality-adjusted life-year (QALY) is a measure of the value of health outcomes. Since health is a function of length of life and quality of life, the QALY was developed as an attempt to combine the value of these attributes into a single index number. The QALY calculation is simple: the change in utility value induced by the treatment is multiplied by the duration of the treatment effect to provide the number of QALYs gained.
and DALYs:
The disability-adjusted life year is a societal measure of the disease or disability burden in populations. DALYs are calculated by combining measures of life expectancy as well as the adjusted quality of life during a burdensome disease or disability for a population.
These are fairly well-developed metrics in the medical and public health arenas. Graphically, here’s what it looks like when an intervention increases QALYs:
Obviously, describing quality of life with a number is a fraught thing to do. Look at this table of “disability weights” (how severe a disability is in calculating DALYs), consider how much some of them changed in just six years, and tell me this is anything other than doctors stumbling around in the dark:
But they’re a start, a way to compare apples to oranges, and I think that’s the point. Better to have an imperfect system than no system at all.
QALYs, DALYs, and climate
As we evaluate many possible forms of climate adaptation, we’ll need to compare apples to oranges. But is anyone working on applying QALYs and DALYs to climate? Upon a brief survey, it seems like we’re early on this journey. In particular, the Effective Altruism community seems well positioned to weigh in on adaptation, but I don’t see them talking about it yet.4
Nonetheless, it doesn’t seem crazy to me that we could arrive at something like this:
We could of course take this further and get into dollars per QALY, as healthcare does. On the “supply” side, $/QALY varies wildly by treatment. On the “demand” side, it quickly gets philosophical, controversial, and regional, but consensus in the US tends to be on the order of $50,000/QALY.5
What could QALY look like in climate adaptation? Here are a few very rough ideas:
Grid stability: Power outages kill people. I believe this is likely to happen more frequently as climate change continues and grids experience greater stress. The 2021 Texas outage killed somewhere between about 250 and 900 people, let alone the millions whose quality of life was reduced for a few days. How much would it have cost to avoid this blackout and save 250-900 lives? I don’t know, but if we take a conservative view and only count the deaths, and then assume a willingness to pay of $50,000/QALY, this gives us a range of $12.5-45MM to work with.
Heat: Extreme heat killed about 6,300 people in 2020. We also know how to quantify the impacts of temperature on productivity. We have a mature technological solution for it called air conditioning. How can we quantify the $/QALY of electrifying and air-conditioning the developing world?
Crop disease: Climate-related crop disease is projected to be an increasingly large issue. Digging through a PhD thesis from the University of Queensland, it’s possible to model historical crop disease rates amid a changing climate and express them in terms of QALYs lost. Similarly, solutions could be expressed in $/QALY.
What I’m driving toward is this: climate adaptation is going to grow into a mature field of study, with outposts in many industries. In order to get there, we’ll need to know what we’re up against, what options we have, and how cost-effective those options will be. In the same way that we have a marginal abatement cost curve for emissions reduction, we should be interested in building one for climate adaptation.
Is anyone working on this? Please reach out, I’d love to chat.
Elsewhere
Thanks for reading!
Please share your thoughts and let me know where I mess up:
And I do mean literally sitting on the resource. That’s true everywhere, but especially true where we were sitting, on top of the San Andreas Fault.
If you want your mind blown by global CO₂ movement, watch this video.
Climate change is felt in extremes more than averages, which is why I think expressing the extent of global warming in global average temperature increase is kind of weak. A few degrees warmer everywhere? Whatever. Melting roads in England? Oh shit.
Please shoot me a note if the EA community is on top of adaptation and I’m just not seeing it.
If this quantification of the value of human life makes your skin crawl, I‘m not going to tell you you’re wrong. But it is happening whether we like it or not.