The unexpected role of software in catalyzing climate solutions

The unexpected role of software in catalyzing climate solutions

The unexpected role of software in catalyzing climate solutions

Monday, June 5, 2023

Monday, June 5, 2023

Monday, June 5, 2023

Software has a leveraged role to play in fighting climate change, but the problems worth solving are hidden from view for most software engineers. To reverse climate change, we have to build entire new industries and transform existing ones that are opaque to most of Silicon Valley. So rather than having modern AI wizards brainstorm in a vacuum, we wanted to put them in contact with the reality of the daily challenges of the climate tech companies at the frontier.

The story behind this list

Before starting Streamline, we spent months speaking with climate tech founders, operators, permitting officers, electricians, the DOE, activists, ESG investors, carbon traders, etc. 

Why spend all this time? It felt imperative to build a systemic perspective of climate change. The more we learned, however, the more we realized we knew absolutely nothing. The risk here, is to keep exploring indefinitely without action. The more we learned, the more we realized we knew nothing about.

But how will I know I’m working on the highest leverage problem?

Reading Speed & Scale unlocked a new perspective: to solve the climate crisis, we need to do it all. Work on problems you care the most about, provide your required dopamine-inducing feedback loops, and help attract others to join. (Plus, the reality is you learn more by doing than learning through others).

"Solve problems you’ve personally faced" is the classic advice aspiring founders receive. Equipped with domain knowledge and the conviction to make your life better, it’s easy to stay motivated and increase your odds of success. When it comes to climate at the speed and scale that’s necessary, we need to accelerate every human workflow, from energy auditors to heat pump installers.

To bridge this knowledge gap, we pulled together tens of climate companies and over 100 builders for a 24 hour hackathon.

Climate companies presented “prompts” that detailed internal challenges they had, and stuck around to mentor AI experts and software engineers on how to best tackle them.

This kicked off SF Tech Week, and was an effort made possible by the collaboration of OpenAI, Lowercarbon Capital, Cerebral Valley, and our team at Streamline.

If history is any guide, we should be watching the winners. To continue the conversation and plant seeds for aspiring climate startup founders, we’re sharing the prompts publicly!

The Prompts

Life Cycle Analysis automation ♼ 

To address the urgent climate crisis, it is essential that we take immediate action to reduce emissions and remove CO2 from the atmosphere – even if we stopped all emissions, we need to draw CO2 out of the atmosphere (est. 10GT/year). Various solutions exist, such as regenerative agriculture, direct air capture, and enhanced weathering, but each is energy intensive, and requires supporting infrastructure like renewable energy generation, storage facilities, and transportation systems. We need to ensure they are end-to-end carbon negative. The measurement for this is what’s called a Life Cycle Analysis (LCA).

The  problem is that these LCA’s are expensive, resulting in an additional $100k cost for every carbon removal project.

Scope 1 emissions directly stem from the source and should be negative and verifiable for successful CO2 removal. This falls under the Measurement, Reporting, and Verification (MRV) scope and is a key input for the LCA.

Scope 2 accounts for carbon emitted from energy inputs, including grid energy, natural gas, and renewable sources. The carbon intensity of energy varies based on location, region, and technology. Additionally, renewable energy credits (RECs) complicate matters by offsetting dirty fuel with clean sources through credit purchases. The quality and impact of these credits also vary based on source and accounting method.

Scope 3 emissions consider the value and supply chain, including embodied emissions influenced by procurement, regionality, regulations, and technologies used. Scope 3 must account for all upstream and downstream activities, such as R&D, waste management, and employee travel.

Scope 2 and 3 emissions encompass all operational and capital inputs to a deployment and must be completely offset and surpassed by negative Scope 1 emissions from direct air capture (DAC) to make a significant impact.

Producing LCAs requires extensive research, data synthesis, and accurate modeling to predict the total carbon impact and its uncertainty. We need a way to make LCA faster, easier, and cheaper. 

Example Customers: Heirloom, Vesta, Lithos Carbon, Noya, Running Tide

Resources


Sourcing Electrical Power for CO2 Removal 🧾

As noted in the previous prompt, to power Carbon Dioxide removal, we need access to substantial amounts of electrical power. It’s currently incredibly challenging to determine optimal locations for renewable energy systems in a way that minimizes construction timelines and electricity costs. 

The problem is navigating the complex landscape of government, NGO, and public utility regulations, guidelines, and utility tariffs.

At the hackathon, Heirloom asked specifically for Utility Tariff PDF parsing, which is a crucial aspect of this endeavor. A utility tariff is a price structure or an electricity rate offered by a local utility and approved by the state's Public Utility Commission that allows eligible customers to source up to 100% of their electricity from renewable resources. They vary at the state, regional, district, and municipal levels.

To evaluate a deployment strategy effectively, we need information on permitting timelines, special requirements, power costs, and ideally, previous deployments in the same area.

One proposed solution includes building an AI tool capable of locating, assimilating, and comprehending the most suitable utility strategies, requirements, and key stakeholders for specific region(s) and their infrastructure would greatly facilitate the deployment and scaling of DAC and related technologies.

Example Customers: Heirloom, Noya, Eion, and every other Carbon Dioxide Removal startup that has a large energy footprint.

Resources


Land Use Planning 🛰 

To deploy solutions on public land, you need to go through permitting. To plan projects, we need an understanding of permitting rules and regulations across the board. 

The problem is that rules vary on federal and state land, and even between local city governments and we lack centralized databases of permitting requirements. This makes it difficult to build any modeling or planning software for any sort of infrastructure solution. 

Currently planning infrastructure projects costs lots of time and money, and deciphering how best to use land under different regulatory regimes is a major bottleneck in our ability to quickly deploy new technologies and develop the new resources we need in this country.

One place to get started would be in exploring Land Use Restrictions, the first major regulatory hurdle that clean energy and minerals companies have to reference before pursuing onshore development opportunities. These documents are dense and site-specific and contain critical information that may restrict or enable a project’s success.

(Blumen Systems is working on building a solution! If this sounds interesting, they recently raised a round and are hiring.)

Example Customers: Heirloom, Fervo, Nira Energy, Paces, Monarch Hydrogen, Cypress Creek

Resources


Rebate & Incentive Retrieval 📜

The Inflation Reduction Act will spend an estimated $800B in the US in the next 10 years, and it came following the CHIPs act and other legislation that unlocked more capital for climate: both for consumers and businesses. Rebate and tax credit programs are not new, but this amount far surpasses other climate funding we’ve ever seen. 

In practice, this looks like utility rebate programs for everything from installing a heat pump hot water heater, to purchasing an electric vehicle charger, or upgrading their home’s insulation are exploding. There are thousands of these programs across the U.S.

The problem is that this information is not aggregated: it is found scattered across electric utility websites, state energy offices, and nonprofits that promote clean energy and electrification. We can use these sources as a starting point but we need to ingest as much information as possible then sift through it and check for accuracy.

Example Customers: Eli, Upfront, (and every single white glove home decarbonization service)

Resources


Carbon Registry Document Generator 📝 

In their ideal form, carbon markets allow companies to account for their emissions by funding carbon removal elsewhere. Purchasing carbon credits to offset emissions, should only occur once all possible reductions in emission have been made – that transition will take a while. To build trust in carbon credits, third party registries and verifiers exist to ensure projects are high quality. 

Registries provide a transparent and accountable framework for tracking and verifying the legitimacy of offset projects and their associated emissions reductions. They maintain comprehensive records of offset transactions and associated information, and enable the accurate quantification, reporting, and auditing of carbon offsets, ensuring their credibility and promoting trust among stakeholders. 

The problem is that these “comprehensive records” often take the form of long, manually-generated PDF documents describing the project activities, how emissions reductions will be achieved, and independent validation of the project’s claims. The challenge is to use AI to assist in the generation (optional: review as well) of registry documentation. These documents are public on registry databases such as Verra and CAR, providing ample training/prompting data. 

Think “TurboTax” for carbon registries.

Example Customers: Perennial, Terradot (and every carbon project developer that wants to list credits on Verra/Gold Standard/etc.)

Resources

Field Boundary Editor 💡

Over 15 percent, or 9 gigatons, of emissions a year can be attributed directly to our food system— we must change agriculture and our food system from the ground up.

There are many ways to tackle re-building our agricultural system, all would benefit from solving a problem: 

One of the biggest, most difficult to automate challenges in the digital agriculture space is the collection and refinement of agricultural field boundaries. This is especially important for regenerative agricultural programs, because errors in the field boundary will result in an under or over-estimation of the climate benefit achieved by that farm by introducing errors in the effective area. Field boundaries almost always include roads, buildings, trees, or non-planted areas that shouldn’t be there, as many are drawn by the farmer themselves. Polygon boundary refinement is also a common issue across practically all geospatial problems.

Example Customer: Perennial, UNDO, Eion

Resources

  • Example field boundaries from the 2008 Common Land Unit dataset with context images here.

  • segment-geospatial: a library for using SAM with geospatial data

  • Example notebook

Parsing Freeform Conversations into Structured Data 🗣

To solve the climate crisis, we need to collaborate with communities that predominately do business through phone calls and in-person conversations. Farmers, for example, are a critical stakeholder in the green transition, being at the implementation stage of regenerative agriculture and several rock weathering solutions. 

Despite the proliferation of precision agriculture, farming is fundamentally still face-to-face. That means we spend a lot of time in freeform conversation with farmers (ie phone calls). Farmers will be able to tell you all 15 fields they manage, quirks of each of them, and also remember that on a specific field they put down 150 lbs/ac of nitrogen, 200 lbs/ac of potash, and that it rained for 2 weeks before planting.

A useful solution would include using AI to pull structured information out of these conversations, flag potential issues and blockers, run sentiment analysis, and also guide the conversation based on a text description/checklist would be extremely powerful.

Example Customer: Lithos Carbon, Yardstick, Eion, Perennial, Charm Industrial

Resources

  • Reach out to Henry from Lithos to get access to example conversations with farmers, including the mp4 file, transcripts


Why Now?

We’re at the intersection of the fastest momentum we’ve seen for combating climate change, and breakthroughs in AI - language models in particular. Now is the time to revisit a lot of what we’ve built, and optimize it to be more resourceful.

Note that Climate x AI is not new (think Pachama, Perennial, Yardstick, etc.,) but the recent developments in AI unlock a new set of possibilities.

Some of the software tools that have eased that transition, include products that make it easy to train models (Roboflow), find GPUs (Brev.dev), store embeddings data (Chroma), and build on top of one of the most advanced language models (OpenAI’s GPT-4). 

We at Streamline have benefited from the explosion of generative AI- within a few months we are able to go from zero to building an end-to-end grant sourcing, qualification, and drafting platform for climate companies. Our goal with the hackathon was to accelerate learning and get other impact-minded engineers into the weeds of the problem. The structure of the hackathon, having climate companies stay and mentor, increased project quality and actually bode useful hacks – for example, the winning team, AgriTalk, successfully addressed Lithos’ challenge with an end-to-end solution.

If you work in climate and have a problem you think LLMs can solve, or you’re an aspiring startup founder looking to work on one of these problems, please reach out to helena@streamlineclimate.com!

Thanks for reading! 💚

🙏 Special thanks to Rohan Nuttall, Shawn Xu, Douglas Qian, Jamie Wong, Vikrum Aiyer, Derek Gann, David Schurman, Henry Liu, Hannes Boehning , Josh Santos, Sam Steyer for reviewing drafts and contributing ideas.

Software has a leveraged role to play in fighting climate change, but the problems worth solving are hidden from view for most software engineers. To reverse climate change, we have to build entire new industries and transform existing ones that are opaque to most of Silicon Valley. So rather than having modern AI wizards brainstorm in a vacuum, we wanted to put them in contact with the reality of the daily challenges of the climate tech companies at the frontier.

The story behind this list

Before starting Streamline, we spent months speaking with climate tech founders, operators, permitting officers, electricians, the DOE, activists, ESG investors, carbon traders, etc. 

Why spend all this time? It felt imperative to build a systemic perspective of climate change. The more we learned, however, the more we realized we knew absolutely nothing. The risk here, is to keep exploring indefinitely without action. The more we learned, the more we realized we knew nothing about.

But how will I know I’m working on the highest leverage problem?

Reading Speed & Scale unlocked a new perspective: to solve the climate crisis, we need to do it all. Work on problems you care the most about, provide your required dopamine-inducing feedback loops, and help attract others to join. (Plus, the reality is you learn more by doing than learning through others).

"Solve problems you’ve personally faced" is the classic advice aspiring founders receive. Equipped with domain knowledge and the conviction to make your life better, it’s easy to stay motivated and increase your odds of success. When it comes to climate at the speed and scale that’s necessary, we need to accelerate every human workflow, from energy auditors to heat pump installers.

To bridge this knowledge gap, we pulled together tens of climate companies and over 100 builders for a 24 hour hackathon.

Climate companies presented “prompts” that detailed internal challenges they had, and stuck around to mentor AI experts and software engineers on how to best tackle them.

This kicked off SF Tech Week, and was an effort made possible by the collaboration of OpenAI, Lowercarbon Capital, Cerebral Valley, and our team at Streamline.

If history is any guide, we should be watching the winners. To continue the conversation and plant seeds for aspiring climate startup founders, we’re sharing the prompts publicly!

The Prompts

Life Cycle Analysis automation ♼ 

To address the urgent climate crisis, it is essential that we take immediate action to reduce emissions and remove CO2 from the atmosphere – even if we stopped all emissions, we need to draw CO2 out of the atmosphere (est. 10GT/year). Various solutions exist, such as regenerative agriculture, direct air capture, and enhanced weathering, but each is energy intensive, and requires supporting infrastructure like renewable energy generation, storage facilities, and transportation systems. We need to ensure they are end-to-end carbon negative. The measurement for this is what’s called a Life Cycle Analysis (LCA).

The  problem is that these LCA’s are expensive, resulting in an additional $100k cost for every carbon removal project.

Scope 1 emissions directly stem from the source and should be negative and verifiable for successful CO2 removal. This falls under the Measurement, Reporting, and Verification (MRV) scope and is a key input for the LCA.

Scope 2 accounts for carbon emitted from energy inputs, including grid energy, natural gas, and renewable sources. The carbon intensity of energy varies based on location, region, and technology. Additionally, renewable energy credits (RECs) complicate matters by offsetting dirty fuel with clean sources through credit purchases. The quality and impact of these credits also vary based on source and accounting method.

Scope 3 emissions consider the value and supply chain, including embodied emissions influenced by procurement, regionality, regulations, and technologies used. Scope 3 must account for all upstream and downstream activities, such as R&D, waste management, and employee travel.

Scope 2 and 3 emissions encompass all operational and capital inputs to a deployment and must be completely offset and surpassed by negative Scope 1 emissions from direct air capture (DAC) to make a significant impact.

Producing LCAs requires extensive research, data synthesis, and accurate modeling to predict the total carbon impact and its uncertainty. We need a way to make LCA faster, easier, and cheaper. 

Example Customers: Heirloom, Vesta, Lithos Carbon, Noya, Running Tide

Resources


Sourcing Electrical Power for CO2 Removal 🧾

As noted in the previous prompt, to power Carbon Dioxide removal, we need access to substantial amounts of electrical power. It’s currently incredibly challenging to determine optimal locations for renewable energy systems in a way that minimizes construction timelines and electricity costs. 

The problem is navigating the complex landscape of government, NGO, and public utility regulations, guidelines, and utility tariffs.

At the hackathon, Heirloom asked specifically for Utility Tariff PDF parsing, which is a crucial aspect of this endeavor. A utility tariff is a price structure or an electricity rate offered by a local utility and approved by the state's Public Utility Commission that allows eligible customers to source up to 100% of their electricity from renewable resources. They vary at the state, regional, district, and municipal levels.

To evaluate a deployment strategy effectively, we need information on permitting timelines, special requirements, power costs, and ideally, previous deployments in the same area.

One proposed solution includes building an AI tool capable of locating, assimilating, and comprehending the most suitable utility strategies, requirements, and key stakeholders for specific region(s) and their infrastructure would greatly facilitate the deployment and scaling of DAC and related technologies.

Example Customers: Heirloom, Noya, Eion, and every other Carbon Dioxide Removal startup that has a large energy footprint.

Resources


Land Use Planning 🛰 

To deploy solutions on public land, you need to go through permitting. To plan projects, we need an understanding of permitting rules and regulations across the board. 

The problem is that rules vary on federal and state land, and even between local city governments and we lack centralized databases of permitting requirements. This makes it difficult to build any modeling or planning software for any sort of infrastructure solution. 

Currently planning infrastructure projects costs lots of time and money, and deciphering how best to use land under different regulatory regimes is a major bottleneck in our ability to quickly deploy new technologies and develop the new resources we need in this country.

One place to get started would be in exploring Land Use Restrictions, the first major regulatory hurdle that clean energy and minerals companies have to reference before pursuing onshore development opportunities. These documents are dense and site-specific and contain critical information that may restrict or enable a project’s success.

(Blumen Systems is working on building a solution! If this sounds interesting, they recently raised a round and are hiring.)

Example Customers: Heirloom, Fervo, Nira Energy, Paces, Monarch Hydrogen, Cypress Creek

Resources


Rebate & Incentive Retrieval 📜

The Inflation Reduction Act will spend an estimated $800B in the US in the next 10 years, and it came following the CHIPs act and other legislation that unlocked more capital for climate: both for consumers and businesses. Rebate and tax credit programs are not new, but this amount far surpasses other climate funding we’ve ever seen. 

In practice, this looks like utility rebate programs for everything from installing a heat pump hot water heater, to purchasing an electric vehicle charger, or upgrading their home’s insulation are exploding. There are thousands of these programs across the U.S.

The problem is that this information is not aggregated: it is found scattered across electric utility websites, state energy offices, and nonprofits that promote clean energy and electrification. We can use these sources as a starting point but we need to ingest as much information as possible then sift through it and check for accuracy.

Example Customers: Eli, Upfront, (and every single white glove home decarbonization service)

Resources


Carbon Registry Document Generator 📝 

In their ideal form, carbon markets allow companies to account for their emissions by funding carbon removal elsewhere. Purchasing carbon credits to offset emissions, should only occur once all possible reductions in emission have been made – that transition will take a while. To build trust in carbon credits, third party registries and verifiers exist to ensure projects are high quality. 

Registries provide a transparent and accountable framework for tracking and verifying the legitimacy of offset projects and their associated emissions reductions. They maintain comprehensive records of offset transactions and associated information, and enable the accurate quantification, reporting, and auditing of carbon offsets, ensuring their credibility and promoting trust among stakeholders. 

The problem is that these “comprehensive records” often take the form of long, manually-generated PDF documents describing the project activities, how emissions reductions will be achieved, and independent validation of the project’s claims. The challenge is to use AI to assist in the generation (optional: review as well) of registry documentation. These documents are public on registry databases such as Verra and CAR, providing ample training/prompting data. 

Think “TurboTax” for carbon registries.

Example Customers: Perennial, Terradot (and every carbon project developer that wants to list credits on Verra/Gold Standard/etc.)

Resources

Field Boundary Editor 💡

Over 15 percent, or 9 gigatons, of emissions a year can be attributed directly to our food system— we must change agriculture and our food system from the ground up.

There are many ways to tackle re-building our agricultural system, all would benefit from solving a problem: 

One of the biggest, most difficult to automate challenges in the digital agriculture space is the collection and refinement of agricultural field boundaries. This is especially important for regenerative agricultural programs, because errors in the field boundary will result in an under or over-estimation of the climate benefit achieved by that farm by introducing errors in the effective area. Field boundaries almost always include roads, buildings, trees, or non-planted areas that shouldn’t be there, as many are drawn by the farmer themselves. Polygon boundary refinement is also a common issue across practically all geospatial problems.

Example Customer: Perennial, UNDO, Eion

Resources

  • Example field boundaries from the 2008 Common Land Unit dataset with context images here.

  • segment-geospatial: a library for using SAM with geospatial data

  • Example notebook

Parsing Freeform Conversations into Structured Data 🗣

To solve the climate crisis, we need to collaborate with communities that predominately do business through phone calls and in-person conversations. Farmers, for example, are a critical stakeholder in the green transition, being at the implementation stage of regenerative agriculture and several rock weathering solutions. 

Despite the proliferation of precision agriculture, farming is fundamentally still face-to-face. That means we spend a lot of time in freeform conversation with farmers (ie phone calls). Farmers will be able to tell you all 15 fields they manage, quirks of each of them, and also remember that on a specific field they put down 150 lbs/ac of nitrogen, 200 lbs/ac of potash, and that it rained for 2 weeks before planting.

A useful solution would include using AI to pull structured information out of these conversations, flag potential issues and blockers, run sentiment analysis, and also guide the conversation based on a text description/checklist would be extremely powerful.

Example Customer: Lithos Carbon, Yardstick, Eion, Perennial, Charm Industrial

Resources

  • Reach out to Henry from Lithos to get access to example conversations with farmers, including the mp4 file, transcripts


Why Now?

We’re at the intersection of the fastest momentum we’ve seen for combating climate change, and breakthroughs in AI - language models in particular. Now is the time to revisit a lot of what we’ve built, and optimize it to be more resourceful.

Note that Climate x AI is not new (think Pachama, Perennial, Yardstick, etc.,) but the recent developments in AI unlock a new set of possibilities.

Some of the software tools that have eased that transition, include products that make it easy to train models (Roboflow), find GPUs (Brev.dev), store embeddings data (Chroma), and build on top of one of the most advanced language models (OpenAI’s GPT-4). 

We at Streamline have benefited from the explosion of generative AI- within a few months we are able to go from zero to building an end-to-end grant sourcing, qualification, and drafting platform for climate companies. Our goal with the hackathon was to accelerate learning and get other impact-minded engineers into the weeds of the problem. The structure of the hackathon, having climate companies stay and mentor, increased project quality and actually bode useful hacks – for example, the winning team, AgriTalk, successfully addressed Lithos’ challenge with an end-to-end solution.

If you work in climate and have a problem you think LLMs can solve, or you’re an aspiring startup founder looking to work on one of these problems, please reach out to helena@streamlineclimate.com!

Thanks for reading! 💚

🙏 Special thanks to Rohan Nuttall, Shawn Xu, Douglas Qian, Jamie Wong, Vikrum Aiyer, Derek Gann, David Schurman, Henry Liu, Hannes Boehning , Josh Santos, Sam Steyer for reviewing drafts and contributing ideas.

Software has a leveraged role to play in fighting climate change, but the problems worth solving are hidden from view for most software engineers. To reverse climate change, we have to build entire new industries and transform existing ones that are opaque to most of Silicon Valley. So rather than having modern AI wizards brainstorm in a vacuum, we wanted to put them in contact with the reality of the daily challenges of the climate tech companies at the frontier.

The story behind this list

Before starting Streamline, we spent months speaking with climate tech founders, operators, permitting officers, electricians, the DOE, activists, ESG investors, carbon traders, etc. 

Why spend all this time? It felt imperative to build a systemic perspective of climate change. The more we learned, however, the more we realized we knew absolutely nothing. The risk here, is to keep exploring indefinitely without action. The more we learned, the more we realized we knew nothing about.

But how will I know I’m working on the highest leverage problem?

Reading Speed & Scale unlocked a new perspective: to solve the climate crisis, we need to do it all. Work on problems you care the most about, provide your required dopamine-inducing feedback loops, and help attract others to join. (Plus, the reality is you learn more by doing than learning through others).

"Solve problems you’ve personally faced" is the classic advice aspiring founders receive. Equipped with domain knowledge and the conviction to make your life better, it’s easy to stay motivated and increase your odds of success. When it comes to climate at the speed and scale that’s necessary, we need to accelerate every human workflow, from energy auditors to heat pump installers.

To bridge this knowledge gap, we pulled together tens of climate companies and over 100 builders for a 24 hour hackathon.

Climate companies presented “prompts” that detailed internal challenges they had, and stuck around to mentor AI experts and software engineers on how to best tackle them.

This kicked off SF Tech Week, and was an effort made possible by the collaboration of OpenAI, Lowercarbon Capital, Cerebral Valley, and our team at Streamline.

If history is any guide, we should be watching the winners. To continue the conversation and plant seeds for aspiring climate startup founders, we’re sharing the prompts publicly!

The Prompts

Life Cycle Analysis automation ♼ 

To address the urgent climate crisis, it is essential that we take immediate action to reduce emissions and remove CO2 from the atmosphere – even if we stopped all emissions, we need to draw CO2 out of the atmosphere (est. 10GT/year). Various solutions exist, such as regenerative agriculture, direct air capture, and enhanced weathering, but each is energy intensive, and requires supporting infrastructure like renewable energy generation, storage facilities, and transportation systems. We need to ensure they are end-to-end carbon negative. The measurement for this is what’s called a Life Cycle Analysis (LCA).

The  problem is that these LCA’s are expensive, resulting in an additional $100k cost for every carbon removal project.

Scope 1 emissions directly stem from the source and should be negative and verifiable for successful CO2 removal. This falls under the Measurement, Reporting, and Verification (MRV) scope and is a key input for the LCA.

Scope 2 accounts for carbon emitted from energy inputs, including grid energy, natural gas, and renewable sources. The carbon intensity of energy varies based on location, region, and technology. Additionally, renewable energy credits (RECs) complicate matters by offsetting dirty fuel with clean sources through credit purchases. The quality and impact of these credits also vary based on source and accounting method.

Scope 3 emissions consider the value and supply chain, including embodied emissions influenced by procurement, regionality, regulations, and technologies used. Scope 3 must account for all upstream and downstream activities, such as R&D, waste management, and employee travel.

Scope 2 and 3 emissions encompass all operational and capital inputs to a deployment and must be completely offset and surpassed by negative Scope 1 emissions from direct air capture (DAC) to make a significant impact.

Producing LCAs requires extensive research, data synthesis, and accurate modeling to predict the total carbon impact and its uncertainty. We need a way to make LCA faster, easier, and cheaper. 

Example Customers: Heirloom, Vesta, Lithos Carbon, Noya, Running Tide

Resources


Sourcing Electrical Power for CO2 Removal 🧾

As noted in the previous prompt, to power Carbon Dioxide removal, we need access to substantial amounts of electrical power. It’s currently incredibly challenging to determine optimal locations for renewable energy systems in a way that minimizes construction timelines and electricity costs. 

The problem is navigating the complex landscape of government, NGO, and public utility regulations, guidelines, and utility tariffs.

At the hackathon, Heirloom asked specifically for Utility Tariff PDF parsing, which is a crucial aspect of this endeavor. A utility tariff is a price structure or an electricity rate offered by a local utility and approved by the state's Public Utility Commission that allows eligible customers to source up to 100% of their electricity from renewable resources. They vary at the state, regional, district, and municipal levels.

To evaluate a deployment strategy effectively, we need information on permitting timelines, special requirements, power costs, and ideally, previous deployments in the same area.

One proposed solution includes building an AI tool capable of locating, assimilating, and comprehending the most suitable utility strategies, requirements, and key stakeholders for specific region(s) and their infrastructure would greatly facilitate the deployment and scaling of DAC and related technologies.

Example Customers: Heirloom, Noya, Eion, and every other Carbon Dioxide Removal startup that has a large energy footprint.

Resources


Land Use Planning 🛰 

To deploy solutions on public land, you need to go through permitting. To plan projects, we need an understanding of permitting rules and regulations across the board. 

The problem is that rules vary on federal and state land, and even between local city governments and we lack centralized databases of permitting requirements. This makes it difficult to build any modeling or planning software for any sort of infrastructure solution. 

Currently planning infrastructure projects costs lots of time and money, and deciphering how best to use land under different regulatory regimes is a major bottleneck in our ability to quickly deploy new technologies and develop the new resources we need in this country.

One place to get started would be in exploring Land Use Restrictions, the first major regulatory hurdle that clean energy and minerals companies have to reference before pursuing onshore development opportunities. These documents are dense and site-specific and contain critical information that may restrict or enable a project’s success.

(Blumen Systems is working on building a solution! If this sounds interesting, they recently raised a round and are hiring.)

Example Customers: Heirloom, Fervo, Nira Energy, Paces, Monarch Hydrogen, Cypress Creek

Resources


Rebate & Incentive Retrieval 📜

The Inflation Reduction Act will spend an estimated $800B in the US in the next 10 years, and it came following the CHIPs act and other legislation that unlocked more capital for climate: both for consumers and businesses. Rebate and tax credit programs are not new, but this amount far surpasses other climate funding we’ve ever seen. 

In practice, this looks like utility rebate programs for everything from installing a heat pump hot water heater, to purchasing an electric vehicle charger, or upgrading their home’s insulation are exploding. There are thousands of these programs across the U.S.

The problem is that this information is not aggregated: it is found scattered across electric utility websites, state energy offices, and nonprofits that promote clean energy and electrification. We can use these sources as a starting point but we need to ingest as much information as possible then sift through it and check for accuracy.

Example Customers: Eli, Upfront, (and every single white glove home decarbonization service)

Resources


Carbon Registry Document Generator 📝 

In their ideal form, carbon markets allow companies to account for their emissions by funding carbon removal elsewhere. Purchasing carbon credits to offset emissions, should only occur once all possible reductions in emission have been made – that transition will take a while. To build trust in carbon credits, third party registries and verifiers exist to ensure projects are high quality. 

Registries provide a transparent and accountable framework for tracking and verifying the legitimacy of offset projects and their associated emissions reductions. They maintain comprehensive records of offset transactions and associated information, and enable the accurate quantification, reporting, and auditing of carbon offsets, ensuring their credibility and promoting trust among stakeholders. 

The problem is that these “comprehensive records” often take the form of long, manually-generated PDF documents describing the project activities, how emissions reductions will be achieved, and independent validation of the project’s claims. The challenge is to use AI to assist in the generation (optional: review as well) of registry documentation. These documents are public on registry databases such as Verra and CAR, providing ample training/prompting data. 

Think “TurboTax” for carbon registries.

Example Customers: Perennial, Terradot (and every carbon project developer that wants to list credits on Verra/Gold Standard/etc.)

Resources

Field Boundary Editor 💡

Over 15 percent, or 9 gigatons, of emissions a year can be attributed directly to our food system— we must change agriculture and our food system from the ground up.

There are many ways to tackle re-building our agricultural system, all would benefit from solving a problem: 

One of the biggest, most difficult to automate challenges in the digital agriculture space is the collection and refinement of agricultural field boundaries. This is especially important for regenerative agricultural programs, because errors in the field boundary will result in an under or over-estimation of the climate benefit achieved by that farm by introducing errors in the effective area. Field boundaries almost always include roads, buildings, trees, or non-planted areas that shouldn’t be there, as many are drawn by the farmer themselves. Polygon boundary refinement is also a common issue across practically all geospatial problems.

Example Customer: Perennial, UNDO, Eion

Resources

  • Example field boundaries from the 2008 Common Land Unit dataset with context images here.

  • segment-geospatial: a library for using SAM with geospatial data

  • Example notebook

Parsing Freeform Conversations into Structured Data 🗣

To solve the climate crisis, we need to collaborate with communities that predominately do business through phone calls and in-person conversations. Farmers, for example, are a critical stakeholder in the green transition, being at the implementation stage of regenerative agriculture and several rock weathering solutions. 

Despite the proliferation of precision agriculture, farming is fundamentally still face-to-face. That means we spend a lot of time in freeform conversation with farmers (ie phone calls). Farmers will be able to tell you all 15 fields they manage, quirks of each of them, and also remember that on a specific field they put down 150 lbs/ac of nitrogen, 200 lbs/ac of potash, and that it rained for 2 weeks before planting.

A useful solution would include using AI to pull structured information out of these conversations, flag potential issues and blockers, run sentiment analysis, and also guide the conversation based on a text description/checklist would be extremely powerful.

Example Customer: Lithos Carbon, Yardstick, Eion, Perennial, Charm Industrial

Resources

  • Reach out to Henry from Lithos to get access to example conversations with farmers, including the mp4 file, transcripts


Why Now?

We’re at the intersection of the fastest momentum we’ve seen for combating climate change, and breakthroughs in AI - language models in particular. Now is the time to revisit a lot of what we’ve built, and optimize it to be more resourceful.

Note that Climate x AI is not new (think Pachama, Perennial, Yardstick, etc.,) but the recent developments in AI unlock a new set of possibilities.

Some of the software tools that have eased that transition, include products that make it easy to train models (Roboflow), find GPUs (Brev.dev), store embeddings data (Chroma), and build on top of one of the most advanced language models (OpenAI’s GPT-4). 

We at Streamline have benefited from the explosion of generative AI- within a few months we are able to go from zero to building an end-to-end grant sourcing, qualification, and drafting platform for climate companies. Our goal with the hackathon was to accelerate learning and get other impact-minded engineers into the weeds of the problem. The structure of the hackathon, having climate companies stay and mentor, increased project quality and actually bode useful hacks – for example, the winning team, AgriTalk, successfully addressed Lithos’ challenge with an end-to-end solution.

If you work in climate and have a problem you think LLMs can solve, or you’re an aspiring startup founder looking to work on one of these problems, please reach out to helena@streamlineclimate.com!

Thanks for reading! 💚

🙏 Special thanks to Rohan Nuttall, Shawn Xu, Douglas Qian, Jamie Wong, Vikrum Aiyer, Derek Gann, David Schurman, Henry Liu, Hannes Boehning , Josh Santos, Sam Steyer for reviewing drafts and contributing ideas.

Get precious time back

Get precious time back

Because the planet can't wait.

Because the planet can't wait.