Aklimate
CTO, Co-Founder
The Problem
Climate change poses the most significant global challenge of our time. In the historic Paris Agreement of 2015, a total of 197 nations collectively pledged to limit global warming to "well-below 2°C above preindustrial levels" by the end of this century, with a concerted effort to constrain the increase to 1.5°C. This target was established on the understanding that maintaining warming below 1.5°C might enable us to avoid the most catastrophic implications of climate change. At the time of writing, we are at a ~1.27°C increase since the year 1880.
In order to adhere to this 1.5°C limit, businesses are tasked with reducing their greenhouse gas emissions by 50% (relative to their 2018 emissions) by 2030. The Science Based Targets Initiative (SBTi) has diligently translated this goal into tangible, quantifiable climate targets for businesses. However, the issue is more difficult than merely addressing a few large corporate emitters, as a staggering 80-90% of corporate emissions originate from their supply chains. So, to effectively track and manage emissions, businesses must gain insight into their suppliers' emissions. The challenge is that many suppliers do not collect this data due to the perceived costs and complexity of building a carbon accounting competency internally.
Recognizing this, my two co-founders and I embarked on the task of finding a way to use software to unravel the complexity, cost, and delays hindering climate action in the business sector. This journey led us to build Aklimate. Below I share some of the insights gained along the way.
Navigating Towards Product-Market Fit
At the start of our journey, we believed that the reason that suppliers were not adopting carbon accounting practices in their businesses was primarily due to its inherent complexity. To test this hypothesis, we built a simple carbon calculator and approached small and medium size suppliers (SMEs) in the UK, from a variety of industries. Although a small number of environmentally-conscious businesses expressed interest, the majority of the SMEs we spoke with lacked a strong economic incentive to adopt carbon accounting by themselves.
Going back to the drawing board, we hypothesised that the powerful incentive suppliers were currently lacking would arrive in the form of the corporate customer wielding their buying power to encourage (and eventually push) their suppliers to provide regular carbon data. This prediction was grounded in upcoming legislation in the UK, Europe, and the US, which in turn pushes corporations to report supply chain emissions (through mandatory Scope 3 disclosures). Though the rollout of the new legislation will be gradual, this change presented an opportunity to revamp our go-to-market strategy. Working with the corporate as the paying customer, instead of charging the individual suppliers directly, brought three key advantages to the business:
- Reducing Sales Engagements, from N to 1. The corporate customer introduces us to the supplier, lending credibility and eliminating the need for individual sales pitches to each supplier.
- Incentivized Supplier Participation. Suppliers are more willing to measure their carbon footprint due to the preferred supplier status conferred by the corporate customer.
- Corporate ROI Realization. Charging licenses to the corporate customer, rather than the supplier, further reduces friction and provides a strong economic incentive for the corporate to ensure supplier follow-through.
However, perhaps the most transformative aspect of our new approach is the creation of a network of suppliers and corporates. As we onboard more corporates, suppliers on Aklimate with existing relationships can provide access to their carbon data, without additional work for the supplier. Vice versa, corporates at the outset of their journey are swayed towards choosing our solution by the prevalence of adoption among their supplier base. This creates inherent value and stickiness for both the supplier and corporate, which increases exponentially with the size of the network.
We made a video to explain our approach:
Adapting requirements
Pivoting toward large corporate supply chains introduced new demands and opportunities for the technical design of our application. I identified three new requirements:
- The most critical component of the application, the carbon calculation engine, needed to be modular, consistent, and testable. This would allow a greater level of transparency and predictability when composing the calculation logic for a larger range of companies.
- The data input section of the application needed to be a real-time, collaborative experience, similar to the one found in most modern editors (Notion, Google Docs, etc.). This would allow multiple stakeholders from the same organisation to safely work alongside each other, without overwriting each other.
- The interface design of the application needed to be extremely intuitive. Recognizing our constraints as a three-person team and the impracticality of providing personalized customer support to thousands of suppliers, we needed to drastically simplify the user interface to remove points of potential confusion.
Based on these requirements, I chose to build the backend in Elixir and Phoenix, GraphQL (with Absinthe) as the API layer, and the frontend in Next.js (TypeScript):
-
I chose Elixir for two main reasons. Firstly, Elixir's lightweight process model (enabled by the BEAM virtual machine that runs all Elixir/Erlang code) is ideally suited for handling a large number of simultaneous connections. This provided a robust foundation to build out our real-time functionality.
-
Secondly, as a functional language, Elixir is a perfect fit for building the calculation system. The calculation system is composed of dozens of modules that interact with the same input. By eliminating the possibility of modifying this input or inadvertently adding global state in some other way, the calculation composition can be changed while providing a guarantee of consistency.
-
As the most popular web framework for Elixir, Phoenix provides a productive and familiar Rails-inspired web framework to minimise the number of non-critical decisions one has to make while building out the conventional parts of any application (controllers, routing, etc.). The closeness of Phoenix to Rails has also helped with hiring across from the much larger pool of Rails developers. To capitalise on this, we have a practical course and mentoring system (using Exercism) for Rails developers that are new to Elixir / Phoenix, that takes them from zero to productive Phoenix development in two weeks.
-
TypeScript provides the type safety that lets us move faster while providing a baseline level of confidence about the integrity of the frontend. This supports our use of integration (Jest) and E2E (Cypress) tests by reducing our exposure to one very common category of errors.
-
Lastly, React / Next.js was a natural choice given the use of GraphQL. Chakra UI allowed us to build a simple and accessible frontend quickly, which helped given the large surface area of the measurement forms. You can see how it turned out in this orientation voiced over by our CEO, Harry:
Pattern Matching
To entice developers who are exploring Elixir for the first time, I'd like to spotlight just one of the language features that help to make Elixir a joy to work with: Pattern Matching.
In Elixir, functions can be defined multiple times with different "patterns" of arguments. This is known as function overloading, and is just one of a few different kinds of pattern matching in Elixir. It allows the language to choose which function to call based on the shape and contents of the arguments you pass in.
For instance, consider the following simple module that calculates the length of something, handled differently depending on the type of variable you pass in:
defmodule CalculateLength do
# When the argument is a list
def length(arg) when is_list(arg), do: Enum.count(arg)
# When the argument is a string
def length(arg) when is_binary(arg), do: String.length(arg)
# When the argument is a map
def length(arg) when is_map(arg), do: map_size(arg)
# When the argument is a tuple
def length(arg) when is_tuple(arg), do: tuple_size(arg)
end
In this example, there are four different versions of the length function, each with different argument patterns. All the functions actually have the same number of arguments, but are differentiated by guard clauses (when…
) that select for the type of the argument being provided. The functions are evaluated from top to bottom, until one pattern matches. If none of these patterns match, you will get a function clause error:
iex> CalculateLength.length([1, 2, 3])
3
iex> CalculateLength.length("Elixir")
6
iex> CalculateLength.length(%{a: 1, b: 2, c: 3})
3
iex> CalculateLength.length({1, 2, 3})
3
iex> CalculateLength.length(nil)
** (FunctionClauseError) no function clause matching in CalculateLength.length/1
In the real world, this language feature is particularly useful for unpacking many divergent code paths, based on the shape or presence/absence of critical information. In other languages, this is typically handled with a set of if
and elsif
clauses, which can make the function long and unwieldy.
For example, this overloaded function check_access
from our codebase helps us gate access to private data structures, by checking if an organisation is authenticated:
def check_access(property, _context = %{current_user: %{role: :admin}}), do: {:ok, property}
def check_access(property = %Org{id: org_id}, _context = %{org: %Org{id: current_org_id}})
when org_id == current_org_id,
do: {:ok, property}
def check_access(unrecognised_schema, ctx) do
{:error, "Schema type #{unrecognised_schema.__struct__} not recognised as property."}
end
Taking each function from top to bottom:
- In the first function, we simply check if the user requesting access is an
admin
(Aklimate staff member) by pattern matching on the user'srole
enum field. If so, they get access. - In the second function, we firstly check that the requested data structure is an
%Org{}
(Organisation), by pattern matching on the name of the schema of the data structure, while also destructuring theOrg
to get its ID, which we alias asorg_id
. We do a similar pattern match and destructuring of the organisation ID of the user making the request. These are then compared in the guard clause, so the function only ever gets evaluated if the user belongs to the organisation they are attempting to access. - In the third function, we handle the default (or fallback) case, which runs when none of the previous matches have been made. This function allows anything to match by not qualifying the parameters in any way, and returns an
:error
tuple which conventionally signifies an unacceptable result in Elixir.
In this way, you are able to move much of the conditional logic of a function to its signature in Elixir. Although this is one of the more idiomatic features of Elixir, there are still, of course, plenty of control structures (cond
, case
, and even if
) that let you evaluate conditions within the function body. Function overloading is not inherently faster and your choice here is really a question of finding better code readability. Moreover, because guard clauses only use a limited set of operators, you may find that certain conditions can only be written in the body anyway.
Creating a space where engineers thrive
Since joining Y Combinator and raising our seed round, we have had the pleasure of welcoming two engineers to our growing team so far. We have been 100% remote (UK, Europe) from the beginning. We spend a lot of time thinking about what it takes to create a nurturing and calm workspace for engineers to do their best work, from wherever they want to work.
Culture of Trust and Autonomy
We understand that people we work with perform at their best when they have a real sense of ownership over what they're building. In practice, this means we work with engineers that want to own the responsibilities of measuring product adoption among our users and helping to steer the roadmap, of the products and features they work on. This integrated approach allows everyone to contribute meaningfully, and to develop a holistic understanding of the product and its users.
Autonomy also applies to your working hours. We don't track time, and we don't expect you to be online all the time. We trust you to manage your own time and to take time off when you need it. We believe that a healthy work-life balance is essential to doing your best work.
Room for Learning
If you're not familiar with Elixir and/or TypeScript, that's not a barrier to joining our engineering team. We understand that once you know one or two programming languages, learning another one comes much quicker. As part of your role, we provide protected time and resources to familiarize you with these languages. The same applies to the softer, but equally important, skills of product management, design, and mentoring. We don't just hire for skills — we hire for potential, passion, and the willingness to learn.
Ensure Inclusion and Diversity
We care deeply about fostering an environment where everyone feels valued and included. In practice, this means we don't care about whether you studied Computer Science (or anything) at university, where you grew up, or anything else like that. What truly matters to us is your passion for learning, your problem-solving skills, and your ability to work autonomously.
Top Tooling
We also understand that great tools are the foundation of effective work, which is why we provide you with a brand new Macbook Pro and a £500 budget for any home office gear that helps you stay comfortable and productive. The same applies to software. Our current workflow stack is:
- Linear for project management
- Tuple for pair programming
- GitHub for version control
- Figma for design and prototyping
We're always looking for talented individuals to join our team. If you want to take on the challenge of our times, and become part of an innovative, remote company that values trust and autonomy, we'd love to hear from you.