The Mission
The climate crisis demands urgent, data-driven action — and we’re building tools to make that possible. Our mission is to empower smarter, faster decisions in the face of wildfires and their cascading impacts. We’re developing products providing actionable insights that track burnt areas, above-ground biomass loss, vegetation regrowth, CO₂ emissions, and infrastructure damage — serving reforestation, carbon markets, insurance, and disaster response.
We are looking for a software engineer with a remote sensing background to help us establish and operationalise our Wildfire Impact & Recovery product. You’ll turn research and prototypes into resilient, production-grade systems that run globally, reliably, and automatically. If you’re motivated by building real systems that make a real-world impact, this is your role.
Relevant Skills
We expect that you bring the following:
Proven professional experience (3+ years) in a software engineering role
Proven professional experience (1+ year) in developing remote sensing or computer vision solutions
Solid understanding of Object-Oriented Programming (OOP) concepts with a strong foundation in designing and implementing robust, scalable software solutions
Strong Python skills, particularly in the following (e.g., xarray, Satpy, NumPy, pandas, rasterio, Celery)
Familiarity with Docker and the Cloud Native landscape
Experience maintaining production systems: you know what can go wrong in ops and how to prevent it
Strong Git knowledge and comfort working in collaborative development workflows
Strong problem-solving skills and ability to work effectively in a collaborative team environment.
Your work is thorough and you pay attention to both the details and the larger picture
You like to help others grow and succeed
Strong communication skills
Degree in Computer Science, Geoinformatics, Remote Sensing, or a related STEM field — or equivalent experience
NICE TO HAVE (WHAT MAKES A DIFFERENCE)
What skills and knowledge will make a difference:
Familiarity with geospatial formats and standards (Cloud optimised GeoTIFF, NetCDF, Zarr, STAC, OGC Web services)
Experience with SQL/PostGIS or other relational databases
Exposure to CI/CD pipelines and automated testing for data workflows
Contributions to open-source projects or maintenance of your own
Experience with workflow orchestration tools (e.g., Dagster, Airflow, Prefect)