Careers
At ERA, you will help operate one of the world’s largest, longest-running AI safety and governance talent programmes.
We are always looking for talented, ambitious, and motivated individuals to join our team. In 2025, we ran a Summer Research Fellowship, hosted 30+ speakers & workshops over the 8 weeks, ran a Technical AI Governance Forum, and co-organised the 2025 Vegas AI Security Forum. You can see some of our past research here. We are uniquely positioned at the intersection of technical and governance research for mitigating risks from frontier AI, and we are looking to bring in strong hires to help us build substantial field-building infrastructure in this space.
We just closed our Autumn 2025 hiring round, but still have one position open (deadline: 3rd Oct):
AIxBio Research Manager (1-3 Hires) – Apply Here
We will be running at least two iterations of our flagship Research Fellowship in 2026, with strong ties to the University of Cambridge, alongside other pilot programmes such as an AIxBiosecurity Research Fellowship. Over the past 5 years, we have supported over 120 early-career researchers from 10+ countries through our research fellowship and conferences, leading to high counterfactual impact on their careers. We provide our fellows with mentorship from organisations such as UK AISI, CAISI, RAND, GovAI, Google Deepmind etc., and our alumni have gone on to lead work at impactful institutions in this space.
Please contact hello@erafellowship.org if you have any questions. If you might be interested in joining our team in the future, we'd love to hear from you through our General Expression of Interest Form. We will be in touch about relevant opportunities as they arise.
Application Process
Our Autumn 2025 hiring round is currently open (deadline: 29th September, anywhere on Earth), and we expect to hire a significant number of people via this round. Here’s a tentative timeline and the application process:
-
Applications close at midnight on Monday September 29th (UK time). More information (including job descriptions) can be found here.
We’re cognizant of this being a tight turnaround for applications, and want to respect candidates’ time. With this in mind, the initial application stage only requires candidates to submit their CV/LinkedIn profile for our review, with a few other small questions about background and motivation.
Applications will be reviewed on a rolling basis
-
Following the initial application screening, candidates progressing to the next stage will be invited to attend an interview with the ERA team on a rolling basis. We aim to send all interview invitations by 3rd October, and complete all interviews by 10th October
We understand that this timeline might not work for all candidates; in some cases, we may be able to accommodate interviews in the following week as well.
-
In parallel with conducting interviews, ERA will reach out to contacting the professional references that you provide us.
-
We aim to extend offers and final decisions to new ERA Team members by mid-October!
Life at ERA
We are a tight-knit team who all deeply care about our mission of mitigating catastrophic risks from frontier AI. ERA has been thinking about AI Safety & Governance since before the launch of ChatGPT, and this makes ERA one of the world’s largest, longest-running AI Safety & Governance talent programmes. You could help us 10x what we are doing, and we are really excited about bringing top talent onto all our teams.
ERA’s Culture & operating values centre around 3 core pillars:
Integrity: Doing the right thing, especially when no one is watching.
Intimacy: Building genuine, psychologically safe relationships so we can tackle the hard problems together.
Intensity: Operating with focused urgency: because the stakes are high and the time is limited.
Harrison Gietz, Programme Director at ERA:
“To me, ERA strikes a perfect balance between professionalism and entrepreneurial culture. It's incredibly empowering to work with such a high-velocity, motivated team; I have an exceptional degree of autonomy and ownership, but coupled with reliable, thorough support from the rest of the team. I've been consistently impressed with the quality and pace of work from this organisation.”
Genevieve Gaul, Programme Associate at ERA:
"There is a genuine interest in my own personal development, and in making sure that I can try new responsibilities, explore different skillsets, and have agency within the team. ERA has an environment which balances genuine challenge with rest and growth. Being a part of ERA, and learning about the way it operates, is definitely a valuable experience in itself.”
Marta Strzyga, Operations Manager at ERA:
Working here has been an incredible opportunity to grow and develop my skills in operations. ERA moves fast and doesn't break things, which creates a great combination of agility and excellent governance. Every person on the team gets to contribute and influence the shape of the work we do: there is a strong sense of trust and mutual support. It's a pleasure to create the operational backbone for ERA's impact.”
Cameron Tice, Technical AI Governance Research Manager at ERA:
“Research managing for ERA allowed me to move from targeting fellowships to full-time roles within AI Safety. I learned how to unblock a variety of projects across the field of AI Safety, and was able to do this with some of the most incredible talent I've had the pleasure of working with.”