Own and run impactful workshops that introduce promising students to careers in AI safety and biosecurity.
Apply nowKnow someone who might be a good fit? If your referral gets hired and completes 6 months with us, we'll pay you a $5,000 referral bonus. Learn more
Global Challenges Project (GCP) is a series of workshops designed for students to explore careers in AI safety and biosecurity. Over the past few years, GCP has run dozens of three-day workshops for 400+ students, teaching participants to think critically about these fields. Most workshops have been hosted in Oxford, Boston, and Berkeley, 5-6 times per year.
Kairos is excited to announce that the leaders of GCP have asked us to host the program going forward. Kairos is a nonprofit focused on accelerating talent into AI safety and policy. In just one year, we've served over 600 people through our flagship programs:
Our fellows have published at NeurIPS and ICML, been featured in Time, and gone on to lead impactful work across top research labs, think tanks, and policy orgs. We've helped dozens land AI safety roles and supported over a hundred meaningful actions in the space after our programs concluded.
We see ourselves as a portfolio of highly impactful projects in a fast-evolving field, and we're just getting started. By 2026, we plan to launch even more ambitious initiatives addressing critical gaps in the ecosystem and continue rapidly scaling up our efforts.
We're looking for a Director of Global Challenges Project to own and run the program, delivering impactful workshops to promising talent. You'll also lead our OASIS workshops for top AI safety university organizers (hosted at the Constellation office), support other Kairos programming, and potentially lead new programs.
GCP workshops introduce participants to foundational concepts like scope sensitivity and taking ideas seriously before diving into AI safety or biosecurity. Each topic gets one day, featuring an intro talk you'll deliver plus presentations from notable guest speakers at top orgs in these spaces, with social activities woven throughout.
Based on the data we've seen, GCP has been quite impactful in motivating students to work on these problems. While many opportunities exist for shallow engagement with these topics, most people don't get space to think critically about their role and potential impact unless they're accepted into competitive in-person research fellowships. We think GCP fills an important gap by giving people earlier in the pipeline a chance to engage deeply with high-context individuals.
You'll travel roughly every month or two (usually 5 to 6 days at a time) to facilitate GCP and OASIS workshops. As the primary person on the ground, you'll coordinate between participants, guest speakers, and staff, deliver core presentations on AI safety and biosecurity, and ensure events run smoothly. You'll also have latitude to make design improvements to these workshops and propose new events for Kairos to run.
You'll plan workshops (sometimes multiple concurrently), thinking through all details and requirements. Rather than handling logistics yourself, you'll supervise and coordinate external contractors who execute the nitty-gritty operational work: booking venues, managing catering, handling tech setup, and other tactical tasks. Your focus will be on event design, clear delegation, quality oversight, and ensuring contractors deliver against your specifications and timeline.
You'll review workshop applications, interview candidates, and make final selection decisions.
You'll drive recruitment efforts to ensure strong applicant pools for all workshops.
You'll contribute to the SPAR and Pathfinder programs, as well as other organizational priorities as needed.
You demonstrate strong internal motivation and ownership. You proactively upskill on complex tasks and reliably drive toward goals. Specifically, you can:
You have a strong understanding of the basic case for AI safety and at least some baseline familiarity with biosecurity. This is critical since you'll need to deliver presentations on these topics multiple times per year and field questions from sharp participants.
You're motivated by reducing catastrophic risks from AI and pandemics and ensuring a positive transition to transformative AI. You take individual responsibility while supporting the team. When challenges arise, you roll up your sleeves and contribute wherever you can make a difference.
You've organized multiple events, like workshops or conferences, and feel comfortable making game-time decisions. You know how to be professional yet friendly and approachable during events. If you lack direct event experience, you have strong project management experience instead.
You manage multiple workstreams effectively, anticipate problems before they arise, and keep a detailed track of tasks and deadlines.
You'll travel regularly for this role. You should be able to travel easily in and out of the US and UK, and enjoy the experience.
$90,000–$150,000
This will depend on experience, seniority, and location, with the potential for additional compensation for exceptional candidates. We will also pay for work-related travel and expenses. If you work from an AI safety office in Berkeley, London, or Cambridge, MA, we'll cover food, lunches, and office expenses.
10% 401(k) contribution or equivalent 10% pension contribution
We don't have strict location requirements, although we prefer candidates who can be available for meetings for most of a working day in the ET time zone. We can pay for access to office spaces in Berkeley, London, or Cambridge, MA, or optional coworking access if elsewhere
We host biyearly all-team retreats to connect in person, collaborate, and build team culture.
Flexible working hours, competitive health insurance, dental and vision coverage, generous vacation policy, flexible expense policy for productivity-related expenses, and a professional development budget
Kairos is a small, dynamic, and high-trust team motivated by the urgent challenge of making advanced AI go well for humanity.
We also believe meaningful work should be enjoyable. We support each other's well-being, celebrate wins, and maintain a healthy sense of humor even when the work is challenging.
If you are passionate about our mission but are unsure about whether you meet every qualification, we encourage you to apply anyway. You can also submit a general expression of interest if you don't think you're a good fit for this role but would be interested in other future roles at Kairos.
If you're excited about supporting the next generation of AI safety talent and want to make a tangible impact in this critical field, we'd love to hear from you.