Seeking funding? We're kicking off grantmaking.ai with a $1M launch round.Learn more.

Learn more
grantmaking.ai

We're building an AI safety funding platform,and we're kicking it off with a $1M grant round.

Find, evaluate, and fund AI safety work, without rebuilding the map yourself.

Learn moreLearn more

We're a small non-profit team with experience building infrastructure for the AI safety community.

AISafety.comManifund

CeSIA

The French Center for AI Safety (Centre pour la Securite de l'IA) is a Paris-based non-profit think tank and research center working to reduce risks from AI through education, technical research, and policy advocacy in France and Europe.

Team

6Medium

Led by Charbel-Raphaël Segerie

Amount needed

$40,000 / $75,000Med
9

Why we're building this

The ecosystem is about to receive billions. It isn't ready.

Dozens of new funders are about to deploy billions into AI safety. The current system wasn't built for them.

Smaller grants get missed

Projects asking for <$50K are often too small for larger funds and too hard to discover casually.

Due diligence gets repeated

Every donor rebuilds the same picture. Every grantee re-answers the same questions.

Funding status is unclear

It's hard to tell who is actively fundraising and what their runway looks like.

Discovery is insider-heavy

Good opportunities and honest takes on them are still mostly passed around through private networks.

We're building the coordination layer the ecosystem needs: comprehensive public data, living funding status, and trusted signal — all in one place.

What's in the database

A growing public map of AI safety funding opportunities.

Here's a preview of the 454 organizations from our broader database.

Existential Risk Observatory

A Dutch foundation that works to reduce existential risk by informing the public debate through media engagement, policy advocacy, research, and public events.

Team

5Medium

Led by Otto Barten

Amount needed

$8,000 / $25,000Low
6

Seeking funding?

grantmaking.ai is built to help you get funded.

Seeking funding soon?Apply to our $1M grant round.

$5K–$50K grantsFocused on existential AI safetyInitial decisions in weeks

We're partnering with Manifund to distribute $1M in grants for AI existential safety work.

Applications will be opening soon, with more details to follow. Sign up to get notified!

Not fundraising right now? We still want your profile in the database! We're building the comprehensive AI safety database — so when you do need funding, your profile is already there, already current, already visible to every funder.

Step 1

Find or create your profile

We'll start with public information where we can, so you're not always beginning from a blank page.

Step 2

Add the context funders need

Claim the profile, fix anything outdated, and add the details funders can't easily see elsewhere.

Step 3

Stay discoverable to funders

grantmaking.ai

We're building a public coordination layer for AI safety funding — so donors can move faster, grantees can stay visible, and the ecosystem can share more of its knowledge.

Questions? Takes? Suggestions?

Email us, we want to hear from you

hi@grantmaking.ai

© 2026 This site is released under a CC BY-SA license

0
7

Existential Risk Observatory

A Dutch foundation that works to reduce existential risk by informing the public debate through media engagement, policy advocacy, research, and public events.

Team

3Small

Led by Otto Barten

Amount needed

$100,000 / $100,000Low
12

grantmaking.ai

An AI safety funding platform. A database of organizations, projects, and funds with a trust and signal layer on top.

Team

3Small

Led by Matt Brooks

Amount needed

$50,000 / $100,000Low
4
1

See the whole landscape

Every public AI Safety funding opportunity, and the context you need, in one place.

Follow trusted judgment

See what experienced evaluators think, not just what projects say about themselves.

Spot real funding gaps

See who is actively fundraising and where marginal dollars can help the most.

Fundraise efficiently

Put your request in front of dozens of donors and get feedback in weeks, not months.

0
3

Convergence Analysis

An international AI x-risk strategy think tank that conducts scenario research and governance analysis to mitigate risks from transformative AI technologies.

Team

9Large

Led by David Kristoffersson

Amount needed

$22,000 / $50,000Med
7

CeSIA

The French Center for AI Safety (Centre pour la Securite de l'IA) is a Paris-based non-profit think tank and research center working to reduce risks from AI through education, technical research, and policy advocacy in France and Europe.

Team

6Medium

Led by Charbel-Raphaël Segerie

Amount needed

$40,000 / $75,000High
8

Impact Academy Limited

A nonprofit that runs fellowships and educational programs to develop expert, mission-aligned talent for AI safety research and governance.

Team

2Small

Led by Diana Rendon

Amount needed

$55,000 / $100,000Low
9

Pour Demain

A Swiss non-profit think tank that develops evidence-based policy proposals on AI safety, biosecurity, and emerging technologies, bridging science, politics, and civil society for Switzerland and beyond.

Team

10Large

Led by Patrick Stadler

Amount needed

$90,000 / $150,000Med
6

Orthogonal

A non-profit AI alignment research organization focused on agent foundations, pursuing formal goal alignment approaches that would scale to superintelligence.

Team

3Small

Led by Tamsin Leake

Amount needed

$140,000 / $200,000High
7

Your profile remains part of the database, so future donors can find you faster even after the initial round.

1
4
8
1
5
2
7
0
9
1
3
2
5