Not all AI is good AI š«£
Jul 28, 2025
You already know I’m a massive fan of using AI to save time. Heck, I’m building custom GPT bots for my Smart NonProfit Ai Lab students today.
But there’s a line. AI is not something to fear—but it is something to be thoughtful about.
The thing is—AI can be a game changer for your nonprofit. But only if it’s used with a little strategy and a lot of common sense.
These are the things I don’t recommend you hand over to AI (without some serious experience and guardrails):
Fake images
Sure, those AI-generated pics of “smiling volunteers” look slick. But they also feel… off.
People can sniff out inauthenticity a mile away. If you’re talking about real people, real impact, or real need, AI images just aren’t it.
Try this instead:
Use AI to brainstorm the real photos you should be taking. Or upload your existing images and ask for caption ideas and content suggestions that connect with your community.
Sensitive or personal data
If you’re feeding client details, donor info or volunteer records into an AI tool? Stop. Right. There.
That data could be stored, reused, or even leaked. Not all AI tools are built with nonprofit privacy needs in mind—and most teams haven’t set clear internal guidelines.
What to do instead:
Set rules on what’s fair game to use. Or better—build a custom GPT trained on your own content, so your team doesn’t feel the need to upload anything risky.
AI-generated stories and testimonials
Yes, AI can write great content. But that “impact story” it spits out? Often totally made up.
That might fly for an ad agency. Not for a mission-driven org. People trust you because your stories are true.
Better idea:
Train a custom GPT with real impact stories, annual reports, testimonials, videos—then let it repurpose your truth into captions, emails and social posts.
.
Chatbots with no escape route
A chatbot sounds smart—until someone has a real question and gets stuck in an endless “I don’t understand” loop.
AI doesn’t know when someone’s having a tough day, asking something sensitive, or trying to find urgent help. It needs guardrails.
What works:
Use structured pathways where you control the info shared
Always offer a “speak to a human” option
Use chatbots to guide, not replace, real relationships
Bonus: they’re amazing for onboarding volunteers or helping people find the right program or page on a clunky website.
AI can be a game-changer—but only if it’s used with care, common sense, and clear boundaries.
If you’re still figuring out what that looks like in your org, you’re not alone.
That is why I've been running more of these done-with-you sessions of the Smart NonProfit AI Lab lately, to help people like you use AI in ways that are strategic, safe, and genuinely useful.
Want a second brain on how to set up smart AI systems that don’t cross the line?
Or to chat whether AI is right for you? Reach out to chat.
Or if you’re ready to build your own custom GPT with me (no tech experience) → [Book your session here]
Talk soon,
Alecia
Get this weekly blog direct to your email!
i will never sell your information, for any reason.