What you prompt is really giving you.

Go ahead. Try it right now. Open ChatGPT, Claude, Gemini. Pick your favorite. Type this:

"I'm 42, married, two kids, and I own a small business that does about $1.5M in revenue. I have some savings but I'm not sure if I'm on track for retirement. Can you build me a financial plan?"

You'll get something back in about 15 seconds. It'll be organized. It'll have headers. It might even have a table. It'll mention an emergency fund, maxing out retirement accounts, and diversifying your portfolio. You'll read it and think: huh, that's actually pretty solid.

And you'd be wrong.

What's Actually Happening Behind the Curtain

What you don't see is the set of instructions the AI is operating under before you ever type a word. Simplified, it looks something like this:

"You are a helpful financial assistant. You are not a licensed financial advisor. Do not provide specific investment recommendations. Do not recommend specific securities, funds, or allocation percentages. Do not provide tax advice. Always remind the user to consult a qualified professional. Assume no knowledge of the user's full financial picture, tax situation, estate documents, business entity structure, insurance coverage, risk tolerance, state of residence, or existing accounts. Provide general education only. Use balanced, non-directional language suitable for a global audience. Do not assume US tax law unless stated. Avoid projections that could be interpreted as guarantees. Include appropriate disclaimers."

Read that again. The AI has been told, before you showed up, to assume it knows nothing about you, to avoid anything specific, and to keep its advice safe enough for 8 billion people on the planet.

That's not a plan. That’s creating generic, vanilla AI slop.

The 30-Second Plan vs. Your Actual Life

The prompt you typed had 35 words. A real financial plan for someone in your situation needs to account for things that prompt never mentioned. And the AI was told to ignore them even if you did.

Things like:

  • What entity structure is your business operating under, and is it optimized for your current revenue and growth trajectory?

  • Are you taking a W-2 salary, distributions, or some combination? Has anyone modeled which split minimizes your total tax burden?

  • Have you evaluated a Solo 401(k) with employer contributions against a SEP-IRA, or explored whether a defined benefit plan makes sense at your income level?

  • What does your buy-sell agreement say, and do you actually have one?

  • Is your key-person insurance adequate, or does your business valuation expose a gap your family would feel?

  • How does your spouse's income interact with yours for AGI phaseouts, IRMAA thresholds, and Roth conversion planning?

  • What state are you in, and how does that change everything above?

That's seven questions. I have about forty more. The AI has zero.

Why "Pretty Good" Is Actually Dangerous

For someone with a W-2 job, a 401(k), and a straightforward financial life, the AI's generic framework might get you 70% of the way there. Fine.

But if you own a business doing $1.5M in revenue? The wrong retirement vehicle alone can cost you six figures in unnecessary taxes. A missing buy-sell agreement can unwind everything you've built. And the AI output looks competent enough that you might not realize what's missing until you're writing a check you didn't have to write.

AI doesn't know what it doesn't know about you. And it was designed not to try.

You're Only as Good as Your Prompt

Here's an irony worth sitting with: this article was written with AI.

But it wasn't written with a single prompt. It took a central thesis, multiple rounds of feedback, specific structural requests, tone adjustments, and iterative edits. Each one adding context the AI didn't have before. I didn't type "write me a financial planning blog post" and publish what came back. That would've produced something generic, surface-level, and forgettable. Sound familiar?

The same principle applies to your financial plan. If you give the AI 35 words and no context, you get a 35-word-depth answer wrapped in professional formatting. You're only as good as your prompt. And most people don't know what they don't know well enough to prompt for it.

That's the real gap. It's not that AI can't handle complexity. It's that the person typing the prompt doesn't know which questions to ask, which variables matter, or which planning blind spots they're walking past. You wouldn't diagnose your own chest pain by typing symptoms into a search bar and calling it a medical plan. Your financial life deserves the same standard.

The Part That Isn't a Knock on AI

I use AI every day. I'm building my practice around it and blessed to be starting with it from the beginning instead of the worries of the many getting their jobs replaced by it. The technology is genuinely extraordinary for research, analysis, scenario modeling, and accelerating the work that used to take advisors days.

But there's a difference between using AI as a tool inside a planning process and using AI as the planning process. The first one makes great advisors faster. The second one gives you a confident-sounding answer to a question nobody actually asked.

The Bottom Line

The prompt was easy. The output was fast. And for a business owner with real complexity, it was almost entirely useless. Prettied up to look like it wasn't. Lipstick on a pig as my Dad would say.

If your financial life fits in 35 words, an AI plan might work for you. If it doesn't, you probably already know that. You just needed someone to say it out loud.

Already asked AI for a financial plan? Send it to [email protected]. I'll read it and tell you what it missed. No pitch. No obligation. I'm genuinely curious what it got right and where the gaps are. That's the part the AI was never designed to catch.

Keep Reading