Tomorrow Prompt
BTC

Tomorrow Prompt

Forward-looking coverage of emerging AI capabilities — what you'll be able to prompt for tomorrow that you can't today.

Everything You Need

Click any guide below to dive in.

Tomorrow Prompt 🔮

The future doesn't announce itself. It arrives mid-sentence.

One day you're typing prompts into a chatbot. The next, the chatbot is booking your flights, filing your taxes, and negotiating your cable bill — without being asked. That shift didn't happen overnight. But it happened faster than anyone predicted.

Tomorrow Prompt tracks the bleeding edge of AI capability — not the hype, not the fear, but the actual technology arriving in the next 1-5 years and what it means for how we live and work.


The State of AI — March 2026

CapabilityStatusMaturity
Text generation (essays, code, emails)Production-ready🟢 Mature
Image generation (photos, art, design)Production-ready🟢 Mature
Video generation (short clips, ads)Early production🟡 Emerging
Autonomous web agents (browse, click, transact)Limited production🟡 Emerging
Real-time voice conversationProduction-ready🟢 Mature
Scientific discovery (protein folding, materials)Research breakthrough🟡 Emerging
Robotics (physical world interaction)Early prototype🔴 Nascent
Artificial General Intelligence (AGI)Undefined / contested🔴 Theoretical

What You'll Find Here

  • 📡 The Capability Tracker — What AI can do today vs. what's coming in 6, 12, and 24 months
  • 🔧 Tools & Platforms — The frontier tools most people don't know about yet
  • ⚡ Industry Disruption Map — Which industries are being reshaped and how fast
  • 🌊 Deep Dives — Long-form analysis of specific emerging technologies
  • ❓ FAQ — Separating signal from noise in AI predictions

The Three Waves of AI (2020-2030)

Wave 1: Generation (2020-2024)

AI learned to create. Text, images, code, music. The output was impressive but the AI had no agency — it waited for your prompt and produced content. The human was the operator.

Wave 2: Action (2025-2027)

AI learned to do. Browse the web, fill forms, execute multi-step tasks, use tools. The AI gained agency but within narrow boundaries. The human is the supervisor.

Wave 3: Orchestration (2028-2030+)

AI learns to coordinate. Multiple AI agents working together, negotiating with other agents, managing complex projects with minimal human oversight. The human is the goal-setter.

WaveExample PromptAI Response
Wave 1"Write a marketing email"Produces the email text
Wave 2"Send a marketing email to our Q1 leads"Writes, formats, and sends the email via your ESP
Wave 3"Increase Q1 lead conversion by 15%"Designs campaign, A/B tests subject lines, segments audience, sends emails, monitors results, adjusts strategy

Five Technologies to Watch in 2026

1. Autonomous AI Agents

What: AI that can operate a computer as well as a human — click, type, navigate, multi-task.

Who: OpenAI (Operator), Anthropic (Computer Use), Google (Mariner).

When: Now, with guardrails. Fully autonomous by late 2027.

Impact: Administrative and research jobs transform first. Every knowledge worker gets a "digital intern."

2. Multimodal Reasoning

What: AI that seamlessly combines text, image, audio, and video understanding.

Who: Google (Gemini 2.0), OpenAI (GPT-5 rumoured), Meta (Llama 4).

When: Shipping now. Major quality jump expected H2 2026.

Impact: AI can understand a photo of your whiteboard, a recording of your meeting, and a PDF of your requirements — all in one conversation.

3. On-Device AI

What: Powerful AI models running locally on phones, laptops, and edge devices.

Who: Apple (Apple Intelligence), Qualcomm (Snapdragon NPU), Google (Gemini Nano).

When: Shipping now. 2027 models will match cloud 2024 performance.

Impact: Privacy-first AI. Works offline. Zero latency. Your data never leaves your device.

4. AI-to-AI Protocols

What: Standardised ways for AI agents to communicate with each other and with software tools.

Who: Anthropic (Model Context Protocol), OpenAI (function calling), Google (Agent2Agent).

When: MCP adoption accelerating now. Universal standards by 2028.

Impact: Your AI assistant can talk directly to your bank's AI, your doctor's AI, your employer's AI — without you as the middleman.

5. Synthetic Data & Simulation

What: AI-generated data used to train other AI systems and simulate scenarios.

Who: NVIDIA (Omniverse), Google DeepMind, Anthropic research.

When: Already used heavily in training. Consumer applications by 2027.

Impact: AI can simulate a year of business operations in minutes. Strategic planning becomes experimental, not speculative.


The Big Question

Will AI take my job?

The honest answer, based on everything we track:

  • Some jobs will disappear. Mostly roles that are 90%+ routine information processing.
  • Many more jobs will transform. The work stays, the tools change dramatically.
  • New jobs will appear that don't exist yet. Prompt engineering is the early example, but it's tip-of-iceberg.
  • The transition speed is the real variable. Technology is ready faster than regulation, education, and culture can adapt.

Tomorrow Prompt — watching the future arrive, one capability at a time. 🔮