What you’ll learn:

This guide combines two essential aspects of prompting in Rocket:

  • The C.L.E.A.R. framework for writing strong, effective prompts.
  • Core prompting strategies like zero-shot, one-shot, few-shot, and chain-of-thought.

These tools help you go from simple tasks to deeply structured conversations - faster, smarter, and more reliably.


Why prompting principles matter

A prompt isn’t just a request - it’s your instruction manual for Rocket’s AI.
Strong prompts give you better structure, smarter behavior, and fewer surprises.

This guide introduces two core models that will level up how you prompt:

  • The C.L.E.A.R. framework for writing high-quality prompts.
  • A set of universal prompting strategies, including zero-shot, few-shot, one-shot, and chain-of-thought prompting.

The C.L.E.A.R. Framework

This framework outlines five qualities that make your prompt easier for Rocket to interpret and dramatically improves the quality of your output.

To show how this works in practice, we’ll use one example prompt across all five stages:

Goal: Create a user registration form.


1. Concise

Be brief but clear. Focus on what matters, and cut unnecessary wording.

Do:
Create a registration form with full name, email, and password fields.
Don’t:
I want a form that collects user info - like name or email.

Rocket parses prompts instantly. Extra words don’t help - it’s clarity that counts.


2. Logical

Group related elements. Organize the layout like a user would experience it.

Do:
Stack the inputs vertically: full name, email, password. Add a submit button at the bottom.
Don’t:
Add email and password fields, and also a button. Then include a name field.

Prompting out of sequence often leads to unexpected layouts or confusing flows.

Key term: Layout refers to how interface components are visually arranged. Logical order helps Rocket structure your app cleanly.


3. Explicit

Be specific about what you want to appear and how it should behave.

Do:
Add input placeholders, enable password toggle, and show validation errors when fields are empty.
Don’t:
Add a few inputs so users can register. It should work like a normal form.

Don’t leave details up to interpretation. The more precisely you describe the behavior, the more accurate the output.

Prompting works best when you imagine you’re writing instructions for a builder. Specific requests = precise results.


4. Adaptive

Tailor your prompt to the current goal - whether it’s design, logic, or debugging.

Do (Design):
Create a minimal registration form, centered on screen, with large inputs and white space.
Do (Logic):
When the form submits, validate inputs, store data in Supabase, and show a success message.
Do (Debug):
The “Register” button stays disabled even when fields are filled. What logic might be missing?
Don’t:
Build a form and connect it to the backend. Also make sure it’s styled well.

Being too general makes it harder for Rocket to know what to prioritize - layout, logic, or both.

Key term: Logic describes how elements behave - validation, interactions, and flows.


5. Reflective

Don’t settle for your first draft. Reflective prompting means reviewing, adjusting, or rephrasing based on results or what you learned from a previous attempt.

Do:
Tried a version that felt too vague, so you rewrote it:

Add a centered registration form with name, email, and password inputs. Validate all fields and show a success message after submit.
Don’t:
Kept using the same unclear prompt even after Rocket returned layout issues or missing behaviors.

Great prompting is iterative. If the output isn’t quite right, don’t just edit the result, rethink the instruction.

Reflection is what turns decent prompts into excellent ones. If the result wasn’t what you expected, revise the input - not just the app.


Prompting Strategies

Different prompts unlock different outcomes. Whether you’re exploring, instructing, or troubleshooting, the right strategy will get you there faster.


Zero-shot prompting

How it works:
You give Rocket one instruction, with no examples or setup. It uses context and training to respond.

Best used for:

  • Simple tasks
  • Common UI or logic
  • Fast one-off builds
Examples
- Create a registration form with full name, email, and password.  
- Design a dashboard with a sidebar, a top bar, and three chart blocks.  
- When the form is submitted, save the data to Supabase.

One-shot prompting

How it works:
You give a single example to set expectations, then follow up with your actual task.

Best used for:

  • Showing structure or style
  • Setting format once for Rocket to follow
  • Lightweight customization
Examples
- Provide a section layout for one feature, then ask for a second section in the same style.  
- Show how a card component should be styled, then generate another with different content.  
- Set up a single example of a success message, then reuse the format for different flows.

Few-shot prompting

How it works:
You show Rocket a few examples so it learns a pattern - then ask it to follow that structure.

Best used for:

  • Repeating layout patterns
  • Matching tone or behavior
  • Consistent UX or logic styles
Examples
- Group form fields into sections with headers.  
- Add a testimonial block, then a pricing section, then a feature list.  
- Use icons next to each feature and a consistent layout for each section.

Chain-of-thought prompting

How it works:
You guide Rocket to reason step-by-step before giving the final output.

Best used for:

  • Logic-heavy prompts
  • Multi-step flows
  • Debugging and conditional logic
Examples
- Let’s walk through this: Check if the user is logged in. If yes, show dashboard. If not, redirect to login.  
- First, generate the form. Then, add validation. Finally, link to the success screen.  
- Break this logic down—what needs to happen before the user sees a confirmation?

Chain-of-thought isn’t just for solving problems - it’s for teaching Rocket how to think through them.


Preventing hallucinations and inconsistencies

Even well-written prompts can generate off-track results. Here’s how to reduce drift, missing logic, or features you didn’t ask for.


1. Be specific about source data and constraints

Instead of:
Build a dashboard with all the user metrics.
Try:
Build a dashboard that shows active users, churn rate, and average session time from the `user_metrics` table.

Specific nouns (fields, collections, actions) reduce ambiguity and keep Rocket grounded.


2. Sequence complex logic into steps

Instead of:
Build a complete workflow that saves data, and emails the user.
Try:
Step 1: Create a form with validation.  
Step 2: Save the data to the database.  
Step 3: Send a follow-up email.

Step-by-step instructions reduce skipped behavior and are easier to test.


3. Reflect and revise before re-prompting

If the output isn’t what you expect, don’t just rephrase - step back and clarify what Rocket misunderstood.

Review this screen’s logic. What assumptions did Rocket make that aren’t valid?

The more complex the task, the more your prompt becomes a design brief. Revising the instruction is often more effective than editing the output.