Start Here

  • Introduction
  • Quick Start

Core Features

  • Project Settings
  • Schema Builder
  • Dynamic Data
  • Conditional Logic
  • Relationships
  • Consuming your API

Integrations

  • n8n Workflows
  • Make (Integromat)
  • AI Tools

AI Tools Integration

Building with AI-powered editors like Cursor, Windsurf, or GitHub Copilot? Mock66 is designed to be AI-friendly.

If your AI assistant keeps "hallucinating" incorrect API endpoints or random JSON structures, you simply need to give it the right context. This guide provides optimized System Prompts and Context Snippets you can feed your AI.

The Golden Context

Stop the Hallucinations

To get perfect code generation, paste this block into your AI's chat window or save it as a rule in your IDE (e.g., .cursorrules). This teaches the AI exactly how Mock66 works.

Copy & Paste into AI Context
# Mock66 API Documentation for AI Context

You are integrating with Mock66, a dynamic mock API generator.
Use the following specifications when writing code:

1. Base URL Pattern: 
   https://{PROJECT_ID}.mock66.dev/{PROJECT_ID}/{ENDPOINT_PATH}
   (Example: https://proj_123.mock66.dev/users/12345)

2. Authentication: 
   - Public Projects: No headers required.
   - Private Projects: Must include header "x-mock66-api-key: {YOUR_API_KEY}".

3. Response Format: 
   - Always returns standard JSON.
   - HTTP 200: Success (returns generated data).
   - HTTP 403: Project Inactive or Invalid Key.
   - HTTP 404: Endpoint not found.
   - HTTP 500: Server error.
   - HTTP XXX: Custom status codes as configured.

4. Workflow:
   - I define "Schemas" in the Mock66 dashboard.
   - I call the endpoint to get data generated based on those schemas.

Generating Schemas

Don't write complex JSON schemas by hand. Describe your business logic to the AI and ask it to generate the schema for you to paste into the Mock66 dashboard.

  1. Define the Prompt

    Use this prompt to get a clean JSON object ready for Mock66.

    Prompt Template
    I need to configure a Mock66 endpoint for a {Feature Name}.
    Please generate a robust JSON Schema for a {Data Type}.
    
    Requirements:
    - Include fields: {List fields, e.g., id, name, status, avatar}.
    - Use realistic data types (uuids for IDs, internet names for usernames).
    - The 'status' field should randomly select from: ['active', 'pending', 'banned'].
    - Output ONLY the valid JSON object.
  2. Apply to Mock66

    Copy the JSON output from the AI and paste it into the Schema Editor in your Mock66 project.

Fetching Data

Ask your AI to write the boilerplate data-fetching code. Since it knows the "Golden Context" from step 1, it will get the URL structure correct automatically.

React Hook Prompt
Create a reusable React hook called use{ResourceName}.

1. It should fetch data from my Mock66 project: {PROJECT_ID}.
2. The endpoint path is /{endpoint_name}.
3. Use fetch (or axios).
4. Handle standard loading and error states.
5. Define a TypeScript interface for the response based on this data shape:
   {Paste a small example of your expected JSON here}

Writing Tests

The AI Superpower

AI excels at writing integration tests when given a stable mock. Use Mock66 as the deterministic backend for your tests.

Testing Prompt
Write an integration test for my UserProfile component using React Testing Library.

- The component makes a GET request to Mock66.
- Mock the network request to return this specific JSON:
  {Paste your Mock66 schema output here}
- Test that the user's name is rendered and that the loading skeleton appears while fetching.
Tip: If you use Cursor, you can highlight your component code and the prompt above, then hit Cmd+K to generate the test file in seconds.

On this page

  • The Golden Context
  • Generating Schemas
  • Fetching Data
  • Writing Tests
Previous
Make (Integromat)