Logo

Make guide

How to use Make with systemprompt

Introduction

With systemprompt.io modules in Make, you can embed prompts with validated structured data as the output. This enables the use of chained prompts in scenarios, unlocking agentic flows and automations.

There is an easy to use graphical interface to create and manage your prompts. Prompts can be defined in detail using State of the Art (SOTA) prompting techniques. Static content can be saved and referenced. The platform enables control over the output data and visibility into usage.

We provide a comprehensive and growing library of both prompts and scenarios to turbo boost your scenarios with reliable AI automation.

We are committed to our Make integration to enable the easy use of no-code AI powered agentic workflows

The use of this module and provided prompts and scenarios require a paid subscription (with 14 days free trial available) on the ‘Execute’ plan.

Connect SystemPrompt.io to Make

To establish the connection, you must:

(A) Obtain your API key from SystemPrompt.io.

(B) Establish the connection in Make.

(A) To obtain your API key from your SystemPrompt.io account:

  1. Log in to your systemprompt.io account, systemprompt.io login If you are not a paid user, you will be asked to select the plan that most suits your needs Make use cases require the ‘Execute’ plan There is a 14 day free trial that you can terminate at any time
  2. Navigate to the dashboard.
  3. Click on API SETTINGS and copy the API key value shown.
  4. You will use this value in the API Key field in Make.

(B) Establish the connection in Make:

  1. Log in to your Make account, add a systemprompt.io module to your scenario, and click Create a connection.
  2. Optional: In the Connection name field, enter a name for the connection.
  3. In the API Key field, enter the API key copied above.
  4. Click Save.
  5. If prompted, authenticate your account and confirm access.

You have successfully established the connection. You can now edit your scenario and add systemprompt.io modules. If your connection requires reauthorization at any point, follow the connection renewal steps here.

Build SystemPrompt.io Scenarios

After connecting the app, you can perform the following actions:

Action Create Prompt Create a systemprompt.io prompt for embedding via API.

Execute Prompt Executes a call to an LLM with structured request and response data.

Universal Performs an arbitrary authorized API call.

Create Prompt

Create prompt allows you to provide a title and description of what you want your prompt to do, we complete and autogenerate the prompt in our system.

[
  {
    "name": "title",
    "label": "Title",
    "type": "text",
    "help": "The title of your prompt, our AI generation platform will do the rest",
    "required": true
  },
  {
    "name": "description",
    "label": "description",
    "type": "text",
    "help": "The description of your prompt, our AI generation plafform will do the rest",
    "required": true
  }
]

This will return a prompt ID that you can then use to execute.

Your prompt is now available in the https://systemprompt.io/console dashboard for you to edit and test for more complex use cases.

For new users, we recommend you start by importing a prompt from our Prompt library. https://systemprompt.io/resource/prompt.

If you have a common use case that you can’t see in our library, reach out on our Discord channel and we will may set up the prompt on your behalf.

After you have created or imported a prompt, you can now execute it in your scenarios.

Executing prompts enables you to directly work with the result of the prompt after it has been processed by an LLM. In the Make scenario, you will receive the output of the prompt after it has been processed, with the structured data guaranteed to match the definition in your prompt entity.

For example, you might want to convert a list of products into a curated list of birthday presents for a target profile. You would define the output of the schema as.

{
  "type": "object",
  "$schema": "http://json-schema.org/draft-07/schema#",
  "required": ["products"],
  "properties": {
    "products": {
      "type": "array",
      "items": {
        "type": "object",
        "required": [
          "productLink",
          "title",
          "description",
          "price",
          "choiceMessage"
        ],
        "properties": {
          "price": {
            "type": "number",
            "description": "The price of the product."
          },
          "title": {
            "type": "string",
            "description": "The title of the product."
          },
          "description": {
            "type": "string",
            "description": "A brief description of the product."
          },
          "productLink": {
            "type": "string",
            "format": "uri",
            "description": "A URL link to the product page."
          },
          "choiceMessage": {
            "type": "string",
            "description": "A message explaining why this product was chosen."
          }
        }
      },
      "maxItems": 5,
      "minItems": 5
    }
  }

This means that you are guaranteed to receive a response that conforms to the defined schema and can be used in your scenario.

Don’t worry if you aren’t comfortable creating JSON Schemas, as with everything in systemprompt, “there is a prompt for that” and we have automations and flows that will do the technical work on your behalf.

This can be done in our dashboard, or you can get help on our Discord channel.

Universal

For advanced users, we have an OpenAPI compliant swagger schema for our API, opening up a world of use cases with the systemprompt.io platform.

Resources

On this page