Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This document outlines the standard practices and naming conventions that all app developers should adhere to in order to maintain consistency within Make.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
You write and edit the custom app configuration with the VS Code extension as you would edit a JSON file on your PC. The custom app configuration is downloaded to your PC when you open it from VS Code and uploaded to Make when you save it. The custom app configuration is stored on your PC as temporary files. The temporary files are deleted after you close the configuration.
The custom app configuration is associated with your account in Make. The communication with the VS Code extension and your Make account is authorized with an API key. To set up the VS Code extension you must provide an API key with the appropriate API key scopes. Generate a new Make API key if you don't have one:
You can get the Make Apps Editor extension from the .
Make supports writing custom apps in Visual Studio Code (VS Code) with the Make Apps Editor extension. You can get the VS Code extension from the or install it in the Extensions tab in VS Code.
The Make UI allows you to browse in Example and Custom apps you previously created.
When developing an app, you can work with elements such as base, connections, webhooks, modules, RPCs, and custom IML functions.
Discover how to develop apps using the available features in Make UI.:
The VS Code extension supports all jobs available in the Make UI, additionally, it empowers developers with the following capabilities:
Importing & exporting the apps
Cloning components within the app
Support for apps' local development, enabling:
pulling updates or new app elements to the local workspace and a repository
collaborative development
control over code contributions
Learn how to get the most out of the VS Code extension:
The Apps platform documentation is divided into topic groups. The topic groups are structured the same way as you would navigate inside the Apps builder in Make.
Section APP STRUCTURE describes how to define and configure the basic elements of a custom app. The basic custom app elements are listed in the top menu in the custom app configuration, for example: Base, Communication, Modules .
Every structure element is divided into blocks. Some of these blocks appear in multiple places and share the same purpose, others do not.
COMPONENTS are like bricks for APP BLOCKS. Block doesn't exist without components. Multiple components form a block.
Pick a brick that suits your needs and use it. You can use multiple separated components or combine them together by using the nested
property. Play around and find the best components combination for you!
When for a call you need to provide a name and an email -> you would use parameters "type": "email"
and "type": "text"
.
When for a call in case X you need to provide a name and in case Y an email-> you would use parameters "type": "select"
with nested parameters "type": "email"
/ "type": "text"
under each option.
When you create your app in the Make platform, you are using the web code editor. The web code editor has a bunch of handy features to make your app development easier:
Hints with links to the apps platform documentation with more details.
Code reformat button with a label showing the data format. The available editor box data formats are:
jsonc
json
javascript
markdown
Context aware code suggestions for the jsonc
data format. The web code editor gives you hints what parameters or properties you can add to the custom app code. The code suggestions work even for RPCs.
Hints displayed when hovering over the custom app configuration.
Custom app code validation for the jsonc
data format. If you misspell a code property name, or if you place a property in an invalid code section, the web code editor highlights the error.
One button to save changes in all editor boxes on the page.
Controls to expand and collapse the code. To view the controls, hover over the space between your code and editor line numbers.
The editor highlights the IML syntax so it's distinct from the rest of the code.
The web code editor is built on the same backend as the Visual Studio Code. If you know your way around Visual Studio Code, you will find that some keybindings also work in the web code editor. Some of the most useful web code editor keybindings are:
For Windows users:
Ctrl + Shift + H: shows a cheat sheet with selected shortcuts
Ctrl + S: save your changes to all of the editor boxes on the page
Ctrl + F: search in the editor box content
Ctrl + H: search and replace in the editor box content
Ctrl + Space: show a list of code suggestions valid in the current cursor context
Shift + Alt + F: format code
Ctrl + /: toggle line comments
F1: display web code editor command and keybindings reference
For MacOS users:
Ctrl + Shift + H: shows a cheat sheet with selected shortcuts
Cmd + S: save your changes to all of the editor boxes on the page
Cmd + F: search in the editor box content
Opt + Cmd + F: search and replace in the editor box content
Ctrl + Space: show a list of code suggestions valid in the current cursor context
Shift + Opt + F: format code
F1: display web code editor command and keybindings reference
Make sure your app icon meets the requirements outlined in the article below.
To view, set, or change the icon of the app, click the right mouse button on the app name and choose the Edit icon
option.
A new view will appear and you will see the preview of the current app's icon inside the app module.
You can change the icon by clicking the Change icon
button. The file upload dialog will appear and after you confirm the chosen icon, it will be uploaded back to Make. The change icon view will close and the new icon will appear in the left tree.
Click the Make icon on the VS code sidebar. Clicking on the Make icon activates the Make Apps Editor. You get notified in a pop-up window that you haven't yet set up a development environment.
Click the Add environment button to launch the environment setup or execute the command: >Make Apps: Add SDK Environment
from the command palette.
Fill in the API URL in the next pop-up window. The API URL depends on your Make zone. For example, the US1 Make zone has the API URL: us1.make.com/api
.
If the app, you want to access, originates from a different zone than your account, enter the app's zone to access its content.
Fill in the label for the environment in the next pop-up window. Press Enter to confirm the environment label.
The Make Apps Editor extension restarts with the environment configuration.
A new sidebar appears in VS code with a list of your custom apps and Make open-source apps. If you previously created any, your custom apps are listed in the block My apps. The Make open-source apps are listed in the Open source apps field at the bottom of the VS code sidebar.
The open-source apps' code is only available in the EU1 zone.
If your zone is different and you want to access their code, create a new environment with eu1.make.com/api
environment URL by following the steps starting from the 5th.
In Make Apps Editor, you can work across multiple environments.
Easily identify the active environment by checking the indicator in the bottom status bar.
Upon clicking the environment indicator, you can switch among multiple environments.
You can add another environment by issuing the >Add SDK environment
again (follow the steps starting from the 5th).
You can add extra settings in Extensions > Make Apps Editor > Extension Settings > settings.json file.
Here are some settings for better performance and experience:
Set editor.formatOnSave
to true
in VS Code settings. Source codes will be formatted automatically when you save them.
Set editor.quickSuggetions.strings
to true
in VS Code settings. Keyword recommendations will automatically show up while you're typing in IML strings too.
You can log out using the >Logout
command and log back in by the >Login
command.
When you log out, the API key is removed from the settings.json
file. To log in again, you will need to enter your API key.
Make Custom Apps documentation is a guide for developers looking to create their own apps for themselves or others to use on the Make platform. This documentation will walk you through how to use Make Apps Editor in Make UI and in Visual Studio Code to create and manage those custom apps, as well as best practices and common approaches for development.
When there is a service that you want to use in Make but the service is not yet available in Make, use the apps builder to create a custom app. The only requirement is that the service has to have an API.
In the apps builder, you write down a JSON configuration. This configuration is then used by the Make platform to generate all connections and app modules for you. If you are working on a complicated custom app, you can write a custom IML function with JavaScript.
There are two options directly supported by Make to write the custom app configuration:
The web interface of your Make account instance.
The Visual Studio Code (VS Code) extension.
The benefits of using VS Code over the Make account web interface are for example:
first-class support for JSON format, like syntax highlighting and completion,
automatic checking of the JSON configuration validity, notably in terms of parameter type checking and correct object context,
predefined project structure for every custom app you create,
If you want to write the custom app configuration in the Make web interface, navigate to Custom apps in the left sidebar menu of your Make account. You might have to click on the three dots at the bottom of the left sidebar to view the Custom apps option.
All modules can be tested directly in scenarios.
Changes to communication configs are immediately active.
You can see raw requests/responses in your browser's console.
Changes in parameters and interface requires you to reload scenario editor page.
We use JSONC (JSON with comments) in all sections except common data.
The Make Apps Editor VS Code extension contains support for Git.
To start editing the source code, find the item you want to edit in the left menu and click it. A new editor will appear and the current source will be downloaded from Make. You can edit it as a normal file. If your app contains some RPCs or IML functions, they will be provided to you.
After pressing the shortcut CTRL+S
, the source code will be automatically uploaded back to Make.
To add a new item, such as a module, connection, webhook, etc., right-click the corresponding folder and click the New <item> option.
Each time the prompt will appear, you will be asked to fill in information about the newly created item. Just go through it. Your new items will always appear under the corresponding folder.
To edit metadata (for example to change a label of a module), right-click the desired item and select the Edit metadata option from the menu.
The prompt will appear allowing you to change allowed values. If you don't want to change a value, skip the field by pressing the Enter key.
To change the attached connection or webhook of an item, right-click the item and select the Change connection (or webhook) option from the menu.
The prompt will appear allowing you to change the connection or webhook. If possible, there will also be an option to unassign the current connection without assigning a new one.
Also, the prompt to assign an alternative (secondary) connection will appear. Please note, that the alternative connection should not be the same as the main connection. You can leave it empty.
To delete an item, right-click it and choose the Delete
option.
You will be asked to confirm the deletion. If you answer Yes
, the item will be deleted from the app.
Utilize the Interface Generator to generate an interface. Familiarize yourself with the Generator's functionality by following the steps described in the Interface Generator article below.
Hovering over shows their docs.
Hovering over provides a link to the function definition.
For a full reference of the web code editor keybindings check the official and the official VSCode keybindings cards:
Install the Make Apps Editor. You can get the Make Apps Editor from the or install it in the Extensions tab in VS code.
Copy your Make API key in the last pop-up window. If you don't have an API key, follow the procedure to .
If you are new to custom apps on Make or want to freshen up your knowledge about apps developing, enroll in the . It has 39 lessons and 4.5 hours of video content where you can learn all the aspects of custom apps development on Make.
Follow the instructions to configure the VS Code extension.
If you are developing a custom app for the first time, check out the section first.
Learn how to utilize feature to develop .
Deleting items is only possible in private apps. Once an app is published, the capability to delete items within the app is disabled. For further details regarding apps' visibility and the deletion of items, please refer to .
Click on the API tab.
Click the Add token button.
4. Set API scopes for the API key.
The required scopes for the Make Apps Editor are:
sdk-apps:read
sdk-apps:write
5. Click the Save button to confirm selected permissions.
6. Copy your new API key to your clipboard and store it in a safe place. This is the only time when you can see the whole API key. The API key is required to set up the VS code extension.
You have created a new Make API key. All your Make API keys are listed in your Make profile in the API tab. In the API tab, you can view permissions for all your keys and delete unused Make API keys.
Navigate to in Make.
Add more API scopes to your API key if you want to use them. The description of the API scopes is documented in the .
Unlike online development within Make Apps Editor or web code editor, local development doesn't provide access to advanced features such as prefilled code templates, IML object suggestions, and seamless integration with Scenario Builder for continuous testing.
Instead, it offers a self-contained environment for app creation and modification, ideal for situations where internet connectivity is unreliable or where comprehensive testing within Scenario Builder isn't required. Additionally, local development allows synchronization with Git repositories and provides full search capabilities across the entire codebase.
If you don't have an app cloned yet, do so by following this manual:
The structure of the app is very similar to the one in Make Apps Editor. Each group of components, such as RPCs, modules, or custom IML functions, contains the component directories with corresponding code files.
If you need to edit the current code, click the file you want to edit and develop your changes in the opened tab. Save the changes.
If you need to create a new component, right-click the corresponding components' folder and select New Local Component: <component's name> (beta), for example, New Local Component: Module (beta).
Follow the dialog and enter the component's name, label, and other corresponding parameters as you are used to from Make Apps Editor.
Once all the questions are answered in the dialog, a new component with the corresponding files is created.
Write the code in the corresponding files and save the changes.
The changes in components or new components can be partially deployed to Make, or the app as a whole can be deployed to Make.
The changes in components or new components can be partially deployed to Make, or the app as a whole can be deployed to Make.
Follow the instruction, that fits your case, below.
In the dialog, select the origin where the changes should be deployed.
The changes or new components are now available in the Make app. The new app version can be thoroughly tested in Scenario Builder. If you utilize testing and production apps, you can deploy the changes or new components to the production, after the testing app passes the testing phase.
Please be advised that this feature is in beta, meaning, it may encounter occasional bugs or inconsistencies, so proceed with an awareness of potential functionality limitations.
To start the development of an app in a local directory or git repository, or start tracking changes in your app, you need to clone the Make app to the local workspace.
First, you need to open the folder where you intend to store the app, in Visual Studio Code.
Once the repository in Visual Studio is set, you can proceed to cloning the app to the local folder.
In the opened window of Visual Studio Code, go to Make Apps Editor and right-click the app you wish to save to your repository. Select Clone to Local Folder (beta).
Read the text in the dialog window and confirm reading by clicking Continue.
Enter the workspace subdirectory name, where the app should be cloned to. If the subdirectory doesn't exist yet, it will be created. The default subdirectory is set to src
. Click Enter.
Exclude (more secure) - Select, if your app contains sensitive data, such as Client ID and Secret. Common data will not be stored in your local workspace or a repository.
Include (for advanced users only) - Select, if you want to store the common data in your local workspace or a repository. Be aware that storing common data outside of Make could potentially expose the app to vulnerabilities.
The app is now cloned to the local folder.
To properly start the versioning of your app in the git repository, follow the steps below:
Go to the GitHub Desktop app and open the repository where you deployed the current version of your app. You will see a list of new files.
Enter the Summary of the commit and click Commit to main.
Now, the first version of your app is logged. Every new change or a new component in the app will be considered a new change.
Optionally, click Publish branch.
Whenever you create a new package of changes, you can commit them to create a new version in the Git repository.
Go to the GitHub Desktop app and open the repository with the latest version of your app. You should see a list of new changes.
Enter the Summary, optionally the Description of the commit, and click Commit to main.
Now, the new version of your app is logged.
Please be advised that this feature is in beta, meaning, it may encounter occasional bugs or inconsistencies, so proceed with an awareness of potential functionality limitations.
One feature available to developers is the "Compare with Make" functionality. This tool allows developers to compare the code in their local app with the code in their Make app.
By leveraging this feature, developers can easily identify differences, track changes, and ensure consistency between their local and Make apps.
To compare the code of a local component with a remote component, right-click the local component block and select Compare with Make (beta).
Once clicking, a view with the local code on the right side, and the remote code on the left side will appear.
Once you finish the development of new changes and components in the Make app you can push the changes to the local app. To do so, follow the steps below.
Go to Visual Studio Code and open the local directory with the local app file.
Locate the makecomapp.json file and right-click it. In the pop-up menu select Pull All Components from Make (beta) option.
In the dialog, select the testing app origin.
All changes in the existing components will be pulled into the local components. If there is a new component, a dialog will appear. Either confirm the creation of the component by pressing Enter or click Ignore permanently/do not map with the remote option.
This tutorial will guide you through the steps to create a custom app with a module. You will use the Make "Virtual Library Demo API."
If you want to test the Virtual Library Demo API click the link:
A new tab opens in your web browser with the API response in JSON format:
The development of a custom app in Make is divided into several steps. These steps are described in detail in the following subpages.
In the app development process, there are situations where it can be highly beneficial to push changes to multiple app origins. This is especially useful when managing different versions of an app, such as maintaining both a testing and a production app.
In this manual, the steps to create a new app origin for a testing app version are described.
If you haven't created a testing app in Make, do so by creating a new app in Make and naming it <App Label> Testing
so it is obvious it is the Testing version.
Go to the makecomapp.json file in the local app repository. Locate the origins
array of the collection and enter a new item by copying the code below and editing the values as instructed under the code.
If the local app is shared among multiple developers, the origins
array will contain records for each connection between their local and Make (remote) app. Do not edit the current records to prevent breaking the connections.
label
- the label of the local origin
baseUrl
- the URL to the origin's zone
appId
- the name of the app
Save the changes in the makecomapp.json file.
When you want to create a custom app, you have to set up the custom app metadata, such as:
a unique name of the custom app
custom app label that Make displays in the scenario editor
color of the app's modules
app icon
This page will guide you through these settings.
Navigate to Custom apps in the left sidebar menu in your Make account. You might have to click on the three dots at the bottom of the left sidebar to view the Custom apps option.
You will see a list of sample custom apps. Click Create a new app in the top right corner.
In the Create an app pop-up, fill in:
Name: The unique identifier of your custom app. Make uses the custom app name internally. Note the requirements under the text box.
Label: The custom app label. Make will display the custom app label in the scenario editor. You can use any characters in the custom app label.
Description: Description of your custom app. This field is optional.
Language: The language of your custom app and its descriptions.
Audience: This parameter has no effect currently. The custom app is accessible regardless of the user's country.
App icon: Upload icon. The icon upload is optional. Make will use this icon to create your custom app logo. The icon has to:
be a file in .png
format
have 512 x 512 px dimensions
have a maximum of 512 kB size
Make processes the icon file so that:
All color pixels in the icon file are converted to white
All transparent pixels in the icon file are converted to your theme color
Click Save changes to confirm the custom app settings.
You created a custom app in Make. The next pages will guide you through setting up the app's base, connection, error handling, and modules.
These common settings are:
Base URL
Authorization
Error Handling
Sanitization
In this step, we will set up the Base for our custom app. The custom app uses the Virtual Library Demo API.
Navigate to your custom app settings.
Click on the BASE tab.
The first code block already contains a JSON snippet:
Replace the URL address https://www.example.com
with the Virtual Library Demo API base URL: http://demo-api.integrokit.com/api/v1
Press Ctrl+S
to save changes.
The base settings of your custom app look like this:
For example "Europe/Prague"
It's possible to write tests for your custom IML functions. You can use it
function and asserts
as you may already know them from Mocha and other testing frameworks.
Our example function has the following code:
So, let's write a test for this function. We'll create two blocks.
As you can see, the it
function accepts exactly two parameters, the name of the test and the code to run. In this code, we can verify expected outputs using assert.ok()
function.
To run a test on a specific function, right-click the function name in the tree and select the Run IML test option.
The test will start and you'll see the output in the IML tests output channel.
When you have successfully logged in and have the environment set, it's time to develop your first app. To start, click the +
icon in the header of My apps
section or call the >New app
command directly from the command palette.
First, you will be asked to fill in a label. That's the app's name the users will see in the Scenario builder.
Next, the app ID will be generated for you. It will be used in URL paths. However, it should be clear to which app it leads. It has to match (^[a-z][0-9a-z-]+[0-9a-z]$)
regular expression.
Then, you'll be asked to enter the app's version. Currently, the version 1 is only supported.
Next, you'll be prompted to enter the description of your app.
Then, enter a color theme for your app. This is the color the app's modules will be seen as in scenarios. The theme is in hexadecimal format. For example, the Make app's color is #6e21cc
.
The next prompt will ask for the app language. That is the language of the interface of the app. Most of the apps in Make are currently in English.
The last prompt is for countries where the app will be available. If you don't pick any country, the app will be considered global.
And that is it. After you confirm the last dialog, your brand-new app will appear in the My Apps
view and you can start coding.
This page describes how to develop an application with the Make Apps Editor, managing production and test versions of the application.
The developer writes and tests the code with the test application in Make.
The developer tracks changes in the local git
repository, pulling changes from the test application.
The developer pushes the changes to the production application from the local testing app when they finish the development and testing.
This process improves the maintenance and stability of the application because the development does not influence the production version of the application. In addition, all changes can be tracked in a git
repository, providing a clear and organized development workflow.
To start the development of testing and production app versions, the following is needed:
The production version of an app in Make - If you already have an app that is already in use, use it as the production app.
The testing version of an app in Make - Create a new app in Make, with no content, that will function as the testing version of the app.
Optional, version control with Git - To properly track all changes in the local app, it is recommended to use a Git repository, for example, GitHub.
Below is a diagram explaining how a developer can develop testing and production app versions.
First, you will need to create a testing version of Make app. Follow the instructions in the article below, to create a new origin to the Make testing app.
Once the origin for the testing app is successfully created, you will need to deploy the current code from the app, that already exists, which we can call "Production".
Right-click on the makecomapp.json file and select Deploy to Make (beta).
In the dialog, select the app origin that represents the testing app.
The content of the local app will now be deployed to Make. Whenever a new component is about to be created, a dialog will prompt for confirmation. Press Enter to confirm the creation of the component. If you don't want to create the component, click Ignore permanently/do not map with remote option.
The app has been deployed to Make.
Now, you can start developing new components or editing the current components in the Testing app in Make, and thoroughly test the app in Scenario Builder.
Once you finish the development of new changes and components in the testing app you can push the changes to the local app.
To synchronize changes made to the testing app with the production app, follow the steps outlined in the manual below.
To start the collaborative development, ensure that you have set up the testing and production app versions by following this article.
Owner of the production app - Every app in Make can be owned by a single Make account. The owner of the production app manages the deployment of the new local app version to the Make app.
Developers of testing apps - Each developer manages their own testing app in Make, which is connected to the local app from the Git repository.
Below is a diagram explaining how developers can collaborate on app development.
Modules are the key component of your app. They are basically wrappers around specific app/service functionality, which is exposed via an API endpoint.
There are three basic types of modules: Action, Search, and Trigger (polling).
Use if the API endpoint returns a single response. Examples are Create a book, Delete a book or Get a Book.
Use if the API endpoint returns multiple items. An example is List Books that will find specific books according to search criteria.
Use if you wish to watch for any changes in your app/service. Examples are Watch a New Book, which will be triggered whenever a new book has been added to the library.
To create a new Module, click on the tab MODULES. The list of all modules your app consists of, will be shown (empty for now). Click the large button with the plus sign and choose Create a new Module.
A dialog will pop up, where you can name your module, choose its type, and provide some description. Fill the dialog as shown and click Save.
Make sure the tab Communication is active and replace the content of the text area with the following JSON and save the changes (Ctrl+S):
The url
key specifies the API endpoint path. The URL will be joined with baseUrl
specified earlier to produce the full URL of the API endpoint: http://demo-api.integrokit.com/api/v1/helloworld
.
Congratulations! You have just created your first module.
Now, we will test our new module. Open a new browser tab, login to Make, in the left main menu choose Scenarios and create a new scenario. Click the yet undefined “questionnaire” module to bring up a list of all the apps. Search for your new app by typing its name in the Search field: Custom App
. Click your app and a list of all its modules will be shown, currently just the newly created Hello World module. Click the module to select it. An empty module settings panel will pop up saying “There are no configurable options for this module.”.
Close the panel and run the scenario. Click the bubbles above the module to pop up the panel with information about processed bundles. In case you have successfully followed this tutorial, you should see the following output of your new module.
Perfect! You just learned how to create a new module and make it work. But we can make it better! We can make it more user friendly. Continue reading below.
Since we now know the structure of the output, we can specify the Interface
. So we can set up our scenario right away, with no need to first execute the module to learn the output's structure.
Below, you can see an example of a list of available parameters to map from Shopify > Watch Orders module. Notice that the Shopify module wasn't executed yet (the bubble is not displayed).
In the panel with the output, click on the button circled below and choose Download output bundles option.
Then, a new panel will appear, with the original response from the endpoint. Copy the text to your clipboard.
Go back to the tab with your app and make sure you are in the settings of the Hello World module. Select tab INTERFACE. You can see a JSON snippet:
In the right upper corner, click on Options button and choose Generator.
A new panel will appear. There, paste the previously copied JSON from your clipboard and click Generate.
A new data structure will be generated. Copy it to your clipboard and close the panel.
In the INTERFACE, replace the JSON structure with the new structure.
Perfect! You just learned, how to work with interface and automatically prepare a list of expected parameters from the module's output. Now, let's make it harder!
In Make modules, you can define, what data should be passed to the API. So we can create a new record or define, what should be returned.
The Demo API endpoint /helloworld
takes two parameters greeting
and name
. Click the following link to open the API response in your browser:
The API should return the following JSON response:
So let’s make our Hello World module configurable by adding the parameter greeting
. Switch back to the Mappable parameters tab, replace the empty square brackets with the following JSON and save the changes(Ctrl+S):
Switch to your scenario and refresh the browser window (F5). Click the Module to pop up its settings panel. The panel now contains one text field labeled Greeting. Fill Hi
:
Press OK and run the scenario. Though, if you click the bubbles above the module to pop up the panel with information about processed bundles, the module’s output will be identical as in the previous run.
This is how the log from the run looks like, focus on the Request Headers tab:
Notice, that there is no parameter called greetings
sent.
It is because we need to specify the structure of the request in the module. Specifically, we have to pass the content of the greeting
parameter to the API.
Switch back to the MAPPABLE PARAMETERS tab and click the COMMUNICATION tab. To pass the parameter greetings
to the API, you have two options:
You can either add the parameter to the url
key:
Or you can add a new key qs
(query string) and add the parameter there:
Save the changes (Ctrl+S), switch to your scenario, and run the scenario. Click the bubbles above the module to pop up the panel with information about processed bundles, where you should see the modified module’s output:
Also, you can check how clear the run in DevTool looks. Finally, the parameter greeting
is passed in QS (query string):
You can now practice your skills and try to add a parameter name
to your Hello World module.
Congratulations! You have just learned how to pass data from a module to an API. Also, you have learned how you can debug possible issues using our Make DevTool. If you are ready to learn about setting up connection in your app, continue below.
APIs usually employ some sort of authentication/authorization to limit access to their endpoints.
Make platform provides you with a list of the most used types of connections while in every each there is a code prefilled. You only need to edit it up to your and/or API's needs.
Mostly, you will need to change the URLs and names of the parameters.
While the
/helloworld
endpoint was accessible without any authentication/authorization, the other endpoints of the Demo API will require an API key.Let’s try to call endpoint
/books
that should respond with a list of books in the library:Without providing the API key, the response will contain the following error:
To enable the user of your Module to specify her/his own API key (assuming each user has got her/his own API key to access the API), you need to create a Connection.
We have covered the basics of creating a simple module. Since the /books
endpoint will always return an array with items, you will need to create a new module type Search. Now, let’s see how to update our search module with a variable API token for each user.
Click the tab Connections. The (probably still empty) list of all your connections will be shown. Click the large button with the plus sign and choose Create a new Connection. A dialog will pop up, where you can name your connection and choose its type. Fill the dialog as shown and click Save.
The new Connection will appear in the list. Click the new Connection. A page with two tabs will be shown: COMMUNICATION and PARAMETERS.
A pre-configured communication will look like this:
Since our Demo API doesn't have any suitable endpoint, we will not use any.
The code should be as below:
Once you finish the connection configuration, you can go back to your search module and click Attach Connection.
An Attach Connection dialog will appear. There, select the currently created connection.
When we are setting up a connection, we should not forget about the base!
Therefore, click BASE tab and edit the code:
Awesome! You just learned how to add a new connection, attach it to an existing module, and map the connection data in base. Now, it is the right time to learn, how to make error handling, continue below.
DevTool consists of 3 main modules - Live Stream, Scenario Debugger, and Tools. Each of them is described in detail in the next subpages.
There're options to debug your IML code locally on your computer.
The first option is to do automated testing with Mocha. The process and examples can be found in the link below.
Copy your IML code.
Go to VSC -> File -> New Text File (Shortcut: ⌘ + N - on MacOS, Ctrl + N - Windows/Linux).
Paste your IML function. To execute a code you will need 3 more things:
Call function.
Since we are testing the function locally, we cannot run a scenario to execute it. To imitate scenario execution we need to call a function by its name and send the input, that we specified before, inside it. Example: myFunction(input)
.
Remove all IML functions called inside.
All debug()
and iml.function()
should be removed or replaced by similar JavaScript functions.
Put breakpoints on your code (it's red dots on the left side of line numbers).
Upper menu Run -> Start Debugging -> set Node.js as debug environment
Enjoy! On the left side of VSC, you will see variables that are being processed at the moment. Go through the code step by step and find your bug
Live Stream displays what is happening in the background once you've hit the Run once button in your scenario. It allows you to view the following information for each module in your scenario:
Request Headers (API endpoint URL, HTTP method, time and date the request has been called at, request headers, and query string)
Request Body
Response Headers
Response Body
After you run a scenario, click one of the tabs in the right panel of Integromat DevTool to view the desired information.
Searching Requests and Responses
Enter the search term into the search field in the left panel of Integromat DevTool to only display requests that contain the search term.
Removing requests from the list
To clear the list of requests recorded by Integromat DevTool, click the trash bin icon in the top-right corner of DevTool's left panel.
Enabling console logging
Retrieving the request in the raw JSON format or cURL
cURL can be retrieved using the Copy cURL button next to the Copy RAW button.
If API supports pagination, it should be implemented. In order to make sure the pagination works as intended, it is recommended to set the page size to a low number, if possible, see the example below:
Then, it is needed to create as many records as one page and a few more. You can do it by using Flow Control > Repeater module, which also returns the ID of the repeat. Map this ID in the following module as it will help to differentiate the records.
Once the testing records are created, you can test your search module. Thanks to using the ID from the repeater, you are able to see whether the records are ordered and how, and whether the records are correctly retrieved, e. g. they are not being duplicated (1 page retrieved multiple times).
In the console, you can also effectively control every retrieved page and its size.
Possible pagination issues:
The stop condition limit
set by the user doesn't work. Therefore all the records that exist in the account are retrieved (or only the first page if there is also the issue from the point below).
The next page
condition isn't set correctly so the next page isn't retrieved even though it should according to the user's limit and the content of the account.
The next page
condition isn't set correctly so the next pages are retrieved even though all records were already retrieved. They can either be duplicated or without any records (without looking into the console, you will not spot them).
The pagination is not optimized so even though there is a parameter saying “there is no other page to retrieve”, it still retrieves the next page which is empty (developer implements the pagination for offset even though he/she could implement it using the cursor parameter).
The value in the parameter page
is too low therefore there are too many pages being retrieved (= too many calls).
Not common - the records are duplicated because the pages are overlapped (1st page 1-100, 2nd page 100-199 instead of 1-100, 101-200).
During the development of your app, you will probably experience multiple issues with your app and/or API. Therefore, you will need to use Make DevTool and/or console in order to debug your app. Continue below in order to learn how!
You can easily debug your app using Make DevTool or Chrome DevTool's console. Learn how to use Make DevTool or Chrome DevTool's console below.
When the GET endpoint, which you are using, supports pagination, you need to make sure the module uses it correctly. Learn how you can test if your pagination works correctly below.
Make apps platform has a tool for testing the behavior of your RPCs. Thanks to this tool, testing of RPCs is easy. Learn, how to test your RPCs below.
Implementing a custom IML function might get complicated, therefore, you will need to learn, how to effectively debug your custom IML functions. Learn how below.
Since any API endpoint can return an error during an execution of a scenario, Make brought a way, how to handle such errors. Therefore, all apps in Make, custom included, should have error handling implemented. This way, the modules are more transparent as users of them exactly know, what caused the error. Also, error handling in a scenario can be used.
When you were trying your new search module, you were probably experiencing error 401, but you didn't know why was that, if you didn't check Integromat DevTool. Since Make offers advanced error handling, you should set your app the way, so you could understand what is wrong with your module/scenario right away and use error handlers.
Enter your search module into your scenario and create a new connection with a random API key, if you haven't done so before. You should see the error:
In DevTool, you should see this output:
We need to make sure the error from the body of the response is returned in the module's output as well.
As you learned before, error handling should take part in base, since it is an element, which is shared among all modules and remote procedures. Therefore, open base and replace the current code with the code below:
Now, go to your scenario again, and execute the search module. You should now see the detailed wording of the error:
Voila! You just learned all the basics of custom app development in Make!
If you want to continue in development of your custom apps, you can explore our docs and learn more!
Click on an app in the Apps list. You will get a page with nine tabs. In this guide, we will focus on the custom app's main components:
Now you know the main components of a Make app. Your next step is to set up the custom app Base
.
You can use debug
inside your IML functions to print a message or mid-results of your code.
During the function execution debug messages will be visible inside the console of your browser.
Open Developer Tools in Google Chrome.
Go to Sources -> Snippets. Create a new snippet.
Paste your code. Before execution you will need 3 more things:
Call function.
Since we are testing the function locally, we cannot run a scenario to execute it. To imitate scenario execution we need to call a function by its name and send the input, that we specified before, inside it. Example: myFunction(input)
.
Remove all IML functions called inside.
All debug()
and iml.function()
should be removed or replaced by similar JavaScript functions.
Press ⌘ + Enter (on macOS), or Ctrl + Enter (on Windows/Linux).
Click on a line number to put a breakpoint and start debugging!
In Scope, you will see variables that you are using at a particular moment.
In Console, you will see all your console.log()
messages.
In the case that you are using RPC inside your app, you might need to debug it. After this article, you will know how to debug RPCs on Make and be an ace of it!
The RPC debug tool can be found by following:
Go to Custom apps tab.
Select your custom app from the list.
Go to the Remore Procedures tab inside the custom app.
Select an RPC you want to debug.
Click Test RPC.
Compare the tabs below to understand how things work inside RPC on Make.
By default after creating a new RPC, you have a template of the communication code, which should be modified based on your needs.
Also, there is Parameters tab, which by default is empty. Here you can add any parameter needed. In the same way, you can also do it in the Mappable parameters from a module.
RPC debug tool works the same way modules do.
Specify the connection and other fields (i.e. parameters) if needed.
Click the Test button.
The call, which you specified before in the RPC communication, will be executed.
Sometimes you might get an empty array as a response. If that's not the response that you expected. Check that you correctly specified the path to the object, which you use inside iterate
.
If you prefer app development in the App Editor, follow this , which describes online development, instead.
To deploy only a specific code file in a component to Make, right-click the code file and select Deploy to Make (beta).
To deploy a component to Make, right-click the component directory and select Deploy to Make (beta).
To deploy the whole app to Make, right-click the makecomapp.json file and select Deploy to Make (beta).
The 'Git Repository' section describes the development in the Git repository using the app. Yet it's not obligatory, any preferred or CLI can be used.
A dialog window asking whether should be included or not will pop up.
This manual describes the development in the Git repository using the app. Yet it's not obligatory, any preferred can be used.
The manual describes the development in the Git repository using the app. Yet it's not obligatory, any preferred can be used.
If you are uncertain how to do so, simply follow the steps in the corresponding section in the .
To properly track the new version of your local app in a local workspace or git repository, follow the corresponding steps in this .
If you struggle with finding the app API, search on a web search page for: API site:www.app-or-service.com
. For example,
If you haven't cloned the production app to a local workspace, do so by following the .
Theme: The color that Make uses for your custom app modules and forms. Use the .
For detailed guidelines on how to create the app logo check the .
Every custom app has a piece of settings that are common to all custom app modules and . Use the BASE tab to specify settings that are inherited by all modules and remote procedures.
When using IML functions, that work with date and time, remember to set the correct timezone in extension settings. The accepted format is the .
Cloned production version to the local workspace - Clone the app to the local workspace, if you haven't done so yet, by following this .
The collaborative development is facilitated by the Git for apps feature in the VS Code extension. The supported operations are documented in this .
Make works with 6 types of modules, read more about them .
Click the tab Mappable parameters. The JSON on this tab enables you to specify the parameters of your module that will appear in the . Our module does not require any parameters, so erase the content between the square brackets, leaving just empty square brackets and save the changes (Ctrl+S):
This JSON specifies that the module will have one parameter called Greeting
of type text
. More about parameters can be found in the documentation.
You can also use to check the original REQUEST
and RESPONSE
, for further debugging. Learn more about debugging your app here.
Always use sanitization in base and connection! Learn more about sanitization .
You can read more about the password
parameter type .
As you remember, in base, there should be everything common to all modules and , e. g. baseUrl, authorization, sanitization, error handling, etc.
allows you to debug your Make scenarios by adding an extra pane to the . Using this new debugger pane, you can check all the manual runs of your scenario, review all the performed operations and see the details of every API call performed. Additionally, using Make DevTools you can see which module, operation, even which single response causes an error in your scenario. This helps you debug your scenario, and get your scenario back on track.
To get started, just install the extension from the .
Input example.
You can either type it manually or use debug()
inside the IML function on Make and copy input from the developer console of your browser (more about debug()
).
If you're interested to explore VSC debug tool check their docs .
To enable logging into the console, click the computer icon () in the top-right corner of the DevTool's left panel. Logging into the console is enabled when the color of the computer icon switches to green.
To retrieve the raw JSON content of the request, click the Copy RAW icon () in the top-right corner of the DevTool's right panel.
By using debug()
you can understand what data you are manipulating inside a function.
Input example.
You can either type it manually or use debug()
inside the IML function on Make and copy input from the developer console of your browser (more about debug()
).
If you're interested to explore Chrome DevTools check their docs .
The Tools section features useful tools that can help you build your scenario. Detailed descriptions for each tool are available in the Make documentation.
The Base section should contain data that is common to all (or most) requests. At the very least, this should include the root URL, the authorization headers, the error-handling section, and the sanitization.
Make sure that the Base URL uses the URL of the API, which is shared among all modules or their majority.
In case of a request for approval of your app by Make, make sure that the base URL is a production endpoint with a domain that belongs to the app itself.
Apps with development or staging URLs, or apps with a domain belonging to a cloud computing service, will not be approved!
When the service has a different domain for each user, the domain should be asked for in the connection and then the value should be used in the Base tab.
An example from Mailerlite app:
An example from Freshsales app:
The Base section should also have authorization, which is common for all modules. This authorization should use the API Key, Access Token, or Username and Password entered in the connection. The sanitization should hide all these sensitive parameters.
Examples of possible authorization and sanitization:
Each service sends an error message when an error occurs. This most of the time happens when the request has wrong parameters or values or when the service has an outage. That's why error handling is required.
In case of a request for approval of your app by Make, make sure error handling is correctly implemented!
The error handling code should correspond to the structure of the server response. Let's assume that the JSON response has the following format in the cases where something goes wrong:
Here's how the error section should (and should not) look:
The error object in our example contains the code
and message
fields, but not a text
field. It is also important to show the status code of the error, this can be accessed using the statusCode
keyword. So in the case of HTTP error 400, the error message could look like this:
[400] The company with the given ID does not exist. (error code: E101)
The error object in our example contains the code
and message
fields, but not a text
field.
Each app should have a universal module. This module is there to allow users to use not supported API endpoints using their connection created to the app.
Make sure this module has:
correct label and description,
correct url
which starts with the base URL,
correct connection.
All of the modules should build on top of baseURL
from the Base section (simply by starting with a forward slash /
). It is very unlikely that a single module will need to have a completely different URL than the rest.
The underlined part which is the same for each module should be in the Base tab.
Modules "url" should start with forward slash /
Every module should have a label that precisely describes the module's use. For each type of module, there is a standard naming convention. But it may change depending on the functionality of the module. The label should be composed of the verb expressing the intended action (Create, Update, Watch, etc.) and the name of the entity being processed (Customer, Invoice, Table, etc.).
Watch Events
Create a Report
Update a Record
List Photos
Search Files
Add Members to a Group
Get a Group's Info
Watch events
Create a report
Make a call
List photos
Search files
Add members to Group
Get group info
These modules watch for new data in a given service and return it. Compose the label according to this pattern Watch <watched node>
Examples:
Watch Events
Watch Photos
Watch Deleted Files
These modules write data into a service, modify data in the service, or retrieve a single result. Compose the label using simple verbs like Create, Get, Update and Delete and the modified or created node. Use the naming convention of the service you are implementing.
Examples:
Create a Note
Update a File
Get a User
Delete a Task
These modules retrieve data from the service and allow retrieving one or more results. Compose the label using simple verbs like Search or List. Use the naming convention of the service you are implementing.
Examples:
Search Files
List Tasks
In a few words, describe the functionality of the module. Write the description in the third person and capitalize only the first letter of the first word in the description (like a normal sentence structure).
Example: Description for the module Update a Time Entry:
Updates a time entry for a specific user.
Update a Time Entry for a Specific User.
Triggers use the "Triggers when..." in the descriptions. For example, the description for the module Watch New Users should be: **"**Triggers when a new user is created."
For labels, try to use the same names and conventions as in the integrated service. It helps users to use the app in Make and not be confused.
For variable names, use the same names that come from the service API. It helps when debugging for both advanced Make users and support agents. Ideally, the output of the module should be the same as the response from the API.
Example:
Be sure to use the correct uppercase for abbreviated words, such as ID, IDs, URL, GPS, VAT, etc.
While in Live Stream, you can't display historical logs in a scenario, in Scenario Debugger you can.
It displays the history of the scenario runs and enables you to search modules by their name or ID.
Searching modules by their name or description
To search the module by its name, enter the search term (name or module's ID) in the search field in the left panel of Integromat DevTool in the Scenario Debugger section.
Double-click the module's name to open its settings in the scenario editor.
View request details by clicking the desired operation.
Every connection should have a way how to check if the used API Key/Token is valid (Validation Endpoint). That means each connection should have a part that uses the used API Key/Token against some endpoint that requires only the API Key/Token to run. For example, User Info endpoint or any endpoint which is used to List data.
The validation endpoint is located:
OAuth1 and OAuth2 - the info
directive
API Key, Basic Auth, Digest Auth, Other - the url
which is in the Communication tab
It is recommended to use the metadata
parameter to store the account's name or email. This allows users to easily distinguish their stored connections, especially if they don't name their connections in a good manner.
Notice the value in the brackets after the user's connection name. This value is taken from the metadata
parameter.
To allow users to edit a connection:
Go to the Parameters tab of the connection.
Check that each parameter which original value should be kept secret has the password
type.
The parameters with the password
type don't show the original connection's parameter value, while the parameters with the text
type show the value used by the current connection.
Add the editable: true
property for each parameter in the connection.
For example:
Exception: If the service provides each user with a unique URL or domain, the corresponding URL or domain parameter must be non-editable to prevent any potential credential leaks.
Error handling can be used from the Base tab and follows the same rules.
The only difference is where to use it in what type of connection. For example, in connection types OAuth 1.0 and OAuth 2.0, the error handling should be in the "info" part.
For the next example, suppose that when you call /users
on your service, you will get a list of users in body.data
.
This example will correctly output each user that was returned:
The pagination
section should only contain parameters directly influencing the actual pagination. These will be merged with the rest of the parameters defined in the qs
section, so there is no need to define them all again.
The modules type Search and Trigger(polling) should return everything including by pagination. However, these modules should also allow users to limit their output, that is, how many bundles they return.
Search module limit example:
The limit
parameter should not be set to "required":true
(except for polling trigger modules).
The limit
parameter should never be set to "advanced":true
.
Certain APIs provide support for custom queries or filters when searching records, such as invoices. In Make, our goal is to offer query and filter capabilities to both regular and advanced users.
Therefore, we have implemented two methods of achieving this functionality, and ideally, users should be able to choose between the two options.
We have utilized the familiar filter setup format found in Scenario Builder. With this approach, users are not required to learn the query format. Instead, they can simply set up the query in a manner similar to configuring filters.
When the module is executed, a custom IML function constructs the query, adhering to the specified format.
Users have the option to manually compose their own queries. This feature is particularly valuable when the API supports new operators that are not yet available within the module.
To assist users in leveraging the query field effectively, the following helpful information should be provided:
Query format: The guidelines for structuring the query.
Example of a functional query: An illustrative sample query as reference.
API documentation URL: Direct access to the API documentation with query specification.
Bear in mind that there are two approaches to responsiveness in a service.
Synchronous - The service upon an action request returns a result, which can then be processed in the following modules in a scenario.
Asynchronous - The service doesn't return anything at all, or doesn't return useful output, e. g. a processed file.
When a web service doesn't support a synchronous approach and the common use case of the module requires support, it should be added on the app's side. Basically, there should be two (or more calls) executed instead of only one:
create a job call - a call that requests for the job, e. g. conversion of a file, file upload,
periodically check the status of the job - execute repeated calls to obtain the job's status**,**
ask for the result of the job - once the status call returns the awaited status, request the result, e. g. result file.
After importing a JSON file to a web service, it requires a certain period of time to process the file. In this case, we have to keep checking if the status of the entity changed from processing
to completed
. In this case, when the status is completed
, the result is already part of the response.
The trigger module has an Epoch section that defines what "Choose where to start" looks like. This section is an RPC that uses everything in COMMUNICATION including the pagination. This means there can be an issue if the user has too many objects which could be returned by this RPC so that is why a "limit"
parameter should be specified here.
The "limit" parameter should be a static number which should be at max 300 or 3 * number of objects per page.
from
and/or to
date parametersSome API services require date
parameters that define the interval of records to be retrieved, e.g. from
and to
, fromDate
and toDate
, etc.
In this case, it is important to handle the date
parameters correctly.
The behavior of the supported options:
From now on - The current date will be sent.
Since specific date - The date provided will be sent.
Choose manually - The date of the chosen item will be sent.
All - The default date 1970-01-01
will be sent.
Sometimes, you don't want to map all parameters in the body
for some reason:
the parameter shouldn't be sent at all (technical parameters such as selects, etc.)
the parameter should be sent somewhere else than in the body
, e.g. in the url
the parameter has to be wrapped in an IML or custom IML function
Make and other third-party services transport values in many different formats. It is important to know how to handle the value types of arrays, nulls, and empty strings.
It isn't possible to send a null
value to a service.
It isn't possible to send a null
, empty string, and 0 (zero) value to a service.
It isn't possible to send empty array to a service. E.g. User wants to remove all tags from a task.
Let users decide what parameters they want to send to the service. Make has a feature to show how to process values. This feature allows users to define exactly how Make should behave when the value is "empty". For example, if a user defines that they want to send a specific field to a service even if the value is null
, empty string, or an empty array, it will be sent to the service. In module communication, config passes parameters without any modification.
When a field is a type "date", it should be possible to use our keyword "now" as a value, which means, the field should accept ISO-8601 date format and if the service requires only the date (no time) or a different format like timestamp, this formatting should happen inside the module.
Users of the app should never be prompted to format the date inputs the way API requires! Such apps will not be approved by Make.
Communication:
Parameter birthday
is required to have format YYYY-MM-DD and parameter due_date
is required to be a timestamp by the service, so the formatting happens inside the Communication part of the module.
Parameters:
The parameters birthday
and due_date
are correctly date typed and don't need to be formatted by the user, he can freely work with now
keyword.
Communication:
Parameter birthday
is required to have format YYYY-MM-DD and parameter due_date
is required to be a timestamp but nothing is done with the value.
Parameters:
The parameter due_date
is an incorrect type and birthday
is required to be formated by the user.
The query string parameters should be defined using the qs
object in order not to be embedded directly in the URL. Embedded parameters are those parameters that are after a question mark.
This will enforce the correct encoding of both static and dynamic parameters.
If you need to specify a query string parameter, you can do:
But a better way is to use a special qs
collection.
The headers
, qs
, and body
collections represent request headers, query string parameters, and body payload. The key is the variable/header name and the value is the variable/header value. You don’t need to escape values inside those collections.
The above request can be rewritten as:
The example above will not work. But if you want dot notation for some reason, you can use it directly in the parameter name:
This will create a query string that looks like this: ?someProp.anotherOne.and-one-more=THIS%20WILL%20WORK
The mappable parameters should follow this priority order:
required parameters
optional parameters
advanced parameters
Advanced parameters should never be required.
When defining the input parameters, try to use the same order as in the integrated service. Place required parameters at the top if it is possible. Position parameters in logical groups.
Using a help directive, you may specify a hint of what is expected for the parameter when it is not that obvious from the label or the expected value is more complicated. The text should start with a capital letter and end with a period. Supports Markdown, such as _italic_
, **bold**
, `monospace`
, , or URL [example](http://example.com)
.
Example:
Services like CRMs use large amounts of input parameters. In these cases, you can mark less important parameters as advanced. When the parameter is marked as advanced, then by default it isn't shown in the GUI. It can be found in the advanced parameters instead after the user clicks on Show advanced settings.
Example:
If the app has over 10 modules, the modules should be put into special groups. These groups should be named after the entity with which the modules work, or after the type of the job, the modules are executing.
Also, the modules inside these groups should follow this order:
Trigger module
Create module
Update module
Get module
Download/Upload Module
List/Search module
Delete module
Use RPCs for every field that accepts parameters like IDs and other stuff that is hard to guess or get for users.
Also, RPCs come in handy in a case, when a user only needs to understand the functionality of the module, mostly the list of available parameters in the interface. In this case, the aim is not to offer a complete list of all items, but a sample of the last 100 or so items the user could select from when testing the module or setting up the scenario for the first time, before replacing it with mapped data used in automation.
When adding an RPC to a Get, Update, or Delete module, it is required to add:
to the code so when a user first opens the module, they will be able to immediately map the value. However, if needed, they can also switch to getting the value from an RPC.
When it comes to Search/List modules, it depends on the hierarchy level of the entity. If the entity is up in the hierarchy, e.g. a “customer” or a “deal”, and there are not many RPCs in the input, they should not have the mode set to “edit”. However, if the entity is low in the hierarchy, e.g. "E-mail attachment", it should have the mode set to “edit” since the user will probably not want to list attachments of a single e-mail repeatedly.
Finally, a Create module should not have the mode set to "edit", unless it contains a very large number of RPCs in its input. This is because pre-loading a large number of RPCs would significantly increase the waiting time for the user.
Just like modules, RPCs can use pagination to show more records than just the ones on the first page. However, limits should be also set on how many objects the RPC shows so it doesn't load forever.
The limit should be between 300-500 if the API returns 100 or more objects per page. If the API returns less than 100 per page, then the limit should be 3 times the amount of objects per page.
For example, if API returns 25 records per page, the limit should be 75. You can also set a condition for the pagination to stop further requests when no more data is needed.
Training with 37 lessons and 4.5 hours of video content where you can learn all the aspects of custom apps development on Make.
If you are new to custom apps on Make or want to freshen up your knowledge about apps developing on Make, enroll in the training!
Custom Apps category in community is dedicated to all Makers, especially to those who develop custom apps. If you need help, ask the community! You can participate there as well!
100+ questions regarding the custom apps development in Integromat/Make.
You can check if someone already asked the same question and got a response. If not, feel free to post a new question there!
150+ useful videos about Scenario Builder in Make and many more.
Documentation about Scenario Builder, apps, and many more.
Recommended tool for experimenting with regular expressions. Just make sure to tick the ECMAScript
(JavaScript) FLAVOR in the left panel.
Regular expressions generator for those, who struggle with regex.
A free editor that you can use to edit your file with a logo to be used in an app.
Contact us in case you need help or you didn't find any information you need.
Make will introduce a different way of sharing open-source apps.
Therefore, the Examples tab has been removed and is not available anymore.
Bear in mind that there are two approaches to updating entries in a service.
Partial Update - The service updates only specified parameters sent in the API request and other empty parameters will be unchanged. This is the most common approach for APIs.
Full Update - The service requires all parameters to be updated in an update request. If some parameters are omitted, then they will be cleared or overridden to default values in the service. This is extremely user-unfriendly and should be avoided.
If the API doesn't support a partial update approach, it is needed to add the support on the app's side. Basically, there should be two calls executed instead of only one:
GET call - a call that retrieves the current record and saves it in temp
,
UPDATE call - an update request which contains the user's input merged with the missing parameters from temp
.
If there are a few parameters available, you can use simple OR (||)
directive, which ensures that if there is no value in the particular parameter available, the value from temp
is mapped instead.
If there are a lot of parameters available, it is worth writing an IML function, which merges the parameters with the output from temp
.
There might be read-only parameters, which can't be updated, e. g. CreatedAt
and UpdatedAt
parameters. In this case, it is needed to ensure these parameters will be omitted.
Modules should be associated with the correct type, depending on their functionality. Detailed descriptions of different types of modules can be found in our docs as well as in the guide.
More about this module can be found .
A name of a module, an RPC, or a custom IML function should not match with any reserved word in JavaScript. See the list of reserved words in JavaScript .
Names of modules follow the English format - they are capitalized - Specifically, they use Title Case - refer to . Basically, every word is capitalized except for articles, prepositions, and conjunctions.
The API Key is checked against /whoami
endpoint which in case of wrong API Key returns an error and the connection won't be created.
An example of the "info" part of the Dropbox Oauth2 connection.
There is no endpoint to check whether entered credentials like API Key, Username & Password, etc. were correct and will allow the connection to be created with anything, including wrong credentials.
We recommend allowing users to edit their connections after they create them. Updating a connection simplifies scenario and user credential maintenance when there's a change in the user's organization. You can read more about editable connections .
Suppose you want to retrieve all users, that are registered on your service. You can’t use , because it returns only a single result. You will have to create a module for this.
The communication for is the same as for Action, except Search has an directive, which specifies where are the items located inside the body.
An module should never contain pagination
or the iterate
directive. If you need to return multiple objects, create a module instead.
This can be achieved by setting up the parameter in the response. By default, this parameter is added to the Trigger (polling) modules and should be required. In Search modules, this parameter should NOT be required so if a user leaves it empty, the Search modules return everything. Its default value should be set to 10.
When the repeat directive is used, the condition and limit should always be provided to prevent infinite loops. Learn more about the directive.
Notice the mapped data.lastDate
that is available in polling triggers.
The universal module should be left inside "Other" group. More about groups can be found .
Remote procedures are used to get live data from a service. More information about remote procedures can be found .
You can do it by mapping only the parameters, which are allowed, see the . Or you can implement IML function, see below.
Advantage
The result is returned right away. Ability to proceed the result to the following modules.
Comes in handy when we need to process a large amount of data, e. g. file conversion.
Disadvantage
Regarding the type of job and its severity, the job can take too long. This might cause a timeout (default 40 sec). E. g. file conversion. The default timeout can be prolonged depending on the valid cases.
The scenario is not fluent. It is needed to create at least 2 scenarios - one for triggering the job, and another one for proceeding with the result from the first scenario. The second scenario, if possible, should start with an instant trigger that triggers once the job finishes.
Module Example
Example Scenarios
The best approach is to return the API response as it is. In many cases, the response varies based on the user who is using the app, as responses can contain different custom fields, etc. If the response returned is unchanged, and if still all the parameters aren't described in the output parameters, Make will automatically learn additional parameters from actual incoming data and propose them for mapping.
A module's response output should be defined for the case when a request is fulfilled successfully. The output definition should under no conditions contain error messages (this belongs in the Base section's error handling) nor the additional metadata which may arrive bundled with the actual response information.
Requires IML functions enabled.
There are multiple ways how a date can be returned from the service, either as a timestamp or in any format the service finds fit. It is important to parse this date to our ISO 8601 format so it is shown in an output of the module as a date using the users' localization and timezone.
Make is using ISO 8601 format: YYYY-MM-DDTHH:mm:ss.sssZ
Any other format, even just without milliseconds won't be shown correctly in the output and needs to be parsed using an IML function.
The interface describes the structure of output bundles and specifies the parameters which are seen in the next modules. It should contain the full response from the API, including nested parameters.
You can check the stage of your custom app review in the App flow tab. The App flow tab contains a list of app review statuses and their description. When you click the App flow tab, your browser automatically scrolls and highlights the current review status of your app.
If you have published your app or requested a review for a new app or an app update, you have available the Review tab. You can check the Review Status section at the bottom of the Review tab to see all activity that happened during the review.
Every entry contains 4 columns:
Date and Time - When the activity happened.
Action - Name of the activity.
Status - Available only with action "Review status updated.". The status of the app's review.
Comment (manual) - A comment with additional information or context.
Possible actions are:
App has been published.
You have published the app with the Publish button.
Approval requested.
You have submitted the review form. This action marks the start of the app review process.
Review form updated.
You have updated the review form.
Review status updated.
Make updated the review status.
App has been approved.
Make approved your app. You app is now available to every user in Make.
A private app can only be used by the author of the app until the app is installed in an organization to which the author and other users have access.
A private tag is displayed after your app's name on the app's page.
If your private app is not used in any of your scenarios, you can delete the private app.
You can delete a module only if it is not used in any scenario. If you will try to do so, a warning dialog with a request to remove the module from a scenario first will pop up.
You can delete a component only if it is not used in any module. If you will try to do so, a warning dialog in the right-down corner will pop up.
If you want to share your app outside of your organization, you will need to publish your app, so the invite link is generated.
Go to your app and open your app's page. Click Publish in the right upper corner.
Once you publish the app, the invite link is generated. You can share it with any other Maker who will then install it in his/her organization. It doesn't matter, where the organization is located (EU1 or US1).
Once the app is published, it is not possible to delete the app or make it private again.
If you wish to have your public app deleted, contact our helpdesk and we will remove the app from your account.
Once the app is published, it is not possible to delete any module or component.
We recommend you change the component's label the way so that it is obvious the component should not be used, e.g. [DO NOT USE] My unwanted module
. Make sure the module is hidden.
If you actively share your public app via an invite link, you can control the distribution of modules that you have in your public app by making them hidden/visible.
If you hide an already visible module, that is currently in use in a scenario, the module will continue working in the scenario. Though, the hidden module will not be available in the app selector in a new scenario.
If you want to have your app available to all Makers, you will need to first make sure your app follows our conditions and best practices.
Once you make sure your app follows our conditions and best practices, click Request review button.
Once the app passes our review process, it becomes available for everyone to use in our App Builder. That mean's that any user on Make can add your app to his/her scenario and use it.
When you want to make your custom app public and share it with all Make users, your app has to conform to Make standards. Following Make app development standards is a prerequisite to get an app review. Make app standards encompass:
Custom app functionality.
Custom app code best practices.
Testing your custom app with test scenarios.
Check out the following sections to learn more about each prerequisite.
In Make, we develop apps to provide value to our users. Your custom app should use a service that Make doesn't integrate yet. All apps in Make require from the user only credentials that are necessary to create a connection to the service API. If you want to make your custom app public, your app functionality has to follow the same principles:
Your custom app uses a web service that is not already available in Make.
Your custom app and it's modules have to connect to a web service API. Avoid duplicating the same functionality as iterators, aggregators or other tools in Make.
Avoid using APIs that have strong dependencies on other APIs, or APIs that function as redirects to other APIs.
Avoid using APIs that don't have their own domain, or have their domain associated with service platforms like Heroku or AWS.
Your custom app has to use only credentials that the service requires to create a connection. Don't request any additional credentials from the user.
Before sending an app for review, check that:
The items above are mandatory for each app.
After you check the code of your custom app, you have to create test scenarios to show that the custom app works. Use each module of the custom app in at least one testing scenario.
Make sure that the testing scenarios and their execution logs don't contain personal or sensitive data.
In order to help us with a review of your app, please, follow the best practices.
Do not use scenarios with another app's webhook.
It is best to have all modules in one scenario and run them all without an error. You can group your modules via the entity or create scenarios of the way, the endpoints work together, e.g. Create a Task > Create a Subtask
(in order to create a subtask, the task has to be created first).
Try to put all app's modules in one scenario and run the scenario without errors. Connect the app modules based on the object the modules work with or the action the modules do. For example, Create a Task > Create a Subtask (in order to create a subtask, the task has to be created first).
Put the search and list modules at the end of separate scenario routes to avoid multiple runs of the subsequent modules.
Create a separate scenario that produces an error message. Run the scenario to get an error:
Run the test scenarios right before you request the custom app review and every time you fix issues. The scenario execution logs have a data retention limit and the reviewer won't be able to access old logs.
Every app on Make has its own logo and color. The combination of logo and color represents the app. With a distinct logo and color, the users quickly see which module belongs to which app.
The app theme is the module color in hexadecimal format. For example, the Make module's color is #6e21cc.
To add a logo to your custom app, you need to make sure that your file with the logo meets these requirements:
an image file in .png
format
square dimensions: minimum size 512 x 512 px and maximum size 2048 x 2048 px
a maximum size of 512 kB
Make processes the icon file so that:
Areas in the logo that are white or transparent will be displayed as the color specified in the Theme field.
Areas in the logo that are black will be converted to full opacity and will be displayed as white.
Areas in the logo that are in color or are semi-transparent will be displayed as the color between white and the color specified in the Theme field.
An app can have only one theme color. You can use multiple transparency levels to give your logo multiple shades of the theme color. You can create a 3D effect too.
You can update the custom app's logo anytime. Navigate to your custom app settings and click Options > Edit.
If your app has been approved, the change of theme/logo is a subject of a change that needs to be approved by Make.
Navigate to your custom app's code and click the Publish button:
Navigate to the Modules tab. Click the switch to set the visible status of each module of the app that you want to publish.
After the first step, a new tab Review appeared in the top panel. Navigate to the Review tab:
Fill in the form in the Review tab.
Add a link to the API documentation of the service for which you are creating a custom app.
Click the Request review button at the bottom of the page.
The data you filled in the form are sent to Make QA developers for review. At the top of the page, the Request review button changes to Review process started and becomes inactive. Also, the app status turns into pending approval.
Once you request the app review, Make sends you an e-mail with the subject: "App review: YourApp'sName". If you have any questions or additional information to share, reply to the e-mail.
The review process contains 3 phases - form submission, automatic review and manual review.
Once you hit the Request Review button, you will receive an e-mail containing a link to our form, where you will be prompted to share the following information about you and your app, that will help us to provide better service to you:
Developer information - What is your relationship with the vendor of the API service.
Partnership contact - Applicable for ISV. The details of a contact person or team on your end responsible for partnership-related discussions.
Support contact - Details of the contact person or team that should be reached out to in case of any app issues reported by users.
Logo of your company - Applicable for ISV. The logo should appear on the app's landing page on make.com.
URL to the service - URL leading to your website that should appear on the app's landing page on make.com.
Your app is first reviewed by our application. The automatic review checks the common issues and generates a PDF file with the issues found.
The PDF file with feedback from the automatic review is sent to the existing App review email thread.
Once your app passes the automatic review, it is reviewed by our QA Engineer who makes sure, that your app follows our best practices and is user-friendly.
QA Engineer shares the feedback in the existing App review email thread.
Once your app successfully passes the manual review, your app is listed in our planned release. You will be informed about the app approval as well as the app release.
When Make approves your app, your app becomes available to all Make users in their scenarios. To the users, an app developed Make looks the same as the app developed by you. You have to take your custom app's management responsibly, since any Make user might be using your custom app in their scenario.
After a custom app's publishing, Make provides you with a development version of your custom app. You can find the development version of your app by searching for your app in the scenario editor:
The search now has two results:
The app with the tag custom app: This is a development version of your app that only you can see and work with. When you update your custom app, the development version of your app will have your changes.
The app without a tag: This is a public version of your app. Anyone can use this version of the app. Changes to this version have to be approved by Make.
If you need to make changes to the app's public version, you have to test your changes in the version labeled custom app first.
When you are done with the updates and testing check the link below for the next steps.
It is important to keep your app up-to-date and react to API changes on time, e.g. when the API is going to shut down. Also, it is needed to fix issues reported by users.
Read more about maintenance of approved apps below.
When you create a custom app you can use it anytime. When you finish the development and testing of your app, you might share the app with selected users with an invite link. But if you want to provide your custom app to all Make users, we at Make want to make sure that your app is up to Make apps standard. To check that, Make has the app review process.
After you check the app review prerequisites, you can request an app review. Click the link below for instructions on how to request an app review.
The review process consists of multiple stages. You can check the current stage of the review in the Flow tab.
When Make approves your app, the app becomes available to all Make users. Because of that, your custom app management changes. Read the linked section to find out more.
It is important to keep your app up-to-date and provide support to your app's users. More information can be found below.
The scenario contains module, which has the synchronous logic implemented on app's side.
The first scenario contains module which has asynchronous approach by default. The only result is the job's ID.
No matter the module type, the output shouldn't be defined like this:
You can generate an interface using our Interface Generator, learn how in section.
When developing a custom app in Make, you should first go through our guide and apply them to your code. By following our best practices, you will ensure that the review and publication process of your app is as smooth as possible.
The base and connection have of sensitive data, e. g. API key or token.
The base and connection have .
The and connection use an endpoint in the app's API.
Connection is . If the user uses incorrect credentials, they get an error.
Modules have correct and .
The app has a .
All modules have the depending on the output from the module.
All dates are formatted or parsed in the and .
Search modules, trigger modules and have limit
.
Search modules, trigger modules and RPCs have the , if it's supported in the service API.
Run your search and list modules to have logs with pagination. If you don't know how to test pagination, follow the steps .
If you don't have any tool to edit your file with the logo, you can use the free editor.
For creating a transparent layer, you can use the tool that is available in Lunapic.
Once you make sure that your app meets Make's and follows Make's , you can request an app review:
Once you publish your app in the second step of the following procedure, you can't "unpublish" it. Check the section for more details.
Make sure that you removed testing modules or testing connections before you publish an app. When your app is , you can't .
Add a link to scenarios with the custom app modules. Check the testing scenarios section in the .
Categories and subcategories of your app - The categories and subcategories in which you wish to have the app listed on integrations page.
If you want to provide your app to other Make users, you have to request a review of your app first. Check the for rules and requirements for the App's code.
When you request an app review, the Make QA team will check your custom app's code. If your app meets Make's requirements and follows the app development , Make will publish your app and make the app available to all Make users. Otherwise, the Make QA team will contact you with instructions how you should improve your custom app.
When Make approves your custom app any Make user can use your custom app. You have to maintain the app to keep the app working. That involves:
updating the custom app when a relevant part of the service API changes
fixing bugs the users report
submit API checkups
checking feature requests on the Make Idea Exchange regularly
When you maintain the custom app actively, the users can use the app with confidence. The users don't have to worry that their scenarios stop working with new service updates.
When a user reports an issue, Make validates the issue. If the issue is valid, we contact you via e-mail with a request to fix the reported issue.
At Make, we make sure that everything works for our customers. If you do not fix the validated issue or stop communicating with Make, we will access the app's code and fix it for the app's users.
Make sends you a request to update the custom app's API checkup every six months. The Make API checkup helps you and Make to:
watch changes in API and API's docs
make sure the app is up-to-date
manage deprecations of endpoints or API
manage new versions of the API
inform users about shutdowns and other cases in advance
The API checkup form contains the following data about the service API:
Currently used API version
The version of the API the app is currently using.
v1
, 2019-01
Currently used API's documentation
URL of the API's online documentation.
https://service.com/api/v1
Currently used API's changelog
URL of API's changelog in the API's docs.
https://service.com/api/changelog
Currently used API deprecation date
The date, when the API was or will be deprecated.
01. 02. 2023
Currently used API shutdown date
The date, when the API was or will shut down.
01. 06. 2023
Breaking change in currently used API
Information about the breaking changes in the API.
true
, false
Breaking change note
Note about the particular breaking changes in the API.
"Endpoint X has removed parameters. More info in https://...."
Latest API available
Is there a newer API available or not?
true
, false
Latest API version
Version of the latest available API.
v2
, 2023-01
Latest API's documentation
URL of the latest online documentation of the API.
https://service.com/api/v2
Note
Any note that can help with the evaluation of the app's status or the next API checkup.
"The latest v2 version is currently in beta."
At Make, we make sure that everything works for our customers. If you do not submit the API checkup form or stop communicating with Make, we will take over the custom app maintainership to ensure that Make users can continue using it.
Every change that has been made in an approved app has to be approved by Make.
Until then, the changes will not be effective in the public version of the app.
Once you decide that the changes, you have made so far, should be applied to the public version of your app as well, you will need to pass update review.
In order to do so, open the Review tab in your app, and update the form. Enter links to your updated scenarios with new logs, where we can see the functionality of updated modules, and/or links to new scenarios with logs, where we can see the functionality of your new modules. Then, submit the form by clicking Update review.
Our Senior Developers will check your changes. The changes can be either approved or rejected.
In case of approval, you will be informed by e-mail, that your changes have been approved.
While in case of refusal, our senior developer will contact you with further details and/or recommendations for the correct solutions.
changes in current connection, modules, RPCs, custom IML functions, webhooks
a new connection, modules, RPCs, custom IML functions, webhooks
new theme and/or logo of the app
In custom apps, which haven't been approved by Make, it is not possible to keep track of changes as there is no Diff tool available.
All changes take effect immediately. Before saving a change, make sure it will not negatively affect existing scenarios.
You must ensure not to have Javascript syntax warnings or errors on your custom IML functions in your app.
Otherwise, all scenarios that are using the app will throw the error message about Javascript syntax and will be stopped immediately.
This will affect all scenarios no matter if they run the app's modules that do not contain the faulty custom IML functions.
If you find out, that there has been a new API version implemented, or there have been major changes in the current API made, you should create a new app.
This way, you will make sure that everything is consistent and that there are no breaking changes made in the current app.
Once there is a change saved in an approved app, the app is marked with *
next to the name. Also, every module with its blocks, where the changes are available, is marked with the *
.
To view what's changed right click the item and pick the Show changes
option.
A compare view will appear and you'll see the changes that have been made.
Once there is a change saved in an approved app, a new tab Changes will appear. There is a list of all changes available in the particular app.
After clicking on a diff log, a page with the particular app block will open, with a dialog informing your about available changes. For a detail diff file, click Show diff button.
A compare view will appear and you'll see the changes that have been made.
Since any API endpoint can return an error during an execution of a scenario, Make brought a way, how to handle such errors. Therefore, all apps in Make, custom included, should have error handling implemented. This way, the modules are more transparent as users of them exactly know, what caused the error. Also, error handling in a scenario can be used.
When the service returns an HTTP error, it is not possible to evaluate it as a success.
Example:
You are also able to further customize what error message will be shown to the user based on the status code. To do that, just add your status code to the error
directive and fill it in as one:
When handling an error, you can specify the type of the error:
UnknownError
RuntimeError
(default)
Primary error type. Execution is interrupted and rolled back.
InconsistencyError
DataError
Incoming data is invalid. If incomplete executions are enabled, execution is interrupted and the state is stored. The user is able to repair the data and resume execution.
RateLimitError
Service responded with rate-limit related error. Applies delay to the next execution of a scenario.
OutOfSpaceError
The user is out of space.
ConnectionError
Connection-related problem. Applies delay to the next execution of a scenario.
InvalidConfigurationError
Configuration-related problem. Deactivates the scenario and notifies the user.
InvalidAccessTokenError
Access token-related problem. Deactivates the scenario and notifies the user.
UnexpectedError
MaxResultsExceededError
IncompleteDataError
Incoming data is incomplete.
DuplicateDataError
Reports error as warning does not interrupt execution. If incomplete executions are enabled, execution is interrupted and the state is stored. The user is able to repair the data and resume execution.
When you make changes in an app, you need to make sure they will not break existing scenarios. Please, check the below list of common breaking changes.
It's important to avoid removing mappable parameters from a module without a clear indication or notification to the user. Even if it doesn't immediately cause a scenario to fail, it could still impact its functionality or disrupt the underlying process. Therefore, it's best to communicate any changes to the user. Also, the user should be able to see and work with the original input in the deprecated parameter/s.
In situations where a mappable parameter needs to be removed, there are several ways to handle the deprecation. The appropriate approach depends on the specific circumstances and how the API manages parameter deprecation in its endpoint. Below, are methods ranked in order of least to greatest impact.
[Deprecated]
string into the module's labelThe parameter should be put into advanced parameters and the [Deprecated]
string should be attached to the label. Additionally, you can add instructions to the help.
If you need to make sure that the user notices the deprecated parameters, you can use the HTML banner.
If your code with the HTML banner in the label
parameter contains more than 256 letters, implement an RPC to return the HTML banner instead.
If the called API service is too strict about using the deprecated parameters, you can do the error execution on the app's side.
If you need to make sure that the module is not used anymore and the user acknowledges it at least from the error log, you can throw an error whenever the module is executed.
If you need to deprecate a connection create a new connection, that should be used as the functional alternative, and rename the now deprecated connection so it now contains the (deprecated)
string. Then, do the following:
Remove the current primary connection.
Map the new connection as the primary connection.
Map the deprecated connection as the alternative connection.
Base serves as the repository for all components that are common across all modules and remote procedures. Any elements placed in Base will be inherited by each module and remote procedure.
These components are:
Once the app becomes Approved, the Common Data gets locked and it cannot be changed anymore due to security reasons.
Common Data can be accessed by common.variable
IML expression.
Common data are stored in encrypted form in Make.
As you can see, the secret is defined in the common
data. Then it can be used in base
and in all other communication objects inside modules and RPCs. Once the app becomes approved, it will not be possible to change the secret.
The Base
section is used for setting up the authorization. The most of services require the authorization key to be sent either in headers
or in the qs
(query-string). If you set the authorization in the Base
all modules and RPCs will inherit it.
The most common ways of authorizing are:
Sanitization will help you to protect sensitive data (passwords, secret keys, etc.) from leakage.
The changes you make will become available only after triggering your scenario in your web browser. This can be done by clicking the Run once button, or by selecting the Run this module only option after right-clicking on a module with your mouse.
It is recommended to create a new version of the app, when there are major changes in the current API and/or there is a new API version available, and it is not possible to update the current app without breaking changes.
This way, you will make sure that everything is consistent, there are no breaking changes made in the current app, and at the same time, users will be able to upgrade the modules in their scenarios using our upgrade module tool. Read more about the upgrade module tool in our Make Help Docs:
Base URL is the main URL to a web service, which should be used for every module and remote procedure in an app, e.g. https://mywebservice.com/api/v1.
There might be situations, when you need to have a variable base URL, e.g. if your web service, which you are integrating, uses multiple domains, and you want to let your users to have access to the one they use.
Here, is an example of how to handle 2 types of accounts - sandbox
and production
.
First, add a checkbox in your connection parameters, which can be checked when the condition is met.
2. Then, both in the connection and the base, there should be a condition implemented:
3. All modules and remote procedures then can use hard-coded "url": "/uniqueEndpoint".
Here, is an example of how to handle 2 types of accounts - eu
and us
.
First, you need to set up select
in your connection parameters, where you let your users choose from available environments:
2. Then, both in the connection and the base, there should be the environment mapped:
3. All modules and remote procedures then can use hard-coded "url": "/uniqueEndpoint".
When Make users miss a feature in an app, they can submit a new request to . We recommend you check the list of feature requests for your app regularly and respond to them. You can use the search bar for listing requests which mention your app.
If you need to have all changes, that you have made so far, rollbacked, contact us via .
This section explains the work with updates in or apps.
If you need to keep track of your changes and you are not planning to have your app approved by Make, you can export the current version of your app every time you make a change, using extension, and store the files on GitHub or any similar tool.
When the response is returned with 4** or 5** HTTP Status Code, this is automatically considered an error. If the error
directive is not specified, the user will see a message for the status code that was returned (). But you are able to customize the message, shown to the user with the error
or error.message
directive.
Some APIs signal an error with a 200 status code and a flag in the body. For this scenario, there is the valid
directive, which tells whether the response is or not.
When you want the Common Data to be available only to the connection (for example for storing OAuth secrets), use instead.
You should always the log, so no personal tokens and/or keys can leak.
If you don't use sanitization at all, the request and response logs will not be available in the .
The process is different between / apps and apps.
Since your app has been approved, every change made to the app is visible only to you unless we commit it. You can safely add and test new functions and when they are stable, you must to check and release them for users.
This section explains the work with updates in an app.
Once an app is approved by Make, the code is locked and it starts to be versioned. When a new change is made in the code of the app, it automatically creates Diff files, which contain detailed information about the changes. Every change made to the app is visible only to you unless we commit it. You can safely add and test new functions and when they are stable, you must follow our guidelines to have the changes checked and released to users. The Diff files are available in both environments, the web interface, and . Also, you should always make sure the existing scenarios.
To make the changes available to all users in Make, you must request . Once approved, the changes will be available to all users. Additionally, you will have the ability to schedule your scenario with the updated modules, that were previously run in "run-once" mode.
If you need to create a new version of your app, create a new app, and develop its content. Once your app is finished, request for . During the app's review, do not forget to mention the app should be compiled as a new version of the current app.
Base
Any changes might break scenarios.
Connection
Changing refresh
call for OAuth connection.
Module's Communication
Changing response.output
.
Changing response.type
.
Adding response.valid
for 2xx
response.
Changing response.trigger
.
Adding condition
.
Adding additional call (chaining request).
Changing linked connection.
Module's Parameters
Changing required
from false
to true
.
Removing a parameter.
Adding validate
.
Setting select parameter mappable
to false
.
Setting select parameter dynamic
to false
.
Webhook's Communication
Any changes might break scenarios.
RPC
Changing parameter required
from false
to true
.
Changing RPCs building dynamic parameters.
Custom Functions
Any changes might break scenarios.
Key
Type
Description
baseUrl
String
If you want to use this base URL in a request, you need to start the URL of an endpoint with /
character.
headers
Object
Default headers that every module will use.
qs
Object
Default query string parameters that every module will use.
body
Object
Default request body that every module will use when issuing a POST or PUT request.
response
Object
Default directives for handling response, such as error handling.
log
Object
Default directive for handling logs, such as sanitization of sensitive data.
Oauth 1 Parameter Specification
Collection of directives containing parameters for the OAuth 1 protocol.
Connection is a link between Make and 3rd party service/app. OAuth 1.0 connection handles the token exchange automatically.
aws
directive is not available
Communication is extended with oauth
pagination
directive is not available
response.limit
is not available
response.iterate
directive is not available
response.output
is not available
response
is extended with data
It is sometimes very tedious and hard to generate an OAuth 1.0 Authorization header. So we have provided a helper directive, that will simplify this task. Below are all the properties that you can use to customize the header generation.
Key
Type
Description
consumer_key
IML String
Your consumer key
consumer_secret
IML String
Your consumer secret
private_key
IML String
Instead of consumer_secret
you can specify a private_key
string in PEM format
token
IML String
An expression that parses the oauth_token
string.
token_secret
IML String
An expression that parses the oauth_token_secret
string.
verifier
IML String
An expression that parses the oauth_verifier
string.
signature_method
String
Specifies the desired method to use when calculating the signature. Can be either HMAC-SHA1
, RSA-SHA1
, PLAINTEXT
. Default is HMAC-SHA1
.
transport_method
String
Specifies how OAuth parameters are sent: via query params, header or in a POST body. Can be either query
, body
or header
. Default is header
body_hash
IML String
To use Request Body Hash, you can either manually generate it, or you can set this directive to true
and the body hash will be generated automatically
The data
directive saves data to the connection so that it can be later accessed from a module through the connection
variable. It functions similarly to the temp
directive, except that data
is persisted to the connection.
Example:
This accessToken
can be later accessed in any module that uses this connection like so:
Parameters the user needs to provide when setting up a new connection.
Default scope for every new connection.
Collection of available scopes.
Non-user-specific sensitive values like salts or secrets.
OAuth 1.0 authentication process consists of multiple steps. You are able to select the steps you need and ignore the steps that you don’t - just fill in the needed sections and delete unneeded ones.
Key
Type
Description
OAuth 1 Parameters Specification
Allows you to specify special OAuth 1.0 properties to simplify OAuth 1.0 header generation.
requestToken
Request Specification
Describes a request that retrieves the request token
authorize
Request Specification
Describes authorization process.
accessToken
Request Specification
Describes a request that exchanges credentials and the request token for the access token.
info
Request Specification
Describes a request that validates a connection. The most common way to validate the connection is to call a method to get user’s information. Most of the APIs have such a method.
When using an OAuth 1.0 connection there is a special object available globally: the oauth
object. You can use it in connection specification as well as in module specification to avoid generating the OAuth 1.0 header yourself. This object is available at the root of the connection specification, in the Base and in Request Specification
If the oauth
object is present in the root of the connection specification, it will be merged with each of the directives described above. If you wish to override some properties of the root object, you can do so in the respective directive by specifying the oauth
object and overriding the properties.\
These IML variables are available for you to use everywhere in this module:
now
- Current date and time
environment
- TBD
temp
- Contains custom variables created via temp
directive.
parameters
- Contains connection’s input parameters.
common
- Contains connection’s common data collection.
data
- Contains connection's data collection.
oauth.scope
- Contains an array of scope required to be passed to OAuth 1.0 authorization process.
oauth.redirectUri
- Contains redirect URL for OAuth 1.0 authorization process.
The Attach remote procedure is used to automatically attach a webhook to a remote service. Please note that you will need to detach this RPC later, and for that you will need this remote procedure’s Id.
To save the remote webhook id (in order for detach to work), you must use the response.data
collection. This collection will be available in the detach webhook remote procedure as webhook
IML variable for you to use.
The webhook
collection with webhook’s data is also accessible in regular remote procedure calls if the call is processed in the context of an Instant Trigger. For example, if you create a dynamic interface for an Instant Trigger based on parameters entered when the webhook was created.
The Detach remote procedure is used to automatically detach (delete) a webhook from a remote service when it is no longer needed. The only thing you have to do is to correctly specify the url to detach a webhook. No response processing is needed.
The new URL address, which has been created, has to be then registered manually by the user. Basically, the user copies the URL address and pastes it to the webhook's settings of the web service.
Unless there is a reason (read below), the connection should not be connected to the webhook.
Connection is a link between Make and 3rd party service/app. OAuth 2.0 connection handles the token exchange automatically.
aws
directive is not available
pagination
directive is not available
response.limit
is not available
response.iterate
directive is not available
response.output
is not available
response
is extended with data
response
is extended with expires
The data
directive saves data to the connection so that it can be later accessed from a module through the connection
variable. It functions similarly to the temp
directive, except that data
is persisted to the connection.
Example:
This accessToken
can be later accessed in any module that uses this connection like so:
The expires
directive says, when the refresh token (or whole connection when there's no refresh token) will expire. Don't change this with response.data.expires
which is telling you when the current access token will need to be refreshed**.** When the expires
period is overdue, the connection needs to be reauthorized manually. This can be done either from a scenario or the "Connections" tab.
Example:
Parameters that the user should fill while creating a new connection.
Default scope for every new connection.
Collection of available scopes.
Non-user-specific sensitive values like salts or secrets.
OAuth 2.0 authentication process consists of multiple steps. You are able to select the steps you need and ignore the steps that you don’t - just fill in the needed sections and delete unneeded ones.
Each section is responsible for executing its part in the OAuth 2.0 flow.
In short, you can describe the initial OAuth 2.0 flow as follows:
with preauthorize
and info
sections being optional, and refresh
and invalidate
not being a part of the initial OAuth 2.0 flow.
If the authorize
directive isn't used, the condition
in thetoken
directive has to be set totrue.
Otherwise, the token directive will not be successfully triggered.
These IML variables are available for you to use everywhere in this module:
Shared webhooks come to use when the service sends all the notifications for all the users to only one registered URL.
When the service sends all the notifications to only one webhook URL, but the webhook has to be registered under a user account, that's not a shared webhook.
Shared webhook should be registered by the developer of the app. All notifications from the service for all users will be sent to Make by calling this URL, which is generated when creating the shared webhook. On the Make's end, the corresponding user account will be matched.
Since the webhook is shared among multiple users, it is needed to match the incoming events with their owners and deliver them correctly. In order to do it, it is needed to work with uid parameter. Uid parameter can be obtained in connection.
Consider this to be the base:
In a module, you need to add a custom header programmatically.
That results in the base being overwritten by the result from the IML function. To merge both collections, use this special IML syntax inside the module:
Changing type
(if the original type can be coerced to the new type, it’s fine. e.g. number -> text. See .
The new URL address, which has been created, is automatically registered to the service using procedure, and can be unregistered using procedure.
Please refer to to get more information about writing RPCs.
The added value of not attached webhook is the existence of an and the fact, that the user is notified there is an instant trigger available.
https://www.make.com/oauth/cb/app
as a callback URL together with oauth.makeRedirectUri
, or:
https://www.make.com/oauth/cb/app
as a callback URL together with oauth.localRedirectUri
, if you are going to request approval of your app, or:
https://www.integromat.com/oauth/cb/app
as an old callback URL together with oauth.redirectUri
.
In order to have the shared webhook listening to incoming traffic, you must publish your app. Before doing so, please, read this .
If you still need to implement attach and detach directives, check .
Key
Type
Description
preauthorize
Request Specification
Describes a request that should be executed prior to authorize
directive.
authorize
Request Specification
Describes authorization process.
token
Request Specification
Describes a request that exchanges credentials for tokens.
info
Request Specification
Describes a request that validates a connection. The most common way to validate the connection is to call an API’s method to get user’s information. Most of the APIs have such a method.
info
directive can be used to store account's metadata.
refresh
Request Specification
Describes a request that refreshes an access token.
invalidate
Request Specification
Describes a request that invalidates acquired access token.
now
Current date and time.
environment
TBD
temp
Contains custom variables created via temp
directive.
parameters
Contains connection’s input parameters.
common
Contains connection’s common data collection.
data
Contains connection’s data collection.
oauth.scope
Contains an array of scope required to be passed to OAuth 2.0 authorization process.
oauth.redirectUri
Contains redirect URL for OAuth 2.0 authorization process in this format: https://www.integromat.com/oauth/cb/app
oauth.localRedirectUri
Contains redirect URL for OAuth 2.0 authorization process in this format: https://www.make.com/oauth/cb/app
or this format:
https://www.private-instance.com/oauth/cb/app
oauth.makeRedirectUri
Contains redirect URL for OAuth 2.0 authorization process in this format: https://www.make.com/oauth/cb/app
Webhooks power up Instant Triggers, which execute the flow immediately after the remote server sends data.
To use webhooks effectively, you must always create an Instant Trigger and link it to a webhook.
Specifies how to get data from the payload and how to reply to a remote server.
Note: If the webhook returns multiple items in one batch, you might need to use the iterate
directive to specify which items to output. Then you might want to specify the output
directive to map items to output. If you do not specify the output
directive, items will be returned as-is.
Key
Type
Description
Response Specification
Specifies how to respond to the remote server
Verification Specification
Specifies how to reply to the remote server, if it needs a confirmation
IML String or Iterate Specification
Specifies how response items (in case of multiple) are retrieved and processed.
Any IML Type
Describes structure of the output bundle.
IML String or Boolean
Determines if to execute current request or never.
IML String
Specifies how to get the user ID from the request body.
Required: no
This directive lets you customize Integromat’s response on the webhook or a verification request.
type
IML String
no
Specifies how to encode data into body.
Default: json
.
Available values: json
, urlencoded
, text
.
status
IML String
no
Specifies the HTTP status code that will be returned with the response.
headers
IML Flat Object
no
Specifies custom headers that are to be sent with the response.
body
Any IML Type
no
Specifies the response body.
Required: no
This directive allows you to reply to webhook verification requests. Some systems will not allow you to create webhooks prior to verifying that the remote side (in this case Make) is prepared to handle them. Such systems may send a code and request Make to return it and may be some other value with it. In such case, this directive will help you.
Key
Type
Description
IML String
Specifies how data are serialized into body.
IML String
Specifies the response status code.
Example:
Required: no
Default: true
This directive distinguishes normal webhook requests from verification requests. Usually, the remote service will send some kind of code to verify that Integromat is capable of receiving data. In such case, you may want to check for the existence of this code variable in the request body. If it exists - this means that this request is a verification request. Otherwise, it may be a normal webhook request with data.
Required: no
Required: only in shared webhooks
Specifies how to get the user ID from the request body. This value is then used to search for the recipient of the message in the database of connections. Don't forget to specify the uid
parameter in the connection definition.
These IML variables are available for you to use everywhere in a webhook:
now
- Current date and time
environment
- TBD
parameters
- Contains webhook’s input parameters.
data
- Alias for parameters.
body
- Contains the body of an incoming webhook.
query
- Contains query string parameters of an incoming webhook.
method
- Contains HTTP method of an incoming webhook.
headers
- Contains headers of an incoming webhook.
Shared webhooks come to use when the service sends all the notifications for all the users to only one registered URL.
Unlike a shared webhook, a dedicated webhook is directly linked to the user account. Only notifications for the specific user are received.
There are six basic types of modules:
APIs usually employ some sort of authentication/authorization to limit access to their endpoints.
Make platform provides you with a list of the most used types of connections while in every each there is a code prefilled. You only need to edit it up to your and/or API's needs.
Mostly, you will need to change the URLs and names of the parameters.
When you're using an OAuth connection, an ID and secret are generated for your client. To store them you should use the common data inside the connection.
Make sure nobody else knows the client secret, otherwise your app can get vulnerable.
Once the app becomes Approved, the Common Data gets locked and it can't be changed anymore due to security reasons.
Inside the connection, common data can be accessed by common.variable
IML expression.
Common data are stored in encrypted form in Make.
Reserved words are variables used internally by Make platform. Using reserved words for the parameter name
key can lead to unexpected results. Avoid using a reserved word If you don't have a clear intention of why you want to use it.
Make reserved words are:
accountName
: name of the connection used by the app module,
teamID
: ID of the team to which active user is assigned currently.
If you use a Make reserved word for the name
key of a parameter, the value stored in the internal Make parameter will be used by your parameter too.
Consider the following configuration of a connection. The parameter labeled Account Name
has his name
key set to preserved word accountName
.
The setting above leads to mirroring the value from the default Connection name
parameter into a parameter labeled Account name
. The value accountName
is set by Make to the name of the created connection.
Connection is a link between Make and 3rd party service/app.
aws
directive is not available
Only a single request can be performed
pagination
directive is not available
response.limit
is not available
response.iterate
directive is not available
response.output
is not available
response
is extended with data
, uid
and metadata
The data
directive saves data to the connection so that it can be later accessed from a module through the connection
variable. It functions similarly to the temp
directive, except that data
is persisted in the connection.
Example:
This accessToken
can be later accessed in any module that uses this connection like so:
The metadata
directive allows you to save the user’s name or username (or any other text field) so that multiple connections of the same type could be easily recognized. A common practice is to save either username or email or full name to metadata.
The metadata object has 2 properties: value
and type
. value
is used to store the value and type
is used to specify what the value is. Currently, there are only 2 types: email
and text
.
Example:
Example:
Parameters that the user should fill while creating a new connection.
Non-user-specific sensitive values like salts or secrets.
These IML variables are available for you to use everywhere in this module:
now
- Current date and time.
environment
- TBD
temp
- Contains custom variables created via temp
directive.
parameters
- Contains the connection’s input parameters.
common
- Contains connection’s common data collection.
data
- Contains connection's data collection.
oauth.scope
- Contains an array of scope required to be passed to OAuth 2.0 authorization process.
oauth.redirectUri
- Contains redirect URL for OAuth 2.0 authorization process.
Here you can see an example of API key-based connection.
The Search Module is a module that makes a request (or several) and returns multiple results. It doesn’t have state nor any internal complex logic.
Use this module when you need to allow the user to search for items or simply return multiple items.
If API supports pagination, you can implement it by using pagination
directive.
You can use static parameters inside the Search module without any restrictions.
You can use mappable parameters inside the Search module without any restrictions.
Unlike the Action module, the Search module can return multiple bundles at once.
To help the users with setting up your module, you can provide samples to it.
When using an OAuth type of connection, use the Scope to define scopes required by this module.
The IML variables are variables that you are able to use in IML expressions.
These IML variables are available for you to use everywhere in this module:
now
- Current date and time.
environment
- TBD
temp
- Contains custom variables created via temp
directive
parameters
- Contains module’s input parameters.
connection
- Contains connection’s data collection.
common
- Contains app’s common data collection.
data
- Contains module’s data collection.
scenario
- TBD
metadata.expect
- Contains module’s raw parameters array the way you have specified it in the configuration.
metadata.interface
- Contains module’s raw interface array the way you have specified it in the configuration.
Additional variables available to Response Object:
output
- When using the wrapper
directive, the output
variable represents the result of the output
directive
limit - When you use a limit, the process of retrieving items will stop once either the requested number of items has been obtained or if a page doesn't contain any items. Additionally, the module will return only the exact number of items that was specified.
iterate - Iterates the array in the response into items.
Additional variables available after using the iterate
directive, i.e. in wrapper
or pagination
directives:
iterate.container.first
- Represents the first item of the array you iterated
iterate.container.last
- Represents the last item of the array you iterated
Additional variables available after using the iterate
directive, i.e. in wrapper
or pagination
directives:
body
- Contains the body that was retrieved from the last request.
headers
- Contains the response headers that were retrieved from the last request.
item
- When iterating this variable represents the current item that is being iterated.
This section describes how to generate a JWT in Make
To generate the token, you can use the following example.
As you can see in the example, first, we build a JWT payload inside the temp
variable called jwt
, but you can use any other name for that variable.
Then, inside the Authorization header, we call the IML function named jwt
. The jwt
function accepts four parameters:
The payload to be signed.
The secret to signing the payload.
The algorithm. The supported algorithms are HS256
, HS384
, HS512
, and RS256
. The default value is HS256
. This parameter is optional.
A custom header to customize the JWT authorization header. This parameter is optional.
This function will output a JWT token which you can use in the Authorization header.
Unlike shared webhook, dedicated webhook is directly linked to the user account. Only notifications for the specific user are received.
Even when the service sends all data to only one URL registered for the user, it's dedicated webhook. It's up to you how the app will handle incoming data. Over 90% of services use dedicated webhooks, think twice before using shared one.
The new URL address, which has been created, is automatically registered to the service using attach procedure, and can be unregistered using detach procedure.
The new URL address, which has been created, has to be then registered manually by the user. Basically, user copies the URL address and pastes it to the webhook's settings of the web service.
The Trigger module is a special module that saves the information about what was the last item processed and continues the execution from that item, if there is some.
You can configure the trigger module to:
Process all available items and wait for new ones, without repeated processing of the old item.
Process items starting from a specific date and time.
Process items starting with a specific item.
Use this module when you need to process items sequentially in the order they were created/updated.
Communication response
is extended with the trigger object.
The trigger collection specifies directives that will control how the trigger will work and how your data will be processed
Required: yes
Values: id
or date
The trigger.type
directive specifies how the trigger will sort and iterate through items.
If the processed item has a create/update date, then date
should be used as a value, and a correct getter should be specified in trigger.date
directive. The trigger will then sort all items by their date and id fields and return only unprocessed items.
If the processed item does not have a create/update date, but only an id, then id
should be used as a value, and a correct getter should be specified in trigger.id
directive.
Required: yes
Values: asc
, desc
or unordered
The trigger.order
directive specifies in what order the remote API is returning items - descending, ascending, or unordered. This information is needed to correctly determine if there are more pages to be fetched or not. It is also needed to correctly sort the incoming items and display them to the user in ascending order.
So if the API is returning items in ascending order (low to high), then asc
should be used. If the API is returning items in descending order (high to low), then desc
should be used. If the API is returning items in no apparent order, then unordered
should be used.
Required: yes
This directive specifies the item’s id. It must always be present. For example, if your item looks like this:
then you should specify your trigger.id
directive like this: {{item.id}}:
Required: yes, if the trigger type is date
This directive specifies the item’s date. It must be specified when the trigger.type
is set to date
. Note that trigger.id
must always be specified.
For example, if your item looks like this:
Then you should specify your trigger.date
directive like this: {{item.created_date}}
, and your trigger collection might look something like this:
The Epoch panel is a specific component of the trigger allowing a user to choose the starting item.
The Trigger module can only have static parameters. There's no reason to have anything mappable in the Trigger as this module is always the first module in the scenario.
The Trigger module can return multiple bundles at once.
To help the users with setting up your module, you can provide samples to it.
When using an OAuth type of connection, use the Scope to define scopes required by this trigger.
These IML variables are available for you to use everywhere in this module:
now
- Current date and time.
environment
- TBD
temp
- Contains custom variables created via temp
directive.
parameters
- Contains module’s input parameters.
connection
- Contains connection’s data collection.
common
- Contains app’s common data collection.
data
- Contains module’s data collection.
data.lastDate
- Returns the date from the last retrieved item in a previous execution.
data.lastId
- Returns the ID of the last retrieved item in a previous execution.
scenario
- TBD
metadata.expect
- Contains module’s raw parameters array the way you have specified it in the configuration.
metadata.interface
- Contains module’s raw interface array the way you have specified it in the configuration.
Additional variables available to Response Object:
output
- When using the wrapper
directive, the output
variable represents the result of the output
directive.
Additional variables available after using the iterate
directive, i.e. in wrapper
or pagination
directives:
iterate.container.first
- Represents the first item of the array you iterated.
iterate.container.last
- Represents the last item of the array you iterated.
Additional variables available to pagination and response objects:
body
- Contains the body that was retrieved from the last request.
headers
- Contains the response headers that were retrieved from the last request.
item
- When iterating this variable represents the current item that is being iterated.
Optionally, you can define the module's action to take advantages of features, read more below.
Used for modules that are creating an object, most of the time, these modules use POST request.
Used for modules that are retrieving an object, most of the time, these modules use GET request.
Module: Get a Contact
Used for modules that are updating an object, most of the time, these modules use PATCH or PUT request.
Module: Update a Contact
Used for modules that are deleting an object, most of the time, these modules use DELETE request.
Module: Delete a Contact
Communication
Mappable parameters
Communication
Mappable parameters
The "url" in the Communication has API version in it.
The "help" has misleading example, the base url should end without slash and version and the example start with slash and version.
When your app requires specifying scopes to access different groups of endpoints, you need to tweak the connection code a bit to make it work correctly with the Universal Module. Here's how:
Add a new advanced parameter called scopes
to the connection parameters.
In the authorize
part of the connection, merge the original scopes with additional scopes added by the parameter from the previous step.
Now when you want to use Universal Module with scopes that have not been granted to the connection previously, you can create a new connection and request those additional scopes manually.
This directive is exactly the same as the directive, except that it is nested in verification
. The behavior of verification.respond
, is the same as normal respond
.
Properties of the iterate directive are described in the Communication
docs. .
Properties of the output directive are described in the Communication
docs. .
Properties of the condition directive are described in the Communication
docs. .
Use if the API endpoint returns a single response. Examples are Create a book, Delete a book or Get a Book.
Use if the API endpoint returns multiple items. An example is List Books that will find specific books according to search criteria.
Use if you wish to watch for any changes in your app/service. Examples are Watch a New Book, which will be triggered whenever a new book has been added to the library.
Use if the API endpoint has a webhook available (dedicated or shared). Example is Watch a New Event.
Use if you want to enable users to perform an arbitrary API call to the service. Examples are Make an API Call and Execute a GraphQL Query.
Use if you need to send a processed data back to a webhook.
When you need the common data to be available to all modules, use instead.
This directive allows you to save the user’s remote service Id. This is required when using .
There's no dedicated JWT connection type because the JWT itself is only a "special format" of the Authorization header. Everything works the same way as described in chapter.
The options argument refers to the same options object as in the .
Module: Create a Contact
Module: Create a Contact
There are two types of responsiveness - synchronous and asynchronous. Read more about it in .
If you happen to receive this error: Invalid module output. Expected Object, but found Array.
, it means that your module should be type Search. Type expects an output type array, and unlike the type action supports directive.
When a module is type Update, a new keyword appears inside Make - .
There are two types of update approaches - partial and full. Read more about it in .
Key
Type
Description
date
or id
Specifies how the trigger will behave and sort items
asc
or desc
Specifies in what order the remote API returns items
IML String
Must return current item’s Id
IML String
When used, must return current item’s date
Instant Trigger is a trigger that is executed immediately when the data arrives to Make.
Communication is only optional in the Instant Trigger.
It can be used for retrieving additional data.
iterate
directive is not available.
pagination
directive is not available.
Only a single request can be performed.
If you need to retrieve additional data for each bundle, you can describe a request to execute for each bundle of the webhook:
The Instant Trigger module can only have static parameters. There's no reason to have anything mappable in the Instant Trigger as this module is always the first module in the scenario.
Exactly one bundle is generated with each incoming webhook.
To help the users with setting up your module, you can provide samples to it.
These IML variables are available for you to use everywhere in this module:
now
- Current date and time.
environment
- TBD
temp
- Contains custom variables created via temp
directive.
parameters
- Contains module’s input parameters.
connection
- Contains connection’s data collection.
common
- Contains app’s common data collection.
data
- Contains module’s data collection.
scenario
- TBD
metadata.expect
- Contains module’s raw parameters array the way you have specified it in the configuration.
metadata.interface
- Contains module’s raw interface array the way you have specified it in the configuration.
Additional variables available to Response Object:
output
- When using the wrapper
directive, the output
variable represents the result of the output
directive.
Additional variables available after using the iterate
directive, i.e. in wrapper
or pagination
directives:
iterate.container.first
- Represents the first item of the array you iterated.
iterate.container.last
- Represents the last item of the array you iterated.
Additional variables available to Pagination and Response Objects:
body
- Contains the body that was retrieved from the last request.
headers
- Contains the response headers that were retrieved from the last request.
item
- When iterating this variable represents the current item that is being iterated.
Additional variables available in the Instant Trigger
payload
- This variable represents the current webhook item that is being processed.
Universal Module can be used to perform an arbitrary API call to the service's API. It allows the user to specify all parameters of the request while using the App's connection.
Every app using API should have a Universal module. Each app can have one universal module at most.
Security notice
As the Universal Module allows the user to specify the target URL, it's highly important that the Universal Module has to use a relative path. Otherwise, one could point the request to his own custom servers and get access to the access tokens. So every time use a fixed base URL in this kind of module.
Universal Module which doesn't match this condition won't be approved by Make to be used in scenarios.
There are two types of universal module available. Choose one depending on the API you use:
When a universal module is used in a scenario, it is recommended to use it together with JSON > Create JSON module. Not only it is much easier to create the structure of JSON for the universal, but also all characters, which are part of JSON definition and should be considered as letters, are escaped.
RPCs can be used in multiple ways. The number is restricted only by your imagination. In this chapter, we described the most common ways of RPC usage.
The most common use is the replacing of a select
parameter with static options to select
parameter dynamic options. Thanks to RPCs, we are able to retrieve a list of all available options right inside the parameter, according to the user's perspective.
There are web services, which allow users to have their own structure of data. Therefore, it is needed to make the mappable parameters semi-dynamic, e. g. support custom or dynamic fields.
The purpose of this RPC is to retrieve sample data dynamically for a module. Replaces hard-coded samples, which might become outdated quickly.
The Remote Procedure Call, shortly RPC, is a function call, which executes a call to fetch additional data inside a module. A user cannot select it or invoke it from other modules.
As we can't wait for the RPC's output to infinity, there are limits.
Max Execution Timeout
... seconds
40
Since we can't wait forever for the RPC's response, when the parameters of the module are loading, there are some best practices you should know.
Request Count
... calls performed by RPC
3
Record Count
... paginated records
3 * number of objects per page
The Responder module is used for sending response to the sender of a web hook.
The Responder should be used when you need to send processed data back to the service. The scenario gets initiated by an Instant Trigger, does its' stuff with the data received, and then sends the results back to the sender. The responder module has no interface, you just pass parameters in.
Only a response directive is available inside the communication.
You can use static parameters inside the Responder module without any restrictions.
You can use mappable parameters inside the Responder module without any restrictions.
Communication response
is extended with wrapper
object.
limit
is not available in response
as the result of action should always be only one bundle
Required: no
Default: output
This directive lets you post-process module output before returning it to the user. The output of the module will be available to you as the output
context variable - the result of processing the output
directive. When used, the value of the wrapper
directive is what will become the final output of the module. This directive is executed only once and at the end of the processing chain. There are no more directives or transformations after it.
You can use static parameters inside the Action module without any restrictions.
You can use mappable parameters inside the Action module without any restrictions.
Remember that the Action module should always output only one bundle.
To help the users with setting up your module, you can provide samples to it.
When using an OAuth type of connection, use the Scope to define scopes required by this action.
These IML variables are available for you to use everywhere in this module:
now
- Current date and time
environment
- TBD
temp
- Contains custom variables created via temp
directive
parameters
- Contains module’s input parameters.
connection
- Contains connection’s data collection.
common
- Contains app’s common data collection.
data
- Contains module’s data collection.
scenario
- TBD
metadata.expect
- Contains module’s raw parameters array the way you have specified it in the configuration.
metadata.interface
- Contains module’s raw interface array the way you have specified it in the configuration.
Additional variables available to Response Object:
output
- When using the wrapper
directive, the output
variable represents the result of the output
directive
Additional variables available after using the iterate
directive, i.e. in wrapper
or pagination
directives:
iterate.container.first
- Represents the first item of the array you iterated
iterate.container.last
- Represents the last item of the array you iterated
Additional variables available to Pagination and Response Objects:
body
- Contains the body that was retrieved from the last request.
headers
- Contains the response headers that were retrieved from the last request.
item
- When iterating this variable represents the current item that is being iterated.
There is nothing to configure in this module except the interface. The data processing is handled by a selected .
Communication can be .
Same as the modules, you can use in RPCs to iterate over the records.
Components of the Universal Module are the same as for the .
You can use all of the IML variables available in in the universal module, except for the iterate
directive.
RPCs have specific output rules, so take a look at before the implementation.
Communication can be .