In this tutorial you will learn how to ingest data from a Mendix application. In the following steps you will learn how to configure the Mendix Data Loader component, a Snowflake application that is deployed in your Snowflake environment that ingests your Mendix data.
Mendix is a leading platform in the low-code application development domain. The data structure for the application is retrieved and the transient target tables for the data ingestion are created dynamically.
Mendix stands as a leading low-code platform for developing enterprise-grade applications, offering unmatched speed, flexibility, and scalability. Mendix's seamless integration with Snowflake's enterprise data environment makes it an essential tool for building robust, data-driven applications. The platform's intuitive visual development environment accelerates the creation of complex applications, significantly reducing development time while maintaining high standards of quality and performance.
Furthermore, Mendix offers extensive customization through its rich ecosystem of components, including marketplace offerings that facilitate direct integration with Snowflake. This allows Mendix applications to easily connect, query, and visualize Snowflake data, unlocking deeper insights and driving informed decision-making across the organization.
With Mendix, data engineers can focus on what truly matters—maximizing the power of their data within Snowflake—while relying on a platform that ensures enterprise-level security, compliance, and scalability.
SFShowcase.mpk
, a window prompt should appearVersion Control
, then click Upload to Version Control Server...
and confirm by clicking OK
. A window titled Upload App to Team Server should appearPublish
Your application is published
View App
to see the login screen for your Mendix application demo_user
App 'SFShowcase'
-> Security
-> Demo users
-> demo_user
and then click the link that reads Copy password to clipboard
https://sfshowcase101-sandbox.mxapps.io/
if your endpoint is https://sfshowcase101-sandbox.mxapps.io/login.html?profile=Responsive
The application you just downloaded, uploaded to Mendix' version control server and deployed on a free cloud sandbox environment is a free application available on the Mendix Marketplace. Its purpose is to enable, aid and inspire its users on how to tackle integration with Snowflake from the Mendix domain. This application has some pre-installed operational data to showcase the Mendix Data Loader. The pre-installed data is about movies and reviews corresponding to the movies.
Close
, navigate to Data Products
-> Apps
-> Mendix Data Loader
, a documentation page titled Mendix Data Loader
should appearThe Mendix application has a published OData service that exposes the application data for the entities (class definitions) captioned Movie
and Review
which are linked to one another through an association. The OData resource for this application can be found along the following path: Showcase_DataLoader
-> Resources
-> Published OData
-> POS_Movies
.
In the OData resource, the General
tab contains information about the exposed service and the entities that are exposed in the service. Each entity has an endpoint from where the values can be retrieved after authentication. In the Settings
tab, the metadata endpoint contains information about the exposed data structure of the OData resource. Additional endpoints are exposed for each exposed set configured in the General
tab.
The Mendix Data Loader retrieves the exposed data structure from the metadata endpoint. After which the application will start to provision the application with transient target tables. Then the application retrieves the data from the service feed for each exposed entity found in the metadata. The Mendix Data Loader is developed for the extraction and loading of Mendix data, any data transformation and integration should be performed outside the scope of the Mendix Data Loader's objects.
All the exposed data will be ingested into Snowflake. To retrieve a subset of the exposed data you can use the filter query option, for more information refer to OData's Basic Tutorial.
Should any data reside in the specified database and schema from prior ingestion jobs, this data will be lost. For ingestions of multiple sources, we recommend using the same database with a different schema.
Upon starting the application, a documentation page that includes usage instructions is displayed. To start the application, click the MENDIX_DATA_LOADER
hyperlink in the header.
The Mendix Data Loader requires the CREATE DATABASE
and EXECUTE TASK
privileges to create the target database where the ingested data will be stored and execute tasks used for scheduling ingestion jobs. To that end, a modal window will request you to grant the application these privileges, click Grant Privileges
to accept this request.
Next, the application requires a NETWORK RULE
, SECRET
and EXTERNAL ACCESS INTEGRATION
objects to make the external call towards your deployed Mendix application instance. To create these objects, choose the basic credentials authentication method and fill in the ingestion configuration form of the application as follows:
API endpoint
: The location of the OData resource {{YOUR_SAVED_ENDPOINT}}/odata/MoviesBasic/v1/
, e.g., https://sfshowcase101-sandbox.mxapps.io/odata/MoviesBasic/v1/
Target database name
: MOVIE_DBTarget schema name
: MOVIE_APPUsername
: SFDataLoaderUserPassword
: MendixSnowflake123Note that the Mendix Data Loader also has the option to authorize the OData call using OAuth.
Then click the Submit
button and navigate to the Main
tab. On the Main
tab click the Generate Access Script button
and copy the value of the generated field. This button uses the values from the fields to create a SQL script that a user with the ACCOUNTADMIN
role needs to execute. This script will create the objects in the database titled mx_data_loader_secrets
and grant the application privileges to access those objects.
Open a new browser tab, log into the same Snowflake environment and create a new SQL worksheet. In here, paste the copied SQL script and hit the CTRL
+ SHIFT
+ ENTER
buttons on your machine simultaneously. The execution of commands may take a few moments and should result into a table with a single column captioned status
and one row with the status value "Statement executed successfully.
" should appear.
You have now successfully granted the Mendix Data Loader with the privileges and objects it needs to ingest data from your specified endpoint.
Move back to the initial browser tab in which you had opened the Mendix Data Loader. If the input fields have the same values as the ones that you specified for the access script generation, you can now click the Ingest Data
button. If not, go back one step and specify the values in the form of the application again and then click the Ingest Data
button.
Navigate to the Schedule Task
tab and fill in the task configuration. The following fields are mandatory:
When should the ingestion task run?
: Choose one of the following options Custom CRON expression
, If you choose Custom CRON expression
then the Custom CRON expression
field is also mandatory.Every day at 00:00 AM UTC
Every Monday on 00:00 AM UTC
Every first day of the month at 00:00 AM UTC
The other task configuration fields are optional:
Time out
: This is an optional setting that can be used to change after how much time (in ms) a timeout exception should happen.Number of retry attempts
: This setting sets how many retries should be performed if an ingestion job fails.Suspend task after number of failures
: This setting sets the number of times a task is allowed to consecutively fail before suspending the task. Now press the Schedule Ingestion Task
button and grant USAGE
on a warehouse you want to use for the ingestion to create the task. You can view details of the created task on the Task Management
tab where you can also view its performed ingestion jobs, suspend/enable the task, and drop the task. At present, we allow one task to exist at a time. You can view details of the created task on the Task Management
tab where you can also view its performed ingestion jobs, suspend/enable the task, and drop the task. At present, we allow one task to exist at a time.Congratulations! You've successfully installed Mendix Data Loader app and moved data from Mendix application onto Snowflake.