In this article, we will look into as how to parse JSON keys to insert records into PostgreSQL Database using Mule Studio
Introduction
Suppose we have a JSON file whose structure looks as under
[
{ "empid": "001",
"empname": "S. Kumar",
"empage": "30",
"empdesignation": "Software Engineer"
},
{ "empid": "002",
"empname": "Viki Singh Yadav",
"empage": "25",
"empdesignation": "Software Engineer"
},
{ "empid": "003",
"empname": "Shakalu Ambalika",
"empage": "40",
"empdesignation": "DBA"
},
{ "empid": "004",
"empname": "Sinchandra Sukumar",
"empage": "29",
"empdesignation": "Business Analyst"
}
]
The intension is to parse all these keys and store the values into columns into a PostgreSQL table.The final output should be

What we need to have for doing the experiment?
We need
- PostgreSQL
- PostgreSQL JDBC driver
- Download Mule ESB
-
A database to be created in PostgreSQL say "TestDB" where atleast one Table say "tblEmployee" should be created as under
CREATE TABLE "tblEmployee"
(
empid text,
empname text,
empage text,
empdesignation text
)
-
A Json File say "Employee.json" with the content shown above should be stored somewhere.
- Everything should be up and running properly
Let us start the experiment
First create a Mule Project.Then import the Postgre SQL JDBC Driver properly.Next create a Datasource.After that, create the Connector
All these steps are covered in my previous Mule articles and henceforth are not repeated here
The flow

The Configuration File
<jdbc:postgresql-data-source name="PostgreSQL_Data_Source" user="postgres" password="niladri_123" url="jdbc:postgresql://localhost:5432/TestDB" transactionIsolation="UNSPECIFIED" doc:name="PostgreSQL Data Source"/>
<jdbc:connector name="PostgreSQL_JDBC_Connector" dataSource-ref="PostgreSQL_Data_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database">
<jdbc:query key="InsertRecord" value="INSERT INTO "tblEmployee"(empid,empname,empage,empdesignation) VALUES (#[message.payload.empid],#[message.payload.empname],#[message.payload.empage],#[message.payload.empdesignation])"/>
</jdbc:connector>
<flow name="ParseJsonKeysToStoreValuesFlow1" doc:name="ParseJsonKeysToStoreValuesFlow1">
<!-- Input file source -->
<file:inbound-endpoint path="C:\Users\niladri.biswas\Desktop\input" responseTimeout="10000" doc:name="File"/>
<!-- Transform the JSON data into a java.util.List -->
<json:json-to-object-transformer returnClass="java.util.List" doc:name="JSON to List" />
<!-- split it into several java.util.Maps -->
<collection-splitter doc:name="Collection Splitter"/>
<!-- Write the values to database -->
<jdbc:outbound-endpoint exchange-pattern="one-way" queryKey="InsertRecord" queryTimeout="-1" connector-ref="PostgreSQL_JDBC_Connector" doc:name="Destination"/>
</flow>
After reading the source data, we are transforming the JSON data into a Java.Util.List, then split it into several Java.Util.Maps items.Finally we are writing it to the database.
Now, let us run the application and we will find the result has been inserted properly into the database.
Conclusion
Hope you have enjoyed the tutorial.More to come soon.Zipped file for the experiment is attached herewith.