In this article we will look into as how we can use Mule Studio to read CSV (Comma Separated Value)Data from File(Inbound) and write it to PostgreSQL (Outbound)
Introduction
In the last article we saw how to use Mule Studio for reading CSV (Comma Separated Value) Data from PostgreSQL(Inbound) and write it to File (Outbound). In this article we will look into the reverse process.
i.e. Using Mule Studio to read Data from File(Inbound) and write it to PostgreSQL (Outbound)
What we need to have for doing the experiment?
We need
- PostgreSQL
- PostgreSQL JDBC driver
- Download Mule ESB
-
A database to be created in PostgreSQL say "TestDB" where atleast one Table say "tblFamily" should be created.
-- Table Creation Script
CREATE TABLE "tblFamily"
(
"Title" text,
"FirstName" text,
"LastName" text
)
-
A CSV file from where the data needs to be read.The CSV file structure will be as under

- Everything should be up and running properly
Let us start the experiment
Step1:Open Mule Studio
So now we are all set to do our first example using Mule Studio.Execute the "MuleStudio.exe" and enter a location for creating the workspace. Workspace is the location where we need to work for the project.

Click on "OK" button to receive the Mule Studio window.Click on "Go to Mule Studio". We will receive the Mule studio editor.
Click File > New > Mule Project.From the "New Mule Project" window that appears, populate the "Name" field with "JDBC-PostgreSQL-Mule-Experiment".
Click "Next" button and "New Mule Flow" window appears.Click "Next" button and "New Mule Flow" window appears.Leave it as it is.Click "Next" button and "Create New Java Project" window appears. Use proper JRE version that you are using.
Click "Next" button and "Java Settings" window appears.Leave it as it is.Click "Finish" button.We will be presented with the following

Step2:Import the Postgre SQL JDBC Driver
(a)In the Package Explorer, right-click over the Project folder ( in this case "jdbc-postgresql-mule-experiment)->BuildPath->Add External Archives…

(b)Choose the location where the "postgresql-9.2-1002.jdbc4.jar" jar file is located in the hard drive and then click on the "Open" button.

(c)Now we can find the jar file being present in the "Referenced Libraries" of the package explorer.

This will allow us to create an instance of the Object driver
Step3:Now the time for creating Data-Source
(a)Switch to the "Global Elements" tab

(b)Click on the "Create" button.Look for Data Sources > PostgreSQL Data Source and click the OK button.

(c)The PostgreSQL Data Source Window appears

(d)In that, we need to specify the Database name in the URL section, PostgreSQL User Name in the "User" field and Password in "Password" field and click OK.

Step4:Let us create the Connector now
So we have successfully created a Data-Source with it's proper driver.Now we need a Connector.
(a)Click on the "Create" button.Look for Connector > Database.Then click OK.

(b)From the Database Window that appears, click on the "Database Specific" drop-down and choose "PostgreSQL_Data_Source".It is the Data-Source that we created in the previous step

(c)Switch to the "Queries" tab and click on the
button ("Add Query"). First we need to specify the "Query Key"(which is "Insert Record" here) and then the query as shown under.Click "OK" button.

INSERT INTO "tblFamily"("Title", "FirstName", "LastName") VALUES (#[message.payload[0]],#[message.payload[1]],#[message.payload[2]])
Finally, we will be presented with the below screen if everything has gone right so far

If at this stage we view the "Configuration XML" tab, it will look like
<jdbc:postgresql-data-source name="PostgreSQL_Data_Source" user="postgres" password="niladri_123" url="jdbc:postgresql://localhost:5432/TestDB" transactionIsolation="UNSPECIFIED" doc:name="PostgreSQL Data Source"/>
<jdbc:connector name="PostgreSQL_JDBC_Connector" dataSource-ref="PostgreSQL_Data_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database">
<jdbc:query key="InsertRecord" value="INSERT INTO "tblFamily"("FamilyMemberId", "MemberFirstName", "MemberLastName") VALUES (#[message.payload[0]],#[message.payload[1]],#[message.payload[2]])"/>
</jdbc:connector>
Step5:Let us create flow now
Switch to the "Configuration XML" tab first and add the below piece of code just above the "<jdbc:postgresql-data-source....>"
<configuration doc:name="Configuration">
<expression-language autoResolveVariables="true">
<import class="org.mule.util.StringUtils" />
</expression-language>
</configuration>

We need to import the "StringUtils" class for splitting the comma seperated values.We will see it's use in the "Splitter" component and in "Expression Transformer" component.
Next place the below line of code just after the "<jdbc:postgresql-data-source....>"
<flow name="ImportCSVFileToDatabase" doc:name="ImportCSVFileToDatabase">
<!-- Step 1: CSV file source -->
<file:inbound-endpoint path="C:\Users\nbiswas\Documents\InputFolder" pollingFrequency="5000" doc:name="Source"/>
<!-- Step 2: Convert between object arrays and strings -->
<object-to-string-transformer doc:name="Object to String"/>
<!-- Step 3: Split each row -->
<splitter expression="#[StringUtils.split(message.payload, '\n\r')]" doc:name="Splitter"/>
<!-- Step 4: Transform CSV row in array -->
<expression-transformer expression="#[StringUtils.split(message.payload, ',')]" doc:name="Expression"/>
<!-- Step 5: Dump into the destination Database -->
<jdbc:outbound-endpoint exchange-pattern="one-way" queryTimeout="-1" doc:name="Destination" connector-ref="PostgreSQL_JDBC_Connector" queryKey="InsertRecord"/>
</flow>
It is quite simple.At the first step, we basically use the "File" component(InBound) to locate the CSV file.
In the second step, we basically convert between object arrays and strings by using the "object-to-string-transformer"
In the third step, we basically split the CSV file value by using the "Splitter" component.The "Split" method of "StringUtil" class is use for this purpose.
In the fourth step,"Expression Transformer" is returning the expressions in an Object array.
In the last step , we push the records to the PostgreSQL Database which is the "Database" component (Outbound).
The final flow is

The full configuration XML should look like
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:file="http://www.mulesoft.org/schema/mule/file"
xmlns:jdbc="http://www.mulesoft.org/schema/mule/jdbc" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:spring="http://www.springframework.org/schema/beans" version="CE-3.3.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/jdbc http://www.mulesoft.org/schema/mule/jdbc/current/mule-jdbc.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd ">
<configuration doc:name="Configuration">
<expression-language autoResolveVariables="true">
<import class="org.mule.util.StringUtils" />
</expression-language>
</configuration>
<jdbc:postgresql-data-source name="PostgreSQL_Data_Source" user="postgres" password="niladri_123" url="jdbc:postgresql://localhost:5432/TestDB" transactionIsolation="UNSPECIFIED" doc:name="PostgreSQL Data Source"/>
<jdbc:connector name="PostgreSQL_JDBC_Connector" dataSource-ref="PostgreSQL_Data_Source" validateConnections="true" queryTimeout="-1" pollingFrequency="0" doc:name="Database">
<jdbc:query key="InsertRecord" value="INSERT INTO "tblFamily"("Title", "FirstName", "LastName") VALUES (#[message.payload[0]],#[message.payload[1]],#[message.payload[2]])"/>
</jdbc:connector>
<flow name="ImportCSVFileToDatabase" doc:name="ImportCSVFileToDatabase">
<!-- Step 1: CSV file source -->
<file:inbound-endpoint path="C:\Users\nbiswas\Documents\InputFolder" pollingFrequency="5000" doc:name="Source"/>
<!-- Step 2: Convert between object arrays and strings -->
<object-to-string-transformer doc:name="Object to String"/>
<!-- Step 3: Split each row -->
<splitter expression="#[StringUtils.split(message.payload, '\n\r')]" doc:name="Splitter"/>
<!-- Step 4: Transform CSV row in array -->
<expression-transformer expression="#[StringUtils.split(message.payload, ',')]" doc:name="Expression"/>
<!-- Step 5: Dump into the destination Database -->
<jdbc:outbound-endpoint exchange-pattern="one-way" queryTimeout="-1" doc:name="Destination" connector-ref="PostgreSQL_JDBC_Connector" queryKey="InsertRecord"/>
</flow>
</mule>
We are now almost done.Next we need to run the application.Right click on "JDBC-PostgreSQL-Mule-Experiment.mflow" and from the context menu, choose Run As->Mule Application.

We will encounter the below screen

If we now look into the tblFamily, we will find that 5 records has been inserted

Conclusion
Hope you enjoyed the step by step tutorial of using Mule Studio to read CSV (Comma Separated Value) Data from File(Inbound) and write it to PostgreSQL(Outbound). I will write more on the same as and when I will perform some experiment.Thanks for reading.The configuration xml file and CSV file is attached in the zipped format.