Skip to content

Use case Guide - Scenario 3 - Low code data analytics#

This document is meant to guide the user through Scenario 3 - Low code data analytics. As disussed in the use case description, the goal is to provide a tool that performs data analytics in a low code fashion. The guide will be a step by step tutorial towards such objective. More in detail, each subsection covers a step of the approach, namely:

  1. Step 1: Initialize the resources.
  2. Step 2: Connect PgAdmin to PostgreSQL
  3. Step 3: Load the data
  4. Step 4: Perform low code data analytics.

Step 1: Initialize the resources#

As first step, the user should inizialize the required resources. More in particular, two instances should be launched:

  • PostgreSQL instance
  • Knime instance
  • PgAdmin instance

Initialize the PostgreSQL instance#

  1. Go on the Service Catalog section of the Portal.
  2. Click on the button Launch on the PostgreSQL badge.
  3. Assign a name to the instance and select a group from the ones available in the list.
  4. Select the micro configuration.
  5. Copy the auto-generated password. This will be needed to access the instance in the later stage. (NB: Instance credentials are automatically saved and acessible on the My Data section of the portal).
  6. Once the instance is deployed, it will be possible to copy its host address by clicking on the related button Copy in the My Services section of the portal.

Initialize the PgAdmin instance#

  1. Go on the Service Catalog section of the Portal.
  2. Click on the button Launch on the PgAdmin4 badge.
  3. Assign a name to the instance and select a group from the ones available in the list.
  4. Select the default configuration.
  5. Set your PgAdmin email and copy the auto-generated password. This will be needed to access the instance in the later stage. (NB: Instance credentials are automatically saved and accessible on the My Data section of the portal)..
  6. Launch the instance by clicking on the launch button.

Initialize the Knime instance#

  1. Go on the Service Catalog section of the Portal.
  2. Click on the button Launch on the Knime badge.
  3. Assign a name to the instance and select a group from the ones available in the list.
  4. Select the default configuration.
  5. Copy the auto-generated password. This will be needed to access the instance in the later stage. (NB: Instance credentials are automatically saved and acessible on the My Data section of the portal).
  6. Select the NFS PVC name corresponding to the DSL group selected at point 3.
  7. Launch the instance by clicking on the launch button.

After having launch the instances, go on the My Services section of the portal to verify that all the deployed services are up and running (all three instances should be in status ACTIVE).

alt text

Step 2: Configure pgAdmin and connect the PostgreSQL instance#

  1. From the My Services section of the portal, click on the Copy button of the PostgreSQL instance to copy the host address of the instance in the clipboard.
  2. The PostgreSQL host address is in the format:

        <postgreSQLHost>:5432
    
  3. Go on the My Services section of the portal and open the pgAdmin instance.

  4. Login into your pgAdmin account whith the credentials defined in the configuration.
  5. In the main page, click on the Browser tab: this will allow you to browse on all available resources on your pgAdmin account.
  6. On the Browser tab, right click on the Servers item > Register > Servers... . This will open up the configuration modal for registering a new server on our pgAdmin account: the PostgreSQL database will be added.
  7. The configuration modal contains five fields that need to be compiled:

    • Hostname/address: this is the hostname/address of the PostgreSQL instance. Input here the <postgresSQLHost> (see Step 2.2)
    • Port: 5432
    • Maintenance Database: postgres
    • Username: postgres
    • Password: input the PostgreSQL password defined in the configuration.

Step 3: Load the data#

In order to load the data into the PostgreSQL database, we need to perform some operations on pgAdmin. Please note that pgAdmin is the management interface for PostgreSQL, so we need to use this service to interact with our DB on PostgreSQL. Finally, through pgAdmin we will:

  • create the data table
  • define the data schema for the table
  • import the data from the source file

Create the data table#

  1. In the Browser section, go to the Schemas and expand the public schema (postgres>Databases>Schemas).
  2. In the public schema, right-click on Table>Create>Table.. . This will open up the configuration modal for creating a new table on our public schema: in specific, the data table will be created.
  3. In the configuration modal, assign a name to the table (i.e. emhires_country) and click on save.

Define the schema for the data table#

Please note that to import data from a csv file it is preliminarly needed to manually define the related data schema (namely, to add the columns of the table). This is achieved by following the steps:

  1. Right-click on the newly created table and click on Query Tool. This will open an SQL query tool (Query Editor) that will allow to perform queries on the database. Hence, the SQL query tool will be used to alter the current data schema and to add the columns present in our data source.
  2. Paste the following SQL query in the Query Editor:
ALTER TABLE public."name_of_your_table"

ADD COLUMN AT float,
ADD COLUMN BE float,
ADD COLUMN BG float,
ADD COLUMN CH float,
ADD COLUMN CY float,
ADD COLUMN CZ float,
ADD COLUMN DE float,
ADD COLUMN DK float,
ADD COLUMN EE float,
ADD COLUMN ES float,
ADD COLUMN FI float,
ADD COLUMN FR float,
ADD COLUMN EL float,
ADD COLUMN HR float,
ADD COLUMN HU float,
ADD COLUMN IE float,
ADD COLUMN IT float,
ADD COLUMN LT float,
ADD COLUMN LU float,
ADD COLUMN LV float,
ADD COLUMN NL float,
ADD COLUMN NO float,
ADD COLUMN PL float,
ADD COLUMN PT float,
ADD COLUMN RO float,
ADD COLUMN SI float,
ADD COLUMN SK float,
ADD COLUMN SE float,
ADD COLUMN UK float;

NB: Make sure to substite name_of_your_table with the table name defined in the previous step.

  1. Execute the SQL query by clicking on the Play button (or by clicking F5)
  2. Verify that the columns has been created by checking the table properties (right click on the table name > Properties > Columns).

Import the data from the source file#

  1. Download the data source csv file from here
  2. In the navbar, go on File > Preferences > Storage > Options and set the Maximym file upload size (MB) to 10000. Click on Save.
  3. Right-click on the newly created table and click ok Import/Export Data.... This will open an Import/Export Data tool that will allow us to import the data from the source file.
  4. Select the Import option.
  5. In the File Info section, click on the folder icon. This will open the Select file modal.
  6. In the navbar of the Select file modal, click on the Upload file icon.
  7. Drop the EMHIRESPV_TSh_CF_Country_19862015.csv file and wait for its uploading to be concluded.
  8. The uploaded file should be now visible in the list of available files. Click on it and press Select.
  9. In the File Info section, select csv as Format.
  10. In the Miscellaneous section tick the Header option. This will allow PgAdmin to take into account that the first row of the file contains the table headers.
  11. In the Miscellaneous section, select the comma (,) as Delimiter.
  12. Click on Ok and wait for the import process to be done.

Step 4: Perform low code data analytics#

Launch Knime and install the JFree extension#

  1. Go on the My Services section of the portal and launch the deployed Knime instance.
  2. Access the instance by inputing the password defined in the configuration phase. This will let you access the noVNC instance underlying the Knime service.
  3. When the noVNC environment is loaded, double click the Knime Icon to execute Knime. Then, click on Execute on the dialog box that pops-up.
  4. Select /root/knime-workspace as workspace and click on launch.

alt text

  1. In the KNIME navbar, click on File > Install KNIME Extensions...
  2. Search for the KNIME JFreeChart extension. Select it and click on Next.
  3. Wait for the extensions' installation process to be done.

Create Knime workflows#

  1. In the Knime Explorer tab (top-left of the screen), select and right-click on the efs-storage. Then, click on New KNIME Workflow...
  2. Assign a name to the workflow (eg: Average capacity factor per country) and make sure that the destination of the new workflow is LOCAL:/efs-storage: saving the workflows on the efs volume will allow to access them even if you terminate the instance and then you launch a new one at a later time (if you assign the same efs volume in the configuration phase)

alt text

Then, click on finish to ultimate the creation of the new Knime workflow.

Build the Knime Workflow#

1. PostgreSQL Connector#

  1. In the Knime Explorer section, open the Knime workflow just created.
  2. In the Node Repository section, look for "postgreSQL" and then drag-and-drop the PostgreSQL Connector element in the workflow:

alt text

  1. Right click on the PostgreSQL Connector node and click on Configure..
  2. In the configuration dialog box, fill-in the needed information in the Connection Settings tab:

alt text

- *Hostname/address*: this is the hostname/address of the PostgreSQL instance. Input here the `<postgresSQLHost>` (see Step 2.2)
- *Port*: 5432
- *Username*: postgres
- *Password*: input the PostgreSQL password defined in the configuration.
- *Authentication*: select the option *Username & password* and fill-in the Username and Password
 (which correspond to the *Admin username* and *Admin Password* of your PostgreSQL instance).
  1. Click on Apply and then Ok. Thus, right-click on the PostgreSQL Connector node and click on Execute. If the execution of the node is succesfull, the stoplight indicator of the node will turn to green.

2. DB Table Selector#

  1. Just as it is done for the PostgreSQL Connector node, in the Node Repository section look for "DB Table Selector" and then drag-and-drop the DB Table Selector element in the workflow.
  2. Connect the two nodes: select them (by pressing Ctrl and left-click on both), right click and select Connect selected nodes:

alt text

Please note that you could alternatively connect two nodes by selecting them and pressing Ctrl + L or by simply drawing a line between the two nodes' ends.

  1. When the PostgreSQL Connector node and the DB Table Selector nodes are connected, right click on the latter and click on Configure...
  2. Click on the Select a table button and on the dialog windows that pops-up select the PostgreSQL table that contains the data from this dataset (EMHIRESPV grouped by country). Execute the node.

3. DB Table Reader#

  1. In the Node Repository section look for "DB Reader" and then drag-and-drop the DB Reader element in the workflow. Then, connect it with the DB Table Selector node.
  2. Right click on the newly addedd DB Reader node and Execute the node.

4. GroupBy#

  1. In the Node Repository section look for "GroupBy" and then drag-and-drop the GroupBy element in the workflow. Then, connect it with the DB Reader node.
  2. Right click on the newly addedd GroupBy node and click on Configure....
  3. In the Configuration Dialog, click on add all>> to process all columns. Make sure that the Aggregation field is set on Mean. This transformation will calculate the average capacity factor value per each country.

alt text

  1. Click on Apply and then on Ok. When the configuration is done, execute the node.

5. GroupBy Bar Chart (JFreeChart)#

  1. In the Node Repository section look for GroupBy Bar Chart (JFreeChart) and then drag-and-drop it in the workflow. Then, connect it with the GroupBy node.
  2. Right click on the newly addedd GroupBy Bar Chart (JFreeChart) node and Execute the node.
  3. Right click on the GroupBy Bar Chart (JFreeChart) node and click on Image. This will visualize the results on the screen:

alt text

Here the final workflow as described in the steps:

alt text