WC BI uses a standardized set of tables from the ERP’s we support known as the DataLake, which is used to publish data to the cubes during the nightly process.
New options on the Admin Menu, which require Admin level access to WC BI, allow users to add or modify the contents of the DataLake and the Cubes
-
- Data Pump Configuration – Used to add/edit tables to the Datalake
Data Source Creation in WC BI
This requires the user to prepare a list of tables that contain the facts, information that is being measured, and dimensions which is the information that describes the facts for sorting and aggregating.
The first step in creating a new Cube is determining if the required tables are in the DataLake
- Select Data Pump Configuration from the Admin menu
If your tables are part of the ERP database, you can use the standard Connection to use them. If they come from a different data source, you can add a new Connection
- Select Connections if you need to create a new Connection to a data source
- Select Upload Config to import a connection file
- Select New Connection to create a Connection manually
-
- Input a name for your new Connection (e.g., ODBC, Multivalue)
- Select the Source Type using the radio button
JDBC – a connection via JDBC to a database, SQL Server or Progress, or another database type using a driver installed on the Data Pump server that populates your DataLake
FILESYSTEM – a .CSV file containing columns separated by commas with a header row to provide column names and detail rows with your data. Click the "Use Data Pump Service" check box.
CSD – a connection to an INFOR CSD datalake via an IOAPI file and a database driver
MULTIVALUE – a connection to a multi-value file system using source type, UNIVERSE or UNIDATA, and server information
-
- Select Save to save your new connection If you made an error, you can Delete your new connection but do not delete any other connection information that currently exists NOTE: If selecting JDBC, CSD, or MULTIVALUE, please contact your White Cup support representative to finish data pump setup.
The next step is to determine if your tables are already in the DataLake, if you used an existing Connection, or if you need to add them for a new Connection
-
- Select Tables from the menu bar
- Type the table name into the Search box or scroll through the list of tables
- If you don’t see the tables you need then you will need to add them
-
-
- Select New Table
- Input Table Name. If possible use the actual Table Name
- Select a connection from the Connection dropdown
- Use Infer SQL to select all records from the table, you can edit the SQL if you want to apply any filtering or formatting
- Select Save
- Repeat as needed for additional tables
-
Scheduling Newly Created Tables
The next step is to schedule any new tables that you have added
-
- Select the Schedules tab
- Select New Schedule to add a Schedule
-
-
- Input a name for the Schedule
- Select the days of the week, hours and minutes of the days for which you want to run the Schedule. Note: All the times are in UTC.
-
-
-
- Input the names of the tables you added into the DataLake tables input box
- Input the names of the tables you added into the DataLake tables to clear input box
- Select Save to save your schedule
- You can delete a Schedule using the red x icon or edit it using the blue pencil icon or run the Schedule using the green arrow icon
- Do not delete any schedules that you did not create
-
You can check on the progress of your new schedule using the Status menu option and the green Refresh button
-
- Select the Status tab
- The Finished Type column will tell you if your Schedule was successful
- The Error Message column will tell you if your Schedule has any errors
- It can take time for the Schedule to run if you started it manually so you may need to use the Refresh button several times before the results are displayed
After you have added any required tables to your DataLake the next step is to add any new Tasks to extract data from the tables you have added or edit existing Tasks for the current set of Cubes
Comments
0 comments
Please sign in to leave a comment.