Variables
Add flexibility / portability with variables ..




Internal.Job.Filename.Directory
The directory where the job file is located.
Internal.Job.Filename.Name
The name of the job file.
Internal.Entry.Current.Directory
The directory where the current entry is located.
Internal.Transformation.Repository.Directory
If you're running a transformation for the Repository, this variable will display the path.
Internal.Cluster.Size
The number of Salves in the cluster.
Internal.Step.Name
name of executing step.
KETTLE_HOME
Location of kettle.properties file.
java.version
JRE runtime version
os.name
Name of OS
os.version
OS version
user.name
User account name
user.home
User home directory

Open tr_status_variable.ktr
Double-click on the canvas, to open Transformation Properties.
Click on the Parameters tab, and configure as illustrated below:

Ensure the value is exactly the same case as stored in the table.
Capital S
Double-click on the Table Input step.
Modify the SQL statement as illustrated below:

Add the following clause.
Double-click on the Table Input step.
Take a look at how the new field: diff_days is calculated:

Double-click on the Number range step.
Take a look at how the new field: delivery is defined:

Double-click on the Sort rows step.
Take a look at how the step is defined:

Double-click on the Select values step.
Take a look at how the step is defined:

Click on the meta tab.

Click the Run button in the Canvas Toolbar.
Click on the Preview tab:

You can’t set and use a variable in the same pipeline, since all steps in a transformatiom run in parallel.

Open kb_set_get_variables.kjb
Double-click on the Set Variables transformation job entry.

Open tr_set_variables.ktr

Open the Data grid step.

Open the Set variables step.

Open tr_get_variables.ktr

Open Get variables step.

Be careful clicking on Get variables .. it will return all the variables ..!
RUN the the Job.

Open the kb_setting_variables.kjb.

Double-click on the Set Variables job entry.

Last updated
Was this helpful?

