Using ODI Variables in Topology: Pushing Contexts to the Extreme

The posts in this series assume that you have some level of familiarity with ODI. The concepts of Context, Topology, Logical Architecture, Physical Architecture and Variables are used here assuming that you understand them in the context of ODI. If you need more details on these elements, please refer to the ODI Tutorial for a quick introduction, or to the complete ODI documentation for more details.

In a previous post, we have discussed the notion of Context and how powerful it can be. There is a limit however. If you remember our discussion, each server, each schema on each server, have to be defined in the ODI Topology. Now imagine that you design processes that have to run on a large number of systems. The exact same code will be executed; only the physical location will be different. How about having thousands of systems where this code has to run. Who wants to maintain the connection information manually through a graphical interface? And maintain thousands of contexts while doing so?

Realistically, I think that contexts are fine up to about a dozen of environments. Beyond that, you need an environment that will be more dynamic. But we want to keep the benefits of having the exact same code on all systems, the flexibility of having a complete separation of the generated code and of the execution location. The solution? Use variables in Topology!


Before jumping heads down in the usage of variables in Topology, I strongly recommend the creation of 2 contexts:
- A development context where all URL in topology point to an actual server, not using the variables. This will ensure that data can be viewed, and interfaces can be tested without any concerns regarding the variables resolution;
- The dynamic context (similar to what will be used in QA and Production) will use the variables in topology to name the servers, port numbers, user names for the connections, etc. The package will assign the appropriate values to the variable and run the interfaces on the appropriate servers. This context will only be used to validate that the processes defined in the Development context work properly when we use the variables.

Independently from ODI, we will need a table to store the values that will be used for the Topology variables. For this example, will be simply use different server names. Keep in mind that other topology parameters can be set using this same technique.

The table for our structure will contain the server names, as well as the name of an ODI agent: with many processes running concurrently, it is better to assign pools of servers to different agents. We will use the following structure:

create table ODI_SERVERS (
SERVER_NAME varchar(50),
AGENT_NAME varchar(50)

Loading this table is not part of the description we have here. If you have a list of servers available somewhere, ODI would be the perfect tool to load your table though…


2.1 Creation of the Variable
Details on the creation of a variable can be found in this post. We will simply review the main steps here.

Create a new variable called ServerName.
The data type is Alphanumeric.
You can set the default value to localhost.
You can use the Description field to describe the variable.
In the Refreshing tab, select the logical schema that hosts your ODI_SERVERS table and enter the following query:


Note: for more flexibility in the execution of your code, you should never hard-code a schema name – nor assume that the login used to connect to a server defaults to your schema name. Best practice in the example above would be to let ODI complete the table name with the schema name by using the following syntax:

select SERVER_NAME from <%=odiRef.getObjectName(“L”, “ODI_SERVERS”, “D”) %>


2.2 Definition of the Topology

2.2.1 Contexts

To get more details on the creation of contexts, please check out this example. If you already know your way around Topology, create the following two contexts: Development and Dynamic

Servers Context

2.2.2 Development Server: Hardcoded Values

In topology, define one of the servers that you want to access. We will use this server for our development work, and ignore the variables for now: enter the parameters as usual for the definition if this server:

Development Server

Make sure to test the connectivity to validate your parameters. Define a physical schemas as usual under this data server. Map it to the logical schema ORA_DYNAMIC_SERVER for the Development context.

Note: you will have to create the logical schema to perform this mapping.

Servers: Development Schema

2.2.3 Dynamic Server: Using the Variables

For the dynamic contexts, we will use the variable as part of the connection string (aka the JDBC URL). Instead of typing the actual hostname, type the variable name, including the CODE of the project where the variable was created (or GLOBAL for a global variable).

URL example for Oracle:

Servers: Dynamic

This physical server will be considered as the dynamic server. Map its physical schema to the logical schema defined in the previous step, in the Dynamic context this time.

Dynamic Server Schema


Your code will work as usual in the Development environment: you can validate your transformations, your processing logic and processing time. When you will want to use the Dynamic context though, a few more steps will be required.

3.1 Connection to the Databases

The ODI agents are in charge of the connection to the databases, sources and targets. At the beginning of the execution of a process, the agent will connect to the Master Repository to retrieve the Topology information. Then it will connect to the databases, generate the code, and send that code to the databases. This represents a challenge in our case, since we need to set a value for our variables before the agent connects to the databases: we cannot set the value of our variables in the package that will be executed...

3.2 Setting the Values for Topology Variables

The solution will be to pass these values as parameters to the process, so that the values are known to the agent before it establishes the connection. You can set the values no matter how you start the scenario: from a command line interface, from another scenario or from an ODI procedure, from a web service or from a Java API. Remember to declare your variables at the beginning of your package to make sure that the value of the parameters gets properly stored though!

For more details on how to pass parameters to an ODI process, you can check out this other post.


- Make sure that the variable name is properly spelled in Topology (Variable names are case sensitive
- Make sure that the variable is declared at the beginning of your package
- Make sure that the variable is referenced with its project CODE (not the project name) in Topology

For more information on ODI contexts, please refer to the ODI Users Guide (part of the Documentation Library that comes with the ODI installation), in particular the entry "What is the Topology? "

All Screenshots were taken using version of ODI. Actual icons and graphical representations may vary with other versions of ODI.


Christophe, thanks for the post. I have a question. When should I use this? Context is dynamic enough to point same logical schema to multiple physical schemas. If I use same physical schema with different contexts and a variable I may be forced to use one single physical server with one logical schema. In an integration environment I would like all of my connection information explicitly stored inside topology. Keep up the good work, I need posts like these.

Posted by Sid on May 11, 2009 at 12:13 AM PDT #

Sid, You are absolutely correct: the Contexts will be the perfect solution in a large majority of cases. I would only use variables if you have to create too many contexts all of a sudden. Then, the management of all physical servers for each context may become a challenge. We have customers who would have hundreds (even thousands for one of them!) of contexts: at that level, you need an alternative solutions, and the variables become very handy. Thanks for the encouragements :) -Christophe

Posted by Christophe on May 11, 2009 at 12:45 AM PDT #

Hi Christophe, Can I apply CDC in this scenario,while connecting to various instances of source system. Thanks Baji

Posted by Baji on May 26, 2009 at 02:47 PM PDT #

Baji, Absolutely. CDC would not change the way this works. Keep in mind that you will always have to pass variables to your scenarios to set the connectivity information. From a CDC perspective, this means that starting the journal, stopping the journal, adding and removing subscribers will ahve to de done from within a scenario (and not from the GUI anymore). Simply drag and drop your data stores and/or models in a package to automate these operations. My best -Christophe

Posted by Christophe Dupupet on May 26, 2009 at 10:12 PM PDT #

Does anyone tried this dynamic physical connections on a parallel execution contexts?

Posted by pedro on March 25, 2010 at 01:50 AM PDT #

Pedro, Absolutely. We have quite a few customers using this setup to run their processes in parallel against multiple systems (either different schemas in the database or separate databases altogether). When running against multiple schemas of the same database, it's only a matter of making sure that the database itself can sustain the activity... We have a customer with over 40,000 schemas on the same database, ad they use this approach to run their integration processes. They do run hundreds (thousands? I have to check with them on the actual count) of the same ODI process in parallel against that very database...

Posted by Christophe Dupupet on April 02, 2010 at 03:47 AM PDT #

If one of the parallel scenarios with variable topology needs to be restarted due to failure midstream will the variables retain the passed in value?

Posted by Dan on July 20, 2010 at 03:52 AM PDT #

Yes indeed (full disclosure: we went back and forth Dan and I on this one. I thought it would... but Dan proved it! Thanks Dan)

Posted by Christophe Dupupet on August 31, 2010 at 01:17 AM PDT #

I'm trying to use this technique and it seems to work just fine for most fields except for 'DEST_PASS' . I was expecting it will be picked up on my LKM with the following code .. 'LKM File to Oracle(SQLLDR)' has the following piece of code... loadcmd = r"""sqlldr <%=odiRef.getInfo("DEST_USER_NAME")%>/<%=odiRef.getInfo("DEST_PASS")%>%s control="%s" log="%s" > "%s" """ % (oracle_sid,ctlfile, logfile, outfile) Please advise.

Posted by CMB on December 01, 2010 at 04:43 AM PST #

You are indeed correct with the passwords. Because the passwords are encrypted in Topology, the variable name gets encrypted… and as such does not get properly substituted. However… The 11g release of ODI comes with a very extensive SDK that allows you to code anything you want around the default behavior of the GUI. Given enough time (hum), you could literally rebuild the entire ODI GUI based on the SDK (actually this is what is done today!). So … by leveraging the SDK, you should be able to perform any required customization based parameters you want to put together. Sample code for the ODI SDK is availble for download from Hope that helps !

Posted by Christophe Dupupet on December 01, 2010 at 12:44 PM PST #

I came across a problem while moving to the execution repository. The scenarios (inside which the variable is refreshed) do not work in the execution repository complaining of the bad connection. Is there any possibility to get it working in the exec. repo., to which the varibles cannot be imported? Thanks for an advice.

Posted by ibich on June 27, 2011 at 08:03 AM PDT #


In the SQL expression for your variable refresh, make sure that you use <%=odiRef.getObjectName("YourTable")%> in place of your table name. This will dynamically resolve to the appropriate schema name / table name which I suspect to be your problem. Also make sure that all logical schemas are properly mapped in your production context.

Posted by guest on June 28, 2011 at 08:51 AM PDT #

Hi Christophe,
I´m working with Hyperion Applications being loaded using views.
And the problem is that the names of the apps change from year to year.
Any idea to avoid editing physical/logical schemas, and reverse engineering the models?
Thanks in advance,


Posted by guest on November 23, 2011 at 05:42 AM PST #


Where does that apps name appear? Sorry but I am not familiar with Hyperion Applications.

Generally speking, the logical name should not change - after all, it's only an alias for internal use in ODI. Variables can be used in Topology and in the models. You could set the value by using it as a parameter when you start your process..

Hope this helps... and Happy Thanks Giving!

Posted by Christophe Dupupet on November 23, 2011 at 08:05 AM PST #


is there any systems variables or functions,that I could use for the date of the last execution of a package or a procedure.



Posted by kamila on April 02, 2012 at 03:10 AM PDT #


What you would normally do is retrieve the current time with a SQL query and assign that value to a variable so that you can rememeber the value later.

If you are using an Oracle database, you would define a "refresh query" for your variable, something like "select sysdate from dual".

Then add this variable to your package and select "refresh variable".

As long as the variable is defined with "historize values" or "keep last value" this value will be saved, and you can re-use the value in any subsequent run.

An example of this behavior is available here:

My best

Posted by Christophe on April 02, 2012 at 03:21 AM PDT #


Among the methods substition, what methods I could use to determine the final status of execution of a procedure or package in ODI.


Posted by kamila on April 02, 2012 at 08:32 AM PDT #


There are two approaches that can be considered here. The first one is with substitution APIs. The second one would be with the SDK. Depending on what type of information you are trying to retrieve (or when you want to retrieve that information) you want want to consider one or the other.

1. Substitution API
All substitution APIs are documented in the "KM Developer's Guide" available here:
To retrieve the status of a previous step, you can use getPrevStepLog(). This will return the status of the most recently executed step in the same package. so if you wrap your package or procedure in another package, that "master package" can be used to retrieve the status with the API.

2. SDK
The SDK will give you more flexibility in what you want to retrieve, but goes beyond just one API call.
I have build a quick example here:

My best

Posted by Christophe on April 02, 2012 at 10:34 AM PDT #


This is why you want to use the substitution API or the SDK - these will provide the proper values no matter which repository you are using, because they rely on the proper connection

I hope this helps


Posted by Christophe on April 27, 2012 at 01:29 PM PDT #

Hi Christophe,

First of all congratulations for the good work you're doing here. I'm using ODI to load data from an oracle DB to Essbase which is supporting HPCM applications. The problem that I have is related with the names of these applications that often suffer changes due to Essbase cubes that get corrupted and the application needs to be copied to a similar one with a different name (new applications have the same physical architecture that the old ones).

I already tried to use a variable in the physical architecture (in the 'schema' field) but it didn't work.

Any thoughts about it?

B. Soares

Posted by Soares on May 18, 2012 at 06:51 AM PDT #

You have to make sure that the schema name is passed as a parameter:
- in your package, one of the very first steps must be a "declare variable"
- when you invoke your scenario, add the variable name and it's value for this run as a parameter.

If the value of the parameter is not set before you start the scenario (i.e. as a parameter) then the scenario cannot properly connect to your system.

I hope this will help!


Posted by Christophe Dupupet on May 25, 2012 at 11:58 AM PDT #

Hi christophe

Very usefull for multiple execution contexts, or for in external scenarios configuration tool and scheduler.

We use variables in topology, model definition (files, table names...) , ODI tools... Works.



Posted by Ouail on July 04, 2012 at 11:33 AM PDT #

Hello Christophe,

thank you for the useful post.

Declaring an oracle data server we are trying to store the password's value in a global variable, passing it in the 'properties' tab of the data server.

We created a package with two steps:
1) declare the global variable
2) execute a scenario

The first step of the scenario ends with the following error:
ORA-01017: invalid username/password; logon denied

Posted by stefania on November 07, 2012 at 02:14 AM PST #

Sorry, i pressed 'Post' before finishing the comment.

We tried to store in the password both the unencrypted/encrypted values, but it doesn't work either way.

Do you have any tips?

Thank you


Posted by stefania on November 07, 2012 at 02:17 AM PST #

Topology variables must be known before scenario compilation.

So when u use variable to store topoly informations, those variables should be passed to te scenario in the OdiStarScen command.


Posted by guest on November 07, 2012 at 09:16 AM PST #


While using the Global variables in password field in dataserver connection in ODI12c, I am facing "invalid ORA-01017: invalid username/password; logon denied".

If I hardcode the password the connection works fine. If I use Global variables in User field even then it works fine. But with #GLOBAL.<var_name> in Password field , the connection fails and works only with the hardcoded password value.

All was working fine in ODI 11g. After importing code from ODI 11g to ODI 12c, I am not able to execute with global variables in password fields.

Please kindly help with this issue.

Posted by guest on March 03, 2015 at 03:18 AM PST #

Which version of 12c are you using? Its best to ensure you have latest patchset, and then if still an issue report a bug.


Posted by David on March 16, 2015 at 01:08 PM PDT #

Post a Comment:
  • HTML Syntax: NOT allowed

Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality


« July 2016