Tuesday Jul 09, 2013

ODI - Loading MongoDB (API as Target)

In this post I will show how to load documents into a MongoDB collection. The interface design looks just like all other ODI interfaces, but behind the scenes the KM configured in the physical design uses the MongoDB SDK (see MongoDB SDK here) to insert the documents. The target datastore below represents a MongoDB document, the columns are the keys in the document. Each row is inserted as a document, and each column is a key, the column value is the value. The ENAME value below is shown as a complex JSON value.

The IKM I have used is a multi-connect IKM, the source is a SQL data source and the target uses a MongoDB groovy command. The heart of the IKM to insert the documents into the collection has a SQL select as the source command and the following groovy code for the target command;

  1. import com.mongodb.*
  2. MongoClient mongoClient = new MongoClient(" <%=odiRef.getOption("MONGO_SERVER")%> ", <%=odiRef.getOption("MONGO_PORT")%> );
  3. DB db = mongoClient.getDB("<%=odiRef.getOption("MONGODB")%>");
  4. DBCollection coll = db.getCollection("<%=odiRef.getOption("MONGOCOLL")%>")
  5. BasicDBObject doc = new BasicDBObject();

  6. <%=odiRef.getColList(" ", "doc.put(\u0022[COL_NAME]\u0022, \u0022#[CX_COL_NAME]\u0022);", " \n ", "", "((INS and !TRG) and REW)")%>

  7. coll.insert(doc);

The odiRef.getColList method call above generates code for every target column, the code performs a doc.put invocation to add the key-value pairs into the document. For example this is the code generated and executed based on the interface design discussed above;

  1. import com.mongodb.*
  2. MongoClient mongoClient = new MongoClient("DALLAN-SVR", 27017);
  3. DB db = mongoClient.getDB("test");
  4. DBCollection coll = db.getCollection("testCollection")
  5. BasicDBObject doc = new BasicDBObject();

  6.  doc.put("EMPNO", "#EMPNO"); 
  7.  doc.put("ENAME", "#ENAME"); 
  8.  doc.put("JOB", "#JOB"); 
  9.  doc.put("MGR", "#MGR"); 
  10.  doc.put("HIREDATE", "#HIREDATE"); 
  11.  doc.put("SAL", "#SAL"); 
  12.  doc.put("COMM", "#COMM"); 
  13.  doc.put("DEPTNO", "#DEPTNO");

  14. coll.insert(doc);

 This is a simple illustration of how to load documents into MongoDB. We can go into the MongoDB command line and execute the command to see all objects in the collection and get the list of documents, below you can see a preview of executing db.testCollection.find()

  • { "_id" : ObjectId("51dc3ded6c4b9a5bd07d68a6"), "EMPNO" : "7369", "ENAME" : "{ NAME : SMITH, DESCR : 22 }", "JOB" : "CLERK", "MGR" : "7902", "HIREDATE" : "1980-12-17 00:00:00.0", "SAL" : "801", "COMM" : "", "DEPTNO" : "20" }
  • { "_id" : ObjectId("51dc3ded6c4b9a5bd07d68a7"), "EMPNO" : "7499", "ENAME" : "{ NAME : ALLEN, DESCR : 22 }", "JOB" : "SALESMAN", "MGR" : "7698", "HIREDATE" : "1981-02-20 00:00:00.0", "SAL" : "1601", "COMM" : "300", "DEPTNO" : "30" }

 You can see the key:value pairs in our document. For those MongoDB gurus, you'll notice in the 'complex' data illustration, this is really a string and not a MongoDB complex object - that discussion is for another day.

This post is not just about MongoDB, but also a useful post on how to integrate APIs as a target in a data flow. 

Tuesday Jan 15, 2013

ODI - Hive and MongoDB

I've been experimenting with another Hive storage handler, this time for MongoDB, there are a few out there including this one from MongoDB. The one I have been using supports basic primitive types and also supports read and write - using the standard approach of storage handler class and custom properties to describe the data mask. This then lets you access MongoDB via hive external table very easily and abstract away a lot of integration complexity - also makes it ideal for using in ODI. I have been using on my Linux VM where I have Hive running to access my MongoDB running on an another machine. The storage handler is found here, I used it to access the same example I blogged about here, below is the external table definition;

  1. ADD JAR /home/oracle/mongo/hive-mongo.jar;

  2. create external table mongo_emps(EMPNO string, ENAME string, SAL int)  
  3. stored by "org.yong3.hive.mongo.MongoStorageHandler"  
  4. with serdeproperties ( "mongo.column.mapping" = "EMPNO,ENAME,SAL" )  
  5. tblproperties ( "mongo.host" = "<my_mongo_ipaddress>", "mongo.port" = "27017",  
  6. "mongo.db" = "test", "mongo.collection" = "myColl" );

Very simple. The nice aspect of the Hive external table are the SerDeProperties that can be specified, very simple but provides a nice flexible approach. I can then reverse engineer this into ODI (see reverse engineering posting here) and use it in my Hive integration mappings to read and potentially write to MongoDB.

The primitive types supported can also project nested document types, so for example in the document below (taken from here), name, contribs and awards are strings but have JSON structures;

  1. {
  2. "_id" : 1,
  3. "name" : {
  4. "first" : "John",
  5. "last" :"Backus"
  6. },
  7. "birth" : ISODate("1924-12-03T05:00:00Z"),
  8. "death" : ISODate("2007-03-17T04:00:00Z"),
  9. "contribs" : [ "Fortran", "ALGOL", "Backus-Naur Form", "FP" ],
  10. "awards" : [
  11. {
  12. "award" : "W.W. McDowellAward",
  13. "year" : 1967,
  14. "by" : "IEEE Computer Society"
  15. },
  16. {
  17. "award" : "National Medal of Science",
  18. "year" : 1975,
  19. "by" : "National Science Foundation"
  20. },
  21. {
  22. "award" : "Turing Award",
  23. "year" : 1977,
  24. "by" : "ACM"
  25. },
  26. {
  27. "award" : "Draper Prize",
  28. "year" : 1993,
  29. "by" : "National Academy of Engineering"
  30. }
  31. }

can be processed with the following external table definition, which then can be used in ODI;

  1. create external table mongo_bios(name string, birth string, death string, contribs string, awards string)  
  2. stored by "org.yong3.hive.mongo.MongoStorageHandler"  
  3. with serdeproperties ( "mongo.column.mapping" = "name,birth,death,contribs,awards" )  
  4. tblproperties ( "mongo.host" = "<my_ip_address>", "mongo.port" = "27017",  
  5. "mongo.db" = "test", "mongo.collection" = "bios" );

All very simple and that's what makes it so appealing. Anyway, that's a quick following on using external tables with MongoDB and Hive to the SQL oriented approach I described here that used java table functions.

Friday Jan 11, 2013

ODI - Java Table Function for MongoDB

Behind the scenes of the MongoDB posting was a very simple JavaDB/Derby table function. The function implemented a couple of methods - the table function readCollection and the function makeRow which creates an array from a Java Object. It can't get much simpler. The iteration through the collection is handled by the class I extended from EnumeratorTableFunction  which came from the posting by Rick Hillegas, and it fits nicely into ODIs source/target generic integration task in the KM framework. Here is a viewlet I have created showing you everything very briefly but end to end.

The makeRow function uses the MongoDB java SDK, and produces a row for each BasicDBObject, each value in the document is output as a column in the table. Nested/complex values are serialized as Java Strings - so you will get a JSON string for anything complex.

  1. public String[] makeRow(Object obj) throws SQLException
  2. {
  3.   int idx = 0;
  4.   BasicDBObject dbo = (BasicDBObject) obj;
  5.   Iterator it = dbo.entrySet().iterator();
  6.   String[]    row = new String[ getColumnCount() ];
  7.   it.next(); // skip the 'id' column
  8.   while (it.hasNext()) {
  9.     Map.Entry pairs = (Map.Entry)it.next();
  10.     row[ idx++ ] = pairs.getValue().toString();
  11.   }
  12.   return row;
  13. }

The readCollection table function is a static method and has a couple of parameters (for demonstration) - one is the MongoDB database name and the other is the collection name. The function initializes the object instance with the column names which are defined to be the key names for the objects in the collection (the first object is taken and its keys used as the column names);

  1. public static ResultSet readCollection(String dbName, String collectionName)
  2.   throws SQLException, UnknownHostException
  3. {
  4.   int idx = 0;
  5.   MongoClient mongoClient = new MongoClient();
  6.   DB db = mongoClient.getDB(dbName);
  7.   DBCollection coll = db.getCollection(collectionName);
  8.   DBCursor cursor = coll.find();
  9.   BasicDBObject dbo = (BasicDBObject)  coll.findOne();
  10.   Set<String> keys = dbo.keySet();
  11.   String[] skeys = new String[keys.size()];
  12.   Iterator it = keys.iterator();
  13.   it.next(); // skip the id
  14.   while (it.hasNext()) {
  15.     skeys[idx++] = it.next().toString();
  16.   }
  17.   return new mongo_table( skeys, cursor );
  18. }

The mongo_table constructor just initializes itself and sets the enumeration to iterate over - the class I extend from is very useful, it can iterate over Java Enumeration, Iterator, Iterable, or array objects - the super class initializes the column names, and the setEnumeration defines the collection/iterator - which in this case is a MongoDB DBCursor which happens to be a Java Iterator<DBObject>.

  1. public mongo_table(String[] column_names, DBCursor cursor)
  2.   throws SQLException
  3. {
  4.   super( column_names );
  5.   setEnumeration( cursor );
  6. }

This approach can be used for sourcing pretty much anything, which is great for integration needs. The ODI Knowledge Module is an LKM and stages the result of the table function into a work table, then everything else is as normal. The KM creates the work table and also registers the table function with JavaDB/Derby. My code for the function registration is as follows;

  1. create function <%=odiRef.getSrcTablesList("","[TABLE_NAME]", "","")%>( dbName varchar( 330), collName varchar( 30))
  2. returns table
  3. (
  4. <%=odiRef.getSrcColList("","[COL_NAME] [SOURCE_CRE_DT]","[COL_NAME] [SOURCE_CRE_DT]",",\n","")%> )
  5. language java
  6. parameter style DERBY_JDBC_RESULT_SET
  7. no sql
  8. external name '<%=odiRef.getOption("TABLE_FUNCTION_NAME")%>'

This creates the table function with the same name as the datastore in the interface, plus the resultant table of the function has the columns (and types) from that datastore. The external JavaDB function name is taken from the KM option TABLE_FUNCTION_NAME. As I mentioned I have hard-wired 2 parameters just now. The Java code implementing this should be created and put in a JAR in the normal userlib directory for adding custom code including JDBC drivers. The other JARs needed are the MongoDB Java SDK jar, derby.jar and vtis-example.jar (from the zip here). You can get the Java source for mongo_table.java here, it is compiled using the MongoDB Java SDK on the classpath as follows (on Windows).

  1. javac -classpath mongo-2.10.1.jar;vtis-example.jar mongo_table.java
  2. jar cvf mongo_table.jar mongo_table.class

The LKM is here it needs imported into your project.

Anyway...this wasn't all about MongoDB per se, it was also about the JavaDB table function capability, any other examples spring to mind about integration capabilities using this route? Going to post about loading into MongoDB and how an IKM is built for this. Interested to hear any ideas/feedback from you on this....so don't be shy!

Thursday Jan 10, 2013

ODI - MongoDB and Java Table Functions Anyone?

Let's see how we could integrate MongoDB using ODI, first take a step back. Derby/JavaDB introduced table functions a few years ago. Table functions are really useful, they are in the Oracle database and as much fun in JavaDB! ODI is a great platform for integration and JavaDB and table functions provide a really nice way to integrate arbitrary Java APIs into your designs. What I have done here is;

  • built a very simple java table function to project a table to represent the documents in a MongoDB collection. The collection is passed as a parameter to the KM and the column names are the keys for the MongoDB document. The data comes from the values.
  • built a very simple LKM from a Java table function to SQL

All of this will use the JavaDB in-memory, so no admin, simple to use. Rick Hillegas wrote a nice article with some handy classes that I have used. The mongo_table class I have written uses the EnumeratorTableFunction class included in Rick's examples. The MongoDB DBCursor class is a Java Iterator, which makes it really nice to pass to the  EnumeratorTableFunction class, and let it do all of the work.

The LKM I constructed declares the table function to JavaDB/Derby, for example below, the function is declared based on the source datastore name (MONGO_EMPS) and columns (my source datastore has EMPNO,ENAME,SAL, note the table function will actually project types defined in ODI's datastore), the function has the MongoDB database name and collection name as parameters.

  1. create function MONGO_EMPS( dbName varchar( 330), collectionName varchar( 30))
  2. returns table
  3. (
  4.   EMPNO VARCHAR(20),
  5.   ENAME VARCHAR(30),
  6.   SAL NUMERIC(10),
  7. )
  8. language java
  9. parameter style DERBY_JDBC_RESULT_SET
  10. no sql
  11. external name 'mongo_table.readCollection'

Then the actual code to use the function as a source is executed from a source task (the target is SQL as I mentioned earlier for the LKM). Below you can see my execution using the test MongoDB and the myStuff collection;

  1. select
  2. MON.EMPNO   C1_EMPNO,
  3. MON.ENAME   C2_ENAME,
  4. MON.SAL   C6_SAL
  5. from table(MONGO_EMPS('test', 'myStuff' )) MON
  6. where (1=1)
  7. And (MON.SAL > 4000)

Note I can also perform some filtering as an example, here it is being done in JavaDB and in my case its in-memory. No setup, no persistence just on the fly Java. Ideally I would push the filter down to MongoDB rather than reading and filtering in the driver - more on that later.

I had defined my documents in MongoDB using the following basic commands in the mongo shell;

  1. use test
  2. a1 = { EMPNO: "1", ENAME : "Fred", SAL : 10000 }
  3. a2 = { EMPNO: "2", ENAME : "John", SAL : 2000 }
  4. db.myStuff.insert( a1 )
  5. db.myStuff.insert( a2 )

In ODI, I can simply then use the datastore representing the MongoDB collection of documents in an interface and map it to my target;

The physical design uses the LKM JavaTableFunction to SQL KM and sets the MongoDB databases, collection and the Java table function name.

That's it. Pretty straightforward and we are reading and consuming MongoDB documents. So what about complex document types like this? These are transported, more to come. The other point here is that this is a generic LKM that you can plug in other arbitrary table functions - so you can consume from any API, its very simple. For the LKM I created, I just defined 2 parameters (because my table function only had two), but really we need a better way to handle this and ensure they are ignored if not defined etc. That's all polishing tho, anyway fun stuff you can see all with a small piece of code leveraging JavaDB!

About

Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
2
3
5
6
7
8
9
10
12
13
14
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today