Just for being complete and to update people on a minor detail, here is a quick example of a data pump driven External Table.
SQL> CREATE TABLE EMP_50
2 ORGANIZATION EXTERNAL
3 ( TYPE oracle_datapump
4 DEFAULT DIRECTORY dmp_dir
5 LOCATION (‘emp_50.dmp'))
7 AS SELECT * FROM EMPLOYEES WHERE DEPARTMENT_ID = 50
SQL> SELECT COUNT(1) FROM EMP_50;
There are a couple of things here. First of all, you can happily use the wildcards in the select these days. I noticed some blog posts declaring you cannot do this, on my 18.104.22.168 install I could nicely do this.
Secondly, and this is why I think this is quite a nice thing to do, you can filter data using regular SQL. As in my typical examples, I used a simple where clause, but there is nothing stopping you from writing real extract clauses for ETL jobs.
Data pump gives you a nice way to unload data and avoid using DB Links. I found the following on another blog out there on how to transfer the file from a remote box (original post here - look in the comments):
Additional info also after 10g; Transferring a File to a Different Database(10g ASM aware)
SOURCE_DIRECTORY_OBJECT => ‘SOURCE_DIR’,
SOURCE_FILE_NAME => ‘exm_old.txt’,
DESTINATION_DIRECTORY_OBJECT => ‘DEST_DIR’,
DESTINATION_FILE_NAME => ‘exm_new.txt’
DESTINATION_DATABASE => ‘US.ACME.COM’);
In order to transfer a file the other way around, you must replace the PUT_FILE procedure with the GET_FILE procedure. You can monitor copying progress using V$SESSION_LONGOPS view.
This makes this all a very nice way of running ETL from Oracle to Oracle, avoiding those dreaded database links...
Obviously, things like ftp and scp also work on moving files around, but I kind of like the DBMS_FILE_TRANSFER example.