In the latest Application Express release 19.1, the Data Loading component in SQL Workshop has been dramatically improved. Not only CSV files, but also JSON, XML and XLSX files can be loaded into tables in the APEX workspace.
However, some readers might already have noticed, that the new Data Loading wizard can only load into new tables. To accommodate SQL Workshop users needing to load data into existing tables, the functionality from previous APEX versions is still available: In the lower right corner there is a Task named Load into Existing Table.
Clicking that link leads you to the known pages from previous APEX versions. Note, that Load into existing tables is limited to CSV data.
The Load XML Files to existing tables is missing in APEX 19.1. This blog posting provides some approaches to use the new Data Loading functionality (and the ability to upload XLSX or JSON) to load into existing tables. Enabling the new Data Loading functionality to load into existing tables out of the box is one of the top items on the list for the next release 19.2.
The Load XML to existing tables functionality in previous APEX releases was basically a very small wrapper on top the the DBMS_XMLSTORE package - that package actually did all the work. Having this in mind, it is very easy and straightforward to build an APEX application which does exactly the same thing. First, we'll create a PL/SQL procedure which accepts a BLOB and a table name and which invokes DBMS_XMLSTORE - as APEX did.
create or replace procedure load_xml_to_existing_table( p_xml_content blob, p_table_name varchar2, p_table_owner varchar2 default sys_context('userenv','current_user' )) is l_clob clob; l_ins_ctx dbms_xmlstore.ctxType; l_rows number; l_date_format varchar2(255); l_nls_date_format varchar2(255); l_nls_timestamp_format varchar2(255); l_nls_timestamp_tz_format varchar2(255); c_nls_date_format varchar2(255) := 'rrrr-MM-dd"T"HH24:mi:ss."000"'; c_nls_timestamp_format varchar2(255) := 'rrrr-MM-dd"T"HH24:mi:ss.FF3'; c_nls_timestamp_tz_format varchar2(255) := 'rrrr-MM-dd"T"HH24:mi:ss.FF3 TZR'; procedure convert_to_clob is l_dest_offset number := 1; l_src_offset number := 1; l_lang_context number := dbms_lob.default_lang_ctx; l_warning number; begin dbms_lob.createtemporary ( l_clob, true ); dbms_lob.converttoclob( dest_lob => l_clob, src_blob => p_xml_content, amount => dbms_lob.lobmaxsize, dest_offset => l_dest_offset, src_offset => l_src_offset, blob_csid => nls_charset_id('AL32UTF8'), lang_context => l_lang_context, warning => l_warning ); end convert_to_clob; begin convert_to_clob; for c1 in ( select parameter, value from nls_session_parameters where parameter in ('NLS_DATE_FORMAT', 'NLS_TIMESTAMP_FORMAT', 'NLS_TIMESTAMP_TZ_FORMAT')) loop if c1.parameter = 'NLS_DATE_FORMAT' then l_nls_date_format := c1.value; elsif c1.parameter = 'NLS_TIMESTAMP_FORMAT' then l_nls_timestamp_format := c1.value; elsif c1.parameter = 'NLS_TIMESTAMP_TZ_FORMAT' then l_nls_timestamp_tz_format := c1.value; end if; end loop; dbms_session.set_nls( 'NLS_DATE_FORMAT', '''' || c_nls_date_format || ''''); dbms_session.set_nls( 'NLS_TIMESTAMP_FORMAT', '''' || c_nls_timestamp_format || ''''); dbms_session.set_nls( 'NLS_TIMESTAMP_TZ_FORMAT', '''' || c_nls_timestamp_tz_format || ''''); l_ins_ctx := dbms_xmlstore.newContext( dbms_assert.enquote_name( p_table_owner ) || '.' || dbms_assert.enquote_name( p_table_name ) ); dbms_xmlstore.setrowtag( l_ins_ctx, 'ROW' ); l_rows := dbms_xmlstore.insertxml( l_ins_ctx, l_clob ); dbms_xmlstore.closecontext( l_ins_ctx ); dbms_session.set_nls( 'NLS_DATE_FORMAT', '''' || l_nls_date_format || ''''); dbms_session.set_nls( 'NLS_TIMESTAMP_FORMAT', '''' || l_nls_timestamp_format || ''''); dbms_session.set_nls( 'NLS_TIMESTAMP_TZ_FORMAT', '''' || l_nls_timestamp_tz_format || ''''); if dbms_lob.istemporary( l_clob ) = 1 then dbms_lob.freetemporary( l_clob ); end if; end load_xml_to_existing_table;
With this PL/SQL procedure, we can create an APEX application as follows:
declare l_blob blob; begin select blob_content into l_blob from apex_application_temp_files where name = :PX_XML_FILE; load_xml_to_existing_table( p_table_name => :PX_TABLE, p_xml_content => l_blob ); end;
And you are done. The page should look as follows ...
The goal for this blog posting is to load CSV data into an existing table as follows:
Name Null? Typ ---------- ---------- ----------------------- EMPNO NOT NULL NUMBER(4) ENAME VARCHAR2(10) JOB VARCHAR2(9) MGR NUMBER(4) HIREDATE DATE SAL NUMBER(7,2) COMM NUMBER(7,2) DEPTNO NUMBER(2)
We'll use the following CSV data
"EMPNO","ENAME","JOB","MGR","HIREDATE","SAL","COMM","DEPTNO" 7369,"SMITH","CLERK",7902,17.12.1980 00:00:00,800,,20 7499,"ALLEN","SALESMAN",7698,20.02.1981 00:00:00,1600,300,30 7521,"WARD","SALESMAN",7698,22.02.1981 00:00:00,1250,500,30 7566,"JONES","MANAGER",7839,02.04.1981 00:00:00,2975,,20 7654,"MARTIN","SALESMAN",7698,28.09.1981 00:00:00,1250,1400,30 7698,"BLAKE","MANAGER",7839,01.05.1981 00:00:00,2850,,30 7782,"CLARK","MANAGER",7839,09.06.1981 00:00:00,2450,,10 7788,"SCOTT","ANALYST",7566,09.12.1982 00:00:00,3000,,20 7839,"KING","PRESIDENT",,17.11.1981 00:00:00,5000,,10 7844,"TURNER","SALESMAN",7698,08.09.1981 00:00:00,1500,0,30 7876,"ADAMS","CLERK",7788,12.01.1983 00:00:00,1100,,20 7900,"JAMES","CLERK",7698,03.12.1981 00:00:00,950,,30 7902,"FORD","ANALYST",7566,03.12.1981 00:00:00,3000,,20 7934,"MILLER","CLERK",7782,23.01.1982 00:00:00,1300,,10
SQL Workshop is targeted to developers - not end users. So developers can always execute additional SQL Statements to process uploaded data. So, first upload the CSV data to a new table as follows:
After uploading, you will see a message indicating how many rows are loaded successfully - and how many are failed.
Now, we have a table EMP_NEW, containing the uploaded data ...
Note, that the HIREDATE column has been loaded as VARCHAR2, not as DATE, since this very DATE format was not detected by Application Express. Now, head over to SQL Commands and copy the data over using the following SQL statement:
insert into emp ( empno, ename, job, mgr, hiredate, sal, comm, deptno ) ( select empno, ename, job, mgr, to_date( hiredate, 'DD.MM.YYYY HH24:MI:SS'), sal, comm, deptno from emp_new )
Of course, one can also craft a stored procedure to do this ...
create or replace procedure copy_to_existing_table_emp( p_stage_table in varchar2 ) is begin -- copy data to the existing table execute immediate 'insert into emp ( empno, ename, job, mgr, hiredate, sal, comm, deptno ) ( select empno, ename, job, mgr, to_date( hiredate, ''DD.MM.YYYY HH24:MI:SS''), sal, comm, deptno from ' || sys.dbms_assert.enquote_name( p_stage_table ) || ')'; -- drop the stage table execute immediate 'drop table ' || sys.dbms_assert.enquote_name( p_stage_table ); end copy_to_existing_table_emp;
Now, we just have to invoke the procedure named copy_to_existing_table_emp. This has the great advantage that we can do all sorts of additional data processing while inserting into the target table.
Data Loading in APEX 19.1 contains more than the new wizard in SQL Workshop - the new APEX_DATA_PARSER PL/SQL package provides a nice and easy developer interface to parse files.
The APEX_DATA_PARSER package supports all file types which are supported by SQL Workshop data loading, that includes delimited files (CSV, Tab-Delimited), JSON, XML or XLSX). The parser is implemented as a table function - so the developer accesses parser results like a table. And the parser can thus be used like a table: With an INSERT ... SELECT statement, rows can be directly inserted into a table, existing tables can be joined in order to look up values or SQL and PL/SQL functions can be applied in order to process and refine the raw parsed data.
So we can create an APEX application which ...
You can find an example for this in the Sample Data Loading packaged application.
Once the application is installed, navigate to Parse Files (PL/SQL). This page shows how to upload a file and to display the file contents as a preview page.
The SQL query to display this preview is the following.
select line_number, col001, col002, col003, col004, col005, col006, col007, col008, col009, col010 -- more columns (col011 to col300) can be selected here. from apex_application_temp_files f, table( apex_data_parser.parse( p_content => f.blob_content, p_add_headers_row => 'Y', p_max_rows => 500, p_file_name => f.filename ) ) p where f.name = :PX_FILE
More details on how to use APEX_DATA_PARSER.PARSE can be found in the blog posting Super-Easy CSV, XLSX, JSON or XML parsing: About the APEX_DATA_PARSER package.
The Load Data (PL/SQL) page contains a PL/SQL page process which inserts the results of the APEX_DATA_PARSER package into the target table. The PL/SQL code is rather simple - it's just the above query, wrapped with an INSERT INTO statement.
insert into EMP (empno, ename, job, mgr, hiredate, sal, comm, deptno ) ( select to_number( p.col001 ), upper( p.col002 ), upper( p.col003 ), to_number( p.c004 ), to_date( p.col005, 'DD.MM.YYYY HH24:MI:SS' ), to_number( p.col006 ), to_number( p.col007 ), to_number( p.c008 ) from apex_application_temp_files f, apex_data_parser.parse( p_content => f.blob_content, p_skip_rows => 1, p_file_type => apex_data_parser.c_file_type_csv ) p where f.name = :PX_FILE )
This allows to implement "one-click" data loading flows.
This approach allows to actually build applications leveraging the capabilities of the new Data Loading functionality. Such applications can provide a super-easy user interface without cumbersome column mapping dialogs and also load XLSX, JSON or XML files.
The new Data Loading functionality in SQL Workshop introduces many new features - not only uploading XLSX or JSON files, but also loading in the background or auto-detection of data types. However, it can only load into new tables in 19.1 - to load into existing tables, APEX still provides the Data Loading wizard from previous APEX versions. But developers have a lot of options to achieve that functionality. Using some custom PL/SQL code together with the APEX_DATA_PARSER PL/SQL package is a very interesting alternative, since it supports all the file types and all the flexibility of the new Data Loading functionality. Developers can delegate all file parsing to this package and concentrate on the actual loading and processing part. This even allows to implement super-easy data loading flows with just a few mouse clicks is then only a small step.