Wednesday Jan 22, 2014

New CAM XML Editor v3.1 and CAMV validation release now available

The focus for this release is enhancements, bug fixes and performance improvements for both the CAM editor and CAMV validation engine.

The new CAM Editor V3.1 provides the following improved functionality:

  • Enhanced XSD schema importing especially for HL7, GML, OAGi and NIEM complexities
  • New UTF-8 handling to improve international support in elements and annotations
  • Editor entry of annotations improved and enhanced
  • Better XML example generating details (choice items, negative numbers and repeat limits)
  • Improved xsd:annotations handling during import processing (faster + suppress duplicates)
  • Improved bi-directional data processing using Open-XDX for open data query and update
  • Feature and bug fixes for the CAMV rules engine
  • CAMV now allows mixed content for validation

Available from http://www.cameditor.org

Monday Nov 04, 2013

Creating, using and managing XML component dictionaries quick tutorials

XML Component Dictionary capabilities are provided in conjunction with the CAM Editor toolset.  These dictionaries accelerate the development of consistent XML information exchanges using standard sets of dictionary components.

The quick tutorials are aimed at showing the 'how to' of the basic capabilities to jump start use of XML dictionaries with the CAM Editor.

The collection of dictionary tutorials videos run for a total of approximately 20 minutes.  Each video can be reviewed individually also.

Learn how to use the dictionary functions to create dictionaries by harvesting data model components from existing XSD schema, SQL database table schema, or simple Excel / Open Office spreadsheets with tables of components listed.

Also included are tips and functions relating to use of NIEM exchange development, IEPD and EIEM techniques.

These videos should be viewed in conjunction with reviewing the overall concepts and techniques described in the companion video on the CAM Editor and Dictionaries overview.  The approach is aligned with OASIS and Core Components Technical Specification (CCTS) standards specifications for XML components and dictionaries.

Dictionary collections can be stored locally on the file system, or local network, or collaboratively on the web or cloud deployment, or can be shared and managed securely using the Oracle Enterprise Repository (OER) tool.

Also included are techniques relating to the use of the NIEM approach for developing XML exchange schema and IEPD packages.  This includes generating reuse scores, wantlist, and cross reference spreadsheets.

Included in the latest release of the CAM Editor is the ability to use the analyse dictionary tool to determine duplicate components, conflicting component definitions, missing component descriptions and so on.  This ensures high quality dictionary component specifications.  Using the CAM Editor you can also create MindMap models and UML physical models of your dictionary components sets.

For a complete guide to using the CAM Editor see the main YouTube video tutorials website and the CAM Editor website.


Monday Oct 21, 2013

Oracle BPM and Open Data integration development

Rapidly developing Oracle BPM application solutions with data source integration previously required significant Java and JDeveloper skills. Now using open source tools for open data development significantly reduces the coding needed.  Key tasks can be performed with visual drag and drop designing combined with menu selections entry and automatic form generation directly from XSD schema definitions.

The architecture used is extremely lightweight, portable, open platform and scalable allowing integration with a variety of Oracle and non-Oracle data sources and systems.

Two videos available on YouTube walk through the process at both an introductory conceptual level and then a deep dive into the programming needed using JDeveloper, Oracle BPM composer and Oracle WLS (WebLogic Server) along with the CAM editor and Open-XDX open source tools.

Also available are coding samples and resources from the GitHub project page, along with working online demonstration resources on the VerifyXML site.

Combining Oracle BPM with these open source tools provides a comprehensive simple and elegant solution set. Development times are slashed and rapid prototyping is enabled. Also existing data sources can be integrated using open data formats with either XML or JSON along with CRUD accessing via the Open-XDX Java component. The Open-XDX tool is a code-free approach where data mapping is configured as templates using visual drag and drop in the CAM Editor open source tool.  XML or JSON is then automatically generated or processed (output or input) and appropriate SQL statements created to support the data accessing.  

Also included is the ability to integrate with fillable PDF forms via the XML templates and the Java PDF form filling library.  Again minimal Java coding is needed to associate the XML source content with the PDF named fields. 

The Oracle BPM forms can be automatically generated from XSD schema definitions that are built from the data mapping templates.  This dramatically simplifies development work as all the integration artifacts needed are created by the open source editor toolset.

The developer level video is designed as a tutorial with segments, hands-on demonstrations and reviews.  This allows developers to learn the techniques and approaches used in incremental steps. The intended audience ranges from data analysts to developers and assumes only entry level Java skills and knowledge.  Most actions are menu driven while Java coding is limited to simply configuring values and parameters along with performing builds and deployments from JDeveloper and Oracle WLS.  

Additional existing Oracle online training resources can be referenced on Oracle BPM and WLS that cover other normal delivery aspects such as user management and application deployment.

Monday Aug 05, 2013

New CAM v3.0 ships with JSON support and significant performance enhancements

Today we released the new and significantly improved CAM editor toolset along with 3 new companion 'How to' quick videos (see here to view).

The main focus is integrating JSON handling alongside the existing XML capabilities to provide developers with the ability to use either or both from the single set of infrastructure.

This provide JSON developers with the ability to quickly build visual data models, use robust XML content validation services and generate XSD schema and JAXB bindings without having to do all those tasks by hand or know the nuances of complex XSD schema or XML handling.

For XML developers it provides a rapid ability to use JSON as an option in their information exchanges and web service integration for supporting mobile and web-based application needs.

In addition to these new JSON capabilities the existing functionality has significantly improved performance and capability.  The CAMV validation engine now runs up to 20 times faster for large XML validation input and with templates containing setChoice() rules.  For comparison a 500+ rules validation template with large 15MB sample COBie CAD/CAM smart building XML export now runs in 19 seconds instead of previously taking over 9 minutes.

Then the drag and drop dictionary components handling has similarly been significantly improved.  Large sets of components now are inserted in real time with low memory overhead thus dramatically improving the user experience and ability to quickly build information exchanges from XML dictionaries of predefined domain components.  The video shows using the Education domain to rapidly build a StudentDetails report with grades, achievements and student data.

For the Open-XDX open data API toolset we have added bi-directional support.  This means using the same CAM template and the SQL drag and drop interface you can design Update/Insert SQL database web services along with the query services.  Again the focus is on providing simple and rapid application development support.  Example code and resources can be found at our GitHub site while on line demonstrations are available from the VerifyXML.org site.

Further enhancements include a new Dictionary Evaluation report.  This tool analyzes the XML components in a dictionary and highlights design issues, omissions, duplicates and more that would be extremely tedious to detect by hand.  This allows a development team to collaboratively improve the quality of their core components and their reuse across a project implementation.

Last but not least we have improved the XSD schema importing and exporting resolving a range of complexity nuances not previously handled allow improved accuracy and compatibility with XSD schema.

In summary the new release provides:

o All new JSON capabilities and template type
o Bi-directional data processing using Open-XDX for open data query and update
o Dramatically improved Dictionary components drag and drop
o New report for Dictionary evaluation and analysis
o Significant CAMV rules engine performance improvements
o Better XSD schema importing and exporting

We look forward to seeing the enhanced solutions this helps people deliver to their customers.



Friday May 10, 2013

White House announces Open Data policy - dawn of a new age of information sharing

The White House today released an Executive Order -- Making Open and Machine Readable the New Default for Government Information.

In addition there is now a new open source tools project and resources on GitHub in support of this initiative.

The potential here to change how a whole range of services are delivered to citizens is significant and also for new services and commercial opportunities to emerge that utilize these data services.

To see the types of potential here - see sample Open Data API show case work on the related VerifyXML.org site also.


Friday Apr 26, 2013

Analysis of JSON use cases compared to XML

Background

Before there was either XML or JSON there was EDI. JSON is very reminiscent of EDI, both syntactically and conceptually, and that the claims made back then as to why EDI would be sustained over XML. EDI was lightweight, human readable, fast to process, compact and worked well with existing systems exchanges and interfaces and had a dedicated following of advocates. But EDI has significant flaws, it is brittle, difficult to extend and has weak data typing, content validation and rendering support. Also semantics in EDI are very limited and rely on external referenced specifications with local human knowledge that is notoriously difficult to align across implementations. Particularly code lists value sets and cross field content validation rules were especially problematic for EDI.

Moving past these limitations standards setting organizations have adopted XML technologies as the primary toolset in defining information exchange specifications. Further more there is an extensive family of XML technologies that support the complete ecosystem of semantics and particularly the need for interoperability, security and common meaning and rules. The diagram here illustrates that.

Figure 1 – Information Exchange Conceptual Components

Referencing this diagram, JSON is restricted to the Structure and Content capabilities. XML on the other hand provides the ability to handle rich exchanges where all the aspects shown can be managed. In today's challenging commercial and government information sharing world you must have the complete set of robust capabilities available.

The JSON primary use case

JSON is designed for web client interfaces to web services on the internet. Essentially it is serialized Javascript objects which makes it a strong fit for web browser native client side scripting that all the major web browsers provide.

While XML does not fit as well for that scenario there are many equivalent solutions using different interfacing in the browser such as Adobe Flash, Microsoft InfoPath, Oracle ADF, or open source solutions such as Netbeans forms that are using XML. One advantage of these is the “write once” approach and deploy anywhere to tablet, smart phone, or web browser.

XML and JSON Performance Analysis

The presumptions of how slow and resource-demanding "Fat” XML is compared to JSON’s lightweight payload do not hold up to a test. An experiment with 33 different documents and almost 1200 tests on the most commonly used browsers and operating systems found the performance for the total user experience, (transfer, parsing and querying of a document), to be nearly identical for both XML and JSON formats.

Clearly this shows that you should perform experiments, test your own data and code with your own users and devices to determine real results. What "seems obvious" is not always true.

Selection of useful links of peoples opinions and thoughts

We present here a selection of “what does the internet think?” resources to show the context to the use of JSON and insights into processing and handling content in a web browser delivery context.

Broad Discussions

http://blog.technologyofcontent.com/2010/01/json-vs-xml/

http://broadcast.oreilly.com/2011/06/the-good-the-bad-the-ugly-of-rest-apis.html

http://digitalbazaar.com/2010/11/22/json-vs-xml/

Landscape and Performance Comparisons

http://java.dzone.com/articles/streaming-apis-json-vs-xml-and

"We are conducting an experiment in XML and JSON performance in browsers and would appreciate anyone with a couple minutes to spare to visit this website and push one button.
http://speedtest.xmlsh.org (the results will be analysed and published - at this coming Balisage 2013)

Pro-XML

http://metajack.im/2010/02/01/json-versus-xml-not-as-simple-as-you-think/

http://www.ajaxonomy.com/2008/xml/why-xml-is-far-superior-to-json

http://stackoverflow.com/questions/3536893/what-are-the-pros-and-cons-of-xml-and-json

Pro-JSON

http://www.advertserve.com/blog/2012/01/api-json-vs-xml/

http://myarch.com/json-pros-and-cons

http://www.scriptol.com/ajax/json-xml.php

http://www.slideshare.net/AnandRaj5/json-13725923

http://bitworking.org/news/JSON_isnt_XML

http://www.json.org/xml.html

http://blog.appfog.com/why-json-will-continue-to-push-xml-out-of-the-picture/

JSON and Security

http://stackoverflow.com/questions/395592/json-security-best-practices

Summary and Conclusions

Number one thing to notice here is you are reading this document and it is being delivered and rendered to your computer screen using XML, RSS and xhtml and not JSON.

Back in the day when XML was brand new, Bill Gates held a press conference to announce that Microsoft would be adopting XML wholesale for use across its products and Windows operating system. This tells us that XML is ubiquitous and extensible (that is in its name). There is now a huge number of XML based standards in a family of solutions that support all aspects of the needs of information exchange. In today's challenging world you cannot just discount those as unnecessary.

When you look at information exchanges the diagram provided in the introduction section above it shows the complete ecosystem of components that you need for effective consistent, trusted, predictable, reusable and extensible information flows. Also we can see that JSON is missing key delivery control and semantic pieces and thus JSON has a very limited mission profile. Within that profile when fit-to-purpose it can be effective, but as a general solution it does not meet all the extended requirements.

Clearly JSON has its niche following and will continue to serve that for its primary use case of web based point client-server information exchanges. That is not necessarily a bad thing. Having lightweight alternative solutions is perfectly acceptable for a lot of content delivery circumstances.

People should not confuse business operational convenience with overall applicability - e.g. Twitter and FourSquare dropping XML and relying solely on JSON. Both of these services use simplistic formats totally under their sole edict that are unlikely to change in the future. Also there are competitive reasons, JSON actually can make it harder with its limited semantics for competitive sites to harvest then analyze and reuse and republish their content.

As a technology XML continues to improve and its use is being better optimized and refined, with tooling support that is narrowing the gap in areas where JSON claims to have the technical edge today. Specifically we can point to Oracle's work on Open Data APIs using Open-XDX that supports both XML and JSON outputs. And the CAM templates approach with NIEM that comes with that and enables content providers to rapidly build working web services and user form interfaces from SQL data stores.

In short we can expect to see both XML and JSON to continue to fulfill information delivery needs going forward but the differentiations are likely to blur. Neither one is going to displace the other in core areas of use. Providing the capability to use and support both is not a significant burden and thus meets personal preferences and local project nuances.

To get a sense for all this as a brief real time interactive example you can try these two live demonstration service points.

This one is using XML when you click here. And this one is doing the same thing (its actually the same Open-XDX service component) but returns JSON instead when you click here.

And if you visit http://www.verifyxml.org you can find more Open-XDX examples and details.

Addendum

Table produced by JSON advocates to support JSON adoption

Capability

XML

JSON

Comment

Simplicity

XML is simpler than SGML

JSON is much simpler than XML. JSON has a much smaller grammar and maps more directly onto the data structures.

Simplicity is deceptive. XML can easily be used as simply as JSON syntactically. But simplicity comes at the price of ignoring many common more robust information sharing needs in an extended network – rather than just point-to-point.

The mapping referenced here is for objects within a JavaScript environment only. Outside of that context this is not so. All major programming environments have robust XML support.

Extensibility

XML is extensible because it is a mark-up language

JSON is not extensible because it does not need to be. JSON is not a document markup language, so it is not necessary to define new tags or attributes to represent data in it.

This is a naïve view. Things change constantly with new information sharing needs. Particularly as more participants are added to exchanges and standards evolve. Only in limited cases such as Twitter can we see set formats.

Interoperability

XML is an interoperability standard.

JSON has the same interoperability potential as XML.

JSON clearly has significant limitations and gaps with regard to information semantics and reuse.

Openness

XML is an open standard

JSON is at least as open as XML, perhaps more so because it is not in the center of corporate/political standardization struggles.

This is a highly subjective statement. XML has proven to be universally adopted and implemented not just in software but firmware devices and communications systems. Notice the JSON work is not immune from manipulation as anything else as happened with JavaScript itself.

Human Readable

XML is human readable

JSON is much easier for human to read than XML. It is easier to write, too. It is also easier for machines to read and write.

Again this is an entirely subjective statement. Markup is markup there is no “easier” here. Machines have no notion of “easier”. The notion of “easier to read” and presumably comprehend the meaning of is notoriously hard to define.

Exchange Formats

XML can be used as an exchange format to enable users to move their data between similar applications

The same is true for JSON

Agreed.

However XML also has security and other capabilities that are absent from JSON.

Structure

XML provides a structure to data so that it is richer in information

The same is true for JSON.

However XML can provide deeper structuring than JSON supports. It can also handle more extended content types.

Processed

XML is easily processed because the structure of the data is simple and standard.

JSON is processed more easily because its structure is simpler.

Again this is entirely subjective. See the link provided in the links section on machine timings testing.

Code Re-invention

There is a wide range of reusable software available to programmers to handle XML so they don't have to re-invent code

JSON, being a simpler notation, needs much less specialized software

JSON is mainly available in JavaScript and not in a wide range of programming environments. Further it is not the simplicity of the syntax, it is the drastically reduced capabilities. Hence JSON only provides very limited functionality.

XML separates the presentation of data from the structure of that data.

XML requires translating the structure of the data into a document structure.

JSON structures are based on arrays and records.

This is only in the context of the data within a web browser memory. Whereas XML is the native format that underpins the spreadsheets, databases and array stores that JSON content must ultimately be persisted to and from!

A common exchange format

XML is a better document exchange format. Use the right tool for the right job.

JSON is a better data exchange format.

Again this is entirely subjective and no metrics are being given here. What defines “better”? Clearly JSON is significantly less capable and restricted in it use cases. Therefore “your mileage may vary in actual use” would be an appropriate caution here when trying to measure what is “better” where and how.

Data Views

XML displays many views of one data

JSON does not provide any display capabilities because it is not a document markup language.

XML has broader applicability. Therefore you can write once, use everywhere. While JSON can expect to be changed into XML for such extended uses.

Self-Describing Data

This is a key XML design objective.

XML and JSON have this in common.

However XML has richer semantics available than JSON.

Complete integration of all traditional databases and formats

(Statements about XML are sometimes given to a bit of hyperbole.) XML documents can contain any imaginable data type - from classical data like text and numbers, or multimedia objects such as sounds, to active formats like Java applets or ActiveX components.

JSON does not have a <[CDATA[]]> feature, so it is not well suited to act as a carrier of sounds or images or other large binary payloads. JSON is optimized for data.

Visual content is data! Ask the FBI analyzing the recent Boston attacks. One could also say that JSON is limited to only simple basic data content and lacks extended validation such as code values, date and number formatting.

Internationalization

XML and JSON both use Unicode.

XML and JSON both use Unicode.

However JSON has limitations in its use of encoding and exchanges.

Open and extensible

XML’s one-of-a-kind open structure allows you to add other state-of-the-art elements when needed. They can always adapt your system to embrace industry-specific vocabulary.

Those vocabularies can be automatically converted to JSON, making migration from XML to JSON very straightforward.

Exactly, if you have XML it is trivial to generate JSON. The reverse is not the case however.

Readability

XML is easily readable by both humans and machines

JSON is easier to read for both humans and machines

This is an entirely subjective statement. The better answer is that well written XML and JSON are equivalent for human and machine handling.

Object-Oriented

XML is document-oriented.

JSON is data-oriented. JSON can be mapped more easily to object-oriented systems.

The reverse is an issue however, objects do not necessarily map easily to documents. Plus not all content is objects; it actually constrains the use model. XML on the other hand is well equipped for use as object-oriented content as well as documents.

Adoptation

XML is being widely adopted by the computer industry

JSON is just beginning to become known. Its simplicity and the ease of converting XML to JSON make JSON ultimately more adoptable.

The use of JSON is limited to web client-server scenarios. Within that domain it is popular. Outside of that domain XML completely dominates.




Wednesday Mar 27, 2013

New CAM Editor v2.4 release with enhanced Collaboration tools

The focus for this release is improved collaboration support including better dictionary generation, models, reports, spreadsheets and enhancement of the rules entry tools and rules processing. New for this release is support for Italian language localization.

The new XPath conditional rule entry wizard makes XPath rules definition significantly easier for cross-field validations and more. We have also improved the rule handling in the CAMV engine to be more consistent.

For collaboration the locations of dictionaries collections can now be located at a URL, a file system or stored in the Oracle Enterprise Repository (OER). Coupled with this are now the consistent dictionary collections and database connections manager tools for configuration management. Also better generation of dictionaries from spreadsheets and a new spreadsheet to dictionary utility XSLT tool. Dictionary XML component generation has also been improved adding a new Components section to itemize components in dictionaries along with more and more consistent handling for dictionary content types, rules and annotations.

The template evaluation report and NIEM NDR (Naming and Design Rule) checking is improved including better representation terms.

The XSD schema importing and exporting now supports the use of Appinfo tags for application specific detailing of exchange data relationships.

For models we have enhanced the Mindmaps to include color coding of Added and Updated annotations plus SQL DBmappings and choice items.

For reports we have added a new Export to XML option for the popular Tabular Report view. This exported XML is compatible with importing into an Excel spreadsheet or can be custom rendered using a stylesheet or XSLT transformation.

Several enhancements have been made to the CAMV validation engine along with XSD schema generating and annotations handling. For Open-XDX SQL data integration we now have a nifty utility that can generate MySQL database tables from CSV text file data exports.

In summary the new CAM Editor V2.4 provides the following improved functionality:

  • All new XPath rules entry Wizard tool

  • Significantly enhanced Dictionary generation

  • Collaboration support including Oracle Enterprise Repository (OER) and URL locations

  • Better dictionary collection and SQL database connections management

  • Enhanced Mindmap model generating

  • XML export format for Tabular Report View

  • Italian language localization

  • CAMV rules engine improvements

  • New spreadsheet handling utilities

  • More consistent NIEM NDR evaluation

    To download the latest software please see the CAMeditor.org download site.

    Saturday Jan 12, 2013

    The non-UTF-8 encoding character invalid byte sequences error

    An ongoing issue for XML transactions processing is UTF-8 character conformance. In an ideal world your computer should simply process your information content stream, store it and step on.  XML engineers however have other ideas.

    Content created in Microsoft Excel or Word or in a Web page application on a Windows desktop is by default using the Windows 1252 character set, however often this content ends up in XML document instances labelled as UTF-8 encoding.

    A conforming XML parser such as Xerces will then kick out invalid byte code sequence errors when attempting to process the content.  Turns out the really simple answer is to change the encoding statement in the XML prolog to say "Windows 1252" e.g.


    <?xml version="1.0" encoding="Windows-1252" standalone="yes"?>
    

    and then retry. Of course if you know you are using a different character encoding substitute that for the Windows-1252 value here instead.

    Now for automated batch processes you will need a simple piece of XSLT to switch / add the correct encoding.

    You can find out more tips and tricks on all this - plus links to XSLT tools to help with this from the CAM Editor wiki page.

    Another issue is simply locating the offending characters inside an XML instance - for that you can use this handy command line grep statement:

    grep --color='auto' -P -n "[\x80-\xFF]" file.xml

    All this then allows you to diagnose potential character set conflicts and hopefully then build smoothly functioning XML interfaces.  For XML content validation you can of course use the CAMV validation engine - and you can find out more on that from this YouTube resource site showing a video on the topic (also included are various NIEM training aspect too).


    Monday Nov 05, 2012

    XML Rules Engine and Validation Tutorial with NIEM

    .

    On the technical XML side the video introduces XPath validation rules and illustrates and the concepts of XML content and structure validation. CAM validation templates allow contextual parameter driven dynamic validation services to be implemented compared to using a static and brittle XSD schema approach.

    The SQL table lookup and code list validation are discussed and examples presented.

    Features are highlighted along with a demonstration of the interactive generation of actual live XML data from a SQL data store and then validation processing complete with errors and warnings detection.

    The presentation provides a primer for developing web service XML validation and integration into a SOA approach along with examples and resources. Also alignment with the NIEM IEPD process for interoperable information exchanges is discussed along with NIEM rules services.

    The CAMV engine is a high performance scalable Java component for rapidly implementing code-free validation services and methods. CAMV is a next generation WYSIWYG approach that builds from older Schematron coding based interpretative runtime tools and provides a simpler declarative metaphor for rules definition.

    See: http://www.NIEMtrainingvideos.org

    Thursday Oct 25, 2012

    SQL to XML open data and NIEM training video posted

    Learn how to build a working XML query/response system with SQL database accessing and XML components from example NIEM schema and dictionary.

    Software development practitioners, business analysts and managers will find the materials accessible and valuable in showing the decision making processes that go into constructing a working XML exchange.

    The 22 minute video available online shows how to build a fully working ULEXS-SR exchange using a Vehicle license search example.  Also included are aspects of NIEM training for assembling an IEPD schema with data models.

    Materials are focused on practical implementers, after viewing the instruction material you can use the open source tools and apply to your own SQL to XML use cases and information exchange projects.

    All the SQL and XML code, editor tools, dictionary and instructions that accompany the tutorial video are also available for download so you can try everything yourself. 

    See http://www.youtube.com/user/TheCameditor to run the video.

    And the open source project web site (sponsored by Oracle) contains all the resources, downloads and supplemental materials.

    Enjoy.

    Tuesday Oct 09, 2012

    SQL to XML open data made simple

    The perennial question for people is how to easily generate XML from SQL table content?  The latest CAM Editor release really tackles this head on by providing a powerful and simple toolset. 

    Firstly you can visually browse your SQL tables and then drag and drop from columns and tables into the XML structure editor.   This gives you a code-free method of describing the transformation you require.  So you do not need to know about the vagaries of XML and XSD schema syntax.

    Second you can map directly into existing industry domain XML exchange structures in the XML visual editor, again no need to wrestle with XSD schema, you have WYSIWYG visual control over what your output will look like.

    If you do not have a target XML structure and need to build one from scratch, then the CAM Editor makes this simple.  Switch the SQL viewer into designer mode, then take your existing SQL table and drag and drop it into the XML structure editor.  Automatically the XML wizard tool will take your SQL column names and definitions and create equivalent XML for you and insert the mappings.

    Simply save the structure template, and run the Open Data generator menu option, and your XML is built for you.

    Completely code-free template driven development.

    To see this in action, see our video demonstration links and then download the tools and samples and try it yourself.

    Sunday Oct 07, 2012

    New CAM Editor v2.3 with Open-XDX for Open Data APIs

    Creating actual working XML exchanges, loading data from data stores, generating XML, testing, integrating with web services and then deployment delivery takes a lot of coding and effort. Then writing the documentation, models, schema and doing naming and design rule (NDR) checks and packaging all this together (such as for NIEM IEPD use).

    What if there was a tool that helped you do all that easily and simply?

    Welcome to the new Open-XDX and the CAM Editor!

    Open-XDX uses code-free techniques in combination with CAM templates and visual drag and drop to rapidly design your XML exchange. Then Open-XDX will automatically generate all the SQL for you, read the database data, generate and populate the valid output XML, and filter with parameters. To complete the processing solution Open-XDX works with web services and JDBC database connections as a callable module that can be deployed plug and play with your middleware stack, all with just a few lines of Java code (about 5 actually).

    You can build either Query/Response or Publish/Subscribe services from existing data stores to XML literally in minutes. To see a demonstration of using Open-XDX, a MySQL data store and integrating with Oracle Web Logic server please see this short few minutes video - http://youtube.com/user/TheCameditor

    There is also a Quick Guide available that provides more technical insights along with a sample pack download of templates and SQL that you can try for yourself.

     To view online demonstrations of using Open-XDX see the VerifyXML.org site and GitHub resources.

    Head on over to our project resource site to learn more, download the latest CAM Editor and see links to all the resources and materials.

    We look forward to seeing how the developer community is able to jump start information sharing initiatives using this new innovative approach.

    Tuesday Oct 02, 2012

    Dr Dobbs, Mindmaps and XML Design

    Good to see that someone else has picked up on this. Of course we have had this feature in the CAM Editor (http://www.cameditor.org) now for over a year - so happy to see mainstream spotting how useful this is as well.

     Plus nice summary of the advantages - see the Dr Dobb's article here.

    Saturday Jul 14, 2012

    CAMV and TEAM Engine (Test, Evaluation, And Measurement) suite development

    The CAMV engine is ideal for rapidly developing Test, Evaluation, And Measurement (TEAM) test suites for XML information exchanges.  The concise CAM rules syntax leverages XPath v2.0 capabilities. This means that rules are declarative along with acting directly on the XML structure and hence this dramatically simplifies rule assertions. 

    Coupled to this is the ability of the CAM Editor visual IDE to automate much of the tasks involved in test suite and Compliance and Interoperability Testing development and implementation.  This includes automatic harvesting of rules from existing XSD schema definitions, automatic generation of XML test instances with realistic test data hints, and generation of HTML documentation of the business rules. Then rule entry wizards simplify the task of rule entry and are accessible via a "right click" popup menu tool directly from the XML instance structure viewer in the CAM Editor IDE. Compliance test rules are implemented in XPath language syntax that can be run interactively in the CAM Editor IDE against XML test instances and results visually diagnosed.  Completed rules templates may then be deployed via ANT scripts as automated test suites for validation of arbitrary XML information exchange samples.  The ANT scripts support use of drop folders, so any test cases dropped into the designated folder will then be inspected and validated against the applicable rules templates.

    For more information on utilizing these techniques and tools, along with sample Test Suites please see the Sourceforge.net resources site for the CAM tools - and the CAMV Test Suites link.

    Friday Jun 29, 2012

    CAM XML Editor version 2.2.1 now available.

    CAM Editor v2.2.1 release is now available. Lots of nice enhancements, CAMV performance boost and important bug fixes for DoD, NIEM and LEXS schema.

    Download is available from the CAM XML Editor Resource Site


    The CAM editor is the leading open source XML Editor/Validation/Schema designer for rapidly building and deploying complete XML information exchanges. Provides a visual WYSIWYG structure with rule entry wizards and drag and drop dictionary components. Will import, analyze and refactor existing XML Schema.

    Oracle is a proud sponsor of the project and its use on the NIEM.gov initiative.

    Creates XSD schema + JAXB bindings, Mindmap or UML models (XMI), XML test suite examples, HTML documentation + spreadsheets (NIEM IEPDs). XSD schema export in default, flatten, NIEM, and OASIS modes. Generates canonical component dictionaries from schema sets, ERwin models, or spreadsheets.

    About

    Not all XML is created equal. XML Orb looks at the challenges of creating information exchanges with XML and NIEM and how this can be made simpler, comprehensible, consistent and reliable.

    Search

    Categories
    Archives
    « July 2014
    SunMonTueWedThuFriSat
      
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
      
           
    Today