X

An Oracle blog about Mobile Cloud Service

Recent Posts

Yet More Oracle JET Articles (and Oracle Mobile Cloud Service too)

Sometime back I blogged about New Oracle JET Articles, listing 5 articles for your reading pleasure, with a promise I'd provide an update when more articles became available.  Ah, yeah, about that. I kind of forgot! For Oracle JET to help you learn the basics we've subsequently expanded the article set to the following: Installing Oracle JET for JavaScript Web Development Installing Oracle JET for Hybrid-Mobile Application Development Understanding the Development Process With Oracle JET for Web and Mobile Working With RequireJS in Oracle JET Investigating KnockoutJS in Oracle JET Working with Oracle JET UI Web Components Responsive Web Design with Oracle JET Single Page Applications in Oracle JET Working with REST in Oracle JET This completes the Oracle JET article series for now.  Of course don't forget the original and finest videos for learning Oracle JET, in the Oracle JET MOOC now available on YouTube. Beyond this we started to explore a mash up of Oracle JET with other cloud technologies starting with Oracle Mobile Cloud Service (MCS) which we hope will interest you too: How to Mash Up JET and MCS Mobile Apps In addition we published the following videos exploring the mashup potential between JET and MCS: Configuring the MCS JavaScript & Cordova SDK on Oracle JET Configuring Oracle JET iOS and Android apps for MCS Notifications An Introduction to Sync Express with MCS and Oracle JET Time permitting we'll continue to explore JET + MCS use cases in new articles & videos.  And maybe next time I wont forget to advertise the articles when I publish them!  Happy JETi

Sometime back I blogged about New Oracle JET Articles, listing 5 articles for your reading pleasure, with a promise I'd provide an update when more articles became available.  Ah, yeah, about that. I...

A Q&A with Australia's Mobile Luminary Andrew Paterson

Looking for an insight into what makes an award winning mobile solution amongst Oracle's customers? The following Q&A exert captures an online interview with Andrew Paterson from Rubicon Red, sharing his experiences in building a mobile solution for Australia's National Pharmacies that won the Enterprise Mobility Award at Oracle Open World 2014. The application was built using Oracle technology, a combination of Oracle's Mobile Application Framework (MAF) for the mobile front end, and Oracle's SOA Suite of products at the back end. Hi Andrew, I had the pleasure of meeting you at Oracle Open World, but for our audience could you please introduce yourself, your role and organisation you work for. Andrew: Hi, my name is Andrew Paterson and I work for Rubicon Red – a company based out of Australia that provides products and services relating to Oracle Fusion Middleware. My role is Practice Manager, which involves a mixture of line and resource management, pre-sales and consulting work for clients.   Mobility is my main focus area at the moment, as we are increasingly finding that it is something customers now see as being core to their business. What was your background before working on mobile solutions? Andrew: I have quite a varied background, having worked previously with Ada, Lotus Notes, Java and building Web and Portal applications. The past 7-8 years has been focused on SOA and BPM, through development and consulting. My interest has always been in delivering solutions that make the users' lives easier. Typically with SOA this is abstracting away the complexities of the underlying systems and then exposing services that can be easily consumed.   As a developer, what’s changed since you took up the “mobile first” mantra? Andrew: The key phrase would be “build for change” - Mobility innovation and trends move a lot faster than traditional applications or web. For example, in the timeframe of the current application build (last 6 months) we have seen the Samsung S5 and Apple iPhone 6 devices launched, along with a new iOS and Android Lollipop being released this Friday. Coupled with these are changes to the UI styles and expected patterns of operations.  The challenge is to keep up with these trends. This was one of the main reasons we picked Oracle's Mobile Application Framework. We don’t have to worry about maintaining the underlying structures to keep apace with these changes, MAF takes care of this for us. Our focus is instead on what we can do differently within the app and whether we need to evolve the UI. The next noticeable change is that mobility has changed the way customers view SOA. Previously it could be a hard sell and often it was typically only used to replace existing integrations. The benefits of SOA as an approach were often lost and it was hard for them to see the value in building re-usable services and having a service catalogue. Now, it is impossible for a customer to build mobile applications without having a services layer. A well-structured and performant API is critical for being able to build applications that work well for consumers.  A minor but important change is that the presentation layer infrastructure is now something we don’t need to provide and maintain. The app runs on the users device, saving both money and time. Finally, it is crucial to now see the application as being something you are in for the long term, with multiple and frequent development cycles to add or improve features. The nature of the interaction with the user is more personal and direct, so whatever is built needs to evolve. Users expect this change – look at any app on your phone that hasn’t updated in the past year or so and it probably looks dated and clunky. Could you give us an outline of what Oracle Mobile Application Framework applications you’ve built or been responsible for? Andrew: Prior to MAF we had worked with various customers to build ADF Mobile [Editor's note: ADF Mobile was the precursor to MAF] applications, mainly for sales order entry and purchase order approval (typically moving these functions out of Oracle ERPs). These came about through the use of SOA and BPM to build services, which were then easily consumable via a mobile front-end. With our latest application, we are using MAF and working on a consumer application for a membership based pharmacy organisation. This app will deliver a new channel of interaction for their members, to view prescriptions, membership details and to interact with the company. Providing the app will also mean members no longer need to carry a plastic card to prove membership. Can you outline the size of your teams, your apps, the timelines involved? Andrew: The current project consists of a solution architect/mobile developer to design and build the front-end application and a SOA developer for building back end services in Oracle API Gateway and SOA Suite. The customer’s marketing team are responsible for providing a style guide to build the application to.  The breakdown of work is around 80 days, with 40-50 of these being the work required to expose services that interact with the internal systems.  That's interesting, so more than 50% of the work was with the back end services rather than just the mobile UI front end. Was that your experience in the earlier mobile apps you built too? Andrew: It was probably similar percentages, although we’ve found MAF easier and quicker to build apps compared to ADF Mobile. Some of the tool improvements and framework changes have reduced the development time considerably. For customers who have been on the SOA journey, it should be relatively simple to build an app that utilises their existing services.     Arguably a mobile app can just be a reflection of an existing desktop or browser based system. Can you talk to the qualities of your app that makes it “mobile”, in other words what’s unique to the solution and what benefits did it bring over the original systems? Andrew: Whilst some of the features in the application are available on the website, the big change is that it provides a channel for direct interaction between the member and the organisation and adds features more related to the ‘now’ moment. For example, members will be able to receive notifications that their prescriptions are ready.   Also, whilst it isn’t being leveraged at the moment, we will be able to detect that a member is in-store and this opens up scope to provide spot/local promotions and to improve the interaction between the member and the shop staff. What advice can you give for anyone wanting to start their first Oracle Mobile Application Framework application? Andrew: Don’t be afraid! The framework is relatively easy to understand and you can quickly build a nice looking app that exposes enterprise data from multiple sources. Consider your use cases – a big one for us was to be able to work ‘offline’ – i.e. the user shouldn’t need to login to view some of their data. Look for common issues faced within the organisation – for example, what are the top five reasons that users call your customer services department and how could these be alleviated by exposing a self-service application? The framework also provides a lot of functionality out of the box, so make sure you look through the examples provided to see what is possible. Also take a trip to the MAF tutorial videos as these provide a really good introduction to the overall framework. We’ve found that using an actual device for development and testing was easier than using the emulators. Getting this set up and working also helps you gain an understanding of the tools available and how various components work.  As mentioned before – build for change. Always have in mind how you will add new features and functions and the impact these will have on the app you are designing.   Finally – UI style changes are frequent so do not focus too much effort on a pixel perfect rendering. Build something that looks nice but understand it will likely need to change within a year. The critical aspect is the functionality – users are unforgiving of errors and poor performance. What qualities do you see an “enterprise” mobile apps having over consumer apps? Do you bother to differentiate them? Andrew: I see there being subtle differences between an app built for the enterprise and one for consumers. An enterprise app is likely to expose more interactive features (i.e. create/update/delete of data), whereas consumer focused apps will typically show the status of data from within various systems.  This means that the enterprise app will likely need more consideration to different levels of access and security roles. As you are exposing functions that change corporate data, you’ll likely need to hide/show things like fields and buttons accordingly. However, outside of that there probably isn’t much to differentiate. It’s tempting to say that an enterprise app may not need as frequent UI changes, but I’m not sure that is true anymore! When gathering requirements for mobile applications from your customers/users, was there anything that took you by surprise? Andrew: Typically, when building web or BPM apps it can be difficult to get business users to explain what they require, as often they are not aware of the functionality that is available or what is possible. Mobility is a reverse of this – people know quite well what a mobile device can or can’t do, the features available and have an expectation on how things should work. Design becomes a lot easier as everyone is sharing a common terminology and understanding of what and how things should work. Another noticeable effect was there were a lot of creative ideas proposed around what the app could provide. Whilst some of these have been put to one side for the initial launch of the app, they have been useful for providing an understanding how we should construct the app to cater for these future enhancements.    In developing a mobile solution for your customers, what advice can you give them in considering mobile applications for the first time? Andrew: Be clear about why you are launching an app and the features and functions that it will provide. What is the intent and why would someone download and keep the app? Having decided this, break the features into bodies of work that can be completed in phases no longer than a couple of months each. If it takes you longer than a couple of months to get a release out (for a consumer focused app), then you’re likely to fall behind the market due to amount of innovation and change that is occurring.  Related to this is don’t fall at the first hurdle through ‘analysis paralysis’. If a feature is too complex to decide upon now, then roll it into the next release. The key thing is to get an app out there that you can expand upon. If you’re not sure what features the users want – then ask them through a survey or similar. If we focus on Oracle’s Mobile Application Framework, if you were to pitch the positives to a new customer, what would you cover? Andrew: The key point to pitch is that it provides a fully featured framework to build an app that can then be deployed to both iOS and Android. This greatly speeds up development and reduces costs, as the developer doesn���t need to focus on writing Android or Objective C page structures and navigation. The approach for data controls and pages is based around ADF and is a proven way to build applications. The ease of wiring in services means that it can be very quick to build an application that interacts with your existing systems. For customers with Oracle systems you have that reassurance it has been built to work seamlessly with these. For a developer it’s a relatively simple MVC pattern that is easy to understand and use. The fact it is built on open standards and can embed alternate JavaScript libraries or use Cordova means you are not restricted in what you want to do.  Finally, it’s worth re-iterating that the product is constantly evolving and adding new features. The ability to get these for ‘free’ without needing to invest time to build yourself cannot be understated. And to be fair, let’s cover the negatives too? Andrew: At times we have hit some quirky device related issues that have been hard to resolve. Mostly MAF protects you from these issues, but this is compounded due to the variety of devices that exist (particularly for Android). Often a bug is with a particular device or version of phone operating system, so this can be frustrating (especially when trying to explain how to recreate it and the person can’t). Documentation can be sparse at times as well or at least, examples that cover more complex use cases. There are times where parameters or features are skipped over as though it is obvious how it works. However just like MAF, Oracle is rapidly evolving the documentation set too so this will improve over time. Do you see any unique challenges in the Australian mobile market? Andrew: Australia is quite unique due to the proliferation of people that have smart phones. I heard a statistic the other day that the top 2 providers have 29 million mobile contracts between them – more than the population that exists today. Another consideration for Australia is that once you get out of the cities the connectivity can drop off quite significantly. This needs to be factored in when building your app – what is needed to keep the app running in this situations. Any predictions for enterprise mobility in 2015 and 2020? Andrew: 2015 is probably easier – the tools will evolve to support multi tenancy on the device. i.e. a work profile and a personal profile. Some of this exists today, but again it will get easier to manage as the tooling gets better. 2020 – who knows, better interactivity with our rocket cars :-) I have a HTC Desire that was released around about 5 years ago and the difference in form and function is shocking. It seems barely usable and horrendously slow. I think the big change will be towards ease of access and ‘always on’ connectivity to enterprise data.   Thanks very much for your time Andrew, and sharing your experiences and expert insights into the mobility market.  Best of luck with your next mobile application! Rubicon Red, an Australian Oracle Gold partner is an innovative IT professional services firm focused on enabling enterprise agility and operational excellence through the adoption of emerging technologies such as Service-Oriented Architecture (SOA), Business Process Management (BPM), Cloud Computing and Mobile solutions.

Looking for an insight into what makes an award winning mobile solution amongst Oracle's customers? The following Q&A exert captures an online interview with Andrew Paterson from Rubicon Red, sharing...

NZOUG 2014 Auckland Conference November 19th/21st

It seems the end of the year is the time for conferences, and my favourite conference of them all is coming up, the New Zealand Oracle User Group conference Wednesday-Friday 19th-21st November in Auckland. With all due respect to our Oracle marketing team who make Open World seem the best conference in the world, the NZOUG conference wins all conference awards because New Zealanders are hands-down the most friendly people you could ever meet (as long as you don't mention the cup - shhhhh ;-) This year like the last NZOUG conference, the focus of my presentations will be on Oracle's mobility products, exploring our widening portfolio of solutions and why they will be valuable to you: Oracle Mobile Application Framework 1 day workshop Oracle's Mobile Platform Developing Web and Mobile Dashboards with Oracle ADF Just like the AUSOUG Perth conference the NZOUG conference will also be attended by international Oracle ACE speakers including Alex Gorbachev, Tim Hall, Guy Harrison and Edward Roske, as well as the regular local Oracle ACE pack Francisco Munoz Alvarez and Arjen Visser. Oh and of course a lot of Oracle staff are presenting too ;-) I hope you'll take the opportunity to attend the NZOUG conference, learn a few things about Oracle, and enjoy their hospitality too.

It seems the end of the year is the time for conferences, and my favourite conference of them all is coming up, the New Zealand Oracle User Group conference Wednesday-Friday 19th-21st November...

Learn about Oracle's Mobile Platform at Perth's upcoming Developer Day - Feb 18th

For Oracle customers located in Perth Australia, OTN will be holding an Oracle Developer Day on February 18th to cover contemporary Oracle development topics.  Oracle Developer Days don't come to Perth too often, something to do with being at the end of the earth I suspect, so we hope customers will take the opportunity to attend while the opportunity presents itself. From a middleware perspective I'll be covering in two sessions Oracle's Mobile Platform, specifically looking at Oracle's ADF Mobile and Oracle Mobile Suite.  This is your chance as a customer to get a handle on Oracle's mobile strategy from a development perspective, as well as a good look at what benefits Oracle ADF Mobile and Oracle Mobile Suite can deliver for both developers and your business. The day is not just about middleware though, and Oracle has been lucky enough to have well known and respected community speakers Connor McDonald and Penny Cookson join the line of speakers.  Between Penny and Connor they will also cover what's new in the Oracle database for developers, how to internet enable your database for mobile, and mobilizing your Oracle APEX applications.  Overall you can see the theme for the day is "mobile", which really shouldn't be that surprising from a developer point of view as the development world hums the "mobile first" mantra. We hope you'll take the opportunity to attend the event, listen to both community leaders and Oracle staff give their vision of the mobile development landscape, and network with the local Oracle community.  More information about the event and how to register can be found here.

For Oracle customers located in Perth Australia, OTN will be holding an Oracle Developer Day on February 18th to cover contemporary Oracle development topics.  Oracle Developer Days don't come to...

ADF Mobile: URL Schemes

The most recent version of ADF Mobile 11.1.2.4.0.39.64.51 adds support for URL Schemes, a capability that allows disparate iOS and Android apps to call each other and pass values.  If you've ever used a mobile app that invokes email, maps, Evernote or LinkedIn, often they're using URL Schemes to do this. In order to get you on the path to calling other apps using URL Schemes, or even building your own ADF Mobile app so it can be invoked by a custom URL Scheme, the Product Management team has recorded the folowing ADF Insider Essentials video to get your started. You might note during this recording I stick very carefully to the EmployeesFeature. If you want to invoke multiple different features in your application via the URL Scheme, typically before calling AdfmfContainerUtilities.invokeContainerJavaScriptFunction() in the LifeCycleListenerImpl.start() EventListener.onMessage() method, you would also call AdfmfContainerUtilities.gotoFeature() naming the feature you want to navigate to. However in this version of ADF Mobile the error "Could not show the feature. Attempt to show 'FeatureId' failed. ERROR:- UITabBarController set SelectedViewController: only a view controller in the tab bar controller's list of view controllers can be selected" will be raised. This is due to bug 17450616 and is resolved in an upcoming release of ADF Mobile.

The most recent version of ADF Mobile 11.1.2.4.0.39.64.51 adds support for URL Schemes, a capability that allows disparate iOS and Android apps to call each other and pass values.  If you've ever used...

ADF Debugging Aid: The ADF Request Analyzer

Seasoned Oracle ADF developers will be familiar with the ADF Logging framework, an instrumentation API that you can use to surface what your application is doing behind the scenes, mainly as a debugging aid.  The ADF framework itself uses the ADF Logging framework to produce copious logs showing what the framework actually does for you. Admittedly reading the logs, either your's or Oracle's can be a rather tedious process.  Let's admit it, reading text files with 1000 of lines of output isn't what we signed up for when we joined the exciting world of the IT industry.  Arguably reading the logs can also be like drinking from a fire hose too, there's just too much information to digest. To make your life a little easier as a developer, Oracle's JDeveloper includes the Oracle Diagnostic Log Analyzer which is a visual tool included in the IDE designed to allow you to search, filter and read the logs in a structured fashion. A superb addition to the Oracle Diagnostic Log Analyzer is the ADF Request Analyzer.  The ADF Request Analyzer is not just designed to assist you read the logs, but restructures the logs to represent the JSF & ADF lifecycle on each request.  In other words it moves from a flat line by line log structure which doesn't really represent the flow or logic of how each request is processed, to showing you visually in a tree structure the different phases of the lifecycle processing the request. Interjecting with one of the main benefits I see from a personal perspective, the ADF Request Analyzer takes that dry JSF lifecycle you read about in books but never really understood as it was all theoretical, and now shows you the runtime representation of the lifecycle so you really get to see what gets processed when. For customers who don't know about this tool, the ADF product management team has released a new ADF Insider video on the ADF Insider Essentials YouTube channel entitled the Oracle Diagnostic Log Analyzer - ADF Requests and the JSF Lifecycle.  To make it a little more realistic than "here's the tool and you should use it", the video attempts to show you some real running scenarios, as well as how you would use it in a production environment. We hope you find this useful.  As can be seen Oracle's ADF Product Management team continue to commit to providing customers comprehensive learning collateral on your ADF endeavours, with the ADF Insider Essentials channel, ADF Architecture TV, ADF Mobile Academy and much more. Image courtesy of artur84 / FreeDigitalPhotos.net

Seasoned Oracle ADF developers will be familiar with the ADF Logging framework, an instrumentation API that you can use to surface what your application is doing behind the scenes, mainly as a...

Tip: Keeping the ADF Mobile PDF Guide up to date

This is a little tip for customers using Oracle's ADF Mobile. If you're like me, it's possible you don't rely on the online HTML version of the Mobile Developer's Guide for ADF, but rather download a PDF version of the file to use locally (look to the "PDF" link to the top right of the guide).  For me the convenience of the PDF is it's faster, I can search the whole document easily, I can split read the document across two pages on my home monitor, if I lose my internet connection the document is still available, and it's easy to read on my iPad (especially on long haul flights to the US across the Pacific where there is no internet connection!). The trigger point for me to download the Oracle PDF documentation has always been on a new point release of JDeveloper.  However in the case of ADF Mobile, as an extension to JDeveloper it is releasing at a much faster and independent schedule to JDeveloper and this includes updates to the documentation. As such the 11.1.2.4.0 ADF Mobile PDF guide you have locally might be out of date and you should take the opportunity to download the latest version.  This is also particularly important for ADF Mobile as not only are many new features being added for each release and included in the new documentation, but the guide is under rapid improvement to clarify much of what has been written to date.  Our documentation teams are super responsive to suggestions on how to improve the guides and this often shows per point release. How do you tell you've the latest guide? Look to the document part number which right now is "E24475-03".  This is a unique ID per release for the document, the first part being the document number, and the part after the dash the revision number.  If the website document number has a higher revision number, time to download a new up to date PDF. One last thing to share, you can follow the ADF Mobile guide document manager Brian Duffield on Twitter to keep abreast of updates. Image courtesy of Stuart Miles / FreeDigitalPhotos.net

This is a little tip for customers using Oracle's ADF Mobile. If you're like me, it's possible you don't rely on the online HTML version of the Mobile Developer's Guide for ADF, but rather download a...

Diagnosing ADF Mobile iOS deployment problems

From time to time I encounter customers who have taken possession of a brand new Apple Mac, have that excited "I've just spent more on a computer then I ever wanted to but it's okay" crazy gleam in their eye, but on pre-loading all the necessary software for Oracle's ADF Mobile to start their mobile campaign, following Oracle's setup instructions and deploying their first app to Apple's XCode iPhone Simulator they hit this error message in the JDeveloper Log-Deployment window:[01:36:46 PM] Deployment cancelled.[01:36:46 PM] ----  Deployment incomplete  ----.[01:36:46 PM] Failed to build the iOS application bundle.[01:36:46 PM] Deployment failed due to one or more errors returned by '/Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild'.  The following is a summary of the returned error(s):Command-line execution failed (Return code: 69) "Oh, return code 69, I know that well" I hear you say.  Admittedly the error code is less than useful besides drawing some titters from the peanut gallery. Before explaining what's gone wrong, I think it's useful to teach customers how to diagnose these issues themselves.  When ADF Mobile commences a deployment, be it to Apple's iOS or Google's Android platforms, JDeveloper and ADF Mobile do a good job in the Log window of showing you what the deployment process entails.  In the case of deploying to iOS the log window will literally include the XCode commands executed to complete the deployment cycle. As example here's the log output that was produced before the error message was raised.... take the opportunity to read this line by line and note the command line calls highlighted in blue: (Note some of the following lines have been split over multiple lines to suit reading on this blog, each original line is preceded by a timestamp. Ensure to check the exact commands from JDev) [01:36:33 PM] Target platform is (iOS).[01:36:33 PM] Beginning deployment of ADF Mobile application 'LayoutDemo' to iOS using profile 'IOS_MOBILE_NATIVE_archive1'.[01:36:34 PM] Command-line executed: [/Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild, -version][01:36:34 PM] Command-line execution succeeded.[01:36:34 PM] Running dependency analysis...[01:36:34 PM] Building...[01:36:34 PM] Deploying 3 profiles...[01:36:35 PM] Wrote Archive Module to /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/ApplicationController/deploy/ApplicationController.jar[01:36:35 PM] WARNING: No Resource Catalog enabled ADF components found to package[01:36:36 PM] Wrote Archive Module to /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/ViewController/deploy/ViewController.jar[01:36:36 PM] Verifying existence of the .adf source directory of the ADF Mobile application...[01:36:36 PM] Verifying Application Controller project exists...[01:36:36 PM] Verifying application dependencies...[01:36:36 PM] The application may not function correctly because the following dependent libraries are missing:/Users/chris/jdev/jdeveloper/jdeveloper/jdev/extensions/oracle.adf.mobile/lib/adfmf.springboard.jar[01:36:36 PM] Verifying project dependencies...[01:36:36 PM] Validating application XML files...[01:36:36 PM] Validating XML files in project ApplicationController...[01:36:36 PM] Validating XML files in project ViewController...[01:36:40 PM] Copying common javascript files...[01:36:41 PM] Copying FARs to the ADF Mobile Framework application...[01:36:41 PM] Extracting Feature Archive file, "ApplicationController.jar" to deployment folder, "ApplicationController".[01:36:42 PM] Extracting Feature Archive file, "ViewController.jar" to deployment folder, "ViewController".[01:36:42 PM] Deploying skinning files...[01:36:43 PM] Copying the CVM SDK files built for the x86 processor...[01:36:43 PM] Copying the CVM JDK files built for the x86 processor...[01:36:43 PM] Command-line executed: [cp, -R, -p, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/iOS/jvmti/x86/, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/temporary_xcode_project/lib][01:36:43 PM] Command-line execution succeeded.[01:36:43 PM] Command-line executed: [cp, -R, -p, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/iOS/jvmti/jar/,/Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/temporary_xcode_project/lib][01:36:43 PM] Command-line execution succeeded.[01:36:43 PM] Copying security related files to the ADF Mobile Framework application...[01:36:44 PM] Command-line executed from path: /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/temporary_xcode_project/[01:36:44 PM] Command-line executed:/Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild clean install -configuration Debug -sdk /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator6.1.sdk DSTROOT=/Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/Destination_Root/ IPHONEOS_DEPLOYMENT_TARGET=5.0 TARGETED_DEVICE_FAMILY=1,2 PRODUCT_NAME=LayoutDemo ADD_SETTINGS_BUNDLE=NO As you can see when we move from JDeveloper undertaking its work, it then passes the code off in the last few lines for Apple's XCode to assemble and deploy the required .ipa file.  From the original error message which followed this complaining about xcodebuild failing with return code 69, we can quickly see the exact command line used to call xcodebuild. As this is the exact command line call with all its options, you're free to open a Terminal window in Mac OSX and execute the same command by simply copying and pasting the command line. And via this you'll then find out what return code actually 69 means.  Unfortunately it's not that exciting. For Macs that have just been installed and configured with XCode, XCode (and for that matter iTunes) which is required by ADF Mobile to deploy must have been run at least once before hand on your brand new Mac (to be clear that's once ever, not once every restart). On doing so you will be presented with a license agreement from Apple that you must accept. Only once you've done this will the command line calls work.  They're currently failing as you haven't accepted the legal terms and conditions. (arguably you an also accept the terms and conditions from the command line too, but ADF Mobile cannot do this on your behalf, so it's just easier to open the tools and confirm the legal requirements that way). Putting aside the error code and its meaning, watching the log window, watching what commands are executed, learning what they do, this will assist you to diagnose issues yourself and solve these sort of issues more relatively quickly.  From my perspective as an Oracle Product Manager, it allows me to say "this is the stuff you don't need to worry about when you use ADF Mobile when it's configured correctly" .... as you can see my salesman qualities shine through. For anyone who is happily using ADF Mobile on a Mac and wondering why you didn't hit these issues, it's quite likely that you already accepted the license conditions before deploying via ADF Mobile.  For instance, though I'm not a fan of iTunes itself, iTunes was one of the first things I loaded on my Mac to access my Justin Bieber albums.Image courtesy of winnond / FreeDigitalPhotos.net

From time to time I encounter customers who have taken possession of a brand new Apple Mac, have that excited "I've just spent more on a computer then I ever wanted to but it's okay" crazy gleam...

The ADF EMG day at Oracle Open World 2013 Sunday 22nd

I'm happy to say through the kind efforts of the ADF community volunteers, under the expert banner of ODTUG, the ADF EMG will be running another day of ADF sessions at Oracle Open World's user group Sunday 22nd September.  This adds another six ADF sessions and a whole day of solid ADF content for you to enjoy at the world's largest Oracle conference for developers. The general announcement of sessions is covered on the ADF EMG forums, but a summary of the expert speakers and topics is here: 8:00am - Oracle ADF Task Flows Beyond the 10-Minute Demo [UGF7001] - John King 9:15am - Oracle on Your Browser or Phone: Design Patterns for Web and Mobile Oracle ADF Applications [UGF9898] - Floyd Teter & Lonneke Dikmans 10:30am - ADF Performance Tuning War Stories [UGF2737] - Stephen Johnson, Frank Houweling, Eugene Fedorenko 11:45am - Top 10 Web App Vulnerabilities, and Securing Them with ADF [UGF9900] - Brian Huff 2:15pm - Worst Practices When Developing an ADF Application [UGF9860] - Paco van der Linden & Wilfred van der Deijl 3:30pm - WebCenter & ADF - Responsive and Adaptive Design for Desktop, Mobile & Tablet [UGF9908] - John Sims You can also view the sessions in the OOW content catalog, and check out all of the Oracle ADF content at Oracle OpenWorld 2013 too. We hope you'll take the opportunity to join the ever growing ADF community at our largest ADF event, come learn, share, and participate in something that started as a single session at OOW in 2008 with just 20 people, to in 2013 a whole day of ADF content for all attendees to enjoy.

I'm happy to say through the kind efforts of the ADF community volunteers, under the expert banner of ODTUG, the ADF EMG will be running another day of ADF sessions at Oracle Open World's user group...

ADF rocks the Australian AUSOUG conference series

For Oracle customers looking to learn more about ADF and participate with the larger ADF community in Australia, this year’s AUSOUG Insync conference series across multiple cities has more ADF content than ever before.  As an ADF product manager based in Australia, I’ve seen the AUSOUG conference go from just 1 or 2 ADF presentation in previous years to this huge amount in 2013, the Aussie ADF market is certainly expanding.  Following is a list of ADF presentations per city, and I’ve taken the liberty of including some other FMW topics including web services, FMW infrastructure, Fusion Apps and Forms which relates to ADF. We hope you’ll take time out to register for the conference, come learn more about ADF, and be part of the expanding ADF community in Australia. Sydney 15-16th August • Speed dating - Oracle JDeveloper 12c and Oracle ADF New Features – Chris Muir - Oracle Corporation • The Future of Oracle Forms: Upgrade, Modernize, or Migrate? - Chris Muir - Oracle Corporation Melbourne 19-20th August • Building an award winning product with Oracle Application Development Framework - Primoz Luzar – Callista • Case Study: Redesigning core on-line systems in ADF JDeveloper - Penny Cookson - Sage Computing Services • Redeveloping a legacy application in JDeveloper ADF  - half day workshop - Penny Cookson - Sage Computing Services • Speed dating - Oracle JDeveloper 12c and Oracle ADF New Features – Chris Muir - Oracle Corporation • The Future of Oracle Forms: Upgrade, Modernize, or Migrate? - Chris Muir - Oracle Corporation • Best Practices for accelerating the roll out of Oracle Fusion infrastructure - Half day workshop - Matthew Wright, Craig Barr - Rubicon Red • Extending Fusion Applications : What, Who, When and HOW - Debra Lilley - Fujitsu UK • The Universal Web Service - Data Driven SQL/XML Web Service - Evan Vicary, Primoz Luzar - Callista Brisbane 2-3rd September • Case Study: Redesigning core on-line systems in ADF JDeveloper - Penny Cookson - Sage Computing Services • Redeveloping a legacy application in JDeveloper ADF  - half day workshop - Penny Cookson - Sage Computing Services Adelaide 5-6th September • Case Study: Redesigning core on-line systems in ADF JDeveloper - Penny Cookson - Sage Computing Services • Redeveloping a legacy application in JDeveloper ADF  - half day workshop - Penny Cookson - Sage Computing Services Perth 12-13th November • Case Study: Redesigning core on-line systems in ADF JDeveloper - Penny Cookson - Sage Computing Services • CRUX (CRUD meets UX) Case Study: Building a Modern Applications User Experience with Oracle ADF and UX Design Patterns - Chris Muir - Oracle Corporation • Elementary my Dear Weblogic - the Case of Spying on ADF - Ray Tindall - Sage Computing Services • Empowering your business with mobile applications - Eddie Phan - RACWA • Oracle ADF for Newbies: Surviving your first project - Kylie Payne - Sage Computing Services • Redeveloping a legacy application in JDeveloper ADF  - half day workshop - Penny Cookson - Sage Computing Services • The Future of Oracle Forms: Upgrade, Modernize, or Migrate? - Chris Muir - Oracle Corporation For anyone based in Perth, you've also the opportunity to participate in the AUSOUG ADF special interest group, which runs sessions bimonthly. We hope to see you at one of these events soon. 

For Oracle customers looking to learn more about ADF and participate with the larger ADF community in Australia, this year’s AUSOUG Insync conference series across multiple cities has more ADF content...

This is ADF Architecture TV

In 2011/12 Oracle conducted a survey amongst key ADF customers asking what they'd like to see next from Oracle's ADF Product Management team beyond the ADF basics which we’ve already covered in-depth? Concepts such as ADF design, architecture, best practices and patterns were high in the results.  We responded in late 2012/early 2013 by delivering a one week internal training course to staff and key Oracle partners around the world.  Yet we wanted to do more for everyone, and with support from Oracle's Development Tools management we’d like to announce the ADF Architecture TV YouTube channel.  This free TV channel will publish, week by week, episode by episode, a huge array of short, distinct topics as relating to all parts of the ADF development lifecycle, from planning, to design, to development, deployment and delivery.  All in all we have current plans for near 100 episodes.  The channel also takes a distinct departure from our traditional online video content in that key Oracle staff now stand in front of the YouTube camera, no longer are we a background narrator of the slides.  (Though our budget is limited, no makeup artists for us, you get to enjoy us on even on our bad hair days!) The complete set of topics to be covered is currently indexed on the ADF Architecture Square for you to check out.  We’ve already published two episodes on the YouTube channel so you can get a taste of what's to come, which you can enjoy from either work with a small time investment each week, or even from your iPad/Android tablet at home (well maybe that's just me, I really need to get a life outside Oracle). We hope you personally will find the channel a new and engaging model for consuming our ADF collateral, and in turn this will help you and your organization to success on your next ADF project.  Take time out to subscribe to the channel so you receive instant notification of new episodes being published.

In 2011/12 Oracle conducted a survey amongst key ADF customers asking what they'd like to see next from Oracle's ADF Product Management team beyond the ADF basics which we’ve already covered in-depth?...

Migrating ADF Mobile apps from 1.0 to 1.1

I thought I'd share something valuable from ADF Mobile’s product manager Joe Huang. Within context of migrating ADF Mobile v1.0 11.1.2.3.0 applications to v1.1 11.1.2.4.0, there’s some important migration notes in the release notes that developers should read. Of particular interest for myself is the change to how Javascript libraries need to be imported for the series of device integration videos I recently recorded. In ADF Mobile v1.0 you would have used the following JavaScript references: <script type="text/javascript" src="../../../../www/js/jquery-1.7.1.min.js"></script> <script type="text/javascript" src="../../../../www/js/adf.el.js"></script> <script type="text/javascript" charset="utf-8" src="../../../../www/js/phonegap-1.0.0.js"></script> <script type="text/javascript" src="../../../../www/js/adf.mf.device.integration.js"></script> In ADF Mobile v1.1 you replace these with just the two following lines: <script type="text/javascript">if (!window.adf) window.adf = {}; adf.wwwPath = "../../../../www/";</script><script type="text/javascript" src="../../../../www/js/base.js"> </script>That's it - otherwise there are no other changes needed.Please note, that if you need to invoke device services from a localHTML feature, you would still need to include cordova-2.2.0.js as well. In the previous version, due to some internal issues, PhoneGap1.0.0 always needed to be included even if you are only calling ADF Mobile related functions. In 11.1.2.4.0, only base.js needs to be included for ADF Mobile related functions, and cordova-2.2.0.js is needed only for device services integration like camera. And yes PhoneGap1.0.0 has been replaced with Cordova 2.2 in ADF Mobile 11.1.2.4.0. Thanks very much to Joe for sharing this information. Image courtesy of Vlado / FreeDigitalPhotos.net

I thought I'd share something valuable from ADF Mobile’s product manager Joe Huang. Within context of migrating ADF Mobile v1.0 11.1.2.3.0 applications to v1.1 11.1.2.4.0, there’s some important migrat...

Not 'how' but 'why' should you upgrade to JDeveloper & ADF 11.1.1.7.0?

I recently received questions from Oracle ACE Director (ACED) Tim Hall on upgrading to JDeveloper 11.1.1.7.0, not necessarily asking about the 'how' but rather the 'why should we upgrade to the latest version beyond new features?' Oracle has a close relationship with its ACED members, and overall the questions that we receive from the ACE Directors are well thought out and indicative of what the wider customer base are interested in, so we’re more often than not willing to address their concerns.  As such Tim and I thought we’d collaborate to write and publish the questions and answers for everyone to benefit from. Tim Q1) Assuming we don't need the extra functionality in ADF 11.1.1.7, what is the advantage of moving to it? Are the bug fixes and maybe browser compatibility changes enough to warrant the upgrade? Chris A1) Of course Oracle would love you to look at the new features and we hope they are compelling enough reasons on their own to get you to upgrade.  Probably the most exciting parts are the 'eye candy'components (cough, DVT controls) such as the Sunburst, TreeMap, TimeLine and Code Editor.  However for programmers the PanelGrid, ListView and new (returned?) pagination support in Table’s will also make ADF UI development easier, and who doesn’t want an easier job? Possibly a more compelling reason is the optimisations introduced under the covers which might not be overly obvious, but will make your users happy with the responsiveness of the app migrated to 11.1.1.7.0.  In this release Oracle has introduced a new skin 'Skyros' which to some people will be 'just another skin'.  Yet this skin introduces CSS3 with its graphics rendering support to replace the 10s, if not 100s of images we used to include with each skin for the overall application chrome.  This means your user’s browsers now download far less bytes, the browser and CSS3 can use the graphics processes of the users’ PCs for faster interactions, the end outcome, faster apps, happier users. Not enough on the optimization front?  To prove my point about optimizations under the covers, check out the bug fix list and note the DOM optimizations (e.g. bug 14015969) to make components more 'HTML-light-weight'.  The ADF programmer won’t see this in their day to day coding, but it will make a performance difference to the applications as the browser has less HTML to review and DOM to process. Yet I agree, this and the other new features might not be enough.  So let’s address your other points. Between releases Oracle does put it in significant effort addressing JDeveloper and ADF bugs, as well as behind the scenes optimizations that you get for free and we don’t necessarily publicize. Admittedly we also introduce in a few new bugs, but lets be realistic, that’s the nature of modern complex software, there’s always going to be bugs.  The goal is of course to reduce the amount and this is something Oracle pursues at length. Since starting at Oracle 1 year ago I’ve lodged just over 100 bugs and ERs, and have worked at length with our developers who are committed to addressing these issues, something all our team members do daily.  With all this in mind, from experience in having talked to external development teams who have upgraded in the past, generally speaking when quizzed about the quality of the IDE and ADF, I’ve not yet heard a customer say they wanted to go back versions.  So overall my opinion, is yes, you should diligently plan to upgrade when new releases come out. In turn you mentioned browser compatibility.  One goal of ADF Faces is to remove the mess of programming and testing for different browsers and versions.  This problem for everyday HTML, CSS and JS programmers has got worse in my opinion with the acceleration of browser releases and even mobile browsers thrown into the mix (have you read Webkit might be the next IE6?).  So one thing we’re doing behind the scenes on every customer’s behalf is testing and fixing issues of all our components on each browser release as they become available.  As an exception that hopefully proves the rule, have a look at the 11.1.1.7.0 release notes and read all the different browser issues Oracle has detected and have bugs listed for, Oracle continues to make these sort of issues a priority.  Why? We’ve a vested interest in fixing these too, our applications are all written in ADF, all rendered in browsers, we need this to work for our products too otherwise our customers wont use them. Arguably some readers might say that they don’t have to upgrade to pursue the latest browser fixes as their application only runs internally to their organization where the browsers are pegged to a certain vendor and version.  I think this used to be relevant argument, but is becoming less so now with the BYOD mobile movement, where staff are accessing internal systems via their own tablets and smartphones.  It’s only a matter of time till your CEO sends a nasty email that she can’t access the systems from her latest Android or iOS device.  Oracle includes optimizations & fixes for these devices, indeed 11.1.1.7.0 introduces better support for Android Chrome to address the mobile browser market issues too. At this point in re-reading my answer, I feel like I’ve been drinking the Oracle cool-aid a little too much.  So let’s address reasons why you shouldn’t upgrade at this stage. Depending on the scale of your ADF infrastructure (code, servers, developer PCs, CI engines), the act of upgrading can be substantial across the organization.  Undertaking an upgrade while your part of the organization is under deadlines might not be the wisest thing to do.  If you’re just about to go to production with your latest ADF application, and the dev teams are under pressure to get it out the door, throwing in a new JDev/ADF release at the last moment could be disastrous to your milestones, & even the quality of your software if the QA team doesn’t have time to do a full regression test. As most readers will be aware, regression testing is an important process of ensuring between upgrades a system doesn’t introduce any new bugs, something you definitely should pursue when upgrading your ADF infrastructure or anything else for that matter.  Yet at many sites they’ll attempt to do this testing manually, which with any large system will be difficult, time consuming, and prone to missing issues.  As such in chasing the ADF release cycle, with any system of size, it’s well worth pursuing automated regression testing through tools like Oracle Application Test Suite or similar. Finally, recognize that Oracle is going to continue to release new versions of ADF, with new features, fixes and optimizations.  As such it seems prudent to me that any organization serious about software development shouldn’t be planning these upgrades on an adhoc basis, but it should be something built into their plans.  The browser world is going to continue to churn, Oracle is going to continue to innovate and fix issues, why are customer’s inadvertently drawing a line in the sand after each upgrade to not consider planning for the next? Tim Q2) Is there a significance in upgrading as far as support lifecycle is concerned? Chris A2) Good question, and yes, something I’ve written about before, Do you know your ADF "grace period?".  In brief while Oracle agrees to support ADF for considerable time, your grace period for bugs and fixes to the 11.1.1.6.0 release is now ticking. Disclaimer: I’m paraphrasing the Oracle support policies here and it is important Oracle customers keep up to date with these policies themselves.  Please read the previous blog and research the associated support notes. Tim Q3) Is the upgrade likely to break anything that has already been converted for 11.1.1.6? Chris A3) Well, we'd hope not ;-) Oracle doesn't deliberately release software that is designed to break our customers' software, but as we admitted together earlier on in this discussion, modern software is a complex business and Oracle will have bugs in its software, to say otherwise will be a lie.  But that’s why I put the emphasis on regression testing and automated regression testing early on, not just for Oracle! … but also you.  The grace period gives you a year to upgrade your systems in your test environments, and lodge bugs and get fixes from us, before you upgrade.   As a final note some customers will be asking why Oracle has two current JDeveloper versions, 11.1.1.7.0 and 11.1.2.3.0?  This older blog post, though talking about 11.1.1.6.0 and 11.1.1.2.0 is still relevant and worth a read for anybody confused about the two releases. Image courtesy of pat138241 / FreeDigitalPhotos.net

I recently received questions from Oracle ACE Director (ACED) Tim Hall on upgrading to JDeveloper 11.1.1.7.0, not necessarily asking about the 'how' but rather the 'why should we upgrade to the latest...

Book review - SOA Made Simple

A few weeks ago Packt Publishing asked would I review SOA Made Simple, written by Lonneke Dikmans & Ronald van Luttikhuizen. Being a so called FMW expert, but (self-induced) pigeon holed in the ADF space I always take these offers, a good book provides an opportunity to widen my understanding of the topics & keep up with trends. Besides I know Lonneke & Ronald personally from my time as part of the Oracle ACE Directorship before I joined Oracle & they've always impressed me as knowledgeable and down to earth people (and this also declares my bias in reviewing their new book).  Now I must admit I started reading this book & by the end of chapter one I was concerned the book was going to be an overall dry read, the opening chapter discusses architecture "ontologies", a word that required a careful check of the dictionary on my part.  But with some perseverance & completing chapter two, in fact I completed the rest of the book in one sitting.  Somehow the ontologies grabbed me ;-)  So where this book makes it, is, to carry the joke forward is in its focus on SOA ontologies, with a careful if not methodical consideration of what makes a SOA project and services and how to proceed.  In less than 300 pages it covers many of the basic SOA concepts you need to know to understand SOA, and also what you'll need to do as part of a SOA project, without getting bogged down in detail. Thus the title SOA Made Simple I guess. Frank's already given a good run down of the chapters, but to summarize the book first introduces you to different IT architectures and where SOA fits in, then what is a service, the cornerstone of SOA projects. Next how to identify & classify services, & only then are you introduced to the SOA platform, that is the different vendor solutions and their capabilities.  What's a winner at the end of this book is the two chapters entitled "How to Spend Your Money and When" and "Pick your Battles", which seems a delightfully down to earth and pragmatic approach to SOA projects. Definitely this isn't a book that discusses how to use the SOA tooling from different vendors including Oracle, so don't pick this book up expecting to learn Oracle SOA Suite.  But conversely if you're looking for a concise text to articulate defining and progressing with a SOA project, this is for you. One other comment I wasn't quite sure where to put in my brief review, is, the book is littered with references to other key SOA and computing texts and papers.  Let's just say for a SOA text this was brilliant idea, because it keeps the overall book digestibly small by not trying to cover all the same information again, but you're free to go read more on the referenced topics if it takes your fancy. Overall a recommended read for anyone starting out with a SOA project, and for me, a good refresher on the SOA concepts and a concise list of what needs to be done over a SOA project's lifecycle. 

A few weeks ago Packt Publishing asked would I review SOA Made Simple, written by Lonneke Dikmans & Ronald van Luttikhuizen. Being a so called FMW expert, but (self-induced) pigeon holed in the ADF...

ADF Mobile: Avoiding the Android Emulator with AndroVM

(Post edit 2nd July 2014: For users of Oracle's next generation ADF Mobile known as Oracle Mobile Application Framework (MAF), on the Android Emulator it now supports Intel's HAXM drivers which significantly speeds up the execution of the emulator. We highly recommend this as a way of speeding up your development experience.) If you've had a chance to use ADF Mobile since its release and you're developing for Android, you will have been unlucky to encounter a small issue with Google's Android Emulator, it's incredibly, painfully, excruciatingly slow.   It really is an annoyance as the iOS Simulator on Macs runs lighting quick.  Oh the woes of developing for the Android platform, who would of thought this was the developer friendly platform?! ;-) Mind you the problem isn't particular to ADF Mobile, just a generic issue with Google's software which all Android programmers have to deal with.  This means of course you can search for solutions on the internet, and between the hundreds if not thousands of posts detailing requests on "how do I make the Android Emulator faster" you will find the occasional very useful answer.  The most common one is "test on an actual device".  However there are some alternatives, of which one I've been recently exploring and I'd like to mention here in order to make ADF Mobile Android testing slightly less painful. Before I do, I must clearly note this option is not supported by Oracle at all, and, via my testing it doesn't 100% work with all ADF Mobile's features.  But given the woeful state of the Android Emulator it at least gives you an option to work with. AndroVM is a project dedicated to providing a working Android VM client for VirtualBox.  Of significance for ADF Mobile unlike other efforts such as Android-x86 project, AndroVM attempts to mimic the ARM architecture under the covers with an ARM instruction set translator called Houdini.  Why is this important?  If you can imagine Oracle has written a mini JVM for the iOS and Android platforms which sits above the hardware layer.  If the underlying chip architectures are wrong, this is going to cause the JVM grief. Installing AndroVM is a fairly easy affair, you download and import the VirtualBox OVA file.  Ensure to download an OVA with Houdini included.  Beyond here it takes 14 seconds to boot to the Android home screen on my Mac.  Take that with the 38 seconds it takes to boot the Android Emulator on the same machine even while hot restarting, that's a big difference. The AndroVM documentation pages have plenty of useful installation notes which are worth checking out.  One important point, the Escape key on your keyboard is mapped to the Android back button. From here the most important bit to configure correctly is the VM network settings.  At the time of writing the androVM-4.1.1_r6.1-20130222 release comes with two network settings, the first (eth0) is a null (disabled) connection, and the second (eth1) is a NAT connection to emulate a wifi connection.  The documentation states you need to set the first connection as a "host only network" and attach it to a "DHCP-enabled network".  For whatever reason I couldn't get this to work so instead I configured a "bridged network" as follows:  Note this connects to my Mac "en1: Wi-Fi (AirPort)" connection which in turn connects to my local router which serves an IP for the Android VM.  If you decide to go this same path, you'll need to pick an appropriate connection on your local machine to bridge the network connection. To use AndroVM from here, what we're going to do is use the Android SDK adb command line tool to manually deploy an Android .apk file generated by ADF Mobile to the VM.  Before we can do this we need to know the IP address the VM has started with.  You can obtain this by logging in, opening Apps then androVO Configuration which will show you the IP address top left: If you can't see this it means you haven't configured the network settings on the VM correctly.  You wont be able to proceed without this so take a step back and get that working correctly.  There's an excellent explanation of the different VirtualBox network types here. Once you know the IP of the VM, open a terminal window, cd to your Android sdk install directory then cd to platform-tools: cd <sdk dir>/platform-tools From we'll use adb to connect to the VM by issuing the following command to connect to the IP address we revealed a few steps ago: adb connect <ip address> ...and if you've set up the network connections properly you should hopefully see: At this point you need to return to JDeveloper and generate the .apk package.  As such we cannot deploy directly from JDeveloper at this stage (though you could easily setup an extension to do this), rather we'll manually deploy the .apk via adb into the VM.  Upon generating the .apk make sure you understand where the file was generated as it will be used in the next command. Returning to the terminal once the .apk has been successfully generated, you need to issue the following command in the terminal: adb install <path to apk>/<apk file name>.apk (Note it's worth reading the documentation on using adb, it's a very handy little utility) In the example below you can see I've successfully installed the DeviceDemo sample application supplied with ADF Mobile: Note that blindingly fast <2 second deployment time! From here you're ready to test the application, and just like the regular emulator the application will be available from the Apps window: And from there you can start the application, here's the Device Demo running, and it only took 3 seconds to start: Now I've deliberately led us down the garden path here with the Device Demo, because it reveals limitations in the solution.  The Android Emulator allows us to connect up PC services like the webcam so we can take pictures as if were using a real device.  Via AndroVM we're not going to have the same facilities.  So while it's great that we've massively cut the deployment times, it's not going to be suitable for all our needs. Beyond this from my testing it seems to run most (not all) applications fine.  I took all of the ADF Mobile sample applications (CompGallery, DeviceDemo, GestureDemo, HelloWorld, JavaDemo, LayoutDemo, LifecycleEvents, Navigation, PrefDemo, SkinningDemo, StockTracker) and they all appeared to work with some rudimentary testing.  However the HR application did crash at the beginning.  I suspect this might be because it's the only application that uses the SQLite database though when I have time I'll need to investigate this further. Be it as it may, this is not a supported platform for ADF Mobile so you'll need to live with its issues.  However the speed benefits are I'm willing to bet enticing. I'm going to leave the Comments for this blog post open.  Please do not post any "it doesn't work" or "how do I get this to work" type comments as I will delete them, as I said it's not supported.  However if you do happen to find a problem and solve it, it would be good if you could share it here. One final note, remember, your company, such as Oracle, may have a policy restricting you from installing third party software which is not official approved. You should check before proceeding with this installation.

(Post edit 2nd July 2014: For users of Oracle's next generation ADF Mobile known as Oracle Mobile Application Framework (MAF), on the Android Emulator it now supports Intel's HAXM drivers which...

ADF Essentials meets the Raspberry Pi

Summer holidays in Australia always has me searching for a new project to keep me from annoying the kids by stealing their Lego. This year I was introduced to the Raspberry Pi and set myself a goal of installing and running ADF Essentials on this miniature computer. So what's a Raspberry Pi? The Raspberry Pi, or Raspi or Pi for short is a credit card sized computer that was designed to teach the "real" basics of computers to kids.  The story goes a "bunch of old computer hacks" recognized that kids today are far removed from the electronics side of computing and as a result we're losing kids with an interest in the fundamentals of what makes computing tick.  As such the team had a brilliant idea of creating a small and cheap Linux driven computer with the usual standard connectors (HDMI, USBs etc), but also importantly a set of GPIO pins which allow the Pi to connect a traditional old electronics kit to controls LEDs, switches, sensors and more. The specs of the Model B Raspberry Pi (there's two models, A and B) are deliberately minimal, running on a 700MHZ ARM processor, 512MB RAN (shared with the GPU), and it doesn't even have a harddrive, but rather uses a SD card for storage.  The neat part is the base price for the Model B is only US$35! If you're exceptional underwhelmed by the specs of the Pi, you might not be understanding the point.  This isn't meant to be a highpowered Intel quad-duo-core thing, but a small lightweight computer to learn from, or run exactly one task. Many people have "got it", and as a result the Pi has reportedly seen huge success in just the last year selling over a million units.  The Pi has found itself not only supporting the important role of teaching kids electronics and computer science, but it has also established itself into the hacker space with people declaring successful projects in robotics, home media centers, and lighting a humble LED when you receive an email. What's this got to do with ADF Essentials? The Pi is capable of running different Linux distributions, with a Debian distribution known as Raspbian "wheezy".  Instantly this peaked my interest, as where Linux goes, a JVM often follows, followed by Java EE servers such as Tomcat, and ha! Doesn't ADF Essentials run on Tomcat?! So one 38C (100F) summer Perth day while hiding in the airconditioning I set myself a goal of seeing if I could get ADF to run on the Pi.  Why?  Why not?  I knew it wasn't going to run well, a full blown server solution on only 512MB of memory is pushing the limits, but would it actually run at all? I'm happy to declare with a few hours to spare and a great amount of patience, a world record was declared on 27th January 2013: From here on in I'd like to describe the steps to build the ADF Essentials Pi from scratch, for all those ADF hackers out there ;-) Base Pi Specs and Basic Setup  For this setup I worked with a Model B Raspberry Pi with 512MB RAM and a 4GB SD card.  By chance the Model A was released today with just 256MB RAM, possibly a challenge too far for ADF Essentials! The SD card needs to have a preinstalled Linux of which Raspbian "wheezy" is a Debian distro built specifically for the Pi.  For purposes of my tests I used the 2012-12-16 wheezy img/zip available via the Raspberry Pi main website. You can purchase the SD card preinstalled or burn your own installation via your PC/Mac onto the card.  The Raspberry Pi wiki has a good set of instructions on how to do this and I used the RPi-sd v1.2 card builder for the Mac to assist the process.  If you happen to take the RPI-sd route ensure that both the .app and .img files are located in a directory without spaces otherwise you'll hit errors. After the installation of the base SD, I did hit a problem in getting the HDMI output to work.  The fix was via my Mac to edit the config.txt file on the SD card to include the following line: hdmi_drive=2 Upon the first boot of the Pi you're presented with the Raspi-config screen, of which I changed the following options: expand_rootfs  - invoked change_locale - selected an Aussie locale en_AU.UTF-8 change_timezone - Australia/Perth ssh - enabled boot_behaviour - Start desktop There is also an update command.  As I live behind the Oracle proxy this didn't work for me, but it's worthwhile to run if you've a normal internet connection to update your Pi with the latest software. Once rebooted you're greeted with the Raspberry Pi desktop: Due to the proxy issue above, I opened a terminal, configured apt-get to work with the Oracle proxy: sudo nano /etc/apt/apt.conf.d/01proxy ..with the following line: Acquire::http::Proxy "http://supersecretoracleproxy:80"; ..and on returning to the terminal executed the following 2 lines to update my Pi installation with the latest software: sudo apt-get update sudo apt-get upgrade This will take sometime to complete, time for a nice cup of raspberry tea while you wait. As connecting the Pi to a dedicated monitor becomes a bit tedious, you can configure SSH and a VNC server so you can maintain the Pi remotely.  Firstly you'll need to know the IP address of the Pi through executing ifconfig in the terminal so you can connect via SSH/VNC.  Remember that by default the Pi uses dynamic IP (DHCP) so this may change on each boot.  I'll assume 10.1.1.2 from here on in. To use and setup SSH and a VNC server, Adafruit provides good instructions on how to do this.  Ensure to remember the password you define for the VNC server. From a Mac accessing the Pi via SSH is easy as it's supported through the terminal/Mac OSX.  On Windows you'll need to install a dedicated SSH client. Also from a Mac accessing the Pi via a VNC client is easy as it comes with a ready made VNC client "Screen Sharing.app" located under /System/Library/CoreServices in OSX Mountain Lion.  Remember the user id/password combo is that you specified when setting up the VNC server, and the password is a maximum 8 characters long. On accessing the Pi you'll need to enter the IP address you discovered before and the port number.  Note if you started the VNC server on the Pi using something like "vncserver :1", the port number you must access is 590 + the digit you started the vncserver, 1 in this example, along the lines of 10.1.1.2:5901. On accessing the Pi from VNC for the first time I did hit the following error: "GDBus.Error:org.freedesktop.PolicyKit1.Error.Failed: An authentication agent already exists for the given subject" ..and a fix is suggested here. It's also worthwhile installing the Chromium browser and an FTP server such as proftpd. Setting up the ADF Environment Once we have the basic Pi up and running, we're now ready to install the infrastructure and software required for ADF Essentials.  In brief this includes: JDK 1.6 SE Apache Tomcat 7 jsf-api.jar + jsf-impl.jar + glassfish.jstl_1.2.0.1.jar Oracle ADF Essentials Oracle ADF Component Demo The following instructions are mostly derived from Raphael Rodrigues's blog with a couple of twists. At the time of writing this article there was some "hoohaw" over Java security and a rapid release of JDKs from Oracle to patch security holes.  As such an ARM distribution from Oracle of the JDK wasn't yet available.  To give myself the chance of finishing this blog post I went with installing the OpenJDK.  Warning: I've no idea if the current distribution includes all the security patches necessary.  However somehow I don't think security is a primary concern for most Pi uses: sudo apt-get install openjdk-6-jdk Installing Tomcat is also a fairly painless task.  First visit the Tomcat 7 download page and determine which is the latest version.  At the time of writing this was 7.0.35 for me.  In addition locate the tar.gz link and note the URL.  This will automatically be pointing at a mirror close to your location. Return to the Pi terminal, ensure you're in the home directory and execute the following command, but change the URL to suit the one presented in the previous step: wget http://mirror.mel.bkb.net.au/pub/apache/tomcat/tomcat-7/v7.0.35/bin/apache-tomcat-7.0.35.tar.gz Once downloaded, execute the following command to explode the tar to your home directory: tar zxf apache-7.0.35.tar.gz You can then start and stop the tomcat server by executing one of the following 2 commands after issuing the cd command: cd ~/apache-tomcat-7.0.35/bin sudo sh startup.sh sudo sh shutdown.sh Via the Pi desktop if you open the Chromium browser, you should be able to access the Tomcat console by going to http://localhost:8080, or remotely via the ip address such as http://10.1.1.2:8080.  Be patient for the server to start. At this point shutdown the Tomcat server. Next we need to install the jsf-api.jar, jsf-impl.jar and glassfish.jstl_1.2.0.1 jar files into the TOMCAT_HOME/lib directory.  These are best borrowed from an existing JDev 11.1.2.3.0 installation on your local PC/Mac and ftp'ed across from JDEV_HOME/oracle_common/modules/oracle.jsf_2.0 Finally we're ready for some ADF setup! We need to install the ADF Essentials runtime libraries into Tomcat.  Officially Oracle doesn't support Tomcat (at least currently) against ADF Essentials, but this doesn't mean ADF doesn't run on Tomcat. As ADF Essentials requires you accept the license conditions on Oracle's website before you download it, you cannot use wget to do this.  In addition attempting to download this via the Chromium browser isn't that easy as it's chews so much CPU that often it times-out before it has a chance to download any file.  As such the easiest solution seems to be to download the 11.1.2.3.0 ADF Essentials file on your PC/Mac and then ftp it across to your Pi.  While you're at the same site, also download the Oracle ADF Faces Components Demo 11.1.2.3. Move the ADF Essentials zip file to the TOMCAT_HOME/lib, and extract it using the following command: unzip -j adf-essentials.zip The -j option here flattens the directory structure within the zip such that all JARs included are placed in the Tomcat lib directory directly, not a subdirectory there of. The final part of the install is we need to deploy the ADF Component Demo.  There is a number of different deployments options to take, but the easiest is to place the associated .war file in the TOMCAT_HOME/webapps directory.  However the file is a rather large and chunky application of over 100MB of source code.  Tomcat's default behavior at runtime is to explode the war into a directory of the same name at startup, and given the resources of Pi this is going to take a long time. Best solution is to preexplode the file for Tomcat, by making a directory, say TOMCAT_HOME/webapps/rcf-dvt-demo, copy the war into this new directory and execute unzip on the zip file.  Remember to delete the zip file once done. One last fix.  From experimentation Tomcat 7 on starting will attempt to scan all the source code for annotations to execute.  To stop it doing this we can modify the WEB-INF/web.xml file of the demo we just exploded, and add the attribute metadata-complete="true" to the <web-app> node.  As you're about to see starting the Tomcat server with all the preconfigured ADF software is a *really* long process. Drum Roll: Voila! ADF Essentials on the Raspberry Pi So the big drum roll moment is to again start Tomcat and then access the application via a browser.  Just starting the Tomcat server is now going to consume a vast amount of CPU.  In fact 2984981 ms at 100% CPU on my Pi! Phew. What you can do while the server starts up is execute "tail -f" on the catalina.log file found under the TOMCAT_HOME/logs.  Very very slowly you'll see the server start up.  You might note the server seems to start 1 or two apps that aren't possibly needed and this could be something we could eliminate to speed the startup process later. Finally once up and running, open the browser and access http://10.1.1.2:8080/rcf-dvt-home/faces/index.jspx. Once again it will take sometime to start and load, but eventually you'll see the component demo running.  If you attempt to open another session to the same page (say from your PC/Mac) you will see it's an order of magnitude faster.  But it does run!  "But Chris, it runs so slowly it's near useless!" Ya, I did set expectations early on that it would run slowly, and you weren't seriously going to run an enterprise framework on a Raspberry Pi were you?  The goal here has been to see if it will just run, and it does. "But Chris, what about ADF BC/EJB/my favorite part of ADF?" Indeed, I didn't make any claims of exercising the entire ADF stack.  Though it would be fun to try.  The component demo is a good exercise in that it's a very large JSF application.  I'd love to see somebody try to get ADF BC running by seriously throttling back the AM and connection pool.  Could you even install MySQL on the same Pi to be fully contained?  Hmmm, sounds like a cool hacking opportunity! "But Chris, did you try X/Y/Z to speed up deployment?" Awesome, you've got some hacking ideas!  Let's start an ADF Pi community and take this further for the fun of it  :-) In summary the Raspberry Pi provides an awesome learning environment for kids to learn about computer science, but also a great playground for old fellas like me to see if we can get our favourite technologies to work on this mini computer.  I look forward to reading your Raspberry Pi ADF stories soon.

Summer holidays in Australia always has me searching for a new project to keep me from annoying the kids by stealing their Lego. This year I was introduced to the Raspberry Pi and set myself a goal...

Australia AUSOUG Roadshow - ADF Mobile - April/May

It's a tad early to advertise this, but as I'm receiving a few enquiries are we looking to run any ADF Mobile workshops in Australia anytime soon, I'd like announce yes indeed, we've a series of workshops planned with the Australian Oracle User Group (AUSOUG) in late April/May. This 1 day ADF Mobile workshop will give you the opportunity to learn how to create on-device mobile applications that run on both iOS and Android.  This workshop will explain the unique challenges and common use cases of mobile applications and then dive into possible mobile architectures. The workshop will walk you through developing mobile applications that install and run on the device and are able to leverage device-specific features - all with the new Oracle ADF Mobile solution. No previous knowledge of Oracle ADF is needed! The schedule is follows for the following Aussie cities.  Follow the links to register separately for each city: Brisbane Monday 29th April 9am-5pm Sydney Tuesday 30th April 9am-5pm Melbourne Monday 13th May 9am-5pm Adelaide Tuesday 14th May 9am-5pm Perth Monday 20th May 9am-5pm If you're in New Zealand we'll also be running a workshop in Wellington as part of the NZOUG conference.  The conference is on 18th/19th March, and the workshop follows on the 20th March. We look forward to seeing you there.

It's a tad early to advertise this, but as I'm receiving a few enquiries are we looking to run any ADF Mobile workshops in Australia anytime soon, I'd like announce yes indeed, we've a series of...

ADF Naming and Project Layout Guidelines

In 2012 the Oracle ADF enablement product management team announced a new OTN website the ADF Architecture Square which includes the ADF Code Guidelines paper.  This paper before it was originally published not only considered ADF best practices, but also suggestions for ADF naming conventions and guidelines for project layouts.  However it was quickly realized the document was going to become too large and unwieldy, so we split it. Today the ADF PM team is happy to announce the publication of the resulting second document the ADF Naming and Project Layout Guidelines.  Just like its sister publication, besides giving you guidelines on ADF naming conventions and project layouts,  the document has been produced to allow new ADF development teams a shortcut in writing their own such document.  If you've already got your own, good for you, dont change. Having been embroiled in bipolar arguments with developers in the past about their preference for naming standards (I might have been one of the developers ;-), of the two documents I do expect this document to be more contentious than the sister publication.  And indeed there was a few debates inside Oracle in producing this document about the use (or not!) of Hungarian notation in ADF.  Yet the thing to realize is in many cases naming conventions are a personal preference.  And as the document clearly tries to state, if you don't like the guidelines, one size doesn't fit all, change them! Like the previous document this paper has been designed to be a living document that will change and evolve over time. We encourage customers to discuss these guidelines on the ADF EMG, and log any issues on the ADF EMG Issue Tracker so we can continue to improve them.

In 2012 the Oracle ADF enablement product management team announced a new OTN website the ADF Architecture Square which includes the ADF Code Guidelines paper.  This paper before it was originally...

Making use of multiple independent ADF BC data sources

An uncommon but valid customer use case for ADF BC is to have two root AMs that point at entirely two different connections/data sources/JNDIs within the same Model project.  For example maybe you need to connect to both an Oracle and SQL-Server database to show data from two different database systems.  Or maybe you even need to connect to the same Oracle database but two different schemas. Now one solution to this problem is to solve it at the database level.  This really is the preferred route as the database has some awesome tools for solving these sorts of issues such as database links, gateways and more.  Remember that the database and these tools have all sorts of supports for tricky database issues such as distribution, 2 phase commits, all the sort of tricky issues that years of effort have gone into solving which to be basic, a normal Java programming team wont be able replicate. However sometimes we're simply not going to be able to use those tools or we just want a simple ADF solution for getting data from two databases. So how do we do this in ADF? The solution is fairly simple, we need: a) 2 separate database connections b) 2 separate root AMs and associated VOs/EOs c) Any page/fragment we drop the VOs/EOs, for the relating task flow we *must* use the <No Controller Transaction> option In considering each of these parts to our solution: a) It's not immediately obvious but we are free to use more than one database connection in our application as the following picture demonstrates: When you have more than 1 database connection you must be mindful that via the Model project properties, the ADF Business Components options has a Connection poplist that defines the default database connection for the project to use at design time as can be seen here: It's up to you when you run the various ADF BC wizards to ensure you have the right database connection selected.   b) Once you've defined your two connections, you're now in a position to use the associated ADF BC wizards to create your various ADF BC components.  You will need to return and manage the default database connection at the Model -> ADF Business Components -> Connection project property to ensure you are connecting to the right database each time you run the wizards.  JDev wont give you any help here, so be careful! In the following picture it shows the end results of creating a Model project, with a Regions EO/VO exposed via the HrAppModule connecting to the HR Oracle database schema, and another EO/VO Customers exposed via the OeAppModule connecting to the OE Oracle database schema: Note there's an IDE bug 16032880 we need to be wary of here and double check the IDE hasn't introduced an error with the associated JNDI data sources for the AMs. Each change to the Connection option at the Model ADF BC project properties level, it will override all the data sources of the root AMs with the one you just picked.  If you locate the bc4j.xcfg file that holds the JNDIs, check that the data sources for the associated AM configurations are correct.  For example for the two AMs, I expect to see the following <Custom JDBCDataSource> entries for each configuration: HrAppModuleLocal <Custom JDBCDataSource="java:comp/env/jdbc/HrConnDS"/> HrAppModuleShared <Custom JDBCDataSource="java:comp/env/jdbc/HrConnDS"/> OeAppModuleLocal <Custom JDBCDataSource="java:comp/env/jdbc/OeConnDS"/> OeAppModuleShared <Custom JDBCDataSource="java:comp/env/jdbc/OeConnDS"/> If you do discover an error simply repair this in the file, save all, and the issue will go away.  Ensure to test in the Business Component Browser to make sure the AMs are configured correctly and you can see data from both VOs. c) Having sorted out the Model layer you can now move to the ViewController layer.  Typically you'll start creating pages and fragments in your unbounded task flow (UTF) and bounded task flows (BTF) making use of the VOs you've just exposed through the separate AMs. In doing so you must be aware that the transaction options you pick for your BTFs can change the connection behaviour of your AMs you have just setup.  Here are the bounded task flows supported 4 transaction options: Strictly speaking it's 3 options that are related, that being Always Begin New Transaction, Always Use Existing Transaction and Use Existing Transaction if Possible, plus the ability to turn this feature off known as <No Controller Transaction>.  You can read more about these options here (see the Task Flow Transaction Fundamentals paper). There's a back end feature built into the task flow transaction management where if your application makes use of multiple ADF BC data controls mapping to separate root AMs, at runtime ADF BC will try to share connections amongst the root AMs at runtime.  While this may seem disastrous to what we're attempting to achieve here, this feature is very important for creating scalable applications where our architecture has forced us to create separate root AMs, typically we don't want one user to take out several database connections. Luckily task flows still provide a solution for our use case that is counter to this described functionality.  Rather than picking one of the BTF transaction options, that being Always Begin New Transaction, Always Use Existing Transaction and Use Existing Transaction if Possible, instead pick the <No Controller Transaction> option.  The <No Controller Transaction> option doesn't override the connections of the root AMs and allows them to connect to any JNDI they define at runtime. And that's all there really is to the solution.  You just need to make sure you have the right configurations for your root AMs, the Model project ADF BC connection, and the task flow transaction options. Questionably what about the unbounded task flow?  It doesn't allow us to define any transaction options, what do we do if we use our VOs in pages/fragments of the UTF?  For all intent and purposes you can treat the UTF as using the <No Controller Transaction> option. What about the task flow Share Data Controls with Calling Task Flow option?  What do we set that to?  The previous Task Flow Transaction Fundamentals paper details that in full, but to say, if you untick that option (known as an isolated data control scope) for the current BTF, a brand new instance of the current root AM and relating VOs/EOs will be instantiated for the current user for the life of the BTF.  If you select the check box (known as a shared data control scope), if an instance of the ADF BC AM is already being used in a previous BTF that called this BTF, simple the ADF BC AM data control will be shared.  If it doesn't yet exist it will be created for the first time. Sample App I've provided a sample application for you to see this behaviour based on the scenario described above.  It requires access to both the HR and OE sample Oracle database schemas. When you run the app via the Splash.jsf page and navigate to the CombinedTaskFlow, back in JDev open the database navigator and locate both tables in the separate schemas, and the associated records that are currently showing in the app.  Now in the app you can change and commit each VO and watch the changes independently saved to the corresponding records in the separate schemas in the database.  Note you can change and commit the records independently, committing one will not commit the other. Troubleshooting It's worth while having a look at how things break too.  If you open the CombinedTaskFlow.xml in JDeveloper and change the BTF transaction options to Always Begin New Transaction, rerun the app, you'll notice on arriving at the CombinedTaskFlow you'll receive a database error "ORA-00942: table or view does not exist".  As described previously the other BTF transaction options combine/share the connection of the first root AM at runtime, in this case the HrConn.  As such as soon as the CustomersView attempts to query from the CUSTOMERS table in the database, as its root AM is now sharing the connection of the HR root AM it can't see the required tables. Alternatively if you fix this change and return to the ADF Business Components Connection option under the Model project properties, and switch the connection to OeConn, this will modify all the JNDI connections in the relating bc4j.xcfg file to point at the OE schema.  On rerunning your app you'll again see "ORA-00942: table or view does not exist".  Again in this case be cognizant of bug 16032880 described earlier and how to fix the problem.

An uncommon but valid customer use case for ADF BC is to have two root AMs that point at entirely two different connections/data sources/JNDIs within the same Model project.  For example maybe you...

Working with EO composition associations via ADF BC SDO web services

ADF Business Components support the ability to publish the underlying Application Modules (AMs) and View Objects (VOs) as web services through Service Data Objects (SDOs).  This blog post looks at a minor challenge to overcome when using SDOs and Entity Objects (EOs) that use a composition association. Using the default ADF BC EO association behaviour ADF BC components allow you to work with VOs that are based on EOs that are a part of a parent-child composition association.  A composition association enforces that you cannot create records for the child outside the context of the parent.  As example when creating invoice-lines you want to enforce the individual lines have a relating parent invoice record, it just simply doesn't make sense to save invoice-lines without their parent invoice record. In the following screenshot using the ADF BC Tester it demonstrates the correct way to create a child Employees record as part of a composition association with Departments: And the following screenshot shows you the wrong way to create an Employee record: Note the error which is enforced by the composition association: (oracle.jbo.InvalidOwnerException) JBO-25030: Detail entity Employees with row key null cannot find or invalidate its owning entity.  Working with composition associations via the SDO web services  Shay Shmeltzer recently recorded a good video which demonstrates how to expose your ADF Business Components through the SDO interface. On exposing the VOs you get a choice of operation to publish including create, update, delete and more: For example through the SDO test interface we can see that the create operation will request the attributes for the VO exposed, in this case EmployeesView1: In this specific case though, just like the ADF BC Tester, an attempt to create this record will fail with JBO-25030, the composition association is still enforced: The correct way to to do this is through the create operation on the DepartmentsView1 which also lets you create employees record in context of the parent, thus satisfying the composition association rule: Yet at issue here is the create operation will always create both the parent Departments and Employees records.  What do we do if we've already previously created the parent Departments records, and we just want to create additional Employees records for that Department?  The create method of the EmployeeView1 as we saw previously doesn't allow us to do that, the JBO-3050 error will be raised. The solution is the "merge" operation on the parent Departments record: In this case for the Departments record you just need to supply the DepartmentId of the Department you want the Employees record to be associated with, as well as the new Employees record.  When invoked only the Employees record is created, and the supply of the DepartmentId of the Departments record satisfies the composition association without actually creating or updating the associated Department record that already exists in the database. Be warned however if you supply any more attributes for the Department record, it will result in a merge (update) of the associated Departments record too. 

ADF Business Components support the ability to publish the underlying Application Modules (AMs) and View Objects (VOs) as web services through Service Data Objects (SDOs).  This blog post looks at a...

Yet another ADF book - Oracle ADF Real World Developer’s Guide

I'm happy to report that the number of ADF published books is expanding yet again, with this time Oracle's own Jobinesh Purushothaman publishing the Oracle ADF Real World Developer’s Guide.  I can remember the dim dark days when there was but just 1 Oracle book besides the documentation, so today it's great to have what I think might be the 7 or 8th ADF book publicly available, and not to forgot all our other technical docs too. Jobinesh has even published some extra chapters online that will give you a good taste of what to expect.  If you're interested in positive reviews, the ADF EMG already has it's first happy customer. Now to see if I can get Oracle to expense me a copy. -- Post edit: Kindly Packt Publishing supplied a copy giving me a chance to review the book.  My review comments follow, also published on Amazon: -- As part of my regular job I've read *every* book on ADF. Given there's now several ADF beginners' books I thought the introductory level to ADF area was already amply covered, and I didn't know what Jobinesh's latest ADF text could add. However I must admit I was pleasantly surprised, as even though this book obviously covers many beginners topics, it also covers topics the others haven't making it another valuable addition to the ADF textbooks currently available. Of particular interest to me was the introduction of: 1) Framework class diagrams - explaining for example how the ADF BC and binding layer classes relate to each other, therefore giving insight into what objects you actually work with programmatically when you drop to code in ADF 2) JSF Servlet, ADF Servlet & Filter lifecycle - a soup-to-nuts discussion starting at the configuration of the web.xml, the importance of the order of the filters, and the overall lifecycle of the framework 3) Application Module scenario diagrams - what objects and methods get called when an Application Module is instantiated so not only do you get an abstract discussion of the AM lifecycle, but how it relates to actual code. Don't get me wrong, Jobinesh covers all the usual beginner topics, ADF Business Components, task flows, ADF Faces and more, but the fact that he's focused on the Java and Java EE side of ADF brings two very useful discussions on the framework together, which I believe creates a good learning opportunity. A recommended read for beginners.

I'm happy to report that the number of ADF published books is expanding yet again, with this time Oracle's own Jobinesh Purushothaman publishing the Oracle ADF Real World Developer’s Guide.  I can...

ADF Code Guidelines

During Oracle Open World 2012 the ADF Product Management team announced a new OTN website, the ADF Architecture Square.  While OOW represents a great opportunity to let customers know about new and exciting developments, the problem with making announcements during OOW however is customers are bombarded with so many messages that it's easy to miss something important. So in this blog post I'd like to highlight as part of the ADF Architecture Square website, one of the initial core offerings is a new document entitled ADF Code Guidelines. Now the title of this document should hopefully make it obvious what the document contains, but what's the purpose of the document, why did Oracle create it? Personally having worked as an ADF consultant before joining Oracle, one thing I noted amongst ADF customers who had successfully deployed production systems, that they all approached software development in a professional and engineered way, and all of these customers had their own guideline documents on ADF best practices, conventions and recommendations.  These documents designed to be consumed by their own staff to ensure ADF applications were "built right", typically sourced their guidelines from their team's own expert learnings, and the huge amount of ADF technical collateral that is publicly available.  Maybe from manuals and whitepapers, presentations and blog posts, some written by Oracle and some written by independent sources. Now this is all good and well for the teams that have gone through this effort, gathering all the information and putting it into structured documents, kudos to them.  But for new customers who want to break into the ADF space, who have project pressures to deliver ADF solutions without necessarily working on assembling best practices, creating such a document is understandably (regrettably?) a low priority.  So in recognising this hurdle, at Oracle we've devised the ADF Code Guidelines.  This document sets out ADF code guidelines, practices and conventions for applications built using ADF Business Components and ADF Faces Rich Client (release 11g and greater).  The guidelines are summarized from a number of Oracle documents and other 3rd party collateral, with the goal of giving developers and development teams a short circuit on producing their own best practices collateral. The document is not a final production, but a living document that will be extended to cover new information as discovered or as the ADF framework changes. Readers are encouraged to discuss the guidelines on the ADF EMG and provide constructive feedback to me (Chris Muir) via the ADF EMG Issue Tracker. We hope you'll find the ADF Code Guidelines useful and look forward to providing updates in the near future. Image courtesy of paytai / FreeDigitalPhotos.net

During Oracle Open World 2012 the ADF Product Management team announced a new OTN website, the ADF Architecture Square.  While OOW represents a great opportunity to let customers know about new...

Another big year for the ADF EMG at OOW12

Oracle Open World 2012 has only just started, but in one way it's just finished!  All the ADF EMG's OOW content is over for another year! The unique highlight this year for me was the first ever ADF EMG social night held on Saturday, where I finally had the chance to meet so many ADF community members who I've known over the internet, but never met in person.  What?  You didn't get an invite?  Oh well, better luck next year ;-) Seriously our budget was limited, so in the happy-dictatorship sort of way I had to limit RSVPs to just 40 people.  Hopefully next year we can do something bigger and better for the wider community. Following directly on from the Saturday social night the ADF EMG ran a full day of sessions at the user group Sunday.  I wont go over the content again, but to say thank you very much to all our presenters and helpers, including Gert Poel, Pitier Gillis, Aino Andriessen, Simon Haslam, Ken Mizuta, Lucas Jellema and the FMW roadshow team, Ronald van Luttikhuizen, Guido Schmutz, Luc Bors, Aino Andriessen and Lonneke Dikmans. Also special thanks must go to Doug Cockroft and Bambi Price for their time and effort in organizing the ADF EMG room behind the scenes via the APOUC. To be blunt Doug and Bambi really do deserve serious thanks because they had to wear a lot of Oracle politics behind the scenes to get the rooms organized (oh, and deal with me fretting too! ;-). Finally thanks to all the members and OOW delegates for turning up and supporting the group on the day.  In the end the ADF EMG exists for you, and I hope you found it worthwhile. Onto 2013 (oh, and the rest of OOW12 ;-) 

Oracle Open World 2012 has only just started, but in one way it's just finished!  All the ADF EMG's OOW content is over for another year! The unique highlight this year for me was the first ever ADF...

The Year After the Year of the ADF Developer - the ADF EMG at OOW 2012

I'm happy to announce that the ADF EMG will be continuing on from its success in 2011, and will be running "The Year After the Year of the ADF Developer" at Oracle Open World 2012.  On the user group Sunday 30th of September the ADF EMG has a full day of sessions for anybody interested in ADF.  The collective sessions are designed to have something for everyone, ADF beginners, ADF experts, all.  All sessions will be held in Moscone South room 305. To start out with for OOW attendees coming from a Forms background, Gert Poel and Pieter Gillis from iAdvise will give us the lowdown on ADF for Forms programmers.  This is a very important presentation for the beginners in the ADF community who are coming from a Forms background:  1) UGF3783 - Oracle ADF Immersion: How an Oracle Forms Developer Immersed Himself in the Oracle ADF World - 9am-10am Moscone South room 305  At the other end of the spectrum for EMG members who are looking to expand their ADF skills beyond the basics, Aino Andriessen from AMIS will be looking at using Hudson for building ADF applications.  Surprisingly via the EMG new member's survey around 25% of new members have no idea about CI tools so I think Aino's presentation is a great addition to the ADF EMG line up:  2) UGF4945 - Deploy with Joy: Using Hudson to Build and Deploy Your Oracle ADF Applications - 10:15am-11:15am Moscone South room 305  The 3rd presentation is one ADF EMG members have been asking for such a long time:  3) UGF10463 - A Peek into the Oracle ADF Architecture of Oracle Fusion Applications - 11:30am-12:30am Moscone South room 305  In this presentation Simon Haslam will be discussing the actual Fusion Apps "ADF" architecture.  In other words forgot the high level "yes ADF was used to build Fusion Apps" bullet points, Simon is going for a deep dive into the nitty gritty details of how ADF was used to build Fusion Apps.  For ADF EMG members remember all those times you posted to the EMG wishing to know more details about how ADF was used in Fusion Apps? This is the session for you to learn and bring your own questions. But the fun doesn't stop here.  The final presentation is a muti-slot presentation, where a team of FMW programmers, including ADF programmers, SOA programmers and more will build an end-to-end application, live in front of your very eyes:  4) UGF10464 - Oracle Fusion Middleware Live Application Development Demo - 12:45-3:45pm Moscone South room 305  Why this presentation rocks, is rather than a single presentation on ADF here, then a separate presentation on SOA there, the goal of this presentation is to bring it altogether so you can see an end-to-end Fusion Middleware application being built at once.  I've seen this before, this is a great session, and I highly recommend it.  I hope you'll take the opportunity to attend and support the ADF EMG this year, I'm especially keen to see new faces and meet old friends and to continue supporting it's members. Of course note rooms and times may change, so ensure to check the schedule builder closer to the event.  See you at Oracle Open World!

I'm happy to announce that the ADF EMG will be continuing on from its success in 2011, and will be running "The Year After the Year of the ADF Developer" at Oracle Open World 2012.  On the user group...

Meet ojdeploy's big brother ojserver

Just over a year ago Oracle released the 11gR2 branch of JDeveloper starting at version 11.1.2.0.0.  The primary reason for that release was giving customer's JSF2.0 support in ADF, though our 11gR1 branch remains for various reasons. While JSF2.0 is the primary reason to check out 11gR2, there are some minor other benefits including that of ojserver, which is ojdeploy's big brother. I first became aware of ojserver when Oracle ACE John Stegeman mentioned it when he spotted in 11.1.2.0.0, it's come up on the ADF EMG a number of times, and I've been curious about it ever since.  Part of that curiosity is peaked by the fact that ojdeploy is one of Oracle's, let's say, least loved products.  I'm not utterly convinced by the naysayers' arguments about ojdeploy, but putting that aside, it does leave us wondering what ojserver does and what problem it attempts to rectify for ojdeploy. Why ojserver was invented was to address an issue with the Fusion Applications build.  To date the statistics for the size and number of libraries in Fusion Apps is rather impressive.  And it is ojdeploy's task to build all those libraries and the resulting application.  At one stage it was observed the Fusion Apps build times were becoming rather long, and with a bit of analysis it was determined that the start and stopping of ojdeploy for each build component was a large time sink. Why was this?  ojdeploy is ultimately a "headless" JDeveloper which requires it's own JVM to start, run and stop.  As you can appreciate that lifecycle takes time.  If you call it seven hundred times, that's seven hundred times ojdeploy needs to be start, run and stop.  There's not much we can do about the run bit, that's the bit when ojdeploy is actually doing it's real job, but is there anything Oracle could do to fix the start and stop cycle? Enter odeploy's big brother ojserver. ojserver is essentially a server version of ojdeploy, in the sense that once started it stays alive and can be asked to, well serve stuff ;-)  As such what you can ask ojdeploy to do is rather than build each library itself, just hand the request off to ojserver instead, which has already been started.  Brilliant.  We no longer have to start and stop the ojdeploy process for each build item, we just start it once at the beginning, and stop it once at the end. To start ojserver you simply execute the following from the jdeveloper/jdev/bin directory: ojserver -start This will start the service on localhost port 2010 by default.  You can override this by specifying the address after the -start flag.  On starting the server you will eventually see: INFO: Server ready. We're now ready to rock n roll with ojdeploy.  The following shows you an example of how to call ojserver from ojdeploy from a separate command line, note the additional -ojserver and -address flags:  ojdeploy -workspace /Users/chris/jdev/mywork/Demo/Demo.jws -project ViewController -profile DemoVCProfile -ojserver -address localhost:2010 In context of ojdeploy you will not see much activity in the logs.  Rather all activity include build output, errors and more will come from the ojserver logs. One thing to keep in mind is if you're accessing ojserver remotely from ojdeploy remotely, for the given paths for the ojdeploy -workspace flag and more, ojserver must have access to those paths and the source code.  Remember ojserver is just a server version of ojdeploy, there's no magic copying of files between ojserver and ojdeploy, so in terms of building applications and the files it needs, it's the same as ojdeploy. Note there is currently one known limitation with using ojserver and ojdeploy via Ant (as separate to the command line call above).  At the moment the OJDeployAnt taskDef that you define in Ant to call ojdeploy currently does not support parameters for calling ojserver.  ER 14464838 has been raised to address this limitation. Reference: ojserver documentation --Errata-- In addressing the last comment about OJDeployAnt, there is a known workaround to add <arg> elements to the taskDef as follows: <ora:ojdeploy xmlns:ora="oraclelib:OJDeployAntTask"      executable="${oracle.jdev.ojdeploy.path}"      ora:buildscript="${oracle.jdev.deploy.dir}/ojdeploy-build.xml"      ora:statuslog="${oracle.jdev.deploy.dir}/ojdeploy-statuslog.xml">   <arg value="-ojserver"/>   <arg value="-address "/>   <arg value="localhost:2010"/>   <ora:deploy>     <ora:parameter name="workspace" value="${oracle.jdev.workspace.path}"/>     <ora:parameter name="profile" value="${oracle.jdev.deploy.profile.name}"/>     <ora:parameter name="nocompile" value="false"/>     <ora:parameter name="outputfile" value="${oracle.jdev.deploy.outputfile}"/>   </ora:deploy> </ora:ojdeploy> In the above example I've truncated some of the Ant property names using "jdev" rather than "jdeveloper" so the code will fit in the width of the blog.  Ensure to double check these with your own Ant property names.

Just over a year ago Oracle released the 11gR2 branch of JDeveloper starting at version 11.1.2.0.0.  The primary reason for that release was giving customer's JSF2.0 support in ADF, though our 11gR1...

New recording on using JMeter to test ADF applications

Before joining Oracle I maintained an older ADF blog where I covered using Apache JMeter to load test ADF.  That post has been picked up by a number of people over the years and it's nice to see it was useful. Unfortunately one of the problems in using JMeter to test ADF is there's an extreme amount of fussy configuration to get right.  As a result to this day I continue to get hit with questions - why don't our tests work?  From my own investigation 99% of the time it's a configuration error on the developer's part.  Like I said, there's lots of fussy configuration you must get exactly right otherwise ADF gets confused by the messed up HTTP requests it receives from JMeter (more rightly ADF says the user session has expired, which is just ADF's way of saying it doesn't know who the current session is because the ADF HTTP state parameters JMeter is sending to the ADF server are not what it expected). While the original blog post was useful in teaching people the technique of using JMeter, it really could do with a recorded demonstration to show all the steps involved in a live test.  Lucky for you as I'm now an ADF product manager with far too much time on my hands, I've taken time out to record such a demo as part of our ever expanding ADF Insider series. At the conclusion of the demo you may decide it all sounds like too much effort.  Without a doubt this is why you should look at Oracle's Application Test Suite (OATS).  OATS has ADF intelligence built in, there's far less fussy configuration required, so you can focus on the job of testing rather than configuring the test tool.  I hope to publish some demos on using OATS soo. One final caveat, I don't expect the existing JMeter configurations to survive for every future version of ADF.  So if you do find your old JMeter tests stop working on adopting a future ADF version, time to look under the covers, discover how we need to change the JMeter tests, and most importantly please share your knowledge by blogging about it or post it on the ADF EMG and leaving a comment here for people to find. Post edit 12th March 2013: Jan Vervecken has provided a very useful update for JMeter, check out the following OTN forums post. Post edit 2nd September 2013: Ray Tindall has provided the following updates for using this under 11.1.1.7.0. The following changes need to be made to the JMeter solution: Previously 11.1.1.6.0 afrLoop was extracted from:query = query.replace(/_afrLoop=[^&]*/__,"_afrLoop=21441675777790"); orquery = query += "_afrLoop=21441675777790";Under 11.1.1.7.0 it should be extracted from:query = _addParam(query, "_afrLoop", "21137373554065"); As such afrLoop should now be extracted using:_afrLoop", "([-_0-9A-Za-z]{13,16}) Thanks to both Jan and Ray for these updates.

Before joining Oracle I maintained an older ADF blog where I covered using Apache JMeter to load test ADF.  That post has been picked up by a number of people over the years and it's nice to see it...

Perth reveals itself as an ADF hotspot - ADF Community Event

I don't know if you've every visited Perth, but it's a loooooong way from anywhere.  As a result sandgropers often feel like we're left out of the rest of the world's excitement (which isn't a bad thing sometimes either). In the IT industry this as true as other worldly events.  We read all about those exciting Silicon Valley conferences, huge European technical user groups, and look local to find, well with a population of only 1.2 million and the next major city 2500kms away, IT events are on a much smaller scale. So I'm happy to announce regardless of the tyranny of distance, Perth proved itself a little ADF hotspot yesterday. Yesterday marked the first ADF Community Event trial in Perth, opened to only 4 of our local customers to test will this work locally, and I must say the event was a success! The ADF Community Events are designed not to be Oracle sales events, but rather gathering parties who are interested in ADF to discuss and collaborate and network, while learning more about what ADF and other FMW products have to offer (think: SIG).   All in all this is an idea we "borrowed" from Frank Nimphius and Germany who are running their own successful events series. Over 3 hours we covered discussing how to build large scale ADF applications, as well as what I thought an excellent hands-on session by Tim Middleton on integrating Coherence with ADF (Tim, I finally get Coherence, thanks!). So how do I know the event was a success? Well firstly Oracle staff were trying to push their way in too, so I had an overly full room.  Secondly I've already lined up our customer speakers for the next event and they volunteered themselves without (much) prompting! ;-) The next event is tentatively scheduled for Wednesday 12th September.  I'm deliberately controlling the invites, but if you're desperate to attend please email me at chris DOT muir AT oracle DOT com. Thanks to everyone who attended yesterday and I look forward to seeing everyone at the next event.

I don't know if you've every visited Perth, but it's a loooooong way from anywhere.  As a result sandgropers often feel like we're left out of the rest of the world's excitement (which isn't a bad...

An invitation to join a JDeveloper and ADF productivity clinic (and more!) at KScope

Would you like a chance to influence Oracle's decisions on tool usability and productivity? If you're attending ODTUG's Kaleidoscope conference this year in San Antonio, Oracle would like to invite you to participate in our Usability Activity Research and separately our JDeveloper and ADF Productivity Clinics with our experienced user experience teams.  The teams are keen to hear what you have to say about your experiences with our tools in general and specifically JDeveloper and ADF.  The details of each event are described below. Invitation to Usability Activity - Sunday June 24th to Wednesday June 27th Oracle is constantly working on new tools and new features for developers, and invites YOU to become a key part of the process!  As a special addition to Kscope 12, Oracle will be conducting onsite usability research in the Alyssum room, from Sunday June 24 to Wednesday June 27. Usability activities are scheduled ahead of time for participants' convenience.  If you would like to take part, please fill out this form to let us know of the session(s) that you would like to attend and your development experience. You will be emailed with your scheduled session before the start of the conference. JDeveloper and ADF Productivity Clinic - Thursday June 28th Are you concerned that Java, Oracle ADF or JDeveloper is difficult? Is JDeveloper making you jump through hoops?  Do you hate a particular dialog or feature of JDeveloper? Well, come and get things off your chest! Oracle is hosting a product management and user experience clinic where we want to hear about your issues and concerns. What's difficult to use?  What doesn't work the way you want, and how would you want it to work?  What isn't behaving like your current favorite tool?  If we can't help on you the spot, we'll take your feedback and use it to improve the product experience.  A great opportunity to get answers, or get improvements. Drop by the Alyssum room, anytime from 8:30 to 10:30 on Thursday, June 28. We look forward to seeing you at KScope soon! 

Would you like a chance to influence Oracle's decisions on tool usability and productivity? If you're attending ODTUG's Kaleidoscope conference this year in San Antonio, Oracle would like to invite you...

Page based Prematurely Terminating Task Flow scenario

In a previous blog post I highlighted the issue of ADF Prematurely Terminating Task Flows, essentially where ADF page fragment based task flows embedded in regions can be terminated early by their enclosing page causing some interesting side effects.  In that post I concluded the behavior was restricted to task flows embedded in regions, and to be honest besides a log out/timeout scenario, I thought this issue could only occur in regions. While reading our documentation on the CLIENT_STATE_MAX_TOKENS and browser back button support I realized there is indeed another prematurely terminating task flow scenario for page based task flows rather than fragment based task flows which we'll describe here.  For anyone who hasn't read the previous blog post, I suggest you read it before reading this post as it won't make much sense otherwise. Let's describe the application we'll use to demonstrate the scenario: 1) First it contains an unbounded task flow which includes a ViewCountries.jspx page to show data from the Countries table, followed by call to a countries-task-flow.xml. 2) The ViewCounteries.jspx page contains a read-only af:table showing countries data and the ability to select a record, an edit button to navigate to the countries-task-flow, and finally a plain old submit button. 4) The countries-task-flow includes an EditCountries.jspx and an exit Task Flow Return Commit activity. Note the countries-task-flow transaction options are set to Always Begin New Transaction and a shared Data Control Scope: 6) Finally the EditCountries.jspx page includes an editable af:form for the countries data, and a button to exit the task flow via the Task Flow Return Commit activity. Similar to the last blog post we'll use ADFLoggers on the underlying Application Module to show what's happening under the hood. On running the application and accessing the ViewCountries.jspx page we see the Application Module initialized in the logs: <AppModuleImpl> <create> AppModuleImpl created as ROOT AM <AppModuleImpl> <prepareSession> AppModuleImpl prepareSession() called as ROOT AM We'll pick the Brazil record.... ....then the edit button which navigates us to the EditCountries.jspx page within the countries-task-flow.  Note the Brazil record is showing as the current row as the countries-task-flow is using a shared data control scope: Now if we use the browser back button to return to the previous page we see something interesting in the logs as soon as we click the button..... <AppModuleImpl> <beforeRollback> AppModuleImpl beforeRollback() called as ROOT AM ....and because of the rollback note that the current row has reset to the first row: As promised this is another prematurely terminating task flow scenario, this time with pages rather than fragments.  As we can see the framework on detecting the back button press terminates the task flow's transaction by automatically issuing the rollback. You can download the sample application from here.

In a previous blog post I highlighted the issue of ADF Prematurely Terminating Task Flows, essentially where ADF page fragment based task flows embedded in regions can be terminated early by their...

Which JDeveloper is right for me?

Developers downloading JDeveloper will notice that there are two "current" releases to download, 11g Release 1 and 11g Release 2 (abbreviated to 11gR1 and 11gR2 respectively).  11gR1 encompasses the 11.1.1.X.0 JDeveloper versions including the latest 11.1.1.6.0 release.  11gR2 encompasses the 11.1.2.X JDeveloper versions including the latest 11.1.2.2.0 release. What's the difference between the two releases and when would you want to use them? JDeveloper 11g Release 2 includes support for JavaServer Faces 2.0 and was released for customers who are specifically interested in using this contemporary Java EE technology.  Oracle plans to bring in full Java EE 6 support in JDeveloper 12c which JSF2.0 is apart of, but in listening to customers there was interest in obtaining the JSF2.0 support earlier.  Thus the 11gR2 release. The question begs then why would you want 11gR1 if 11gR2 includes the latest Java EE JSF standards?  Surely 11gR1 only supports the older JSF1.2?  The answer revolves around JDeveloper's Fusion Middleware (FMW) support.  Only 11gR1 and the yet-to-be-released 12c versions of JDeveloper will support the full FMW tools including WebCenter, SOA Suite and so on. So if you want the latest JSF2.0 support go 11gR2, but if you're happy with 11gR1 or need the rest of the FMW stack stay on the 11gR1 platform for now as Oracle is continuing to actively improve it.  Eventually JDeveloper 12c will arrive where the 11gR1 and 11gR2 releases will converge, and your choice will again be a simple one.

Developers downloading JDeveloper will notice that there are two "current" releases to download, 11g Release 1 and 11g Release 2 (abbreviated to 11gR1 and 11gR2 respectively).  11gR1 encompasses the...

ADF UI Shell update

Developers who use the ADF UI Shell (aka Dynamic Tab Shell) will be interested to know it now has support for multi browser tabs.  What does multi browser tab support mean? As separate to the dynamic tab feature provided by the ADF UI Shell, contemporary browsers give the user the ability to open multiple tabs within the browser. Each browser tab can view different URLs allowing the user to browse different websites simultaneously, or even the same website multiple times.  There's effectively two ways you can currently be using the ADF UI Shell, either you're using the version coupled with JDeveloper and selected through the New Page dialog the Dynamic Tab Shell template, or you've downloaded the source code via the ADF UI Shell patterns page. If you're using the former option, note that the multi browser tab support within the Shell  became available in JDeveloper 11.1.1.6.0 (patchset 5).  If you want to make use of this support you will need to consider adding the context parameter USE_PAGEFLOW_TAB_TRACKING to your web.xml to turn on the multi browser tab support in the shell.  By default the Shell will not turn this on for backwards compatibility reasons. Alternatively if you're using the ADF UI Shell source code as downloaded via the original pattern web page, you will not only need to configure this new parameter, but you will need to download the source code (via the zip in the link above) and modify your local copy of the template too.  For reference the only code change has been to the TabContext.java class. Note while this will make the ADF UI Shell ready for multi browser tab support, it does not mean your entire application suddenly can support multi browser tabs. You need to have taken special care in determining your application's bean scopes as detailed in one of my old blogs.

Developers who use the ADF UI Shell (aka Dynamic Tab Shell) will be interested to know it now has support for multi browser tabs.  What does multi browser tab support mean? As separate to the dynamic...

ADF Prematurely Terminated Task Flows

In this post I'll describe some interesting side effects on task flow transactions if a task flow terminates/finalizes earlier than expected.  To demonstrate this we'll use the following application PrematurelyTerminatingTaskFlows.zip built in JDev 11.1.1.6.0. The app based on the Oracle HR schema renders a single page: Before describing the prematurely terminating task flow behavior let's describe the characteristics of the application first: 1) The app makes use of 2 independent view objects DepartmentsView and EmployeesView. 2) The overall page is Main.jspx which has an embedded region calling the departments-task-flow which itself has another embedded region calling the employees-task-flow.  The departments task flow has the editable departments form and navigation buttons to walk the departments, the employees task flow the table of relating employees for the department. 3) As the user navigates between departments records, the department ID is passed to the employees task flow which calls an ExecuteWithParams activity then displays the resulting employees.  The employees task flow binding has it's refresh = ifNeeded and the associated region has partialTriggers on the navigation buttons, ensuring the employees task flow is updated as the user navigates the departments using the supplied buttons. 4) Of particular interest, the departments-task-flow is using the Always Begin New Transaction task flow transaction behaviour and has an isolated data control scope: And the employees-task-flow is using Use Existing Transaction if Possible and a shared data control scope: If you run this application and navigate amongst the departments using the navigation buttons, the application works as expected.  Both the departments and employees records move onto the next departments ID after each button click. In the application I've also added some ADFLoggers which help capture the current behaviour: <TaskFlowBean> <taskFlowInit> Task flow /WEB-INF/departments-task-flow.xml#departments-task-flow initialized <AppModuleImpl> <create> AppModuleImpl created as ROOT AM <AppModuleImpl> <prepareSession> AppModuleImpl prepareSession() called as ROOT AM <AppModuleImpl> <create> AppModuleImpl created as NESTED AM under AppModule <TaskFlowBean> <taskFlowInit> Task flow /WEB-INF/employees-task-flow.xml#employees-task-flow initialized <TaskFlowBean> <taskFlowFinalizer> Task flow /WEB-INF/employees-task-flow.xml#employees-task-flow finalized <TaskFlowBean> <taskFlowInit> Task flow /WEB-INF/employees-task-flow.xml#employees-task-flow initialized As the page first renders we can see the departments task flow initialized then the associated application module created and prepared to serve the data for the DepartmentsView.  Subsequently we can see the employees task flow initialized.  We don't see a new application module as the employees task flow is sharing the data control. From here each time we step onto another departments record, we'll see the employees task flow finalizer called, then the employees task flow intializer called.  This occurs because the ifNeeded refresh property on the employees task flow is restarting the task flow each time the department ID is changed. This restarting of the task flow is what I coin the "premature termination" of the task flow.  Essentially the calling parent has forced the framework to terminate the employees task flow, rather than the task flow gracefully exiting via a task flow return activity. At the moment though, this is still a "So what?" scenario.  What do we care?  Everything appears to work? Let's change the setup slightly to demonstrate something unexpected.  Return to the application and set the departments task flow transaction option to <No Controller Transaction> (and leave the data control scope option = isolated/unselected): Rerun the application.  Note now when it runs and we press one of the navigation buttons, besides a screen refresh nothing happens.  We don't walk onto a new departments record in the departments task flow, and we don't see the associated employees for the expected new department.  The application seems stuck on the first department. A clue to what's going on occurs in the logs: <TaskFlowBean> <taskFlowInit> Task flow /WEB-INF/departments-task-flow.xml#departments-task-flow initialized <AppModuleImpl> <create> AppModuleImpl created as ROOT AM <AppModuleImpl> <prepareSession> AppModuleImpl prepareSession() called as ROOT AM <TaskFlowBean> <taskFlowInit> Task flow /WEB-INF/employees-task-flow.xml#employees-task-flow initialized <TaskFlowBean> <taskFlowFinalizer> Task flow /WEB-INF/employees-task-flow.xml#employees-task-flow finalized <AppModuleImpl> <beforeRollback> AppModuleImpl beforeRollback() called as ROOT AM <TaskFlowBean> <taskFlowInit> Task flow /WEB-INF/employees-task-flow.xml#employees-task-flow initialized Notice in between the last employees task flow finalizer/initializer pair we see the application module has performed a rollback.  This partially explains the behaviour we're seeing.  When a rollback is issued, a rollback resets the current row indicators for all view objects attached to the application module.  This is why we can't move onto another record. But why is the rollback called in this scenario? The answer is wrapped around the concept of task flow transactions and the associated data control frame. In the first scenario the departments task flow initiated the task flow transaction and associated data control frame.  In turn the employees task flow joined the departments transaction and data control frame.  Only the initiator of a task flow transaction can commit/rollback the overall transaction associated with the data control frame.  In the case where the employees task flow is prematurely terminated, as it a secondary citizen in the overall task flow transaction, the framework leaves the initiator of the task flow to tidy up the transaction. No automatic rollback occurs on the work done by the secondary task flow. In the second scenario the departments task flow is not initiating the task flow transaction as it's chosen the <No Controller Transaction> option.  Instead the employees task flow initiates the transaction because when the Use Existing Transaction if Possible option finds no transaction open it defaults to the equivalent of Always Begin New Transaction behaviour. In remembering the initiator of a task flow transaction can commit/rollback the overall transaction, the framework automatically rolls back the employees task flow and this is the cause of the behaviour we're seeing.  Even though the departments task flow is using <No Controller Transaction> this doesn't mean the underlying view object doesn't participate in a transaction, it just doesn't participate in a task flow transaction (which is an abstraction sitting about the data control transactions).  As the two task flows share data controls, there is only a single application module shared by both task flows, a rollback from one task flow will result in a roll back in both task flows. The solution? Either revert back to the original settings where the bookings task flow uses Always Begin New Transaction, or alternatively use an isolated data control scope for the employees task flow.

In this post I'll describe some interesting side effects on task flow transactions if a task flow terminates/finalizes earlier than expected.  To demonstrate this we'll use the following application Pr...

Oracle JDeveloper 11gR2 Cookbook book review

I recently received a free copy of Oracle JDeveloper 11gR2 Cookbook published by Packt Publishing for review. Readers of technical cookbooks would know this genre of text includes problems that developers will hit and the prescribed solutions, in this case for Oracle's Application Development Framework (ADF).  Books like this excel themselves on excellent coverage, a logical progress of solutions through out the book, and providing a readable narrative around the numerous steps and code. This book progresses well through ADF application assembly, ADF Business Components, the view layer, security, deployment and tuning.  Each recipe had a clear introduction and I especially enjoyed the "There's more" follow up sections for some recipes that leads the reader onto related ideas and issues the reader really needs to be aware of. Also worthy of comment having worked with ADF for over 5 years, there certainly was recipes and solutions I hadn't encountered before, this book gets bonus points for that. As a reviewer what negatives can I give this text? The book has cast it's net too wide by trying to cover "everything from design and construction, to deployment, testing, debugging and optimization."  ADF is such a large and sophistication technology, this book with 100 recipes barely scrapes the surface.  Don't expect all your ADF problems to be solved here. In turn there is inconsistency in the level of problems and solutions.  I felt at the beginning the book was pitching itself at advanced problems to solve (that's great for me), but then it introduces topics like building a static View Object or train.  These topics in my opinion are fairly simple and are covered by the Oracle documentation just as well, they shouldn't have been included here.  In conclusion, ADF beginners will find this book worthwhile as it will open your eyes to the wider problems and solutions required for ADF, and experts for just the fact they can point junior programmers at the book for certain problems and say "get on with it". Is there scope for more ADF tombs like this?  Yes!  I'd love to see a cookbook specializing on ADF Business Components (hint hint to budding authors).

I recently received a free copy of Oracle JDeveloper 11gR2 Cookbook published by Packt Publishing for review. Readers of technical cookbooks would know this genre of text includes problems that...

Solution for developers wanting to run a standalone WLS 10.3.6 server against JDev 11.1.1.6.0

In my previous post I discussed how to install the 11.1.1.6.0 ADF Runtimes into a standalone WLS 10.3.6 server by using the ADF Runtime installer, not the JDeveloper installer.  Yet there's still a problem for developers here because JDeveloper 11.1.1.6.0 comes coupled with a WLS 10.3.5 server.  What if you want to develop, deploy and test with a 10.3.6 server?  Have we lost the ability to integrate the IDE and the WLS server where we can run and stop the server, deploy our apps automatically the server and more? JDeveloper actually solved this issue sometime back but not many people will have recognized the feature for what it does as it wasn't needed until now. Via the Application Server Navigator you can create 2 types of connections, one to a remote "standalone WLS" and another to an "integrated WLS".  It's this second option that is useful because what we can do is install a local standalone WLS 10.3.6 server on our developer PC, then create a separate "integrated WLS" connection to the standalone server.  Then by accessing your Application's properties through the Application menu -> Application Properties -> Run -> Bind to Integration Application Server option we can choose the newly created WLS server connection to work with our application. In this way JDeveloper will now treat the new server as if it was the integrated WLS.  It will start when we run and deploy our applications, terminate it at request and so on.  Of course don't forget you still need to install the ADF Runtimes for the server to be able to work with ADF applications. Note there is bug 13917844 lurking in the Application Server Navigator for at least JDev 11.1.1.6.0 and earlier.  If you right click the new connection and select "Start Server Instance" it will often start one of the other existing connections instead (typically the original IntegratedWebLogicServer connection).  If you want to manually start the server you can bypass this by using the Run menu -> Start Server Instance option which works correctly.

In my previous post I discussed how to install the 11.1.1.6.0 ADF Runtimes into a standalone WLS 10.3.6 server by using the ADF Runtime installer, not the JDeveloper installer.  Yet there's still a...

The case of the phantom ADF developer (and other yarns)

A few years of ADF experience means I see common mistakes made by different developers, some I regularly make myself.  This post is designed to assist beginners to Oracle JDeveloper Application Development Framework (ADF) avoid a common ADF pitfall, the case of the phantom ADF developer [add Scooby-Doo music here]. ADF Business Components - triggers, default table values and instead of views. Oracle's JDeveloper tutorials help with the A-B-Cs of ADF development, typically built on the nice 'n safe demo schema provided by with the Oracle database such as the HR demo schema. However it's not too long until ADF beginners, having built up some confidence from learning with the tutorials and vanilla demo schemas, start building ADF Business Components based upon their own existing database schema objects.  This is where unexpected problems can sneak in. The crime Developers may encounter a surprising error at runtime when editing a record they just created or updated and committed to the database, based on their own existing tables, namely the error: JBO-25014: Another user has changed the row with primary key oracle.jbo.Key[x] ...where X is the primary key value of the row at hand.  In a production environment with multiple users this error may be legit, one of the other users has updated the row since you queried it.  Yet in a development environment this error is just plain confusing.  If developers are isolated in their own database, creating and editing records they know other users can't possibly be working with, or all the other developers have gone home for the day, how is this error possible? There are no other users?  It must be the phantom ADF developer! [insert dramatic music here] The following picture is what you'll see in the Business Component Browser, and you'll receive a similar error message via an ADF Faces page: A false conclusion What can possibly cause this issue if it isn't our phantom ADF developer?  Doesn't ADF BC implement record locking, locking database records when the row is modified in the ADF middle-tier by a user?  How can our phantom ADF developer even take out a lock if this is the case?  Maybe ADF has a bug, maybe ADF isn't implementing record locking at all?  Shouldn't we see the error "JBO-26030: Failed to lock the record, another user holds the lock" as we attempt to modify the record, why do we see JBO-25014? : Let's verify that ADF is in fact issuing the correct SQL LOCK-FOR-UPDATE statement to the database. First we need to verify ADF's locking strategy.  It is determined by the Application Module's jbo.locking.mode property.  The default (as of JDev 11.1.1.4.0 if memory serves me correct) and recommended value is optimistic, and the other valid value is pessimistic. Next we need a mechanism to check that ADF is issuing the LOCK statements to the database.  We could ask DBAs to monitor locks with OEM, but optimally we'd rather not involve overworked DBAs in this process, so instead we can use the ADF runtime setting –Djbo.debugoutput=console.  At runtime this options turns on instrumentation within the ADF BC layer, which among a lot of extra detail displayed in the log window, will show the actual SQL statement issued to the database, including the LOCK statement we're looking to confirm. Setting our locking mode to pessimistic, opening the Business Components Browser of a JSF page allowing us to edit a record, say the CHARGEABLE field within a BOOKINGS record where BOOKING_NO = 1206, upon editing the record see among others the following log entries: [421] Built select: 'SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings'[422] Executing LOCK...SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings WHERE BOOKING_NO=:1 FOR UPDATE NOWAIT[423] Where binding param 1: 1206  As can be seen on line 422, in fact a LOCK-FOR-UPDATE is indeed issued to the database.  Later when we commit the record we see: [441] OracleSQLBuilder: SAVEPOINT 'BO_SP'[442] OracleSQLBuilder Executing, Lock 1 DML on: BOOKINGS (Update)[443] UPDATE buf Bookings>#u SQLStmtBufLen: 210, actual=62[444] UPDATE BOOKINGS Bookings SET CHARGEABLE=:1 WHERE BOOKING_NO=:2[445] Update binding param 1: N[446] Where binding param 2: 1206[447] BookingsView1 notify COMMIT ... [448] _LOCAL_VIEW_USAGE_model_Bookings_ResourceTypesView1 notify COMMIT ... [449] EntityCache close prepared statement ....and as a result the changes are saved to the database, and the lock is released. Let's see what happens when we use the optimistic locking mode, this time to change the same BOOKINGS record CHARGEABLE column again.  As soon as we edit the record we see little activity in the logs, nothing to indicate any SQL statement, let alone a LOCK has been taken out on the row. However when we save our records by issuing a commit, the following is recorded in the logs: [509] OracleSQLBuilder: SAVEPOINT 'BO_SP'[510] OracleSQLBuilder Executing doEntitySelect on: BOOKINGS (true)[511] Built select: 'SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings'[512] Executing LOCK...SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings WHERE BOOKING_NO=:1 FOR UPDATE NOWAIT[513] Where binding param 1: 1205[514] OracleSQLBuilder Executing, Lock 2 DML on: BOOKINGS (Update)[515] UPDATE buf Bookings>#u SQLStmtBufLen: 210, actual=62[516] UPDATE BOOKINGS Bookings SET CHARGEABLE=:1 WHERE BOOKING_NO=:2[517] Update binding param 1: Y[518] Where binding param 2: 1205[519] BookingsView1 notify COMMIT ... [520] _LOCAL_VIEW_USAGE_model_Bookings_ResourceTypesView1 notify COMMIT ... [521] EntityCache close prepared statement Again even though we're seeing the midtier delay the LOCK statement until commit time, it is in fact occurring on line 412, and released as part of the commit issued on line 419.  Therefore with either optimistic or pessimistic locking a lock is indeed issued. Our conclusion at this point must be, unless there's the unlikely cause the LOCK statement is never really hitting the database, or the even less likely cause the database has a bug, then ADF does in fact take out a lock on the record before allowing the current user to update it.  So there's no way our phantom ADF developer could even modify the record if he tried without at least someone receiving a lock error. Hmm, we can only conclude the locking mode is a red herring and not the true cause of our problem.  Who is the phantom? At this point we'll need to conclude that the error message "JBO-25014: Another user has changed" is somehow legit, even though we don't understand yet what's causing it. This leads onto two further questions, how does ADF know another user has changed the row, and what's been changed anyway? To answer the first question, how does ADF know another user has changed the row, the Fusion Guide's section 4.10.11 How to Protect Against Losing Simultaneous Updated Data , that details the Entity Object Change-Indicator property, gives us the answer: At runtime the framework provides automatic "lost update" detection for entity objects to ensure that a user cannot unknowingly modify data that another user has updated and committed in the meantime. Typically, this check is performed by comparing the original values of each persistent entity attribute against the corresponding current column values in the database at the time the underlying row is locked. Before updating a row, the entity object verifies that the row to be updated is still consistent with the current state of the database.  The guide further suggests to make this solution more efficient: You can make the lost update detection more efficient by identifying any attributes of your entity whose values you know will be updated whenever the entity is modified. Typical candidates include a version number column or an updated date column in the row.....To detect whether the row has been modified since the user queried it in the most efficient way, select the Change Indicator option to compare only the change-indicator attribute values. We now know that ADF BC doesn't use the locking mechanism at all to protect the current user against updates, but rather it keeps a copy of the original record fetched, separate to the user changed version of the record, and it compares the original record against the one in the database when the lock is taken out.  If values don't match, be it the default compare-all-columns behaviour, or the more efficient Change Indicator mechanism, ADF BC will throw the JBO-25014 error. This leaves one last question.  Now we know the mechanism under which ADF identifies a changed row, what we don't know is what's changed and who changed it? The real culprit What's changed?  We know the record in the mid-tier has been changed by the user, however ADF doesn't use the changed record in the mid-tier to compare to the database record, but rather a copy of the original record before it was changed.  This leaves us to conclude the database record has changed, but how and by who? There are three potential causes: Database triggers The database trigger among other uses, can be configured to fire PLSQL code on a database table insert, update or delete.  In particular in an insert or update the trigger can override the value assigned to a particular column.  The trigger execution is actioned by the database on behalf of the user initiating the insert or update action. Why this causes the issue specific to our ADF use, is when we insert or update a record in the database via ADF, ADF keeps a copy of the record written to the database.  However the cached record is instantly out of date as the database triggers have modified the record that was actually written to the database.  Thus when we update the record we just inserted or updated for a second time to the database, ADF compares its original copy of the record to that in the database, and it detects the record has been changed – giving us JBO-25014. This is probably the most common cause of this problem. Default values A second reason this issue can occur is another database feature, default column values.  When creating a database table the schema designer can define default values for specific columns.  For example a CREATED_BY column could be set to SYSDATE, or a flag column to Y or N.  Default values are only used by the database when a user inserts a new record and the specific column is assigned NULL.  The database in this case will overwrite the column with the default value. As per the database trigger section, it then becomes apparent why ADF chokes on this feature, though it can only specifically occur in an insert-commit-update-commit scenario, not the update-commit-update-commit scenario. Instead of trigger views I must admit I haven't double checked this scenario but it seems plausible, that of the Oracle database's instead of trigger view (sometimes referred to as instead of views).  A view in the database is based on a query, and dependent on the queries complexity, may support insert, update and delete functionality to a limited degree.  In order to support fully insertable, updateable and deletable views, Oracle introduced the instead of view, that gives the view designer the ability to not only define the view query, but a set of programmatic PLSQL triggers where the developer can define their own logic for inserts, updates and deletes. While this provides the database programmer a very powerful feature, it can cause issues for our ADF application.  On inserting or updating a record in the instead of view, the record and it's data that goes in is not necessarily the data that comes out when ADF compares the records, as the view developer has the option to practically do anything with the incoming data, including throwing it away or pushing it to tables which aren't used by the view underlying query for fetching the data. Readers are at this point reminded that this article is specifically about how the JBO-25014 error occurs in the context of 1 developer on an isolated database.  The article is not considering how the error occurs in a production environment where there are multiple users who can cause this error in a legitimate fashion.  Assuming none of the above features are the cause of the problem, and optimistic locking is turned on (this error is not possible if pessimistic locking is the default mode *and* none of the previous causes are possible), JBO-25014 is quite feasible in a production ADF application if 2 users modify the same record. At this point under project timelines pressure, the obvious fix for developers is to drop both database triggers and default values from the underlying tables.  However we must be careful that these legacy constructs aren't used and assumed to be in place by other legacy systems.  Dropping the database triggers or default value that the existing Oracle Forms  applications assumes and requires to be in place could cause unexpected behaviour and bugs in the Forms application.  Proficient software engineers would recognize such a change may require a partial or full regression test of the existing legacy system, a potentially costly and timely exercise, not ideal. Solving the mystery once and for all Luckily ADF has built in functionality to deal with this issue, though it's not a surprise, as Oracle as the author of ADF also built the database, and are fully aware of the Oracle database's feature set.  At the Entity Object attribute level, the Refresh After Insert and Refresh After Update properties.  Simply selecting these instructs ADF BC after inserting or updating a record to the database, to expect the database to modify the said attributes, and read a copy of the changed attributes back into its cached mid-tier record.  Thus next time the developer modifies the current record, the comparison between the mid-tier record and the database record match, and JBO-25014: Another user has changed" is no longer an issue. [Post edit - as per the comment from Oracle's Steven Davelaar below, as he correctly points out the above solution will not work for instead-of-triggers views as it relies on SQL RETURNING clause which is incompatible with this type of view] Alternatively you can set the Change Indicator on one of the attributes.  This will work as long as the relating column for the attribute in the database itself isn't inadvertently updated.  In turn you're possibly just masking the issue rather than solving it, because if another developer turns the Change Indicator back on the original issue will return.

A few years of ADF experience means I see common mistakes made by different developers, some I regularly make myself.  This post is designed to assist beginners to Oracle JDeveloper Application...

How competent in Java do I need to be for ADF?

I recently received the following question via email: "Chris - what competency level in Java does a developer need to have in order to develop medium to complex ADF applications?  Looking forward to your future postings." This is a common question asked of ADF and I think a realistic one too as it puts emphasis on medium to complex developments rather than simple applications. In my experience a reasonable answer for this comes from Sten Vesterli's Oracle ADF Enterprise Application Development - Made Simple: Getting Organized - Skills required - Java programming "Not everybody who writes needs the skills of Shakespeare. But everybody who writes need to follow rules of spelling and grammar in order to make themselves understood. All serious frameworks provide someway for a programmer to add logic and functionality beyond what the framework offers. In the case of the ADF framework, this is done by writing Java code. Therefore, every programmer on the project needs to know Java as a programming language and to be able to write syntactically correct Java code. But this is a simple skill for everyone familiar with a programming language. You need to know that Java uses { curly brackets } for code blocks instead of BEGIN-END, you need to know the syntax for if-then-else, constructs and how to build a loop and work with an array. But not everyone who writes Java code needs to be a virtuoso with full command of inheritance, interfaces and inner classes." Sten's book is a recommended read for teams looking to commence large ADF projects. From my own experience it's hard to comment on the specifics of every project, what constitutes medium to complex requirements for one ADF team maybe complex to "yeeks!" for another. But from my own experience as an independent ADF developer for several years, I'm willing to share the levels of Java skills I think required. In addressing the question I think a good way is to look at the Java SE and Java EE certification exams, what topics they cover and note which topics I think are valuable. Before doing this readers need to note that JDeveloper at the time this blog was written still runs on Java SE 1.6 and Java EE 1.5. However I'm going to link to the later Java SE 1.7 exams, as that'll increase the lifetime relevance of this post. Note those exams are currently beta so subject to change, and the list of topics I've got below might not be in the final exams.. As such from the Oracle Certified Associate, Java SE 7 Programmer I certification exam topics, in my honest opinion ADF developers need to know *all* of the following topics: Java Basics Working with Java Data Types Using Operators and Decision Constructs Creating and Using Arrays Using Loop Constructs Working with Methods and Encapsulation Working with Inheritance Handling Exceptions I might have a few people argue with me on the list above, particularly inheritance and exceptions. But in my experience ADF developers who don't know about inheritance and in particular type casting, as well as exception handling in general will struggle.  In reality all of the topics above are Java basics taught to first year IT undergraduates, so nobody should be surprised by the list. When we move to the Java SE 7 Programmer II exam topics, the list is as follows.  You'll note the numbers next to each topic, 1 being mandatory, 2 not mandatory but knowledge in this area will certainly help most projects, and 3 not required. 1- Java Class Design 1- Java Advanced Class Design 1 -Object-Oriented Principles 2 - String Processing 1 - Exceptions 3 - Assertions 2 - Java I/O Fundamentals 2 - Java File I/O 1 - Building Database Applications with JDBC * - Threads * - Concurrency * - Localization In the #1 list there's no surprises but maybe JDBC. From my own personal experience even though ADF BC & EJB/JPA abstracts away from knowing the language of the database, at customer sites frequently I've had to build solutions that need to interface with legacy database PL/SQL using JDBC. Your site might not have this requirement, but the next site you work at probably will. The #2 list is more interesting. String processing is useful because without some internal knowledge of the standard Java APIs you can write some poorly performing code . Java I/O is not an uncommon requirement, being able to read/write uploaded/downloaded files to WLS. As for the #3 list, assertions simply don't work in the Java EE world that ADF runs. Finally the topics marked with stars require special explanation. First localization, often called internationalization really depends on the requirements of your project. For me sitting down in Australia, I've never worked on a system that requires any type of localization support besides some daylight saving calculations. For you, this requirement might be totally the opposite if you sit in Europe, so as a requirement it depends. Then the topics of threading and concurrency. Threading and concurrency are useful topics only because there "be demons in thar" (best said in a pirate voice) for future Java projects. ADF actually isolates programmers from the issues of threading and concurrency. This isolation is risky as it may give ADF programmers a false belief they can code anything Java. You'll quickly find issues of thread safety and collection classes that support concurrency are a prime concern for none-ADF Java solutions. So do you need to be an expert Java programmer for ADF? The answer is no. But a reasonable level of Java is required. And this can be capped off with the more Java you know, of course this will be beneficial, and not just for your ADF project! Java remains in my opinion a popular language and something to have on your resume (or is that LinkedIn profile these days?).

I recently received the following question via email: "Chris - what competency level in Java does a developer need to have in order to develop medium to complex ADF applications?  Looking forward to...

ADF EMG at Collaborate 2012

I'm happy to announce the ADF EMG will have sessions at this year's Collaborate conference in Las Vegas April 22-26th 2012.  This is the first time the ADF EMG has presented at Collaborate. Chad Thompson, Chris Ostrowski and Penny Cookson will be leading the charge presenting the following topics on the Wednesday: 1) ADF: A Path to the Future for Dinosaur Nerds - Penny Cookson - Session 173 - Wednesday 11:00am-12:00pm 2) Getting Started with ADF - Chad Thompson - Session 655 - Wednesday 1:00pm-2:00pm 3) JDeveloper ADF and the Oracle Database - Friends Not Foes - Session 172 - Wednesday 3:00pm-4:00pm 4) ADF + Faces: Do I Have to Write ANY Java Code - Session 164 - Wednesday 4:15pm-5:15pm Penny Cookson won best paper for presentation 3 at the Aussie AUSOUG Perth conference in 2011, so the calibre of speakers here is high and well worth attending.  Even if you can't make the sessions it would be great if you could just pop your head in and say hi & thanks to these speakers for presenting at Collaborate. Note the above session times are subject to change, you can find more information here. If anybody is interested in ADF EMG speakers presenting at their conference, please let an EMG representative know so we can see what we can arrange.

I'm happy to announce the ADF EMG will have sessions at this year's Collaborate conference in Las Vegas April 22-26th 2012.  This is the first time the ADF EMG has presented at Collaborate. Chad...

ADF Runtimes vs WLS versions as of JDeveloper 11.1.1.6.0

The following blog post attempts to give Oracle WebLogic Server (WLS) administrators and Oracle Application Development Framework (ADF) customers some guidance of the pairing of ADF Runtime versions to WLS, in order to assist future planning and project management. The blog post discusses two different branches of Oracle's JDeveloper, namely the 11.1.1.X.0 branch including versions 11.1.1.1.0 through the current 11.1.1.6.0, and separately the 11.1.2.X.0 branch including 11.1.2.0.0 through the current 11.1.2.1.0.  In reading this post readers must be clear on the two different branches. The recent Oracle JDeveloper 11.1.1.6.0 release shows a small change in Oracle's pairing of ADF Runtimes versions to WebLogic Server which WLS administrators should be aware of. Since the inception of JDeveloper 11g each new release has required a new version of WLS too.  For example: ADF Runtimes 11.1.1.1.0 required WLS 10.3.1 ADF Runtimes 11.1.1.2.0 required WLS 10.3.2 ADF Runtimes 11.1.1.3.0 required WLS 10.3.3 ADF Runtimes 11.1.1.4.0 required WLS 10.3.4 ADF Runtimes 11.1.1.5.0 required WLS 10.3.5 This "history" is articulated in summary form in the 11.1.1.6.0 Certification and Support Matrix under the Application Server heading. Note with the release of JDeveloper 11.1.1.6.0 there is a subtle change in the ADF Runtime to WLS version pairing.  The latest ADF Runtimes 11.1.1.6.0 can run against WLS 10.3.6 and 10.3.5.  This is the first time in the 11.1.1.X branch we've seen a version run on two versions of WLS.  As such if you have a 10.3.5 WLS server or have just installed WLS 10.3.6 you can also happily install the 11.1.1.6.0 ADF Runtimes on either. Customers need to be careful though as this does not imply the opposite.  If you install WLS 10.3.6, only the ADF Runtimes 11.1.1.6.0 are certified, the 11.1.1.5.0 ADF Runtimes are not (though the 11.1.1.5.0 ADF Runtimes are still of course certified against WLS 10.3.5). While I'm not in a position to comment publicly in detail on future JDeveloper versions beyond those revealed in the roadmaps at OOW, in terms of future releases in the 11.1.1.X.0 branch you should see this trend continue (note the italics on "should", there's no guarantees), namely the 11.1.1.6.0+ runtimes running on both WLS 10.3.5 and WLS 10.3.6.  Obviously to customers having some indication of the trend here is useful, as in previous releases customers had to build a new set of WLS servers for each JDeveloper 11.1.1.X.0 release which was considerable effort. On considering the other 11.1.2.X.0 branch of JDeveloper, as per it's Certification and Support Matrix, the current requirement is a WLS 10.3.5 server with the ADF Runtimes 11.1.1.5.0 installed and a ADF Runtime 11.1.2.X.0 patch applied over the top. Observant readers referring to Oracle's roadmap from OOW will note the upcoming 12c JDeveloper release.  There are no specifics I can give on versions and release dates at all, but it is reasonable to say the ADF 12c runtimes will only run on WLS 12c, not WLS 10.3.X.  There is no information available beyond the general release numbers, so readers should not assume any of the existing or future WLS 12c versions will be satisfactory at this time for ADF 12c - essentially this is to-be-advised at the official release.  The only thing to take from this last paragraph is the 12c release of JDeveloper will require a new stripe of 12c WLS servers, which should assist your future planning efforts if you wish to move to that platform when available. For customers interested in Fusion Middleware (FMW) including SOA Suite etc over ADF, note the same rules apply across the board. However I recognize my reader base is mostly ADF developers thus my focus on the ADF Runtimes. If there's anything unclear in the explanation or in the Certification and Support Matrixes please leave a comment and we'll endeavour to rectify this. Thanks to Brian Fry with his assistance on this blog post.

The following blog post attempts to give Oracle WebLogic Server (WLS) administrators and Oracle Application Development Framework (ADF) customers some guidance of the pairing of ADF Runtime versions...

Running Oracle's ADF Faces Skin Editor under Mac OS

Last year I bought my first Mac and have been slowly learning the in's and out's of Mac OS. My failsafe when I can't get something towork has been to drop to Windows running under an Oracle VirtualBox guestVM. But overtime I've succeeded ingetting most things running under Mac OS. Today's challenge was running Oracle's ADF Faces Skin Editor 11.1.2.1.0 natively under Mac OS 10.7 Lion. As a result I've documented a couple minor issues I overcame herefor my own notes, and hopefully also useful to you too. The generic instructions for installing the 11.1.2.1.0 Skin Editor can be found here, ensure to follow the Mac installation section. Yet I hit three snags during theinstallation: 1) The default process prompts you for thelocation of the 1.6.0 JDK required for 11.1.2.1.0. Under the later Mac OS's finding the location has become a little difficult to do because by default Mac OS now attempts to hide the Library directory from you. The following StackOverFlow post gave me thelocation: /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home 2) On entering this location the Skin Editorstill failed to start stating "Running Skin Editor under a JRE is unsupported".  This error is incorrect aswe're correctly pointing at a JDK. Luckily the resulting error tells you the solution by placing thefollowing flag in the <install-dir>/skineditor/bin/skineditor.conf file: SetSkipJ2SDKCheck true 3) Finally when the Skin Editor startednatively, virtually no toolbar buttons, menus or windows were displayed (making it a little hard touse): The solution via Kevin Angus in theOTN Forums was to include the additional line in the skineditor.conf: AddVMOption -Dapple.laf.useScreenMenuBar=true Voila! 

Last year I bought my first Mac and have been slowly learning the in's and out's of Mac OS. My failsafe when I can't get something to work has been to drop to Windows running under an Oracle VirtualBox...

Minimising the Impact of Data Model Changes in ADF Application Deployment

In the complete lifecycle of an ADF applicationbacked with a database, it's not uncommon for the data model to change. New columns are added to tables, datatypes areexpanded, there are many changes that can take place in the database. Yet as thedatabase is core to the overall application such small changes ripple up thethree tier stack having a wider impact. This is as true for ADF applications as any other database centric technology,as the change causes disruption to the model layer (e.g. ADF BusinessComponents) and the view-controller layers (e.g. ADF Faces RC). Depending on your ADF applicationdeployment setup, building and deploying your application can already take aconsiderable time. For data modelchanges as small as an additional column included in an ADF BC Entity Object(EO), it certainly will be undesirable to have to go through another largebuild and deploy exercise for what amounts to a single new field on the screen. This raises the obvious question can we architect our ADF applications insuch a manner to minimize the impact of data model changes on the build anddeployment of our application? This challenge was put to me in my first few days at Oracle.  The following post describes one suchsolution I came up with using ADF libraries and WebLogic Server shared libraries.  Hopefully I passed the "give the new employee something difficult to do" test but I'm sure readers will set me straight regardless ;-) Whycan't ADF automatically detect this change? One argument that comes up from time totime is that ADF should be able to automatically detect such schema changes andrun with them. Surely something assimple as an additional table column for example could be added to the ADFBusiness Components and JSF pages dynamically at runtime? The problem with this is unless we'rewriting some sort of database to web query tool like Oracle's SQL Developer where youwant to see all the columns in any table regardless, dynamically changing totake into account any database change is dangerous proposition for anapplication. Imagine if the tableEMPLOYEES added a Blob column allowing upto 4GB images to be stored againsteach employee with their latest favourite pic? Should all ADF applications showing employee data automatically make useof this Blob column even if our application doesn't want to show the employee'sportrait? Can our servers handle loading4GBs worth of data for each employee? The answer is obviously no, we couldeasily break our application's ability to scale, and in many cases we don'teven want to show the employee's picture anyhow. As such it's prudent at design time toaccommodate database changes into our application on a case by case basis,rather than allowing our application to dynamically evolve. Angelsin the Architecture Last year for my previous employer I hadthe fortune to present at Open World on ADF architectural blueprints that I hadobserved at different sites (See: Angels in the Architecture). The presentation explored 6 architecturalpatterns of which the 3rd known as the "Master Application-Multi BoundedTask Flow Application" (abbreviated to: Master-App-Multi-BTF-App) presentedthe following application composition: From the diagram we can see the overallapplication is broken into several JDeveloper workspaces: One Common ADF BC ApplicationWorkspace - containing the majority of reusable ADF Business Components One to many BTF ApplicationWorkspaces - each containing BTFs that mimic the user tasks of the system,dependent on the ADF BC Workspace common components through an ADF Library. One Master Application -essentially the composite application that brings the BTF and ADF BC workspacestogether into a presentable whole, again dependent on the individual ADFLibraries. ADF Libraries and the Resource Palette are key to this architectural pattern. While this pattern splits the application into separate workspaces, itdoesn't dictate a deployment model. Bydefault when you add ADF Libraries to another application's projects, thedestination application's WAR profile is updated as follows: In the example above the three ADF Library JARshave been included for deployment with the main application's WAR, and as aresult will be deployed in the overall EAR file for the application. This is ideal from a simplistic deploymentpoint of view, a build-and-deploy-everything approach. But it doesn't satisfy our requirement to notbuild and redeploy the whole application if a simple database change occurs. UsingWLS Shared Libraries with ADF A potential solution which has beendocumented before (See: Andrejus Baranovskis's blog Deploying ADFApplications as Shared Libraries on WLS) makes use of deploying ADFLibraries separately as Shared Libraries to WLS. Without unnecessarily reiterating the currentdocumentation, the basic steps are: - For the application workspace to beshared - 1) In the application workspace create aseparate custom project 2) Add the ADF Library for the workspace tothe new project via the Resource Palette 3) Add a WEB deployment profile to theproject 4) Set the context-root to empty 5) Add a MANIFEST.MF file with thefollowing options: Manifest-Version: 1.0Created-By: <author>Implementation-Title: <module title>Extension-Name: <module package name>Specification-Version: <version>Implementation-Version: <version>Implementation-Version: <author> 6) On deployment via JDev or the WLSconsole ensure to select the Deploy as Shared Library option - For the application workspace that'sconsuming the ADF Library - If the consumer workspace is created as anADF Library itself (to be further consumed by another module), you need to: 1) Follow the previous steps for aworkspace to be shared 2) Add a weblogic.xml file under WEB-INF 3) Add a library-ref option to the sharedlibrary Extension-Name If the consuming workspace is the finalapplication, you need only do the previous steps 2 and 3 plus the followingstep: 4) In the WAR profile uncheck the attachedADF libraries ExampleApplication The following zip file provides a demonstration application builtin JDeveloper 11.1.2.1.0, based on 3 shared libraries, using the Oracle HRdatabase schema. To test this setup you must have the OracleHR database schema available to you, a JDeveloper Resource Palette fileconnection to the "libs" directory as extracted from the zip file,and a preconfigured connection to your WLS server of choice. In order to show the ADF Libraries workingas Shared Libraries, follow these steps: 1) Start your WLS server 2) Ensure a data source is configured forthat used by CommonModel 3) In JDeveloper open all 4 workspaces 4) In the CommonModel workspace: 4.1) Deploy the ADF Library for the Modelproject ... this will write the ADF Library to the libs directory above 4.2) Deploy the SharedLibs project to yourWLS server as a shared library 5) Repeat the previous steps 4.1 and 4.2for the DeptTaskFlows and EmpTaskFlows workspaces 6) Deploy the MasterApp EAR to the server 7) Access the application via http://<wls-host>:<port>/MasterApp/faces/Splash 8) Within the application press each buttonto see each BTF in action Now that we've deployed and tested theexisting application, we'll investigate a scenario with a data model change: 9) In the database add a new VARCHAR2column to the employees table TEST 10) In the associated CommonModel ADF BCEmployees Entity Object and Employees View Object add the new database columnas an attribute 11) Deploy the ADF Library for the Modelproject 12) Open the EmpTaskFlows workspace 13) Refresh the Data Control palette 14) Locate and open the EditEmp.jsf in theViewController project 15) Add the new VO attribute Test to thepage via the Data Control Palette 16) Deploy the ADF Library for theViewController project At this point we want to upload the newCommonModel and EmpTaskFlows to the server, so let's try the following: 17) Deploy the CommonModel and EmpTaskFlowsSharedLibs projects to the server During this operation the 2nd one will failwith the following error message: [03:52:21 PM] Weblogic Server Exception:weblogic.deploy.event.DeploymentVetoException: Cannot undeploy libraryExtension-Name: emp.taskflows, Specification-Version: 1,Implementation-Version: 1.0.0 from server DefaultServer, because the followingdeployed applications reference it: MasterApp.war[03:52:21 PM] See server logs or server console for moredetails.[03:52:21 PM]weblogic.deploy.event.DeploymentVetoException: Cannot undeploy libraryExtension-Name: emp.taskflows, Specification-Version: 1,Implementation-Version: 1.0.0 from server DefaultServer, because the followingdeployed applications reference it: MasterApp.war[03:52:21 PM] Deployment cancelled. While WLS wasn't smart enough to enforcethe indirect dependency on CommonModel, it did so on the EmpTaskFlows as theMasterApp is still live. The solution is to temporarily stop theMasterApp, then attempt the deployment again. Once finished restart the MasterApp and all should be fine. Now when we access the application andnavigate to the EmpTaskFlow we can see the change come through. A copy of the final application can bedownloaded here. Conclusion and Final Thoughts The key point to realize from the examplewas even though we changed the base CommonModel that is directly and indirectlyrelated to all the modules, it was not necessary to redeploy all the modules toget the change. Instead we only deployedthe CommonModel and the EmpTaskFlow where the changes occurred. Our goal has been met. There is of one potentially undesirableissue with the above solution, that we need to stop the MasterApp to achievethe redeployment. For a highavailability site this isn't ideal (read: understatement). Questionably can we use the WebLogic ServerProduction Redeployment feature to stop the application fromhaving to be redeployed? According tothe section Restrictions for Updating J2EE Modules in an EAR: "If redeploying a single J2EE modulein an Enterprise application would affect other J2EE modules loaded in the sameclassloader, weblogic.Deployer requires that you explicitly redeploy all of theaffected modules." .... it would appear the only solution hereis to redeploy all the partsof the updated application to the server, which defeats the point of the wholeexercise. With this limitation in mind I'll look tofurther research a solution for customers in the future and post it here. Of course, ifyou don't have such HA requirements then the current solution is satisfactory.

In the complete lifecycle of an ADF application backed with a database, it's not uncommon for the data model to change. New columns are added to tables, datatypes areexpanded, there are many changes...

Classifying ADF Task Flow Navigation Choices

Having written the Angels in the Architecture: An ADF Application Architectural Blueprint presentation in 2011 it spawned a number of side projects which I had scribbled down but taken nofurther. Starting at Oracle has given mea little more time to rummage through my notebooks and turn these ideas intoblogs posts hopefully to help others. In the Angels in the Architecturepresentation there was an in depth look at how Bounded Task Flows (BTF) in JDeveloper 11g+ could be placed in their own workspace and published as ADFLibraries for reuse in a master composite ADF application. In consuming the BTFs in the masterapplication, it isn't uncommon to make use of the consumed BTFs in a parentcomposite BTF that brings the moving parts together. This is truly one of the delights of BTFs,the ability to shuffle the bits around like Lego to build any application youwant. It was in this composition that Idiscovered another interesting area of BTFs yet to be documented, that of the different navigation models used beyond just the concepts of Unbounded BoundedTask (UTFs) vs Bounded Task Flows (BTFs). This blog posts takes a stab at describing the different models. It shouldn't be considered complete, just astarting point to help you understand the options, and a chance for me to change my scribbled notes into something more substantial. UnboundedTask Flows vs Bounded Task Flows Of course for ADF beginners it's worthgoing over the basics and describing the characteristics of Unbounded TaskFlows (UTFs) and Bounded Task Flows (BTFs). Unbounded Task Flows of which every application hasat least one comprise the main page flow of your application. Whether you're building an application withmany separate pages each with their own URL, or a single page desktop like applicationwith portals/regions, you'll have a UTF. In terms of navigation an example UTF looksas follows: The navigation characteristics of a UTFmany of which been documented before include: There is no set start or end tothe UTF (thus the name "unbounded"), the user can enter theapplication at any activity. Navigation is a combination of userfree-form and design time structured (explained further next) Free-form allows the user toaccess any view activity via a URL. Because of the free-formnavigation model, the minimum amount of steps to get to any view activity is 1. Isolated activities are stillaccessible thanks to their URLs. Structured allows developers tooptionally define uni or bi directional navigation between nodes. Wildcards provide auni-directional leap from a source activity to a defined destinationactivity. The UTF has no defined exitpoints for the user. In fact everyactivity is an exit point, the user can leave the application at any point. Bounded Task Flows navigation takes a moreconstrained approach to navigation: The characteristics of navigation withinBTFs include: As the name suggests, they'rebounded, with one entry point and one or more exit points for the user. There is no free-formnavigation, all navigation (both uni and bi-directional) must be throughpredefined navigation rules or wildcards. You cannot access any activityinside the BTF by an addressable URL. Because of the structurednavigation model, the minimum number of steps to get to any activity within theBTF is dictated by the developer (unlike the free-form nature of UTFs). Isolated nodes areinaccessible. Inter-TaskFlow navigation - task flow calls and regions Before we investigate task flow navigationsfurther, readers need to be familiar that the two mechanismsfor tasks flows to call each other: 1) To call a task flow based on pages wemust use a task flow call 2) To call a task flow based on pagefragments, we must embed the page fragment task flow as a region in a page oranother page fragment. Note how I use the term task flow hererather than Unbounded or Bounded Task Flows. The mechanisms for the different types of task flows to call each otheris the same across both. In addressing point 2 above it is an interestingone as the idea of embedding brings us to the idea of the "stack". Stacknavigation At its simplest "stack navigation" is whenone task flows call another without terminating the first: To be precise stack navigation occurs when: A source task flow calls adestination task flow Control is passed to thedestination task flow until it terminates Upon which control is passedback to the source/caller During the stack the state ofthe source task flow is persisted The state of the destination taskflow only exists for its life The easy analogy here for developers to understand is the 3GL equivalent of functions calling functions. Of course the "stack" model can be extended and wecan have a set of task flows calling each other in a deep stack: Some points on the stack: It's suitable for both page or pagefragment task flows Task flow calls and returns arewhat allow the stack to grow and shrink. As we progress deeper into thestack, as the previous task flows are still live and their state stored in memory,we will consume more memory On returning to a previous itemin the stack, its state is restored in tact with out modifications needed. It is well suited to logical drillup/down solutions. There are no short cuts fromthe stack. It's messy at design time toreorganize the stack if we get the stack order wrong. Task flow parent actions orcontextual events to manipulate the calling task flow are not possible. Task flow calls allow aterminating task flow to pass parameters back to the caller. Networknavigation "Network navigation" is where we chain anumber of task flows together in one composite master. Relevant points of the network navigationmodel: It is suitable for both page or pagefragment task flows Navigation between flows iscontrolled by a master composite task flow. It's very easy to reorganizethe calling order in the composite task flow. At most there's only the twotask flows on the stack, the composite or the called task flow, thus reducing concerns on the memory consumed. If we do return to a previouslyvisited called task flow in the composite, to provide a seamless experience forthe user where it appears we never left the task flow, we need to reestablish it's similar state to when we left it. This willoptionally require more task flow parameters and more logic internally toreexecute previous processing. Task flow parent actions orcontextual events to manipulate the calling task flow are not possible. Task flow calls still allow aterminating task flow to pass parameters back to the caller. Better suited to logic path orwizard style interfaces (noting the similarity to trains). At this point we can start to see one ofthe key differentiators technically with stack vs network navigation is the stack model takes more memory (depending on it's depth), while network takes less but may require more processing. Readers should be careful not to makean ill formed decision here as I've not given you any empirical evidence onwhich one is better or worse from an overhead point of view. As example if stack navigation only takes up1k per user, who cares. But if it takesup megabytes, there's something to worry about. The actual numbers will be dependent on your custom solution and you need to take your own measurements to make this judgement. Hybridnavigation Of course it's possible to have acombination of both stack and network navigation: I wont go into details of the pro's andcon's here as they are just a combination of the stack and network navigationcharacteristics. Nestedregion navigation "Nested regions navigation" is my name forwhen a page or page fragment embeds one or more separate regions to one or moreseparate Bounded Task Flows based on fragments. Unfortunately there's not an easy JDeveloper screenshot to describe thisso we'll use a diagram instead: The characteristics of this model: The call from a region to a BTFcan be thought of as a 2 level stack but where the state of the caller and thenested region BTF run in parallel. Navigation within each BTF isindependent of the parent task flow and as such can be any combination of thenavigation models: stack, navigation or hybrid. The nested BTF can communicateto the parent and other nested BTFs through parent actions or contextualevents. On termination of a nested BTFthere is no way for the BTF to return parameters. This includes the notion of inlinepopups containing regions within the parent page or fragment. The more regions you have, themore memory and processing required for the page. Parallelnavigation Finally returning to the model where onetask flow calls another through a task flow call, task flow calls allows BTFsbased on pages to be called either as an inline popup or external window. The inline popup navigation is a kin to the "Nested region navigation" previously described. The external window navigation is morecomplicated as this navigation occurs separately in a new browser windowseparate to the current browser window, thus the title "parallel navigation". While it doesn't have a separate HTTP session, it does have it's ownpageFlowScope and it's operation is separate to that of the main window. Conclusion What can be seen from the differentnavigation models is they support different user experiences, differenttechnical challenges and different features that can be utilised in each. It's simply not an understanding of taskflows and their features ADF architects need. Rather an understanding of the different navigation models will help architects design new ADF applications. If any readers come up with different navigation models I'd be glad to hear about them.

Having written the Angels in the Architecture: An ADF Application Architectural Blueprint presentation in 2011 it spawned a number of side projects which I had scribbled down but taken nofurther....