Saturday Feb 28, 2009

Ant and Importing

Just how many copies of junit.jar have been added to source repositories on the planet? Quite a few I imagine, seems like a waste of repository data space and well, just wrong. Not junit, which is a fantastic product, just the fact that we have so many copies. Granted you have gained a pretty stable tool by freezing the version you have, and you have guaranteed having a copy at all times, but is it a good idea to add all these binary fines to your repository data? As the list of tools like this grows and grows, does the "just add it to the repository" solution continue to scale? And each time you need a new version, you end up adding even more binary data to your repository.

Some projects have taken to doing a kind of "tools bootstrap" by downloading all the open source tools the first time you setup a repository, making the files immune from normal 'ant clean' actions. Ant has a a task called <get> which can allow you to download tool bundles and it works quite well, but there are some catches to doing it this way. Expecting all the download sites to be up and available 24/7 is not realistic. And predictability is really important so you want to make sure you always download the same version of the tools, keeping a record of what versions of the tools you use.

So what we did in the openjfx-compiler setup repository, was to create an import/ directory to hold the downloaded tools, automate the population of that area with the <get> task, and also allowed for quick population of import/ with a large import zip bundle. The initial version of the repository had a very similar mechanism, so this idea should be credited to the original authors on the OpenJFX Compiler Project.

This logic is contained in the file build-importer.xml of the setup repository and for each tool NAME downloaded, a set of properties is defined (import.NAME.\*), and 2 ant tasks import-get-NAME and import-NAME. Probably best to look at the bottom of this file first. As before quite a few macrodefs were used to make this all work.

The ant build script then just uses ${import.junit.jar} to get a junit.jar file.

You can actually try this out yourself pretty easily if you have Mercurial (hg) and ant by doing this:

hg clone https://hg.kenai.com/hg/openjfx-compiler~soma-setup setup
cd setup
ant import

Of course I'll predict that it fails the first time for 50% or more people, this kind of downloading is just not that reliable when depending on all these sites. So you may have to run ant import a few times.

-kto

Monday Feb 23, 2009

Ant and Platform Specific macrodefs

Ant works great for any pure Java project, very simple to deal with, might get a little tricky when dealing with jar manifests, but not bad, and very efficient in terms of limiting the Java VM startup overhead. But what about platform specific tasks? I myself find the "<exec>" ant task so painful to use that I avoid it at all costs, or at least isolate each use to a "<macrodef>". And this macrodef isolation actually works pretty well when you are dealing with many different platforms that you need to build on.

There are many solutions to the issue of platform specific builds, including the ant cpptasks and I am sure many more. So what I am saying here is not new and not the end all to this issue. Just some ideas for people to consider when up against this problem. Please, add your comments if you have some good references and ideas. It's very obvious to me that I am no where near an ant expert, so take all this with a grain of salt. I also want to give credit to the many JavaFX teams and individuals you wrote the various ant scripts in all the repositories, most of this is a consolidation of other peoples ideas and techniques.

So how did the JavaFX SDK deal with multiple platform issues in ant? This project was composed of many sub repositories, each with different system needs and often using slightly different techniques for building. The top repository (or setup repository or root repository) we have allows for this independence as much as possible, but at the same time trys to create some kind of structure to the build process. From the Mercurial file view of the openjfx-compiler setup repository, I will try and explain what is happening.

  • Basic OS arch detection is done in the file build-os-arch.xml:

    People unfamiliar with xml or ant might find the syntax a bit convoluted, it takes time to get used to it. Key here is the property os_name (which will contain one of: solaris, windows, linux, or macosx), and will be used in the build-defs.xml file to import the right platform specific file build-${os_name}-defs.xml. Keep in mind this is unique to this project, but the basics should work for any multi-platform build project.

  • The platform specific macrodefs are in the build-${os_name}-defs.xml files, customized for each OS, and each defines the -init-platform-defs task. Consider the macosx file:

    Special to this file is the ability to run the xcodebuild utility, this macrodef should probably be turned into some kind of generic do-project-build macrodef, someday. Note that with the JavaFX SDK project we have allowed teams to work in sparse Mercurial forests, this repository we are looking at is the top repository but it could have many sub repositories. Depending on the sub repositories present, there are different needs. We try and check them in these build-${os_name}-defs.xml files, via the -init-platform-defs target.

  • Pull it all together with the file build-defs.xml which we ask all sub repositories to import early in their own build.xml

    This file is imported by each subrepository. Note that this file imports build-os-arch.xml, build-${os_name}-defs.xml, and many other files to provide lots of macrodefs and property settings for a sub repository.

  • Then we established an ant target contract between each sub repository and the top repository by requiring certain jfx-\* targets to be available in the sub repository, for example in the openjfx-compiler repository build.xml file (somewhere in the first 100 lines you should see jfx-\* targets defined):

    Note that it imports in ../build-defs.xml and has defined a set of jfx-\* targets for use by the top repository build.xml file.

  • The forest build script then uses some macrodefs to cycle through the various sub repositories (sometimes called components) in the file build-components.xml, look for the do-all-\* targets:

    Which you will see imported in the top level build.xml file:

    We have a cached area where a previous SDK build is used in the case of a partial forest, allowing a developer to concentrate on the work in a single repository and not have to build the entire forest. The OpenJDK builds uses a similar concept with the Import JDK, where the pieces you aren't building can come from. Effectively, we will cycle over the sub repositories present, in a particular order, and request each one to perform a certain action as defined by the jfx-\* target contract. Look for the use of the do-all-components macrodef for where we will cycle over the sub repositories. (It's a shame that ant doesn't have some kind of applyant task, but you use the tools you are given.)

    You can read more about all this in the JavaFX SDK Build README.

The JavaFX SDK project is a bit unique, and the techniques used in it's build process may not suit many projects, but I thought some of this might be of interest to anyone considering putting their hand into any large ant nest someday. :\^)

Hope someone has found this helpful, as always, comments on better ideas is always welcome.

-kto

Monday Feb 16, 2009

JavaFX Compiler Setup Files

So what happened to my Java and OpenJDK/JDK blogs for the past few months?

Yup, JavaFX bit me. I've been busy working on a special build project for JavaFX, including learning all about the Hudson continuous build system, Ant build script capabilities, and dealing with another forest of Mercurial repositories. Along the way I discovered some new build tricks which I will try and share in separate postings.

The OpenJFX Compiler Project is now sporting a new Mercurial repository (previously was in SubVersion), and a new setup repository that has taken up a great deal of my time. This OpenJFX Compiler project is the "open source" part of JavaFX right now.

The OpenJFX Compiler Setup Files is an open Mercurial repository that allows for building of the entire JavaFX product from a forest of repositories, not all open of course.

Now before you post a question asking why JavaFX isn't all open source, you would be asking the wrong person, I don't know and I have very little influence over this, see the open source statement here. Also see the OpenJFX Data Site for more information on what is visible in the OpenJFX Compiler project.

Some of the issues I tried to tackle with this new setup may relate to many other projects:

  • Dealing ant scripts on a very large project full of hundreds of ant scripts.
  • Dealing with ant and native compilation, including using GNU make, Mac xcodebuild, Visual Studio devenv, and the always painful Visual Studio vcvars32.bat environment variable setup on Windows.
  • Builds of a Mercurial forest, allowing maximum independence but sharing what makes sense to share.
  • Hudson setups and benefits, and the limitations of continuous build system with a distributed source code management system (Mercurial).
  • Ant tricks and limitations I found.
  • Source repository rules, what you should and should not put into your repository.

More on these topics later...

-kto

Wednesday Dec 31, 2008

Where has JVMPI gone?

Have you seen this error before?


FATAL ERROR: JVMPI, an experimental interface, is no longer supported.
Please use the supported interface: the JVM Tool Interface (JVM TI).
For information on temporary workarounds contact: jvmpi_eol@sun.com

For a long time now, since we released JDK 1.5, we have been warning people that the VM profiling interface JVMPI is going away. Starting with the JDK 6 update 3 release (JDK6u3), it is gone for good.

If you really need JVMPI, your best bet is to use a JDK 1.5 or older release, and also find out about transitioning to JVM TI. More often than not, you have become dependent on a tool that uses JVMPI, in which case you should try and upgrade that tool to a version that uses JVM TI instead. But if you have written your own JVMPI code, see the JVMPI transition article at http://java.sun.com/developer/technicalArticles/Programming/jvmpitransition/ for help in transitioning to JVM TI.

NOTE: Getting this message indicates that JVMPI has been requested of the JVM. A request for JVMPI must be made prior to JVM initialization and regardless of whether JVMPI is eventually used at runtime, just the request for it will have a negative performance impact on your Java application. In most situations, JVMPI should never be requested unless some kind of performance work is being done and slower performance is considered acceptable. JVM TI does not have many of the JVMPI limitations.

A few references of interest:

-kto

Monday Dec 08, 2008

OpenJDK6 Repositories!

I see it! It actually exists! Yes Virginia, There Is a Santa Claus!

Actually that's a whale we saw in Alaska this summer. No I'm not trying to insult Santa. ;\^)

Seriously, we have our first cut at generated OpenJDK6 repositories:

http://hg.openjdk.java.net/jdk6/jdk6

You can browse all 7 repositories:

Or you can clone them, or do a forest clone to get the entire forest:

hg fclone http://hg.openjdk.java.net/jdk6/jdk6 yourjdk6
(See the OpenJDK Developer Guide for more information on how to setup Mercurial and the forest extension).

A few important notes:

  • These should be treated as experimental and read-only, official ones should be next week
  • They should match the contents of the OpenJDK6 source bundles, except:
    • No control directory, these files are in the top repository now
    • Previously you had to 'cd control/make && gnumake', now just 'cd . && gnumake'
    • README-builds.html is in the top repository, it's movement has created a little confusion in the changesets, ultimately we will have one copy.
  • Contributed changes should be documented in the changeset comments, if the contribution information is missing please let me know
  • These repositories were created from the TeamWare workspaces and a set of patches and documentation on those patches, we may have to re-create them. If we re-create repositories again, the old ones will not be related to the new ones. So any changesets you create with your clones should be viewed as temporary until the final repositories are put in place.
  • The hotspot repository may be completely replaced when we upgrade to HS14, so when that happens you may need to re-clone the hotspot repository.

Please let me know if you see anything wrong with these repositories.

The target date for official repositories is next week, once it is official we can add more changesets to correct problems, but we can't go back and change the changesets already created.

-kto

Sunday Nov 23, 2008

DVM source repository (DTrace probe VM agents)

The Mercurial repository for the source to the DVM VM agents (source bundles are available at solaris10-dtrace-vm-agents.dev.java.net) is now available at the kenai.com site at kenai.com/projects/dvm.

Also see the dvm wiki page.

A Mercurial repository can be obtained by using:

hg clone https://kenai.com/hg/dvm~mercurial dvm

-kto

About

Various blogs on JDK development procedures, including building, build infrastructure, testing, and source maintenance.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today