Sunday Dec 08, 2013

JSR 269 Maintenance Review for Java SE 8

The annotation processing API, both the processor-specific portion of the API in javax.annotation.processing and the language modeling portions in javax.lang.model.*, are being updated to support the new language features in Java SE 8. Procedurally, the proposed changes are covered by the second maintenance review of JSR 269: Maintenance Draft Review 2.

As summarized on on the maintenance review page, there are three categories of changes from the version of the API shipped with Java SE 7:

  1. Cleaning up the existing specification without changing its semantics (adding missing javadoc tags, etc.)
  2. API changes to support the language changes being made in Project Lambda /JSR 335. These includes adding javax.lang.model.type.IntersectionType as well as javax.lang.model.element.ExecutableElement.isDefault.
  3. API changes to support the language changes being made under JSR 308, Annotations on Java Types. These include javax.lang.model.AnnotatedConstruct and updating javax.annotation.processing.Processor.

The small repeating annotations language change, discussed on an OpenJDK alias, is also supported by the proposed changes.

A detailed specification difference is available. Please post any comments here or send them to me through email.

Monday May 20, 2013

javax.lang.model backed by core reflection

Back when the javax.lang.model API was being designed as part of JSR 269, while the API was primarily intended for use at compile-time with annotation processing, the expert group also wanted the API to be usable in other contexts as well, including at runtime.

JEP 119, javax.lang.model Implementation Backed by Core Reflection, proposed adding such an alternative runtime implementation of javax.lang.model to JDK 8. Such an implementation has recently been pushed as sample code in the JDK 8 langtools repository.

At a high level, the sample code is at its core repeated uses of the adapter pattern, translating between the core reflection API and the javax.lang.model API. However, there were a number of design issues in what interface the javax.lang.model implementation for core reflection should expose. First, should a core reflection specialization of javax.lang.model expose a more specific interface? There are some advantages to just implementing the base API defined in the existing interfaces; it is the simplest approach and would appear to maximize the ability to inter-operate with other implementations. However, javax.lang.model depends on hidden state to define concepts like equality, so there is intrinsically limited inter-operation between disjoint implementations. Therefore, especially in sample code, it was viewed as worthwhile to experiment with a more specific API in the core reflection case.

To make the javax.lang.model backed by core reflection more specific, for each FooElement interface in javax.lang.model.element, a ReflectionFooElement subinterface was defined, as shown in the diagram below.

The subinterfaces are specialized in a few ways:

  • If a method in the base interface was defined to return a FooElement, a covariant override in the subinteface was defined to return a ReflectionFooElement.
  • If a method in the base interface was defined to return a List< ? extends FooElement>, a covariant override in the subinteface was defined to return a List<ReflectionFooElement>.
  • For getBar(FooElement arg) methods defined in the Elements helper interface, add a getBar() method to the ReflectionFooElement subinterface.
  • Add a getSource method to the root ReflectionElement interface to return the underlying core reflection object being adapted and add covariant overrides in subinterfaces as appropriate.
  • The root ReflectionElement interface implements the full AnnotatedElement interface, not just the subset of its methods found in the base javax.lang.model.element.

A cost of specializing the API is the need to define a new visitor interface as well. Covariant overrides cannot be used as in the element modeling interfaces since the visitor methods take elements in the argument position rather than the return position.

Generally, working on the core reflection specialization proceeded as expected. Equality determinations were generally delegated to core reflection source object; in other words, two ReflectionElement objects are equal if they are instances of the same interface and if their sources are .equals. Writing the sample code highlighted several shortcomings of the core reflection API which have been addressed in Java SE 8, including the addition of an Executable type to abstract over the commonalities of Method and Constructor. The sample code also benefited from other reflection changes in Java SE 8, such as the introduction of a java.lang.reflect.Parameter class and retrofitting TypeVariable to extend AnnotatedElement.

While significantly more testing would be needed to productize javax.lang.model backed by core reflection, even as sample code it validates a number of the technological decisions made in JSR 269 and is an interesting demonstration of how one of the platform's reflective APIs can be bridged to another.

Saturday Sep 29, 2012

Annotation Processing Virtual Mini-Track at JavaOne 2012

Putting together the list of JavaOne talks I'm interested in attending, I noticed there is a virtual mini-track on annotation processing and related technology this year, with a combination of bofs, sessions, and a hands-on-lab:

As the lead engineer on bot apt (rest in peace) in JDK 5 and JSR 269 in JDK 6, I'd be heartened to see greater adoption and use of annotation processing by Java developers.

Friday Feb 10, 2012

The passing of apt

With a pair of changesets pushed recently, the time for apt to be included in the JDK has drawn to a close, nearly eight years after first being added to the platform. In the Mythical Man-Month sense, apt was always planned to be the one we threw away, we just didn't realize how slow the windup and pitch would be!

The API, but not implementation, of apt were among the first components of the JDK to be released as open source. I learned a lot about technologies and project managment while working on apt, and it was quite satisfying to carry those lessons over to the "second system" of annotation procesing in JSR 269.

Friday Dec 02, 2011

An apt ending draws nigh

I brought you into this world, and I'll take you out!
—Cliff Huxtable

The end of an era draws nigh! After being deprecated in JDK 7, the apt command line tool and the entirely of its associated API is on track to be removed from JDK 8 within the next few months. While apt was fine back in JDK 5, the time has come to transition annotation processing to the superior standardized annotation processing provided by javax.annotation.processing and javax.lang.model.*. These packages were added to Java SE 6 under JSR 269.

This removal effort was discussed in JEP 117: Remove the Annotation-Processing Tool (apt).

Portions of jax-ws in the JDK use apt, but those portions are being rewritten to use the JSR 269 APIs. Once that revised version of jax-ws is being used by the JDK builds, apt will be excised in short order.

As a com.sun.* API, apt is not part of Java SE; it is just a component of the JDK and is thus easier to remove from the platform. While I was the lead in creating apt, lo these many years ago, I'm looking forward to deleting the code from the JDK to encourage use of a better replacement API and to ease maintenance of javac.

Monday Apr 25, 2011

JSR 269 Maintenance Review Concludes

The previously discussed maintenance review of JSR 269, after a slight extension, has concluded. Compared to the start of the cycle, the only change made during the review period was to rename the "disjunction"/"disjunctive" API elements to "union."

Monday Mar 14, 2011

JSR 269 Maintenance Review

As a planned part of Java SE 7, JSR 269, which standardized an API for annotation processing, is now undergoing maintenance review. In the JCP, a maintenance review is a process to take comments on small changes so that those small changes can be formally incorporated into an existing specification without running a whole new JSR. The changes being proposed in the JSR 269 maintenance review are the changes already implemented in the JSR 269 APIs in JDK 7. In summary, those proposed changes are:

  • Clarified interaction between the Filer and rounds.

  • Constructors explicitly added to the kinds of elements that can be returned by RoundEnvironment.getElementsAnnotatedWith.

  • New enum constant javax.lang.model.SourceVersion.RELEASE_7.

  • In the package description of javax.lang.model.element, requirements on when a model must be provided are loosened to remove the requirement in case of an "irrecoverable error that could not be removed by the generation of new types," a condition which includes but is not limited to syntax errors.

  • New exception type javax.lang.model.UnknownEntityException added as a common superclass for existing exception types UnknownAnnotationValueException, UnknownElementException, and UnknownTypeException.

  • New enum constant javax.lang.model.element.ElementKind.RESOURCE_VARIABLE.

  • New mixin interfaces Parameterizable and QualifiedNameable added to package javax.lang.model.element. ExecutableElement and TypeElement are retrofitted to extend Parameterizable; PackageElementand TypeElement are retrofitted to extend QualifiedNameable.

  • Behavior of getEnclosingElement method defined to return the generic element of a type parameter instead of null.

  • New interface javax.lang.model.type.DisjunctiveType to model disjunctive types.

  • New enum constant javax.lang.model.type.TypeKind.DISJUNCTIVE to mark disjunctive types.

  • New method visitDisjunctive added to visitor interface javax.lang.model.type.TypeVisitor. Utility visitor implementations updated accordingly.

  • In the package javax.lang.model.type, MirroredTypesException retrofitted to be the superclass of MirroredTypeException.

  • New utility visitors for release 7 in package javax.lang.model.util:

    • AbstractAnnotationValueVisitor7

    • AbstractElementVisitor7

    • AbstractTypeVisitor7

    • ElementKindVisitor7

    • ElementScanner7

    • SimpleAnnotationValueVisitor7

    • SimpleElementVisitor7

    • SimpleTypeVisitor7

    • TypeKindVisitor7

  • The visitors ElementKindVisitor6, ElementScanner6, and SimpleElementVisitor6, are updated to account for new element kind RESOURCE_VARIABLE.

  • The visitor AbstractTypeVisitor6 is updated to account for the possibility of visiting a DisjunctiveType.

  • Definition of documentation comment added to javadoc of javax.lang.model.util.Elements.getDocComment.

Monday Nov 15, 2010

Original apt API files

With the transition of the site to a new infrastructure, the time has come to retire the long-hosted but dormant apt mirror API project, The old source bundles in the project are just the JDK 5 apt API; the implementation was not included. This apt project predated OpenJDK; with OpenJDK, the apt API and implementation (and much, much more!) has been available under an open source license for several years. Additionally, annotation processing should transition away from the first-generation apt to the much-improved and standardized JSR 269 annotation processing available since JDK 6 and supported directly in javac. Moreover, the apt tool and API have been deprecated as of JDK 7.

Purely for historical interest, I'm placing an archive of the apt source bundles on this blog:

Tuesday Jul 06, 2010

Project Coin: Bringing it to a Close(able)

As a follow-up to the initial API changes to support automatic resource management (ARM) I wrote an annotation processor, CloseableFinder, to programmatically look for types that were candidates to be retrofitted as Closeable or AutoCloseable.

The processor issues a note for a type that has a public no-args instance method returning void whose name is "close" where the type does not already implement/extend Closeable or AutoCloseable. Based on the exceptions a close method is declared to throw, the processor outputs whether the type is a candidate to be retrofitting to just AutoCloseable or to either of Closeable and AutoCloseable. Which of Closeable and AutoCloseable is more appropriate can depend on the semantics of the close method not captured in its signature. For example, Closeable.close is defined to be idempotent, repeated calls to close have no effect. If a close method is defined to not be idempotent, without changing the specification the type can only be correctly retrofitted to AutoCloseable.

To use the processor, first compile it and then configure your compiler or IDE to run the processor. The processor can be compiled under JDK 6. Once compiled, it can be run either under JDK 6 or under a JDK 7 build that has the AutoCloseable interface; the processor will configure itself appropriately based on the JDK version it is running under. For javac, the command line to run the processor can look like:

javac -proc:only \\
-processor CloseableFinder \\
-processorpath Path_to_processor \\

A thread on build-dev discusses how to run an annotation processor over the JDK sources; a larger than default heap size may be needed to process all the files in one command. When run over the JDK 7 sources, the processor finds many candidate types to be retrofitted. After consulting with the teams in question, an additional nine types were retrofitted to work with ARM, two in java.beans, two in, one in java.util, and four in javax.sound; these additional retrofittings have been pushed into JDK 7 and will appear in subsequent builds.

Besides the potential updating of JDBC at some point in the future, other significant retrofitting of JDK classes in java.\* and javax.\* to AutoCloseable/Closeable should not be expected. Unofficial JDK APIs in other namespaces might be examined for retrofitting in the future. The compiler changes to support the ARM language feature remain in progress.

Monday Mar 15, 2010

Beware of Covariant Overriding in Interface Hierarchies

One of the changes to the Java programming language made back in JDK 5 was the introduction of covariant returns, that is, the ability in a subtype to override a method in a supertype and return a more specific type. For example,

public class A {
  public Object method() {return null;}

public class B extends A {
  public String method() {return "";}

Covariant returns can be a very handy facility to more accurately convey the type of object returned by a method. However, the feature should be used judiciously, especially in interface hierarchies. In interface hierarchies, covariant returns force constraints on the implementation classes. Such a constraint was included the in the apt API modeling the Java language and was subsequently removed from the analogous portion of the standardized JSR 269 API in javax.lang.model.\*.

In apt, the TypeDeclaration interface defines a method
Collection<? extends MethodDeclaration> getMethods().
In the sub-interface ClassDeclaration, the method is overriden with
Collection<MethodDeclaration> getMethods()
and in another sub-interface, AnnotationTypeDeclaration, the method is overriden with
Collection<AnnotationTypeElementDeclaration> getMethods().
Consequently, it is not possible for a single class to implement both the ClassDeclaration and AnnotationTypeDeclaration interfaces since the language specification forbids having two methods with the same name and argument types but different return types (JLSv3 §8.4.2). (This restriction does not exist at the class file level and a compiler will generate synthetic bridge methods with this property when implementing covariant returns.) If a compiler chose to use a single type to model all kinds of types (classes, enums, interfaces, annotation types), it would not be able to be directly retrofitted to implement the entirely of this apt API; wrapper objects would need to be created just to allow the interfaces to be implemented at a source level.

In contrast, in JSR 269 the root modeling interface Element defines a List<? extends Element> getEnclosedElements() method which returns all kinds of enclosed elements, from fields, to constructors, to methods. Elements of a particular type can than be extracted using a filter. This approach provides more flexibility in retrofitting the interfaces onto an existing implementation; a spectrum of implementations are possible, from a single type to represent all sorts of elements to a one-to-one correspondence of implementation types to interface types.

Note that in cases where an implementation type collapses several interface types, instanceof checks for an interface type are not necessarily useful since implementing one interface does not imply no other related interface is implemented. The Element specification warns of this possibility:

To implement operations based on the class of an Element object, either use a visitor or use the result of the getKind() method. Using instanceof is not necessarily a reliable idiom for determining the effective class of an object in this modeling hierarchy since an implementation may choose to have a single object implement multiple Element subinterfaces.

Friday Mar 12, 2010

Last Round Compiling

As of build 85 of JDK 7, bug 6634138 "Source generated in last round not compiled" has been fixed in javac. Previously, source code generated in a round of annotation processing where RoundEnvironment.processingOver() was true was not compiled. With the fix, source generated in the last round is compiled, but, as intended, while compiled such source still does not undergo annotation processing since processing is over. The fix has also been applied to OpenJDK 6 build 19.

Annotation Processor SourceVersion

In annotation processing there are three distinct roles, the author of the annotation types, the author of the annotation processor, and the client of the annotations. The third role includes the responsibility to configure the compiler correctly, such as setting the source, target, and encoding options and setting the source and class file destination for annotation processing. The author of the annotation processor shares a related responsibility: property returning the source version supported by the processor.

Most processors can be written against a particular source version and always return that source version, such as by including a @SupportedSourceVersion annotation on the processor class. In principle, the annotation processing infrastructure could tailor the view of newer-than-supported language constructs to be more compatible with existing processors. Conversely, processors have the flexibility to implement their own policies when encountering objects representing newer-than-supported structures. In brief, by extending version-specific abstract visitor classes, such as AbstractElementVisitor6 and AbstractTypeVisitor6, the visitUnknown method will be called on entities newer than the version in question.

Just as regression tests inside the JDK itself should by default follow a dual policy of accepting the default source and target settings rather than setting them explicitly like other programs, annotation processors used for testing with the JDK should generally support the latest source version and not be constrained to a particular version. This allows any issues or unexpected interactions of new features to be found more quickly and keeps the regression tests exercising the most recent code paths in the compiler.

This dual policy is now consistently implemented in the langtools regression tests as of build 85 of JDK 7 (6926699).

Wednesday Feb 24, 2010

API Design: Identity and Equality

When designing types to be reused by others, there are reasons to favor interfaces over abstract classes. One complication of using an interface-based approach stems from defining reasonable behavior for the equals and hashCode methods, especially if different implementations are intended to play well together when used in data structures like collections, in particular if an interface type is meant to serve as the key of a map or as the element type of a set.

Some interfaces, like CharSequence, are designed to not be a usable type for a map key or an element type of a set:

[The CharSequence] interface does not refine the general contracts of the equals and hashCode methods. The result of comparing two objects that implement CharSequence is therefore, in general, undefined. Each object may be implemented by a different class, and there is no guarantee that each class will be capable of testing its instances for equality with those of the other. It is therefore inappropriate to use arbitrary CharSequence instances as elements in a set or as keys in a map.

Amongst other problems, CharSequences are not required to be immutable so in general there are always hazards from time of check to time of use conditions.

Even if a type is not suitable as a map key, it can be fine as the type of the value to which a key gets mapped. Likewise, even if type cannot serve as the element type of a set, it can often still be perfectly fine as the element type of a list.

Expanding on a slide from my JavaOne talk Tips and Tricks for Using Language Features in API Design and Implementation, for interface types intended to be used as map keys or set elements, equality can be defined in several ways. First, equality can be defined solely in terms of information retrievable from methods of the interface. Alternatively, equality can be defined in terms of information retrievable via the interface methods as well as additional information. Finally, object identity (the == relation) is always a valid definition for equals and often a good implementation choice.

An example of the first kind of equality definition is specified for annotation types:

Returns true if the specified object represents an annotation that is logically equivalent to this one. In other words, returns true if the specified object is an instance of the same annotation type as this instance, all of whose members are equal to the corresponding member of this annotation, as defined below: ...

Returns the hash code of this annotation, as defined below:
The hash code of an annotation is the sum of the hash codes of its members (including those with default values), as defined below:...

A consequence of defining equality in this manner is that the hashCode algorithm must also be specified. If it were not specified, the equals/hashCode contract would be violated since equal objects must have equal hashCodes. Therefore, different implementations of this style of interface must have enough information to implement the equals method and have a precise algorithm for hashCode.

An annotation type is a kind of interface. At runtime, dynamic proxies are used to create the core reflection objects implementing annotation types, such as the objects returned by the getAnnotation method. After a quick identity check, the equals algorithm used in the proxy sees if the annotation type of the two annotation objects is the same and then compares the results of the annotation type's methods. This indirection allows the annotation objects from core reflection to interact properly with other implementations of annotation objects. The annotation objects generated for annotation processing in apt and javac both use the same underlying implementation as core reflection. However, completely independent annotation implementations are fine too. For example, the code below

import javax.annotation.processing.\*;
import javax.lang.model.SourceVersion;
import java.lang.annotation.\*;
import java.lang.reflect.\*;
import java.util.\*;

 \* Demonstrate equality of different annotation implementations.
public class AnnotationEqualityDemonstration {
    static class MySupportedSourceVersion implements SupportedSourceVersion {
        private final SourceVersion sourceVersion;

        private MySupportedSourceVersion(SourceVersion sourceVersion) {
            this.sourceVersion = sourceVersion;

        public Class annotationType() {
            return SupportedSourceVersion.class;

        public SourceVersion value() {
            return sourceVersion;
        public boolean equals(Object o) {
            if (o instanceof SupportedSourceVersion) {
                SupportedSourceVersion ssv = (SupportedSourceVersion) o;
                return ssv.value() == sourceVersion;
            return false;

        public int hashCode() {
            return (127 \* "value".hashCode()) \^ sourceVersion.hashCode();

    public static void main(String... args) {
        SupportedSourceVersion reflectSSV =
        SupportedSourceVersion localSSV = 
            new MySupportedSourceVersion(reflectSSV.value());

        System.out.println("reflectSSV == localSSV is " +
                           (reflectSSV == localSSV));

        System.out.println("reflectSSV.equals(localSSV) is " +

        System.out.println("localSSV.equals(reflectSSV) is " +

        System.out.println("reflectSSV.getClass()equals(localSSV.getClass()) is " +

        System.out.println("\\nreflectSSV.hashCode() is " +

        System.out.println("localSSV.hashCode()   is " +

when run outputs:

reflectSSV == localSSV is false
reflectSSV.equals(localSSV) is true
localSSV.equals(reflectSSV) is true
reflectSSV.getClass()equals(localSSV.getClass()) is false

reflectSSV.hashCode() is 1867635603
localSSV.hashCode()   is 1867635603

The second kind of equality definition is specified for the language modeling interfaces in the javax.lang.model.element package:

Note that the identity of an element involves implicit state not directly accessible from the element's methods, including state about the presence of unrelated types. Element objects created by different implementations of these interfaces should not be expected to be equal even if "the same" element is being modeled; this is analogous to the inequality of Class objects for the same class file loaded through different class loaders.

Inside javac, instance control is used for the implementation classes for javax.lang.model.element.Element subtypes. This allows the default pointer equality to be used and allows the hashing algorithm to not be specified. Just as you can't step in the same river twice, the identity of an Element object is tied to the context in which it is created. Operationally, one consequence of this context sensitivity is that Element objects modeling "the same" type produced during different rounds of annotation processing will not be equal even if there are equivalent methods, fields, constructors, etc. in both types in both rounds.

When independent implementations of an interface and not required to be equal to one another, the hashCode algorithm does not need to be specified, providing the implementer more flexibility. This second style of specification allows disjoint islands of implementations to be defined.

Which style of specification is more appropriate depends on how the interface type is intended to be used. Defining interoperable implementation is more difficult and limits the ability of the interface to be retrofitted onto existing types. For example, while the Element interface and other interfaces from JSR 269 were successfully implemented by classes in both javac and Eclipse, it would be impractical to expect Element objects from those disparate implementations to compare as equal. Mixin interfaces, like CharSequence and Closeable, should be cautious in defining equals behavior if the interface is intended to be widely implemented. In some cases, a mixin interface can finesse this issue by being limited to an existing type hierarchy with already defined equals and hashCode polices. For example, the Parameterizable and QualifiedNameable interfaces added to the javax.lang.model.element package in JDK 7 (6460529) are extensions to javax.lang.model.Element and therefore get to reuse the existing policies quoted above.




« July 2016

No bookmarks in folder