Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

JDT Core/Null Analysis/Options

Note.png
Disclaimer
This page is currently subject to discussion and does not necessarily describe the actual implementation in the JDT


The analysis of possible null pointer exceptions performed by the Eclipse Java Compiler is powerful, but the full power may result in a large number of errors and warnings. Not all projects can afford addressing all these issues.

One potential solution might be to make the compiler "smarter" by adding heuristics to "guess" which problems are probably uninteresting, and which ones are the real bugs.

This page describes the opposite approach: let the compiler soberly perform its analysis without any bias to any coding style. However, empower the user to stepwise enable this analysis for gradually achieving more and more safety.

The following sections each address one potential reason why null pointer problems might remain undetected, pointing at strategies for avoiding each particular kind of risk.

Incomplete Specification

If you start with an existing project and only turn on annotation based null analysis, you will not notice any difference. This is because the analysis implicitly supports three kinds of reference types:

  • @NonNull types
  • @Nullable types and
  • types with unspecified nullness (legacy types).

So before you actually add annotations to your methods and fields, all types will be considered as legacy types and so the unsafe semantics of Java apply where both assigning null and dereferencing are legal.

Adding individual annotations

This path is obvious: add just those annotations where you are certain about the design intent and after each annotation added address the problems reported by the compiler.

Making nonnull the default

Instead of going in tiny little steps you may want to do jumps of some size (cf. bug 331647):

  • annotate a method as @NonNullByDefault to affect all its parameters and its return
  • annotate a type as @NonNullByDefault to affect all its methods and fieds
  • annotate a package as @NonNullByDefault (using package-info.java) to affect all its types
  • define a global policy that all packages should be @NonNullByDefault
    Unfortunately, a global default cannot directly be established but only via the indirection of package level defaults (see bug 366063 for background).

A particular problem arises when a type affected by @NonNullByDefault is a subtype of a legacy type with legacy signatures. It is illegal to override a method with legacy parameters by a method with @NonNull parameters. Specifying @Nullable for all parameters in such an overriding method may be too pessimistic, forcing the method body to do more null checks than actually useful.

For this situation a parameter has been added to the default annotation. Declaring a type that inherits from a legacy type with @NonNullByDefault(false) cancels the applicable default.

Third-party code

Every project depends on third party code, which it doesn't control, so adding null annotations is not directly possible.

To address this issue, the Eclipse Java Compiler should support nullity profiles, aka external annotations: separate files that capture the factual null contracts of all API methods and fields contained in a given library.

Support for this feature is planned for version 3.9, see bug 331651.

Avoiding the risk of incompleteness

In order to ensure that no unchecked legacy types are used in an application, these things must be ensured:

  • @NonNullByDefault is the globally enforced default
  • @NonNullByDefault(false) is never used
  • all libraries come with API-complete nullity profiles

Each of these steps brings a project closer to the safety guarantees of complete analysis.

TODO: Issue a configurable warning when @NonNullByDefault(false) is used.

TODO: Add a note to bug 331651 for checking complete coverage of referenced libraries by available nullity profiles.

Side effects

The simplest problem with analysing the nullness of fields results from side effects in methods:

   if (this.f != null)
     System.out.println(this.toString() + this.f.toString());

In contrived situations the execution of this.toString() could potentially assign null to f thus invalidating the above null check.

Avoiding risks of side effects regarding null analysis

The most common model for avoiding that side effects spoil null analysis is to discard any nullness information for fields whenever a method call is seen. In the above example this would imply that the call this.f.toString() is not considered as safe.

Aliasing

When analysing nullness of fields the following snippet demonstrates how aliasing threatens the validity:

class X {
    Object f;
    void foo(X other) {
        if (this.f != null) {
            other.f = null;
            System.out.println(this.f.toString()); // potential NPE
        }
    }
    void breakIt() {
        foo(this); // definitely triggers the NPE
    }
}

Without further annotations an intra-procedural analysis cannot see that the assignment to other.f affects the value of this.f.

A potential remedy would be to discard all null information for a field once an assignment to the same field is seen, regardless of the receiver. This would put the analysis in a state where at the toString() invocation we have no information about f's null status.

Another remedy is to restrict analysis of fields by a simplistic implied ownership model: consider that each non-static field is owned by the enclosing object and that only references via this have permission to update the field. Conversely, only for read access via this can we assume that our analysis has the information of the field's null status. This analysis would falsely consider the above example as safe. If additionally assignment to foreign fields like other.f would be prohibited the bug could be detected, regaining safety.

More strategies could be thought of. Without ownership annotations none of these are fully satisfactory. Usefulness highly depends on the coding style: if objects strictly encapsulate their fields, the poor-men's ownership model would be suitable. If access to foreign fields is common other rules are a better fit.

Avoiding the risk of aliasing regarding null analysis

Null analysis for the following kinds of fields is actually unaffected by aliasing:

  • final fields
  • @NonNull fields

For both kinds of fields only object initialization may produce unexpected results (yes, even Java's definite assignment rule can be circumvented). Once initialized their null status is either known or amenable to flow analysis.

For non-final @Nullable fields the simplest strategy for avoiding nullness related risks due to aliasing is to pessimistically assume potential null at every read. This means for strict checking no flow analysis should be applied to @Nullable fields.

While this appears to be a very drastic restriction, the remedy is quite easy: before dereferencing a @Nullable field it has to be assigned to a local variable. Flow analysis is then safely applied to the local variable with no risk of aliasing, since local variables are truly owned.

Concurrency

Also concurrency can invalidate the guarantees of flow analysis. In a concurrent application not even this snippet is safe:

   if (this.f != null)
       System.out.println(this.f.toString());

Thus concurrency adds to the problems of side effects and aliasing, because null information from one expression may already be invalid when evaluating the very next expression.

A pragmatic remedy is to let nullness information "age" quickly. However, it is unclear, how "quick" would be quick enough, given the pessimistic observation above.

Avoiding the risks of concurrency regarding null analysis

Again the same drastic restriction helps, that already helped for aliasing: don't rely on flow analysis for fields. Local variables are not subject to concurrent access (with the only exception of local variables shared by local classes, but then the local variable must be final and thus cannot change its null status).

Summary

Any flow analysis performs much better for local variables than for fields. Shifting all serious computation to local variables makes for much safer code. Whether or not a project can afford to fully enforce this strategy depends on many factors.

Two exceptions exist:

  • @NonNull fields don't require any flow analysis (expect for the phase of object initialization).
  • Flow analysis for unannotated final fields is useful to find out their null status. Once determined, this status cannot change (except again for initialization issues).

This lets me conclude that these questions deserve further investigation:

  • How can gradual migration to a fully safe coding style fully based on local variables be supported. I.e., what intermediate levels are useful, what configuration options are needed and what warning messages best convey the vagueness/certainty of each issue?
  • How can the above options be communicated to users? Is it OK to offer options like "perform flow analysis for fields", or should options be labeled as "consider aliasing", or "pessimistically consider concurrency for null analysis"?
  • How can initialization be made safer? How relevant are these issues in practice? Which annotation / set of annotations is the best buy? Is support for several styles required?

Back to the top