Today i’m going to talk about setting up the development environment for running the augmented reality demo shown in this video from my previous post:

Here are the steps:

  • Download and install Java Media Framework (JMF)
  • Download Java3D and place the native / java libraries in the JDK installation folder
  • Download NYArToolkit for Java and unzip somewhere on your machine
  • Download Trident
  • Print out one of the PDF markers in the NYArToolkit/Data. For my demo, i’m using pattHiro.pdf
  • Import all projects under NYArToolkit in your Eclipse workspace. You can ignore the ones for JOGL and QuickTime for this specific demo.
  • In these projects, tweak the build path to point to the JMF / Java3D jars on your system.
  • Plug in your web camera.
  • Run jp.nyatla.nyartoolkit.jmf.sample.NyarToolkitLinkTest class to test that JMF has been installed correctly and can display the captured feed.
  • Run jp.nyatla.nyartoolkit.java3d.sample.NyARJava3D class to test the tracking capabilities of the NYArToolkit. Once the camera feed is showing, point the camera to the printed marker so that it is fully visible. Once you have it on your screen, a colored cube should be shown. Try rotating, moving and tilting the marker printout – all the while keeping it in the frame – to verify that the cube is properly oriented.

The demo from the video above is available as a part of the new Marble project. Here is how to set it up:

  • Sync the latest SVN tip of Marble.
  • Create an Eclipse project for Marble. It should have dependencies on NYArToolkit, NYArToolkit.utils.jmf, NYArToolkit.utils.java3d and Trident. It should also have jmf.jar, vecmath.jar, j3dcore.jar and j3dutils.jar in the build path.
  • Run the org.pushingpixels.marble.MarbleFireworks3D class. Follow the same instructions as above to point the webcam.

There’s not much to the code in this Marble demo. It follows the NyARJava3D class, but instead of static Java3D content (color cube) it has a dynamic scene that is animated by Trident. For the code below note that i’m definitely not an expert in Java3D and NYArToolkit, so there might as well be a simpler way to do these animations. However, they are enough to get you started in exploring animations in Java-powered augmented reality.

Each explosion volley is implemented by a collection of Explosion3D objects. Each such object models a single explosion “particle”. Here is the constructor of the Explosion3D class:

public Explosion3D(float x, float y, float z, Color color) {
   this.x = x;
   this.y = y;
   this.z = z;
   this.color = color;
   this.alpha = 1.0f;

   this.sphere3DTransformGroup = new TransformGroup();
   this.sphere3DTransformGroup
      .setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));
   this.sphere3DTransformGroup.setTransform(mt);
   this.appearance3D = new Appearance();
   this.appearance3D
      .setCapability(Appearance.ALLOW_TRANSPARENCY_ATTRIBUTES_WRITE);
   this.appearance3D
      .setCapability(Appearance.ALLOW_COLORING_ATTRIBUTES_WRITE);
   this.appearance3D.setColoringAttributes(new ColoringAttributes(color
      .getRed() / 255.0f, color.getGreen() / 255.0f,
      color.getBlue() / 255.0f, ColoringAttributes.SHADE_FLAT));
   this.appearance3D.setTransparencyAttributes(new TransparencyAttributes(
      TransparencyAttributes.BLENDED, 0.0f));
   this.sphere3D = new Sphere(0.002f, appearance3D);
   this.sphere3DTransformGroup.addChild(this.sphere3D);

   this.sphere3DBranchGroup = new BranchGroup();
   this.sphere3DBranchGroup.setCapability(BranchGroup.ALLOW_DETACH);

   this.sphere3DBranchGroup.addChild(this.sphere3DTransformGroup);
}

Here, we have a bunch of Java3D code to create a group that can be dynamically changed at runtime. This group has only one leaf – the Sphere object.

As we’ll see later, the timeline behind this object changes its coordinates and the alpha (fading it out). Here is the relevant public setter for the alpha property:

public void setAlpha(float alpha) {
   this.alpha = alpha;

   this.appearance3D.setTransparencyAttributes(new TransparencyAttributes(
      TransparencyAttributes.BLENDED, 1.0f - alpha));
}

Not much here – just updating the transparency of the underlying Java3D Sphere object. The setters for the coordinates are quite similar:

public void setX(float x) {
   this.x = x;

   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));
   this.sphere3DTransformGroup.setTransform(mt);
}

public void setY(float y) {
   this.y = y;
   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));
   this.sphere3DTransformGroup.setTransform(mt);
}

public void setZ(float z) {
   this.z = z;
   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));
   this.sphere3DTransformGroup.setTransform(mt);
}

The main class is MarbleFireworks3D. Its constructor is rather lengthy and has a few major parts. The first part initializes the camera and marker data for the NYArToolkit core:

NyARCode ar_code = new NyARCode(16, 16);
ar_code.loadARPattFromFile(CARCODE_FILE);
ar_param = new J3dNyARParam();
ar_param.loadARParamFromFile(PARAM_FILE);
ar_param.changeScreenSize(WIDTH, HEIGHT);

Following that, there’s a bunch of Java3D code that initializes the universe, locale, platform, body and environment, and creates the main transformation group. The interesting code is the one that creates the main scene group that will hold the dynamic collection of Explosion3D groups:

mainSceneGroup = new TransformGroup();
mainSceneGroup.setCapability(TransformGroup.ALLOW_TRANSFORM_WRITE);
mainSceneGroup.setCapability(Group.ALLOW_CHILDREN_EXTEND);
mainSceneGroup.setCapability(Group.ALLOW_CHILDREN_WRITE);
root.addChild(mainSceneGroup);

nya_behavior = new NyARSingleMarkerBehaviorHolder(ar_param, 30f,
   ar_code, 0.08);
nya_behavior.setTransformGroup(mainSceneGroup);
nya_behavior.setBackGround(background);

root.addChild(nya_behavior.getBehavior());
nya_behavior.setUpdateListener(this);

locale.addBranchGraph(root);

The NyARSingleMarkerBehaviorHolder is a helper class from the NYArToolkit.utils.java3d project. It tracks the transformation matrix computed by NYArToolkit based on the current location of the marker and updates the transformation set on the main scene group. As you will see later, there is no explicit handling of the marker tracking in the demo code – only creation, update and deletion of the Explosion3D objects.

Finally, we create a looping thread that adds random firework explosions:

// start adding random explosions
new Thread() {
   @Override
   public void run() {
      while (true) {
         try {
            Thread.sleep(500 + (int) (Math.random() * 1000));
            float x = -1.0f + 2.0f * (float) Math.random();
            float y = -1.0f + 2.0f * (float) Math.random();
            float z = (float) Math.random();
            addFireworkNew(x * 5.0f, y * 5.0f, 5.0f + z * 12.0f);
         } catch (Exception exc) {
         }
      }
   }
}.start();

The code in this method computes a 3D uniform distribution of small spheres that originate at the specific location (explosion center) and move outwards. Each Explosion3D object is animated with the matching timeline. The timeline interpolates the alpha property, as well as the coordinates. As you can see, while x and y are interpolated linearly, the interpolation of z takes the gravity into the account – making the explosion particles fall downwards. All the timelines are added to a parallel timeline scenario. Once a timeline starts playing, the matching branch group is added to the main scene graph. Once the timeline scenario is done, all the branch groups are removed from the main scene graph:

private void addFireworkNew(float x, final float y, final float z) {
   final TimelineScenario scenario = new TimelineScenario.Parallel();
   Set scenarioExplosions = new HashSet();

   float R = 6;
   int NUMBER = 20;
   int r = (int) (255 * Math.random());
   int g = (int) (100 + 155 * Math.random());
   int b = (int) (50 + 205 * Math.random());
   Color color = new Color(r, g, b);
   for (double alpha = -Math.PI / 2; alpha <= Math.PI / 2; alpha += 2
      * Math.PI / NUMBER) {
      final float dy = (float) (R * Math.sin(alpha));
      final float yFinal = y + dy;
      float rSection = (float) Math.abs(R * Math.cos(alpha));
      int expCount = Math.max(0, (int) (NUMBER * rSection / R));
      for (int i = 0; i < expCount; i++) {
         float xFinal = (float) (x + rSection
            * Math.cos(i * 2.0 * Math.PI / expCount));
         final float dz = (float)(rSection
            * Math.sin(i * 2.0 * Math.PI / expCount));
         float zFinal = z + dz;

         final Explosion3D explosion = new Explosion3D(x * SCALE, y * SCALE,
            z * SCALE, color);
         scenarioExplosions.add(explosion);

         final Timeline expTimeline = new Timeline(explosion);
         expTimeline.addPropertyToInterpolate("alpha", 1.0f, 0.0f);
         expTimeline.addPropertyToInterpolate("x", x * SCALE, xFinal
            * SCALE);
         expTimeline.addPropertyToInterpolate("y", y * SCALE, yFinal
            * SCALE);
         expTimeline.addCallback(new TimelineCallbackAdapter() {
            @Override
            public void onTimelinePulse(float durationFraction,
               float timelinePosition) {
               float t = expTimeline.getTimelinePosition();
               float zCurr = (z + dz * t - 10 * t * t) * SCALE;
               explosion.setZ(zCurr);
            }
         });
         expTimeline.setDuration(3000);

         expTimeline.addCallback(new TimelineCallbackAdapter() {
            @Override
            public void onTimelineStateChanged(TimelineState oldState,
               TimelineState newState, float durationFraction,
               float timelinePosition) {
               if (newState == TimelineState.PLAYING_FORWARD) {
                  mainSceneGroup.addChild(explosion
                     .getSphere3DBranchGroup());
               }
            }
         });

         scenario.addScenarioActor(expTimeline);
      }
   }

   synchronized (explosions) {
      explosions.put(scenario, scenarioExplosions);
   }
   scenario.addCallback(new TimelineScenarioCallback() {
      @Override
      public void onTimelineScenarioDone() {
         synchronized (explosions) {
            Set ended = explosions.remove(scenario);
            for (Explosion3D end : ended) {
               mainSceneGroup
                  .removeChild(end.getSphere3DBranchGroup());
            }
         }
      }
   });
   scenario.play();
}

The dependent libraries that are used here do the following:

  • JMF captures the webcam stream and provides the pixels to put on the screen
  • NYArToolkit processes the webcam stream, locates the marker and computes the matching transformation matrix
  • Java3D tracks the scene graph objects, handles the depth sorting and the painting of the scene
  • Trident animates the location and alpha of the explosion particles

Over the past few days i’ve been playing with adding animations to augmented reality applications. The code is in a rough shape, since i’m juggling the relevant libraries for the first time, and i’m going to publish it over the next week. Here is the first video that shows my current progress:

What is it using?

I’m not going to take credit for this demo. The vast majority of work is being done by JMF, NYArToolkit and Java3D (and they’re quite painful to set up, by the way). The actual visuals are amateur at best and can look much better. But at least it shows where you can take your Java today :)

Trident is an animation library for Java applications, and this week i’ve written about the concepts behind it and APIs available to interested applications:

What’s next? Version 1.0 (code-named Acumen) is right around the corner. Release candidate is scheduled for June 29, and the final release is scheduled for July 13.

Trident is a new library, and its APIs need to be tested in real-world scenarios. While i have a few small test applications that illustrate the specific API methods, as well as medium sized demos (Onyx and Amber), there is always room for improvement.

Going forward, i intend to evolve Trident, and i already have a couple of post-1.0 features in the pipeline. Trident has evolved from the internal animation layer of Substance look-and-feel, and the next major release of Substance will be rewritten to use Trident – further testing the published APIs for usage in real-world scenarios. In addition, the next major release of Flamingo ribbon will add Trident-based animations – where applicable.

Your input and feedback are always highly appreciated. Download the latest daily bits, and read the documentation. Subscribe to the mailing lists and let me know what is missing, and how the existing APIs can be improved. If you find a bug, report it in the issue tracker. If you want to take a look at the code, check out the SVN repository and subscribe to the “commits” mailing list.

Swing / SWT applications do not have to be boring. Trident aims to simplify the development of rich animation effects in Java based UI applications, addressing both simple and complex scenarios. But it can only be as good as the applications that are using it. So, read the documentation, download the sources / binaries, integrate it in your applications and let me know what you think.

Over the course of this week i’m talking about different concepts in the Trident animation library for Java applications. Part ten talks about the plugin layer allows interested applications to provide additional property interpolators for custom application classes and support additional Java-based UI toolkits.

Plugin overview

The core functionality of the Trident library can be extended to address custom needs of the specific applications. Out of the box Trident supports:

  • Interpolating float and integer fields of any Java object that provides the matching public setter methods.
  • Swing and SWT UI toolkits, respecting the threading rules and providing interpolators for the custom graphic classes

The extensibility (plugin) layer allows interested applications to:

  • Provide additional property interpolators for custom application classes
  • Support additional Java-based UI toolkits

Creating a Trident plugin

A Trident plugin is specified by the META-INF/trident-plugin.properties file that should be placed in the runtime classpath. Note that you can have multiple plugins in the same runtime environment – if each one is coming from a different classpath jar, for example.

The format of trident-plugin.properties is simple. Each line in this file should be of the following format:

Key=FullyQualifiedClassName

There are two supported keys:

Sample plugin specification

The core Trident library contains a plugin that supports Swing and SWT UI toolkits, as well as property interpolators for a few core classes. Here are the contents of the plugin descriptor:

UIToolkitHandler=org.pushingpixels.trident.swing.SwingToolkitHandler
PropertyInterpolatorSource=org.pushingpixels.trident.swing.AWTPropertyInterpolators

UIToolkitHandler=org.pushingpixels.trident.swt.SWTToolkitHandler
PropertyInterpolatorSource=org.pushingpixels.trident.swt.SWTPropertyInterpolators

PropertyInterpolatorSource=org.pushingpixels.trident.interpolator.CorePropertyInterpolators

Property interpolator plugins

The PropertyInterpolatorSource entries in the plugin descriptor files allow application code to provide property interpolators for custom application classes. The value associated with this key must be the fully qualified class name of an application class that implements the org.pushingpixels.trident.interpolator.PropertyInterpolatorSource interface.

This interface has one method – public Set<PropertyInterpolator> getPropertyInterpolators() – which returns a set of custom property interpolators. Custom property interpolators can be used in two ways:

  • The Timeline.addPropertyToInterpolate(String, Object, Object, PropertyInterpolator) API to explicitly specify the property interpolator to be used
  • The Timeline.addPropertyToInterpolate(String, Object, Object) API that will choose the property interpolator that matches the types of the from and to values

Property interpolator specification

The org.pushingpixels.trident.interpolator.PropertyInterpolator interface has two methods.

The public Class getBasePropertyClass() is used to choose the property interpolator in the Timeline.addPropertyToInterpolate(String, Object, Object). Internally, all registered property interpolators are queried to check whether they support the specified from and to values using the Class.isAssignableFrom(Class). The first property interpolator that has a match for both values will be used.

For example, the PointInterpolator in the core AWT property interpolator source (AWTPropertyInterpolators class) has the following implementation of this method:

	@Override
	public Class getBasePropertyClass() {
		return Point.class;
	}

The public T interpolate(T from, T to, float timelinePosition) is used to compute the interpolated value during the current timeline pulse. For example, the PointInterpolator in the core AWT property interpolator source (AWTPropertyInterpolators class) has the following implementation of this method:

	public Point interpolate(Point from, Point to, float timelinePosition) {
		int x = from.x + (int) (timelinePosition * (to.x - from.x));
		int y = from.y + (int) (timelinePosition * (to.y - from.y));
		return new Point(x, y);
	}

Bringing it together

Let’s look at the following Swing snippet that has a class with a Point field and a timeline that interpolates the value of that field:

import java.awt.*;

public static class MyRectangle {
	private Point corner = new Point(0, 0);

	public void setCorner(Point corner) {
		this.corner = corner;
	}

	...
}

Timeline move = new Timeline(rectangle);
move.addPropertyToInterpolate("corner", new Point(0, 0),
	new Point(100, 80));
move.playLoop(RepeatBehavior.REVERSE);

What happens when move.addPropertyToInterpolate is called? Internally, the Trident core looks at all available property interpolators and finds that the AWTPropertyInterpolators.PointInterpolator is the best match for the passed values (which are both java.awt.Points). Then, at every pulse of the move timeline, the MyRectangle.setCorner(Point) is called.

Note that the application code did not explicitly specify which property interpolator should be used.

UI toolkit plugins

Graphical applications are a natural fit for animations, and Trident core has built-in support for Swing and SWT. This support covers threading rules, custom property interpolators and repaint timelines.

The UIToolkitHandler entries in the plugin descriptor files allow application code to support additional Java-based UI toolkits. The value associated with this key must be the fully qualified class name of an application class that implements the org.pushingpixels.trident.UIToolkitHandler interface. Most modern UI toolkits have threading rules that the applications must respect in order to prevent application freeze and visual artifacts. The threading rules for both Swing and SWT specify that the UI-related operations must be done on a special UI thread, and the methods in the UIToolkitHandler are used to determine the relevance of these threading rules.

Respecting the threading rules

The UIToolkitHandler.isHandlerFor(Object) is used to determine whether the main timeline object is a component / widget for the specific UI toolkit. At runtime, all fields registered with the Timeline.addPropertyToInterpolate methods will be changed on the UI thread using the UIToolkitHandler.runOnUIThread method.

In the simple Swing example that interpolates the foreground color of a button on mouse rollover, the timeline is configured as

	Timeline rolloverTimeline = new Timeline(button);
	rolloverTimeline.addPropertyToInterpolate("foreground", Color.blue,
			Color.red);

If you put a breakpoint in the JComponent.setForeground(Color) – which is called on every timeline pulse – you will see that it is called on the Swing Event Dispatch Thread. Internally, this is what happens:

  • When the timeline is created, all registered UI toolkit handlers are asked whether they are handlers for the specified object
  • The org.pushingpixels.trident.swing.SwingToolkitHandler registered in the core library returns true for the button object in its isHandlerFor(Object)
  • On every timeline pulse, a Runnable object is created internally. The run() method calls the setters for all registered fields – using the PropertyInterpolator.interpolate method of the matching property interpolator
  • This Runnable is passed to the UIToolkitHandler.runOnUIThread

method of the matching UI toolkit handler.

And this is how SwingToolkitHandler.runOnUIThread() is implemented:

	@Override
	public void runOnUIThread(Runnable runnable) {
		if (SwingUtilities.isEventDispatchThread())
			runnable.run();
		else
			SwingUtilities.invokeLater(runnable);
	}

Running custom application code on UI thread

The flow described above works for the fields registered with the Timeline.addPropertyToInterpolate methods. What about the custom application callbacks registered with the Timeline.addCallback()? If the callback methods need to respect the UI threading rules of the matching toolkit, the TimelineCallback implementation class needs to be tagged with the org.pushingpixels.trident.callback.RunOnUIThread annotation.

Callback implementations marked with this annotation will have both onTimelineStateChanged and onTimelinePulse invoked on the UI thread, making it safe to query and change the UI. The UIThreadTimelineCallbackAdapter is a core adapter class that is marked with this annotation.

Querying the readiness of the timeline object

The isInReadyState(Object) is the third and final method in the UIToolkitHandler interface. After the specific UI toolkit handler has declared that it will handle the main object of the specific timeline (by returning true from the isHandlerFor(Object) method), it will be used to interpolate the registered fields and run the registered callbacks. However, some UI toolkits may impose additional restrictions on when the UI object is ready to be queried / changed.

For example, once an SWT control is disposed, it will throw an SWTException in the setForeground method. So, if the application code is running a slow animation that changes the foreground color of a button, and the application window containing this button is disposed in the meantime, the call to setForeground should be skipped.

SWT implementation of the UI toolkit handler

The trident-plugin.properties descriptor bundled with the core Trident library has the following line:

UIToolkitHandler=org.pushingpixels.trident.swt.SWTToolkitHandler

And the SWTToolkitHandler is:

public class SWTToolkitHandler implements UIToolkitHandler {
	@Override
	public boolean isHandlerFor(Object mainTimelineObject) {
		return (mainTimelineObject instanceof Widget);
	}

	@Override
	public boolean isInReadyState(Object mainTimelineObject) {
		return !((Widget) mainTimelineObject).isDisposed();
	}

	@Override
	public void runOnUIThread(Runnable runnable) {
		Display.getDefault().asyncExec(runnable);
	}
}

This is a very simple implementation of a UI toolkit handler that respects the relevant threading rules:

  • The first method implements the logic that associates this handler with all SWT widgets
  • The second method marks disposed widgets to skip the property interpolation / callback invocations
  • The third method runs the UI related logic on the SWT thread