Project Marble – augmented reality in Java with JMF, Java3D, NYArToolkit and Trident

July 2nd, 2009

Today i’m going to talk about setting up the development environment for running the augmented reality demo shown in this video from my previous post:

Here are the steps:

  • Download and install Java Media Framework (JMF)
  • Download Java3D and place the native / java libraries in the JDK installation folder
  • Download NYArToolkit for Java and unzip somewhere on your machine
  • Download Trident
  • Print out one of the PDF markers in the NYArToolkit/Data. For my demo, i’m using pattHiro.pdf
  • Import all projects under NYArToolkit in your Eclipse workspace. You can ignore the ones for JOGL and QuickTime for this specific demo.
  • In these projects, tweak the build path to point to the JMF / Java3D jars on your system.
  • Plug in your web camera.
  • Run jp.nyatla.nyartoolkit.jmf.sample.NyarToolkitLinkTest class to test that JMF has been installed correctly and can display the captured feed.
  • Run jp.nyatla.nyartoolkit.java3d.sample.NyARJava3D class to test the tracking capabilities of the NYArToolkit. Once the camera feed is showing, point the camera to the printed marker so that it is fully visible. Once you have it on your screen, a colored cube should be shown. Try rotating, moving and tilting the marker printout – all the while keeping it in the frame – to verify that the cube is properly oriented.

The demo from the video above is available as a part of the new Marble project. Here is how to set it up:

  • Sync the latest SVN tip of Marble.
  • Create an Eclipse project for Marble. It should have dependencies on NYArToolkit, NYArToolkit.utils.jmf, NYArToolkit.utils.java3d and Trident. It should also have jmf.jar, vecmath.jar, j3dcore.jar and j3dutils.jar in the build path.
  • Run the org.pushingpixels.marble.MarbleFireworks3D class. Follow the same instructions as above to point the webcam.

There’s not much to the code in this Marble demo. It follows the NyARJava3D class, but instead of static Java3D content (color cube) it has a dynamic scene that is animated by Trident. For the code below note that i’m definitely not an expert in Java3D and NYArToolkit, so there might as well be a simpler way to do these animations. However, they are enough to get you started in exploring animations in Java-powered augmented reality.

Each explosion volley is implemented by a collection of Explosion3D objects. Each such object models a single explosion “particle”. Here is the constructor of the Explosion3D class:

public Explosion3D(float x, float y, float z, Color color) {
   this.x = x;
   this.y = y;
   this.z = z;
   this.color = color;
   this.alpha = 1.0f;

   this.sphere3DTransformGroup = new TransformGroup();
   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));
   this.appearance3D = new Appearance();
   this.appearance3D.setColoringAttributes(new ColoringAttributes(color
      .getRed() / 255.0f, color.getGreen() / 255.0f,
      color.getBlue() / 255.0f, ColoringAttributes.SHADE_FLAT));
   this.appearance3D.setTransparencyAttributes(new TransparencyAttributes(
      TransparencyAttributes.BLENDED, 0.0f));
   this.sphere3D = new Sphere(0.002f, appearance3D);

   this.sphere3DBranchGroup = new BranchGroup();


Here, we have a bunch of Java3D code to create a group that can be dynamically changed at runtime. This group has only one leaf – the Sphere object.

As we’ll see later, the timeline behind this object changes its coordinates and the alpha (fading it out). Here is the relevant public setter for the alpha property:

public void setAlpha(float alpha) {
   this.alpha = alpha;

   this.appearance3D.setTransparencyAttributes(new TransparencyAttributes(
      TransparencyAttributes.BLENDED, 1.0f - alpha));

Not much here – just updating the transparency of the underlying Java3D Sphere object. The setters for the coordinates are quite similar:

public void setX(float x) {
   this.x = x;

   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));

public void setY(float y) {
   this.y = y;
   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));

public void setZ(float z) {
   this.z = z;
   Transform3D mt = new Transform3D();
   mt.setTranslation(new Vector3d(this.x, this.y, this.z));

The main class is MarbleFireworks3D. Its constructor is rather lengthy and has a few major parts. The first part initializes the camera and marker data for the NYArToolkit core:

NyARCode ar_code = new NyARCode(16, 16);
ar_param = new J3dNyARParam();
ar_param.changeScreenSize(WIDTH, HEIGHT);

Following that, there’s a bunch of Java3D code that initializes the universe, locale, platform, body and environment, and creates the main transformation group. The interesting code is the one that creates the main scene group that will hold the dynamic collection of Explosion3D groups:

mainSceneGroup = new TransformGroup();

nya_behavior = new NyARSingleMarkerBehaviorHolder(ar_param, 30f,
   ar_code, 0.08);



The NyARSingleMarkerBehaviorHolder is a helper class from the NYArToolkit.utils.java3d project. It tracks the transformation matrix computed by NYArToolkit based on the current location of the marker and updates the transformation set on the main scene group. As you will see later, there is no explicit handling of the marker tracking in the demo code – only creation, update and deletion of the Explosion3D objects.

Finally, we create a looping thread that adds random firework explosions:

// start adding random explosions
new Thread() {
   public void run() {
      while (true) {
         try {
            Thread.sleep(500 + (int) (Math.random() * 1000));
            float x = -1.0f + 2.0f * (float) Math.random();
            float y = -1.0f + 2.0f * (float) Math.random();
            float z = (float) Math.random();
            addFireworkNew(x * 5.0f, y * 5.0f, 5.0f + z * 12.0f);
         } catch (Exception exc) {

The code in this method computes a 3D uniform distribution of small spheres that originate at the specific location (explosion center) and move outwards. Each Explosion3D object is animated with the matching timeline. The timeline interpolates the alpha property, as well as the coordinates. As you can see, while x and y are interpolated linearly, the interpolation of z takes the gravity into the account – making the explosion particles fall downwards. All the timelines are added to a parallel timeline scenario. Once a timeline starts playing, the matching branch group is added to the main scene graph. Once the timeline scenario is done, all the branch groups are removed from the main scene graph:

private void addFireworkNew(float x, final float y, final float z) {
   final TimelineScenario scenario = new TimelineScenario.Parallel();
   Set scenarioExplosions = new HashSet();

   float R = 6;
   int NUMBER = 20;
   int r = (int) (255 * Math.random());
   int g = (int) (100 + 155 * Math.random());
   int b = (int) (50 + 205 * Math.random());
   Color color = new Color(r, g, b);
   for (double alpha = -Math.PI / 2; alpha <= Math.PI / 2; alpha += 2
      * Math.PI / NUMBER) {
      final float dy = (float) (R * Math.sin(alpha));
      final float yFinal = y + dy;
      float rSection = (float) Math.abs(R * Math.cos(alpha));
      int expCount = Math.max(0, (int) (NUMBER * rSection / R));
      for (int i = 0; i < expCount; i++) {
         float xFinal = (float) (x + rSection
            * Math.cos(i * 2.0 * Math.PI / expCount));
         final float dz = (float)(rSection
            * Math.sin(i * 2.0 * Math.PI / expCount));
         float zFinal = z + dz;

         final Explosion3D explosion = new Explosion3D(x * SCALE, y * SCALE,
            z * SCALE, color);

         final Timeline expTimeline = new Timeline(explosion);
         expTimeline.addPropertyToInterpolate("alpha", 1.0f, 0.0f);
         expTimeline.addPropertyToInterpolate("x", x * SCALE, xFinal
            * SCALE);
         expTimeline.addPropertyToInterpolate("y", y * SCALE, yFinal
            * SCALE);
         expTimeline.addCallback(new TimelineCallbackAdapter() {
            public void onTimelinePulse(float durationFraction,
               float timelinePosition) {
               float t = expTimeline.getTimelinePosition();
               float zCurr = (z + dz * t - 10 * t * t) * SCALE;

         expTimeline.addCallback(new TimelineCallbackAdapter() {
            public void onTimelineStateChanged(TimelineState oldState,
               TimelineState newState, float durationFraction,
               float timelinePosition) {
               if (newState == TimelineState.PLAYING_FORWARD) {


   synchronized (explosions) {
      explosions.put(scenario, scenarioExplosions);
   scenario.addCallback(new TimelineScenarioCallback() {
      public void onTimelineScenarioDone() {
         synchronized (explosions) {
            Set ended = explosions.remove(scenario);
            for (Explosion3D end : ended) {

The dependent libraries that are used here do the following:

  • JMF captures the webcam stream and provides the pixels to put on the screen
  • NYArToolkit processes the webcam stream, locates the marker and computes the matching transformation matrix
  • Java3D tracks the scene graph objects, handles the depth sorting and the painting of the scene
  • Trident animates the location and alpha of the explosion particles