Aurora 1.3.0

December 1st, 2022

It gives me great pleasure to announce the fourth release of Aurora. Let’s get to what’s been fixed, and what’s been added. First, I’m going to use emojis to mark different parts of it like this:

💔 marks an incompatible API / binary change
🎁 marks new features
🔧 marks bug fixes and general improvements

Release notes

There’s still a long road ahead to expand Aurora’s capabilities in 2023 and beyond, with the ribbon / command bar planned as the next big addition. If you’re in the business of writing desktop Compose apps, I’d love for you to take Aurora for a spin. Stay frosty for more features coming in 2023!

Skia shaders in Compose Desktop

September 22nd, 2021

In the past year or so I’ve been working on a new project. Aurora is a set of libraries for building Compose Desktop apps, taking most of the building blocks from Radiance. I don’t have a firm date yet for when the first release of Aurora will be available, but in the meanwhile I want to talk about something I’ve been playing with over the last few weeks.

Skia is a library that serves as the graphics engine for Chrome, Android, Flutter, Firefox and many other popular platforms. It has also been chosen by Jetbrains as the graphics engine for Compose Desktop. One of the more interesting parts of Skia is SkSL – Skia’s shading language – that allows writing fast and powerful fragment shaders. While shaders are usually associated with rendering complex scenes in video games and CGI effects, in this post I’m going to show how I’m using Skia shaders to render textured backgrounds for desktop apps.

First, let’s start with a few screenshots:

Here we see the top part of a sample demo frame under five different Aurora skins (from top to bottom, Autumn, Business, Business Blue Steel, Nebula, Nebula Amethyst). Autumn features a flat color fill, while other four have a horizontal gradient (darker at the edges, lighter in the middle) overlaid with an curved arc along the top edge. If you look closer, all five also feature something else – a muted texture that spans the whole colored area.

Let’s take a look at another screenshot:

Top row shows a Perlin noise texture, one in greyscale and one in orange. Bottom row shows a brushed metal texture, one in greyscale and one in orange.

Let’s take a look at how to create these textures with Skia shaders in Compose Desktop.

First, we start with Shader.makeFractalNoise that wraps SkPerlinNoiseShader::MakeFractalNoise:

// Fractal noise shader
val noiseShader = Shader.makeFractalNoise(
    baseFrequencyX = baseFrequency,
    baseFrequencyY = baseFrequency,
    numOctaves = 1,
    seed = 0.0f,
    tiles = emptyArray()
)

Next, we have a custom duotone SkSL shader that computes luma (brightness) of each pixel, and uses that luma to map the original color to a point between two given colors (light and dark):

// Duotone shader
val duotoneDesc = """
  uniform shader shaderInput;
  uniform vec4 colorLight;
  uniform vec4 colorDark;
  uniform float alpha;
            
  half4 main(vec2 fragcoord) { 
    vec4 inputColor = shaderInput.eval(fragcoord);
    float luma = dot(inputColor.rgb, vec3(0.299, 0.587, 0.114));
    vec4 duotone = mix(colorLight, colorDark, luma);
    return vec4(duotone.r * alpha, duotone.g * alpha, duotone.b * alpha, alpha);
  }
"""

This shader gets four inputs. The first is another shader (which will be the fractal noise that we’ve created earlier). The next two are two colors, and the last one is alpha (for applying partial translucency).

Now we create a byte buffer to pass our colors and alpha to this shader:

val duotoneDataBuffer = ByteBuffer.allocate(36).order(ByteOrder.LITTLE_ENDIAN)
// RGBA colorLight
duotoneDataBuffer.putFloat(0, colorLight.red)
duotoneDataBuffer.putFloat(4, colorLight.green)
duotoneDataBuffer.putFloat(8, colorLight.blue)
duotoneDataBuffer.putFloat(12, colorLight.alpha)
// RGBA colorDark
duotoneDataBuffer.putFloat(16, colorDark.red)
duotoneDataBuffer.putFloat(20, colorDark.green)
duotoneDataBuffer.putFloat(24, colorDark.blue)
duotoneDataBuffer.putFloat(28, colorDark.alpha)
// Alpha
duotoneDataBuffer.putFloat(32, alpha)

And create our duotone shader with RuntimeEffect.makeForShader (a wrapper for SkRuntimeEffect::MakeForShader) and RuntimeEffect.makeShader (a wrapper for SkRuntimeEffect::makeShader):

val duotoneEffect = RuntimeEffect.makeForShader(duotoneDesc)
val duotoneShader = duotoneEffect.makeShader(
    uniforms = Data.makeFromBytes(duotoneDataBuffer.array()),
    children = arrayOf(noiseShader),
    localMatrix = null,
    isOpaque = false
)

With this shader, we have two options to fill the background of a Compose element. The first one is to wrap Skia’s shader in Compose’s ShaderBrush and use drawBehind modifier:

val brush = ShaderBrush(duotoneShader)
Box(modifier = Modifier.fillMaxSize().drawBehind {
    drawRect(
      brush = brush, topLeft = Offset(100f, 65f), size = Size(400f, 400f)
    )
})

The second option is to create a local Painter object, use DrawScope.drawIntoCanvas block in the overriden DrawScope.onDraw, get the native canvas with Canvas.nativeCanvas and call drawPaint on the native (Skia) canvas directly with the Skia shader we created:

val shaderPaint = Paint()
shaderPaint.setShader(duotoneShader)

Box(modifier = Modifier.fillMaxSize().paint(painter = object : Painter() {
  override val intrinsicSize: Size
    get() = Size.Unspecified

  override fun DrawScope.onDraw() {
    this.drawIntoCanvas {
      val nativeCanvas = it.nativeCanvas
      nativeCanvas.translate(100f, 65f)
      nativeCanvas.clipRect(Rect.makeWH(400f, 400f))
      nativeCanvas.drawPaint(shaderPaint)
    }
  }
}))

What about the brushed metal texture? In Aurora it is generated by applying modulated sine / cosine waves on top of the Perlin noise shader. The relevant snippet is:

// Brushed metal shader
val brushedMetalDesc = """
        uniform shader shaderInput;

        half4 main(vec2 fragcoord) { 
          vec4 inputColor = shaderInput.eval(vec2(0, fragcoord.y));
          // Compute the luma at the first pixel in this row
          float luma = dot(inputColor.rgb, vec3(0.299, 0.587, 0.114));
          // Apply modulation to stretch and shift the texture for the brushed metal look 
          float modulated = abs(cos((0.004 + 0.02 * luma) * (fragcoord.x + 200) + 0.26 * luma) 
              * sin((0.06 - 0.25 * luma) * (fragcoord.x + 85) + 0.75 * luma));
          // Map 0.0-1.0 range to inverse 0.15-0.3
          float modulated2 = 0.3 - modulated / 6.5;
          half4 result = half4(modulated2, modulated2, modulated2, 1.0);
          return result;
        }
"""
val brushedMetalEffect = RuntimeEffect.makeForShader(brushedMetalDesc)
val brushedMetalShader = brushedMetalEffect.makeShader(
    uniforms = null,
    children = arrayOf(noiseShader),
    localMatrix = null,
    isOpaque = false
)

And then passing the blur shader as the input to the duotone shader:

val duotoneEffect = RuntimeEffect.makeForShader(duotoneDesc)
val duotoneShader = duotoneEffect.makeShader(
  uniforms = Data.makeFromBytes(duotoneDataBuffer.array()),
  children = arrayOf(brushedMetalShader),
  localMatrix = null,
  isOpaque = false
)

The full pipeline for generating these two Aurora textured shaders is here, and the rendering of textures is done here.

What if we want our shaders to be dynamic? First let’s see a couple of videos:

The full code for these two demos can be found here and here.

The core setup is the same – use Runtime.makeForShader to compile the SkSL shader snippet, pass parameters with RuntimeEffect.makeShader, and then use either ShaderBrush + drawBehind or Painter + DrawScope.drawIntoCanvas + Canvas.nativeCanvas + Canvas.drawPaint. The additional setup involved is around dynamically changing one or more shader attributes based on time (and maybe other parameters) and using built-in Compose reactive flow to update the pixels in real time.

First, we set up our variables:

val runtimeEffect = RuntimeEffect.makeForShader(sksl)
val shaderPaint = remember { Paint() }
val byteBuffer = remember { ByteBuffer.allocate(4).order(ByteOrder.LITTLE_ENDIAN) }
var timeUniform by remember { mutableStateOf(0.0f) }
var previousNanos by remember { mutableStateOf(0L) }

Then we update our shader with the time-based parameter:

val timeBits = byteBuffer.clear().putFloat(timeUniform).array()
val shader = runtimeEffect.makeShader(
    uniforms = Data.makeFromBytes(timeBits),
    children = null,
    localMatrix = null,
    isOpaque = false
)
shaderPaint.setShader(shader)

Then we have our draw logic

val brush = ShaderBrush(shader)

Box(modifier = Modifier.fillMaxSize().drawBehind {
    drawRect(
        brush = brush, topLeft = Offset(100f, 65f), size = Size(400f, 400f)
    )
})

And finally, a Compose effect that syncs our updates with the clock and updates the time-based parameter:

LaunchedEffect(null) {
    while (true) {
        withFrameNanos { frameTimeNanos ->
            val nanosPassed = frameTimeNanos - previousNanos
            val delta = nanosPassed / 100000000f
            if (previousNanos > 0.0f) {
                timeUniform -= delta
            }
            previousNanos = frameTimeNanos
        }
    }
}

Now, on every clock frame we update the timeUniform variable, and then pass that newly updated value into the shader. Compose detects that a variable used in our top-level composable has changed, recomposes it and redraws the content – essentially asking our shader to redraw the relevant area based on the new value.

Stay tuned for more news on Aurora as it is getting closer to its first official release!

Notes:

  1. Multiple texture reads are expensive, and you might want to force such paths to draw the texture to an SkSurface and read its pixels from an SkImage.
  2. If your shader does not need to create an exact, pixel-perfect replica of the target visuals, consider sacrificing some of the finer visual details for performance. For example, a large horizontal blur that reads 20 pixels on each “side” as part of the convolution (41 reads for every pixel) can be replaced by double or triple invocation of a smaller convolution matrix, or downscaling the original image, applying a smaller blur and upscaling the result.
  3. Performance is important as your shader (or shader chain) runs on every pixel. It can be a high-resolution display (lots of pixels to process), a low-end GPU, a CPU-bound pipeline (no GPU), or any combination thereof.

Radiance comes with a number of sample / demo apps that showcase the flexibility and power of its APIs. One of those demos is Lumen. Its main goal is to highlight the feature set of the Trident animation library. Lumen uses MusicBrainz JSON web service to search for all albums of the specific artist, and for the list of tracks on individual albums. Sending requests and parsing responses is done with Retrofit and Moshi. Lucent is the port of Lumen to Kotlin.

Let’s see how it works together in Kotlin.

We start by adding the build dependencies on Retrofit and Moshi:

dependencies {
    implementation "com.squareup.retrofit2:retrofit:2.9.0"
    implementation "com.squareup.retrofit2:converter-moshi:2.9.0"
}

Next, we define our service interface that maps to MusicBrainz APIs:


    private interface MusicBrainzService {
        @GET("/ws/2/release?type=album&fmt=json")
        fun getReleases(@Query("artist") artistId: String): Call<ReleaseList>

        @GET("/ws/2/release/{release}?inc=recordings&fmt=json")
        fun getRelease(@Path("release") releaseId: String): Call<Release>

        companion object {
            const val API_URL = "https://musicbrainz.org/"
        }
    }

Note the usage of fmt=json attribute in all @GET functions, and usage of @Query and @Path that matches the expected endpoint contracts.

The data classes map to the matching MusicBrainz entities, using @field:Json annotation with the matching name attribute, along with @Json annotation on one of the data classes to properly map it to the matching JSON tags:


data class SearchResultRelease(
        @field:Json(name = "id") val id: String?,
        @field:Json(name = "title") val title: String?,
        @field:Json(name = "artist") var artist: String?,
        @field:Json(name = "date") val date: String?,
        @field:Json(name = "release-events") val releaseEvents: List<ReleaseEvent>,
        @field:Json(name = "asin") val asin: String?)

data class Area(
        @field:Json(name = "disambiguation") val disambiguation: String?,
        @field:Json(name = "id") val id: String?,
        @field:Json(name = "name") var name: String?,
        @field:Json(name = "sort-name") val sortName: String?,
        @field:Json(name = "iso-3166-1-codes") val iso31661Codes: List<String>)

data class Medium(
        @field:Json(name = "tracks") val tracks: List<Track>)

data class Release(
        @field:Json(name = "id") val id: String?,
        @field:Json(name = "title") val title: String?,
        @field:Json(name = "date") val date: String?,
        @field:Json(name = "media") val media: List<Medium>,
        @field:Json(name = "asin") val asin: String?)

data class ReleaseEvent(
        @field:Json(name = "date") val date: String?,
        @field:Json(name = "area") val area: Area?)

@Json(name = "release-list")
data class ReleaseList(
        @field:Json(name = "count") val count: Int?,
        @field:Json(name = "releases") val releases: List<SearchResultRelease>)

data class Track(
        @field:Json(name = "title") val title: String?,
        @field:Json(name = "length") val length: Int?)

Now we can create a Retrofit object and fire off our request:


        val retrofit = Retrofit.Builder()
                .baseUrl(MusicBrainzService.API_URL)
                .client(getHttpClient())
                .addConverterFactory(MoshiConverterFactory.create())
                .build()

        val service = retrofit.create(MusicBrainzService::class.java)

        val releaseResponse = service.getReleases(artistId).execute()
        val releases = releaseResponse.body()

And to get the list of tracks for the specific album:


    fun doTrackSearch(releaseId: String): List<Track> {
        val retrofit = Retrofit.Builder()
                .baseUrl(MusicBrainzService.API_URL)
                .client(getHttpClient())
                .addConverterFactory(MoshiConverterFactory.create())
                .build()

        val service = retrofit.create(MusicBrainzService::class.java)

        val releaseResponse = service.getRelease(releaseId).execute()
        val release = releaseResponse.body()

        return release!!.media[0].tracks
    }

Where the OkHttpClient is configured like this:


    private fun getHttpClient(): OkHttpClient {
        val okHttpBuilder = OkHttpClient.Builder()
        okHttpBuilder.addInterceptor { chain ->
            val requestWithUserAgent = chain.request().newBuilder()
                    .header("User-Agent", "My custom user agent")
                    .build()
            chain.proceed(requestWithUserAgent)
        }
        return okHttpBuilder.build()
    }

This is it. No messy handling of HTTP requests, no manual parsing of JSON responses. All driven by metadata and encapsulated by Kotlin data classes.

Radiance 2.5.0

September 3rd, 2019

It gives me great pleasure to announce the third major release of Radiance. Let’s get to what’s been fixed, and what’s been added. First, I’m going to use emojis to mark different parts of it like this:

💔 marks an incompatible API / binary change
😻 marks new features
🤷‍♀️ marks bug fixes and general improvements

Substance

  • 😻 New skins – Nebula Amethyst, Night Shade and Graphite Sunset
  • 🤷‍♀️ Fix for disappearing internal frame title pane buttons
  • 🤷‍♀️ Fix for crash during initialization
  • 🤷‍♀️ Fix for OutOfMemoryError on sliders with large model ranges
  • 🤷‍♀️ Fix for slider tracks under dark skins
  • 💔 Fix for incorrect tracking of state-based alpha values in color scheme bundles
  • 🤷‍♀️ Fix for drop shadows under some skins
  • 🤷‍♀️ Fix for contrast ratio of highlighted content under Sahara skin
  • 🤷‍♀️ Fix for antialiased rendering of pasted text content

Flamingo

Trident

  • 😻 DSL for Trident
  • 🤷‍♀️ Fix for combining looping timelines with .fromCurrent()

Photon

The first Radiance release focused on bringing all the different Swing open-source projects that I’ve been working on since 2005 under one roof. The second Radiance release was about making them work much better together. And this one (code-named Coral) is about covering major functionality gaps that were missing up until now.

There’s still a long road ahead to continue exploring the never-ending depths of what it takes to write elegant and high-performing desktop applications in Swing. If you’re in the business of writing just such apps, I’d love for you to take this third Radiance release for a spin. Click here to get the instructions on how to add Radiance to your Gradle / Maven / Ivy / Leiningen / Bazel builds. And don’t forget that all of the modules require Java 9 to build and run.