Search

Dark theme | Light theme
Showing posts with label GradleGoodness:Plugins. Show all posts
Showing posts with label GradleGoodness:Plugins. Show all posts

November 8, 2022

Gradle Goodness: Defining Plugin Versions Using Version Catalog

A version catalog in Gradle is a central place in our project where we can define dependency references with their version or version rules. A dependency reference is defined using an identifier with a corresponding dependency definition containing the coordinates of the dependency. Now we can reference the dependency using the identifier in for example a dependency configuration, e.g. implementation(libs.spring.core). If there is a version change we want to apply we only have to make the change in our version catalog. An added bonus is that Gradle generates type safe accessors for the identifier we use in our version catalog, so we can get code completion in our IntelliJ IDEA when we want to reference a dependency from the version catalog.

Besides dependencies we need to build and test our software we can also include definitions for Gradle plugins including their version. Normally we reference a Gradle plugin using the id and version. For example in the following code block we include 4 Gradle plugins of which 3 are identified by an id:

plugins {
	`java` // Default Gradle Java plugin
	
	// Include 3 third-party Gradle plugins
	id("org.springframework.boot") version "2.7.5"
	id("io.spring.dependency-management") version "1.0.15.RELEASE"
	id("org.asciidoctor.jvm.convert") version "3.2.0"
}

We can replace these plugin references with version catalog defined values. First we must create the file gradle/libs.version.toml in our project directory. We might already have such file with definitions for the dependencies we use in our build and tests. Next we must add a section [plugins] where we can define our plugin dependencies. We can use the full power of the version catalog here, the only thing we need to remember is to use the id property of we use the longer notation option. With the shorthand notation we can simply define a string value with the id of the plugin, a colon (:) and the version.

In the following example libs.versions.toml file we defined our 3 third-party plugins using several notations:

# File: gradle/libs.versions.toml
[versions]
# Define version we can use as version.ref in [plugins]
asciidoctor = "3.2.0" 

[plugins]
# We can use shorthand notation with the plugin id and version.
spring-boot = "org.springframework.boot:2.7.5"

# We can use the longer notation option where we set 
# the id and version for the plugin.
spring-dep-mgmt = { id = "io.spring.dependency-management", version = "1.0.15.RELEASE" }

# Here we use the longer notation and version.ref to reference
# the version defined in the [versions] section.
asciidoctor-jvm = { id = "org.asciidoctor.jvm.convert", version.ref = "asciidoctor" }

We only have to change our plugins block in our build file. We use the method alias to reference our version catalog definitions. In IntelliJ IDEA we even get code completion when start typing. The following code shows how we include the plugins:

plugins {
	`java` // Default Gradle Java plugin
	
	// Using alias we can reference the plugin id and version
	// defined in the version catalog.
	// Notice that hyphens (-) used as separator in the identifier
	// are translated into type safe accessors for each subgroup.
	alias(libs.plugins.spring.boot)
	alias(libs.plugins.spring.dep.mgmt)
	alias(libs.plugins.asciidoctor.jvm)
}

The version catalog is a powerful feature of Gradle. It allows to have a single place in our project where we define dependency coordinates and we get type safe accessors methods to have code completion in IntelliJ IDEA.

Written with Gradle 7.5.1.

February 17, 2021

Gradle Goodness: Setting Plugin Version From Property In Plugins Section

The plugins section in our Gradle build files can be used to define Gradle plugins we want to use. Gradle can optimize the build process if we use plugins {...} in our build scripts, so it is a good idea to use it. But there is a restriction if we want to define a version for a plugin inside the plugins section: the version is a fixed string value. We cannot use a property to set the version inside the plugins section. We can overcome this by using a pluginsManagement section in a settings file in the root of our project. Inside the pluginsManagement section we can use properties to set the version of a plugin we want to use. Once it is defined inside pluginsManagement we can use it in our project build script without having the specify the version. This allows us to have one place where all plugin versions are defined. We can even use a gradle.properties file in our project with all plugin versions and use that in pluginsManagement.

In the following settings file we use pluginsManagement to use a project property springBootPluginVersion to set the version to use for the Spring Boot Gradle plugin.

// File: settings.gradle.kts
pluginManagement {
    val springBootPluginVersion: String by settings // use project property with version
    plugins {
        id("org.springframework.boot") version "${springBootPluginVersion}"
    }
}

Next in our project build file we can simply reference the id of the Spring Boot Gradle plugin without the version. The version is already resolved in our settings file:

// File: build.gradle.kts
plugins {
    java
    application
    id("org.springframework.boot") // no version here: it is set in settings.gradle.kts
}

application {
    mainClass.set("com.mrhaki.sample.App")
}

Finally we can add a gradle.properties file with the project property (or specify it on the command line or environment variable):

# File: gradle.properties
springBootPluginVersion=2.4.2

Written with Gradle 6.8.2.

February 16, 2021

Gradle Goodness: Shared Configuration With Conventions Plugin

When we have a multi-module project in Gradle we sometimes want to have dependencies, task configuration and other settings shared between the multiple modules. We can use the subprojects or allprojects blocks, but the downside is that it is not clear from the build script of the subproject where the configuration comes from. We must remember it is set from another build script, but there is no reference in the subproject to that connection. It is better to use a plugin with shared configuration and use that plugin in the subprojects. We call this a conventions plugin. This way it is explicitly visible in a subproject that the shared settings come from a plugin. Also it allows Gradle to optimize the build configuration.

The easiest way to implement the shared configuration in a plugin is using a so-called precompiled script plugin. This type of plugin can be written as a build script using the Groovy or Kotlin DSL with a filename ending with .gradle or .gradle.kts. The name of the plugin is the first part of the filename before .gradle or .gradle.kts. In our subproject we can add the plugin to our build script to apply the shared configuration. For a multi-module project we can create such a plugin in the buildSrc directory. For a Groovy plugin we place the file in src/main/groovy, for a Kotlin plugin we place it in src/main/kotlin.

In the following example we write a script plugin using the Kotlin DSL to apply the java-library plugin to a project, set some common dependencies used by all projects, configure the Test tasks and set the Java toolchain. First we create a build.gradle.kts file in the buildSrc directory in the root of our multi-module project and apply the kotlin-dsl plugin:

// File: buildSrc/build.gradle.kts
plugins {
    `kotlin-dsl`
}

repositories.mavenCentral()

Next we create the conventions plugin with our shared configuration:

// File: buildSrc/src/main/kotlin/java-project-conventions.gradle.kts
plugins {
    `java-library`
}

group = "mrhaki.sample"
version = "1.0"

repositories {
    mavenCentral()
}

dependencies {
    val log4jVersion: String by extra("2.14.0")
    val junitVersion: String by extra("5.3.1")
    val assertjVersion: String by extra("3.19.0")
    
    // Logging
    implementation("org.apache.logging.log4j:log4j-api:${log4jVersion}")
    implementation("org.apache.logging.log4j:log4j-core:${log4jVersion}")

    // Testing
    testImplementation("org.junit.jupiter:junit-jupiter-api:${junitVersion}")
    testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:${junitVersion}")
    testImplementation("org.assertj:assertj-core:${assertjVersion}")
}

java {
    toolchain {
        languageVersion.set(JavaLanguageVersion.of(15))
    }
}

tasks.withType<Test> {
    useJUnitPlatform()
}

The id of our new plugin is java-project-conventions and we can use it in our build script for a subproject as:

// File: rest-api/build.gradle.kts
plugins {
    id("java-project-conventions")  // apply shared config
    application  // apply the Gradle application plugin
}

dependencies {
    val vertxVersion: String by extra("4.0.2")

    implementation(project(":domain"))  // project dependency
    implementation("io.vertx:vertx-core:${vertxVersion}")
}

application {
    mainClass.set("com.mrhaki.web.Api")
}

The rest-api project will have all the configuration and tasks from java-library plugin as configured in the java-project-conventions plugin, so we can build it as a Java project.

Written with Gradle 6.8.2.

November 14, 2016

Gradle Goodness: Adding Task With Rule Based Model Configuration

Gradle has an incubating feature Rule based model configuration. This is a new way to configure Gradle projects where Gradle has more control of the configuration and the dependencies between configuration objects. This allows Gradle to resolve configuration values before they are used, because Gradle knows there is a dependency. With this new model we don't need any lazy evaluation "tricks" we had to use. For example there was an internal convention mapping mechanism for tasks to assign values to a task configuration after the task was already created. Also the project.afterEvalute is a mechanism to have late binding for task properties. With the new rule based model Gradle can do without these options, we can rely on Gradle resolving all dependent configuration values when we create a task.

In Gradle we already know about the "project space" where the Project object is the root of the object graph. For example repositories are part of the project space. Gradle can get some useful information from the project space, but it is mostly a graph of objects that Gradle only partially can reason about. Then we have the "model space". This is part of a project and we can use it in our build script with the model configuration block. The model space is separate from the project space and contains objects that are managed by Gradle. Gradle knows dependencies between the objects and how to create and change them. This helps Gradle to optimise build logic. To help Gradle we must define rules to work with objects in the model space. Each rule is like a recipe for Gradle on how to work with the model. Gradle can build a graph of models and know about dependencies between models. This way Gradle guarantees that model objects are completely configured before being used. For example if a rule needs a VersionFile model configuration object then Gradle makes sure that the VersionFile is created and all properties are set. So we don't need any lazy or late binding anymore, because the properties will be set (Gradle makes sure) when we want to use them. The rules are defined a class that extends RuleSource. Such a class is stateless and only contains methods to work with the model objects. Gradle has some specific annotations that can be used on methods to indicate what a method should do.

In our example we have a Gradle custom task VersionFileTask. The task has some properties which we want to set via the model space using a model configuration block. We want to add this task to the list of tasks in our project by using a apply plugin: statement.

Let's first look at the source of the custom Gradle task:

// File: buildSrc/src/main/groovy/mrhaki/gradle/VersionFileTask.groovy
package mrhaki.gradle

import org.gradle.api.DefaultTask
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.OutputFile
import org.gradle.api.tasks.TaskAction

/**
 * Simple task to save the value for the
 * {@link #version} property in a file.
 * The file is set with the {@link #outputFile}
 * property.
 */
class VersionFileTask extends DefaultTask {

    /**
     * Value for version to be saved.
     */
    @Input
    String version

    /**
     * Output file to store version value in.
     */
    @OutputFile
    File outputFile

    /**
     * Actual task actions to save the value
     * for {@link #version} in {@link #outputFile}.
     */
    @TaskAction
    void generateVersionFile() {
        outputFile.parentFile.mkdirs()
        outputFile.text = version
    }

}

Nothing special here. Now it is time to enter the model space of Gradle. First we create a object with properties that is used to configure our VersionFileTask task. Here we must use the annotation @Managed so Gradle knows this object will be managed in the object space:

// File: buildSrc/src/main/groovy/mrhaki/gradle/VersionFile.groovy
package mrhaki.gradle

import org.gradle.model.Managed

/**
 * Gradle is responsible for creating an implementation
 * for this interface. We use @Managed to let Gradle know.
 * We need to provide the get and set
 * methods following the Java Beans standards for properties.
 * 
 * In the model space Gradle provides an implementation and
 * knows how to create an instance of that implementation
 * and how to invoke the get and set methods to mutate the state.
 */
@Managed
interface VersionFile {
    String getVersion() 
    void setVersion(final String version) 

    File getOutputFile() 
    void setOutputFile(final File outputFile) 
}

Next we create a class with the rules for the model space VersionFileTaskRules. We can use this class like a plugin in our project using the statement apply plugin: mrhaki.gradle.VersionFileTaskRules. We need two rules to instruct Gradle about our model objects. First we need to make sure an instance of the managed VersionFile interface is created. We do this with the createVersionFile method. We need another method ( createVersionFileTask) to change the list of tasks (Gradle calls this mutate in the model space terminology) using an instance of VersionFile. Gradle knows about the connection between the two methods via the VersionFile object, so it makes sure VersionFile is created before the method createVersionFileTask is invoked:

// File: buildSrc/src/main/groovy/mrhaki/gradle/VersionFileTaskRules.groovy
package mrhaki.gradle

import org.gradle.api.Task
import org.gradle.model.Model
import org.gradle.model.ModelMap
import org.gradle.model.Mutate
import org.gradle.model.RuleSource

/**
 * Class contains several methods to tell Gradle
 * how to create a {@link VersionFile} instance
 * and how to mutate the list of tasks by creating
 * the {@link VersionFileTask} task.
 */
class VersionFileTaskRules extends RuleSource {

    /**
     * Method to tell Gradle that we need an instance
     * of {@link VersionFile} in the model space. The name of the method
     * is also used as in the model space to configure
     * the object. Another name can be used as an argument for the
     * {@code @Model} annotation.
     *
     * @param versionFile The type {@link VersionFile} has a {@code @Managed}
     *                    annotation, so Gradle can provide an implementation.
     */
    @Model
    void versionFile(final VersionFile versionFile) {}

    /**
     * Method to create the {@link VersionFileTask} task and add to list
     * of tasks. The first arguments is the type we want to mutate, the
     * other argument is an input argument used to mutate the list of tasks.
     * 
     * With the {@code versionFile} argument we can pass information to this method
     * that is needed to create the {@link VersionFileTask}. A user can use
     * the {@code model} configuration block in a build file to set values for 
     * the {@link VersionFile} instance.
     * 
     * Gradle will make sure the input argument is created and all properties
     * are set before it is used in this method. So no more {@link afterEvaluate}
     * or convention mappings are needed. Gradle makes sure all input arguments
     * are resolved before they are used.
     * 
     * @param tasks Tasks we want to add a new one to
     * @param versionFile Resolved instance used to configure new task
     */
    @Mutate
    void createVersionFileTask(final ModelMap<Task> tasks, final VersionFile versionFile) {
        tasks.create('generateVersionFile', VersionFileTask) { task ->
            task.version = versionFile.version
            task.outputFile = versionFile.outputFile
        }
    }
    
}

To use the rules we create a simple build.gradle file:

apply plugin: mrhaki.gradle.VersionFileTaskRules

To see the model space managed by Gradle we can invoke the model task. The output shows the current model of our project.

$ gradle model
...
------------------------------------------------------------
Root project
------------------------------------------------------------

+ tasks
      | Type:           org.gradle.model.ModelMap<org.gradle.api.Task>
      | Creator:        Project.<init>.tasks()
      | Rules:
         ⤷ VersionFileTaskRules#createVersionFileTask(ModelMap<Task>, VersionFile)
...
    + generateVersionFile
          | Type:       mrhaki.gradle.VersionFileTask
          | Value:      task ':generateVersionFile'
          | Creator:    VersionFileTaskRules#createVersionFileTask(ModelMap<Task>, VersionFile) > create(generateVersionFile)
          | Rules:
             ⤷ copyToTaskContainer
...
+ versionFile
      | Type:           mrhaki.gradle.VersionFile
      | Creator:        VersionFileTaskRules#createVersionFile(VersionFile)
    + outputFile
          | Type:       java.io.File
          | Value:      null
          | Creator:    VersionFileTaskRules#createVersionFile(VersionFile)
    + version
          | Type:       java.lang.String
          | Value:      null
          | Creator:    VersionFileTaskRules#createVersionFile(VersionFile)
...
$

We see the model type mrhaki.gradle.VersionFile is created with the method versionFile and it's properties outputFile and version. Also the model shows that the method generateVersionFile creates the task VersionFileTask.

We set values for the VersionFile properties in our build file:

apply plugin: mrhaki.gradle.VersionFileTaskRules

// Configure model space.
model {
    
    // Configure VersionFile instance created 
    // by method versionFile() from VersionFileTaskRules.
    versionFile {
    
        // Set value for version property of VersionFile.
        version = project.version

        // Set value for outputFile property of VersionFile.
        outputFile = project.file("${buildDir}/version.file")
    }   
}

version = '1.0.1.RELEASE'

We run the model task again and this time we see that the properties version and outputFile are set:

$ gradle model
...
------------------------------------------------------------
Root project
------------------------------------------------------------
...
+ versionFile
      | Type:           mrhaki.gradle.VersionFile
      | Creator:        VersionFileTaskRules#versionFile(VersionFile)
      | Rules:
         ⤷ versionFile { ... } @ build.gradle line 8, column 5
    + outputFile
          | Type:       java.io.File
          | Value:      /Users/mrhaki/Projects/mrhaki.com/blog/posts/samples/gradle/versionrule/build/version.file
          | Creator:    VersionFileTaskRules#versionFile(VersionFile)
    + version
          | Type:       java.lang.String
          | Value:      1.0.1.RELEASE
          | Creator:    VersionFileTaskRules#versionFile(VersionFile)
...
$

Finally we run the task generateVersionFile and check the result:

$ gradle generateVersionFile
:buildSrc:compileJava UP-TO-DATE
:buildSrc:compileGroovy UP-TO-DATE
:buildSrc:processResources UP-TO-DATE
:buildSrc:classes UP-TO-DATE
:buildSrc:jar UP-TO-DATE
:buildSrc:assemble UP-TO-DATE
:buildSrc:compileTestJava UP-TO-DATE
:buildSrc:compileTestGroovy UP-TO-DATE
:buildSrc:processTestResources UP-TO-DATE
:buildSrc:testClasses UP-TO-DATE
:buildSrc:test UP-TO-DATE
:buildSrc:check UP-TO-DATE
:buildSrc:build UP-TO-DATE
:generateVersionFile

BUILD SUCCESSFUL

Total time: 0.864 secs
$ more build/version.file
1.0.1.RELEASE
$

Please remember at the time of writing the Rule based model configuration is still incubating. In future versions things may change.

Written with Gradle 3.2

November 8, 2016

Gradle Goodness: Custom Plugin Repositories With Plugins DSL

To apply a plugin in our Gradle build script we can use the plugins DSL. The plugins DSL is very concise and allows Gradle to be more efficient and more in control when loading the plugin. Normally the plugin we define is fetched from the Gradle plugin portal. If we have our own repository, for example on the intranet of our company, we have to define that extra repository with a pluginRepositories configuration block in the settings.gradle file of our project.

In the following sample we have a plugin mrhaki.gradle.version-file that is stored in the company intranet repository with the URL http://intranet/artifactory/libs-release/.

// File: settings.gradle
// As the first statement of the settings.gradle file
// we can define pluginRepositories:
pluginRepositories {
    maven { url 'http://intranet/artifactory/libs-release/' }
    gradlePluginPortal() // Include public Gradle plugin portal
}

In our build file we use the plugins DSL to apply the mrhaki.gradle.version-file plugin:

// File: build.gradle
plugins {
    id 'mrhaki.gradle.version-file' version '1.2.2'
}

There is a restriction when we use this approach. The plugin must be deployed to our intranet repository with plugin marker artifacts. Gradle needs these to resolve the value for id to the correct plugin. A plugin marker artifact is a deployment following a specific naming convention with a dependency on the actual plugin code. If we write our own plugin we best can use the java-gradle-plugin (Java Gradle Plugin Development Plugin), which automatically adds the plugin marker artifacts when we publish our plugin to the intranet repository.

We could also have used the buildscript configuration block in our build.gradle file to define a custom repository for our plugin. Inside the buildscript configuration we use the repositories block to add our intranet repository. But than we cannot use the plugins DSL, but have to use the apply plugin: syntax.

Written with Gradle 3.1.

February 19, 2016

Gradle Goodness: Using Nested Domain Object Containers

In a previous post we learned how to use the NamedDomainObjectContainer class. We could create new objects using a nice DSL in Gradle. But what if we want to use DSL syntax to create objects within objects? We can use the same mechanism to achieve this by nesting NamedDomainObjectContainer objects.

We want to support the following DSL to create a collection of Server objects, where each server can have multiple Node objects:

// File: build.gradle
apply plugin: com.mrhaki.gradle.DeploymentPlugin

deployments {
    aws {
        url = 'http://aws.address'

        nodes {
            node1 {
                port = 9000
            }
            node2 {
                port = 80
            }
        }
    }

    cf {
        url = 'http://cf.address'

        nodes {
            test {
                port = 10001
            }
            acceptanceTest {
                port = 10002
            }
        }
    }
}

This would create two Server objects with then names aws and cf. Each server has Node objects with names like node1, node2, test and acceptanceTest. Let's look at the Server class where we have added a nodes property of type NamedDomainObjectContainer as the nested object container. Also notice the nodes method so we can use the DSL syntax to create Node objects.

// File: buildSrc/src/main/groovy/com/mrhaki/gradle/Server.groovy
package com.mrhaki.gradle

import org.gradle.api.NamedDomainObjectContainer

class Server {

    /**
     * An instance is created in the plugin class, because
     * there we have access to the container() method
     * of the Project object.
     */
    NamedDomainObjectContainer<Node> nodes

    String url

    String name

    /**
     * We need this constructor so Gradle can create an instance
     * from the DSL.
     */
    Server(String name) {
        this.name = name
    }

    /**
     * Inside the DSL this method is invoked. We use
     * the configure method of the NamedDomainObjectContainer to
     * automatically create Node instances.
     * Notice this is a method not a property assignment.
     * <pre>
     * server1 {
     *     url = 'http://server1'
     *     nodes { // This is the nodes() method we define here.
     *         port = 9000
     *     }
     * }
     * </pre>
     */
    def nodes(final Closure configureClosure) {
        nodes.configure(configureClosure)
    }

}

And the Node class:

// File: buildSrc/src/main/groovy/com/mrhaki/gradle/Node.groovy
package com.mrhaki.gradle

class Node {

    String name

    Integer port

    /**
      * We need this constructor so Gradle can create an instance
      * from the DSL.
      */
    Node(String name) {
        this.name = name
    }
}

To make the DSL work we use a custom plugin so we can add the DSL for creating the objects to our project:

// File: buildSrc/src/main/groovy/com/mrhaki/gradle/DeploymentPlugin.groovy
package com.mrhaki.gradle

import org.gradle.api.Project
import org.gradle.api.Plugin
import org.gradle.api.NamedDomainObjectContainer

class DeploymentPlugin implements Plugin<Project> {

    public static final String EXTENSION_NAME = 'deployments'

    private static final String DEPLOY_TASK_PATTERN = 'deployOn%sTo%s'

    private static final String REPORTING_TASK_NAME = 'reportDeployments'

    private static final String TASK_GROUP_NAME = 'Deployment'

    void apply(final Project project) {
        setupExtension(project)
        createDeploymentTasks(project)
        createReportTask(project)
    }

    /**
     * Create extension on the project for handling the deployments
     * definition DSL with servers and nodes. This allows the following DSL
     * in our build script:
     * <pre>
     * deployments {
     *     server1 {
     *         url = 'http://server'
     *         nodes {
     *             node1 {
     *                 port = 9000
     *             }
     *         }
     *     }
     * }
     * </pre>
     */
    private void setupExtension(final Project project) {

        // Create NamedDomainObjectContainer for Server objects.
        // We must use the container() method of the Project class
        // to create an instance. New Server instances are
        // automatically created, because we have String argument
        // constructor that will get the name we use in the DSL.
        final NamedDomainObjectContainer<Server> servers =
            project.container(Server)

        servers.all {
            // Here we have access to the project object, so we
            // can use the container() method to create a
            // NamedDomainObjectContainer for Node objects.
            nodes = project.container(Node)
        }

        // Use deployments as name in the build script to define
        // servers and nodes.
        project.extensions.add(EXTENSION_NAME, servers)
    }

    /**
     * Create a new deployment task for each node.
     */
    private void createDeploymentTasks(final Project project) {
        def servers = project.extensions.getByName(EXTENSION_NAME)

        servers.all {
            // Actual Server instance is the delegate
            // of this closure. We assign it to a variable
            // so we can use it again inside the
            // closure for nodes.all() method.
            def serverInfo = delegate

            nodes.all {
                // Assign this closure's delegate to
                // variable so we can use it in the task
                // configuration closure.
                def nodeInfo = delegate

                // Make node and server names pretty
                // for use in task name.
                def taskName =
                    String.format(
                        DEPLOY_TASK_PATTERN,
                        name.capitalize(),
                        serverInfo.name.capitalize())

                // Create new task for this node.
                project.task(taskName, type: DeploymentTask) {
                    description = "Deploy to '${nodeInfo.name}' on '${serverInfo.name}'"
                    group = TASK_GROUP_NAME

                    server = serverInfo
                    node = nodeInfo
                }
            }
        }
    }

    /**
     * Add reporting task to project.
     */
    private void createReportTask(final Project project) {
        project.task(REPORTING_TASK_NAME, type: DeploymentReportTask) {
            description = 'Show configuration of servers and nodes'
            group = TASK_GROUP_NAME
        }
    }
}

We also have two custom tasks that use the Server and Node instances that are created by the DSL in our build file. The DeploymentTask is configured from the plugin where the server and node properties are set:

// File: buildSrc/src/main/groovy/com/mrhaki/gradle/DeploymentTask.groovy
package com.mrhaki.gradle

import org.gradle.api.tasks.TaskAction
import org.gradle.api.DefaultTask

class DeploymentTask extends DefaultTask {

    Server server

    Node node

    /**
     * Simple implementation to show we can
     * access the Server and Node instances created
     * from the DSL.
     */
    @TaskAction
    void deploy() {
        println "Deploying to ${server.url}:${node.port}"
    }

}

The DeploymentReportTask references the project extensions to get a hold of the Server and Node objects:

// File: buildSrc/src/main/groovy/com/mrhaki/gradle/DeploymentReportTask.groovy
package com.mrhaki.gradle

import org.gradle.api.tasks.TaskAction
import org.gradle.api.DefaultTask

class DeploymentReportTask extends DefaultTask {

    /**
     * Simple task to show we can access the
     * Server and Node instances also via the
     * project extension.
     */
    @TaskAction
    void report() {
        def servers = project.extensions.getByName(DeploymentPlugin.EXTENSION_NAME)

        servers.all {
            println "Server '${name}' with url '${url}':"

            nodes.all {
                println "\tNode '${name}' using port ${port}"
            }
        }
    }

}

Let's run the tasks task first to see which tasks are added by the plugin. Next we invoke some tasks:

$ gradle -q tasks
...
Deployment tasks
----------------
deployOnAcceptanceTestToCf - Deploy to 'acceptanceTest' on 'cf'
deployOnNode1ToAws - Deploy to 'node1' on 'aws'
deployOnNode2ToAws - Deploy to 'node2' on 'aws'
deployOnTestToCf - Deploy to 'test' on 'cf'
reportDeployments - Show configuration of servers and nodes
...
$ gradle -q deployOnNode2ToAws

Deploying to http://aws.address:80
$ gradle -q reportDeployments

Server 'aws' with url 'http://aws.address':
 Node 'node1' using port 9000
 Node 'node2' using port 80
Server 'cf' with url 'http://cf.address':
 Node 'acceptanceTest' using port 10002
 Node 'test' using port 10001
$

Written with Gradle 2.11.

Gradle Goodness: Create Objects Using DSL With Domain Object Containers

Gradle offers the NamedDomainObjectContainer class to create a collection of objects defined using a clean DSL. The only requirement for the objects we want to create is that they have a constructor that takes a String argument and a name property to identify the object. The value for the name property must be unique within the collection of objects. We create a new instance of a NamedDomainObjectContainer with the container method of the Gradle Project class. We can add the NamedDomainObjectContainer instance to the extensions property of our project, so we can use a DSL to create instances of objects that need to be in the NamedDomainObjectContainer object in our project.

The following code shows a simple build script in which we want to create a collection of Product objects. The creation of the NamedDomainObjectContainer object is done in a plugin so we only have to apply the plugin to use the DSL to create Product objects:

apply plugin: ProductsPlugin

// DSL to define new objects of type Product.
products {
    // Create Product with name pencil.
    pencil {
        price = 0.05
    }
    // Create Product with name crayon.
    crayon {
        price = 0.18
    }
}


class ProductsPlugin implements Plugin<Project> {
    void apply(final Project project) {
        // Create NamedDomainObjectContainer instance for
        // a collection of Product objects
        NamedDomainObjectContainer<Product> productContainer =
            project.container(Product)

        // Add the container instance to our project
        // with the name products.
        project.extensions.add('products', productContainer)

        // Simple task to show the Product objects
        // that are created by Gradle using
        // the DSL syntax in our build file.
        project.task('reportProducts') << {
            def products = project.extensions.getByName('products')

            products.all {
                // A Product instance is the delegate
                // for this closure.
                println "$name costs $price"
            }
        }
    }
}

class Product {
    // We need a name property
    // so the object can be created
    // by Gradle using a DSL.
    String name

    BigDecimal price

    Product(final String name) {
        this.name = name
    }
}

We can run the reportProducts task to see the name and price properties of the Product instances:

$ gradle -q reportProducts
crayon costs 0.18
pencil costs 0.05
$

Written with Gradle 2.11.

October 21, 2015

Gradle Goodness: Apply External Script With Plugin Configured Through Buildscript

Suppose we use the Gradle apply from: statement to import another Gradle build file into our build file. The external build file uses a Gradle plugin that needs a buildscript block to define the classpath configuration with the classes needed for the plugin. We cannot use the plugin id inside our external build script to use it, but we must use the type of the plugin. Otherwise Gradle cannot resolve the plugin from the main build file.

Let's create a simple Gradle build that we want to include in our main Gradle build file:

// File: gradle/extra.gradle
buildscript {
    repositories.jcenter()
    dependencies.classpath 'com.bmuschko:gradle-docker-plugin:2.6.1'
}

// We use the type of the plugin instead of the id.
// This is the class that defines the plugin. We can leave of
// .class, because Gradle uses Groovy.
apply plugin: com.bmuschko.gradle.docker.DockerRemoteApiPlugin

// The following statement doesn't work if this file
// is included via apply from: in another Gradle build file.
// apply plugin: 'com.bmuschko.docker-remote-api'

...

In the following Gradle build file we import this script:

// File: build.gradle
apply from: 'gradle/extra.gradle'
...

Written with Gradle 2.8.

April 19, 2015

Gradle Goodness: Alter Start Scripts from Application Plugin

For Java or Groovy projects we can use the application plugin in Gradle to run and package our application. The plugin adds for example the startScripts task which creates OS specific scripts to run the project as a JVM application. This task is then used again by the installDist that installs the application, and distZip and distTar tasks that create a distributable archive of the application. The startScripts tasks has the properties unixScript and windowsScript that are the actual OS specific script files to run the application. We can use these properties to change the contents of the files.

In the following sample we add the directory configuration to the CLASSPATH definition:

...
startScripts {

    // Support closures to add an additional element to 
    // CLASSPATH definition in the start script files.
    def configureClasspathVar = { findClasspath, pathSeparator, line ->

        // Looking for the line that starts with either CLASSPATH=
        // or set CLASSPATH=, defined by the findClasspath closure argument.
        line = line.replaceAll(~/^${findClasspath}=.*$/) { original ->

            // Get original line and append it 
            // with the configuration directory.
            // Use specified path separator, which is different
            // for Windows or Unix systems.
            original += "${pathSeparator}configuration"
        }

    }

    def configureUnixClasspath = configureClasspathVar.curry('CLASSPATH', ':')
    def configureWindowsClasspath = configureClasspathVar.curry('set CLASSPATH', ';')

    // The default script content is generated and
    // with the doLast method we can still alter
    // the contents before the complete task ends.
    doLast {

        // Alter the start script for Unix systems.
        unixScript.text = 
            unixScript
                .readLines()
                .collect(configureUnixClasspath)
                .join('\n')

        // Alter the start script for Windows systems.
        windowsScript.text = 
            windowsScript
                .readLines()
                .collect(configureWindowsClasspath)
                .join('\r\n')

    }

}
...

This post was inspired by the Gradle build file I saw at the Gaiden project.

Written with Gradle 2.3.

October 5, 2012

Gradle Goodness: Getting Announcements from Gradle Build

We can use the Gradle announce plugin to send announcements from the build process. We can send data to Twitter (I don't know if our followers are waiting for this, but if you want to you can), but also to notification applications on our local computers. For Mac OSX Growl is supported, for Linux notify-send and for Windows Snarl.

The plugin adds an announce object with the announce() method. The method accepts two arguments. The first argument is the message and the second argument is either twitter or local to indicate where to send the announcement.

apply plugin: 'announce'

task info {
    doLast {
        announce.announce "Running $it.name", 'local'
        println gradle.gradleVersion
    }
}

Here we see the announcement as Growl message:

We can also get an announcement object for only sending announcement to the local notification applications. We can use a send() method that accepts a title for the announcement as first argument and the message as second argument. To get the local announcement object we invoke announce.local:

apply plugin: 'announce'

task info {
    doLast {
        // Now we can specify a title and message
        announce.local.send "Gradle Info Task", 'Running'
        println gradle.gradleVersion
    }
}

To automatically send out notifications when a task is executed we can implement the TaskExecutionListener interface. From the implementation we can use the announce.local object. In the following example build file we create the class TaskAnnouncer and use the addTaskExecutionListener() method to add it to the TaskExecutionGraph available through gradle.taskGraph:

apply {
    plugin 'announce'
}    

gradle.taskGraph.addTaskExecutionListener new TaskAnnouncer(localAnnouncer: announce.local)

task info {
    doLast {
        println gradle.gradleVersion
    }
}

class TaskAnnouncer implements TaskExecutionListener {
    Announcer localAnnouncer

    @Override
    void afterExecute(final Task task, final TaskState state) {
        String message
        if (state.failure) {
            message = "Failure: $state.failure.message"
        } else if (state.executed) {
            message = 'Done'
        } else if (state.skipped) {
            message = "Skipped: $state.skipMessage"
        }
        send task, message
    }

    @Override
    void beforeExecute(final Task task) {
        send task, 'Ready to run'
    }

    private void send(final Task task, final String message) {
        final String title = "Gradle build: $task.project.name:$task.name"
        localAnnouncer.send title, message
    }
}

Automatically announce build results

To get the build results after running a build we only have to apply the build-announcements plugin to our Gradle build. This plugin uses the local notification applications to send out a message with a summary of the build. If the build failed we get a message with the task name that failed. For a successful build we can see how many task were executed.

apply {
    plugin 'announce'
    plugin 'build-announcements'
}    

task info {
    doLast {
        println gradle.gradleVersion
    }
}

The following screenshots show the result of a successful and non-successful build:

Apply for all Gradle builds

To add the plugins to all Gradle builds on our local computer we can create a so-called init script. Init scripts are executed before a project build script. We can place the init scripts at several locations. Let's create a new init script announce.gradle in our $USER_HOME/.gradle/init.d directory. If we don't have this directory yet, we can create it ourselves. All files in this directory are treated as init scripts by Gradle and are executed automatically. Here is the contents of the announce.gradle script:

rootProject {
    apply {
        plugin 'announce'
        plugin 'build-announcements'
    }    
}