Java 9 and macOS

Some notes for myself as I try to migrate the mac specific pieces of my Java SE app to Java 9.

  • Menu bar, in prior versions of java, a property was set to relocate the menu bar to where it should be: System.setProperty(“apple.laf.useScreenMenuBar”, “true”). In Java 9, it’s Desktop.setDefaultMenuBar().
  • Full screen mode, mac style. Previously it was a capability set through, now the windows work as expected, double click the frame bar and it maximises, click the maximise button and it goes into mac full screen mode.
  • GestureUtilities remain a mystery, I will update if I find the answer. It never worked on Java 7 or 8 as the underlying JNI methods were broken (will post the jdk bug reference, if I find it again, that states this and that it’s fixed in 9), however all I can see so far are JavaFX methods for swipe etc.
  • About, Preferences handlers etc are all set from Desktop() now, no need for fiddling with reflection (OSXAdapter is the usual method) to access eawt to allow compilation on a non-mac platform.

One giant leap

One giant leap

I confess, I never did complete migrating my software’s website from a Ruby on Rails app with an embedded local mySQL database running on a VM in the cloud to a full cloud native application leveraging lots of AWS goodness. I got as far as putting the installer binaries in an S3 bucket and referencing them from the app. And I only did that as the repo  had become greater than the max size allowed by BitBucket. I then studied all the different options, my real job took me to focussing on cloud native applications and I could see I was barking up yesterdays tree. The set up I’m running today is the day before yesterdays tree. Cloud Native all the way is the future.

From the earlier studies I could see it’s a big job migrating something that “just works”, has a complete deployment automation to UAT and production sites, courtesy of capistrano to a new world order while maintaining the simplicity of deployment enjoyed today. I recall that didn’t come easy, SSL certs, schema refreshes, Facebook integration for Oauth and so on. Rome wasn’t built in a day and neither will the next gen Drum Score Editor website be.

The first thing is to get to a point where development iterations can be fast, simple and automated. That’s because the scope of the app in terms of functions is too broad for a big bang approach so we’ll be developing in bite size chunks. When you do that you don’t want a massive overhead in testing, packaging and deploying your app, and in cloud native speak that means we need a Continuous Integration / Continuous Deployment pipeline. First investment should be in getting that up and running to be as efficient and agile as possible in getting new updates out. Eclipse is my IDE of choice so to be able to commit and push an update resulting in the changes pushed to the CDN’s caching the edge of the web would be ideal!

To build and prove the CI / CD pipeline, we’re starting with a static site containing the documentation pages, hosted on S3, developed using Eclipse, hosted in GitHub, deployed automatically to a UAT site using Travis. Why Travis, why GitHub, why S3 – because it’s a pattern that appears to be quite mature, so help is at hand courtesy of the usual places that Mr Google serves up, stackoverlow, personal blogs etc.

Travis uses your GitHub account, so integration was relatively easy, it found all my repos there although a slight hiccup in that it just sat there at the first_sync page and left me wondering but given I hadn’t read a single line of documentation at this stage I considered it so far so good! Activated the repo I needed in my Travis profile then set about setting up the static site on AWS.

Plenty of good how-to pages out there on setting up a static site. Wishing to use Zurb Foundation as the front-end framework, I downloaded their distro to a new static web project in Eclipse and followed the AWS docs to set up the static website in S3. Next was to enable a CloudFront distribution and then looked at putting a cert on the front. At this point I found my cert provider had been removed from the good guys club and all my certs would no longer be trusted. Thanks Startcom, or more accurately WoSign for your failures including not disclosing the acquisition of Startcom. Leaving it as for now, will revisit certificate renewal and reissue hell across my stuff at a later date. Depending on when you try that link you’ll either get the Foundation intro page or more of my docs as this develops!

Next is to get Travis talking to AWS, I followed Renzo Lucioni’s how-to, to set up the credentials Travis needs on AWS to update the S3 bucket and invalidate the CloudFront distribution, so it picks up the changes and pushes them to the edge. Watch that although the blog sets the environment variable for the AWS default region, it doesn’t include it in the s3 upload params, see the Travis S3 Deployment page. I’ve contacted Renzo, i.e. raised an issue as his blog is a static website generated from GitHub – great toolchain!

One thing to note, Eclipse insists on creating a folder with your project name in then another under that called WebContent. The .travis.yml file must go in the root of the git repo. Renzo’s example calls for the files to be deployed to reside in a folder called ‘build’ in the root of the Travis build container. I didn’t like the idea of an empty build script as I will be putting more in there soon, so add a line to create the build directory, and another to copy the contents of the static site there.

language: python
- "3.5"
cache: pip
# Install any dependencies required for building your site here.
# `awscli` is required for invalidation of CloudFront distributions.
- pip install awscli
# Build your site (e.g., HTML, CSS, JS) here.
- mkdir build
- cp -Rp DocNext/WebContent/* build
# Control deployment by setting a value for `on`. Setting the `branch`
# option to `master` means Travis will only attempt a deployment on
# builds of your repo's master branch (e.g., after you merge a PR).
branch: master
provider: s3
# You can refer to environment variables from Travis repo settings!
access_key_id: $AWS_ACCESS_KEY_ID
secret_access_key: $AWS_SECRET_ACCESS_KEY
# Name of the S3 bucket to which your site should be uploaded.
# Prevent Travis from deleting your built site so it can be uploaded.
skip_cleanup: true
# Path to a directory containing your built site.
local_dir: build
# Set the Cache-Control header.
cache_control: "max-age=21600"
# Allow `awscli` to make requests to CloudFront.
- aws configure set preview.cloudfront true
# Invalidate every object in the targeted distribution.
- aws cloudfront create-invalidation --distribution-id $CLOUDFRONT_DISTRIBUTION_ID --paths "/*"

So here’s the summary:

  1. Create a new Eclipse static website project and populate with the Zurb Foundation download, turn into a git repo, create GitHub repo, commit and push.
  2. Create your S3 buckets, CloudFront distribution, DNS entry
  3. Create your Travis account linked to your GitHub
  4. Copy in the .travis.yml file from Renzo’s blog, add the tweaks to populate the build directory and fix the region setting
  5. Commit and push to GitHub, watch the magic happen in Travis

Migrating Installer Images to AWS S3 and CloudFront

This is the second in a series of posts which describe the adventures encountered while sticking our heads even further in the clouds.

  • The first article is mostly Project Introduction & Background, describes what we’re doing and why in higher level terms
  • This post is about getting our static data hosted at AWS as our first steps to using the AWS cloud
  • Then we’ll talk about Migrating Site Local MySQL database to AWS RDS
  • And the big kahuna part 1: Establishing the Ruby on Rails Web App Environment on AWS EBS

Ok firstly let’s think about the types of static content we want to serve from AWS, and then look at how this impacts our web app and then the build and distribution workflows for the Drum Score Editor app itself.

We’ve got 2 types of static content we want served, firstly there’s the Drum Score App installer images for each platform, plus example scores and PDFs. We’re going to store these in an AWS S3 bucket primarily to reduce the size of the web app so it can be deployed in later steps using AWS Elastic Beanstalk, which has an untweakable hard maximum web app size it can deal with (500MB at last look).

Secondly there’s the Rails asset pipeline, all the static css, javascript etc that goes with an app. Some reading of the various opinions on the inter web reveals that AWS CloudFront is the CDN which caches copies of static assets closer to the user. You simply specify the origin of the files and it does it’s magic to make them appear. That origin could be the S3 bucket containing the installer images etc, or the origin could be your web site itself, or the assets can be precompiled to an S3 bucket also. We’ll look at the options and why we chose the solution we did later in this article.

Step 1 – getting the installer images into an S3 bucket

Before we can put anything in a bucket, we need to acquire that bucket. Should we just the use AWS Console as this is a one off operation, or maybe we want to have different buckets for UAT and Production separation and so should create a reusable script for their creation. Do we need that complexity? Probably not at this stage given our overall use case.

First we create a bucket for these resources, imaginatively called drumscoreportal-resources, and we upload our installer image into it using the console (or AWS command line tools) and make it public, by right clicking on the uploaded filename. Selecting properties will show the public name of the file, so to test this works we copied the link presented and pasted it into a command line and used curl to pull it down. Remember it’s a common installer we want to be available to everybody, no need to control access to it so no need to work out any permission stuff. This all just worked, so in theory we can tweak the download links in the appropriate page in the web app and off we go.

However, everybody does local development right? You really don’t want to be running up your AWS network costs by pulling the copies down when in an iterative dev/test cycle. So we need to somehow make the development environment use local copies while UAT and production use the S3 objects.

Given all we’re doing though is pulling down the objects via http, we don’t need anything more clever than a local web server and the files in a similar URL so the app can switch through Rails environment specific initialisers. The secrets.yml file is already used to separate out which hosts are used for each environment for things like the Facebook and Paypal integration. Might be impure, but popping a line in there for the resource_host and using that in the link tags might be viable.

The download link in the view then becomes

<a href="#{Rails.application.secrets.resource_host}/drumscoreportal-resources/DrumScoreEditor-2.23.dmg" class="button radius" download>Download For Mac OS X</a>

The secrets.yml entry for the environment then specifies  for production and UAT and http://localhost:8080 for development. Why that URL for development, well every Mac comes with python and from the directory you want to serve files from, the command below works well.

python -m SimpleHTTPServer 8080

Simple really, in summary our web app just serves up a link to the resource in the S3 bucket rather than from it’s own host. Really need a Rails guru to chime in and say what the best way to set the resource_host variable would be though. I’m sure secrets.yml isn’t meant for this!

Last thought before moving on, putting that installer image in the S3 bucket cost money, for transfer in fees, every download costs money, and we’ve made it publicly available – this worries me.

Step 2 – moving the rails assets pipeline to S3 & CloudFront

I’ve chosen not to do this at this stage. What? I thought this was an article about how to do that! Here’s the deal, if I follow the pretty simple advice out there to simply use CloudFront, e.g., then I’m simply running up my costs further.

Well that’s how it seems at the moment, with no control over bandwidth costs due to user behaviour or any other malevolent person choosing to do so, we’re exposed enough already. I’ll park this for now, and return when I understand better what techniques are available for controlling exposure here.

For reference, my current setup in the single VM, which hosts both the UAT and Production website in Apache virtual sites, and the database is different schemas in a single MySQL instance, costs less than a tenner a month, and has unlimited (subject to fair use) bandwidth. For sure this project is about adding resilience and scalability but as always there’s a decision about costs versus value. We don’t understand our total costs yet, perhaps there’s a way of modelling and understanding potential costs based on apache and mysql logs, but also need to understand the behaviours on costs that EBS, RDS, S3 and maybe eventually CloudFront add to this.

Scalable Resilient Distribution of Drum Score Editor

AKA getting very cloudy out there! This is the first of a series of posts which will describe the adventures encountered while sticking our heads even further in the clouds.

This instalment is mostly Project Introduction & Background, rapidly followed by (which will be links once written):

Briefly, Drum Score Editor is a GUI app that installs locally on Windows, Mac OS X or Linux. It’s written in Java and pretty much lives the write once run (almost) anywhere dream, ok no tablets or phones as the industry pretty much needed a whole bunch of new challenges in that domain – not.

To get it to it’s users, Drum Score Editor is packaged into native installers for each platform (another process worthy of improvement and an article one day) and those installers are hosted on a bespoke website. So it’s not that simple, as there’s a free version which is very usable imho, and then with the application of a license key, it unlocks a bunch of productivity features.

To deliver this to it’s users a bespoke website has been written, as integrating with Mac, Windows, Ubuntu etc app stores is just a bridge too far, oh and Apple won’t let me host it on theirs as the optional license key is regarded as an in-app purchase and because it’s cross-platform it doesn’t use their technology so they believe I’m avoiding the 30% fees they charge. Oh and the Microsoft one just wanted ridiculous upfront costs. There’s probably another article all about this space waiting to be written.

So this website, on the face of it it’s fairly simply, you connect to it, you download the installer right from the home page, no tracking, no identity capture needed, no email spam afterwards, nada. Nothing complicated there, static installer images hosted on the website accessed from a (sort of) pretty html page.

But then there’s the making a financial contribution in exchange for a license, that’s where it gets not-so-simple. Integrations through oauth2 are needed to register a new account or sign in to an existing one. From there if the users wishes to acquire a new license we’re integrated with PayPal to collect the funds, and then when that’s done, encrypt a new license key by spawning some Java code which does all the complicated bits, returning the keys to stash in the users account. Oh, and just to be sure we’re good, everything’s behind an SSL cert (https FTW), which is also used to help protect the site when talking to the aforementioned authentication and payment sites.

So not quite so simple as bunging it in a simple call to AWS Elastic Beanstalk and hey presto. I didn’t mention AWS before, that’s the target cloudy environment chosen for it’s rich etc etc, it’s the first, biggest and most complete imho. We’re already a bit cloudy but it’s oh so last year and not as comprehensive a solution as offered by AWS, not that I’m sure we really need all those bells and whistles as it’s been going fine as a single web app on a VM I rent from a hosting provider on the internet for a few years now. Yes, single, so no resilience in the case of provider outages, no disaster recovery other than the fact I back up the database and email it’s contents to myself each day. Ahem, yes data protection, some of that too please.

Had a quick look at what it will take, and the first hurdle is my current single git repo containing the web app and the installer images is about 1GB in size. The AWS EBS experts will now be jumping up and down to say it’s too big, fix it! We will, that’s the first step in getting this web app some more professional attention. It’s a shame though as it breaks the model of a single repo for the whole app and it’s data. We’ll move the installer images to S3 and use CloudFront to serve the content more locally to the sites users.

Second phase is to shift the database out of the singe instance web app, so it can be accessed by multiple instances of the app. This is going to be trickier. Currently there’s a very effective Capistrano integration (seems I forget to say the web app is written in Ruby on Rails, talking to a MySQL database), which takes care of pushing updates to either a UAT or the Production website. Both of which are hosted on the same VM, by the same Apache2 instance configured with virtual servers. Yeah, not exactly a lot of separation there either, another thing that’ll be fixed by getting cloudier. I like to call the current config as “just good enough”, and the thinking man’s server consolidation (1990’s style).

Third phase has to be to move the main web app to EBS. There will be plenty challenges with this, hence wanting to separate out the two major changes above and make sure they’re working before this piece of heavy lifting. Just to finish this intro to the project, here’s a picture that tries to show the before and after for the whole shebang at a fairly high level.


By the way, just as a footnote, when we’re done with this, we’ll look to see how the many tools and github repos which address the complexity in this space are progressing. Right now, I couldn’t find anything that meets the requirements, without a complete rearchitecture and rewrite of the web app. Feel free to comment away with recommendations, I’ll look at them all, honest!

Oh and p.p.s., continuous integration, there must be answers in there for this whole toolchain! Another future post.

A Windows build machine in the cloud

I need a windows machine for about an hour each time I release my software. I need it to perform the final build and validate it on a windows platform. The majority of the testing I can do on my Mac as the software is 99% identical between platforms, thank you Java.

I used to run a Windows VM I had on a 128GB SDXC card, it failed, I have a backup somewhere but I’m on the road, on a lengthy business trip. I’ve promised a new release of the software, what do I do?

I’m no longer concerned about my digital assets when a computer, or storage device dies. My software is in the cloud on Bitbucket, my photos are out there somewhere too. My life is managed by Evernote, also out there in the cloud. So where should my windows server be on the rare occasion (once a month on average maybe) that I need it?

You can now rent a Windows server by the hour, I don’t need massive performance, just enough to get a build done and ensure my software installs OK and functions, preferably within an hour to keep rental costs down. A little research shows me who the players are and the costs. Here’s what I found as of September 2015.

Per Hour Analysis Service Name Server Type Server Cost Disk Cost Windows Cost Network Cost
1 shared CPU/0.6G RAM
Estimate $0.54
1 shared CPU/1.7G RAM
Estimate $0.69
1 vCPU/3.75G RAM
Estimate $0.74
1 vCPU/0.75G RAM/20G Disk
Estimate $0.02
1 vCPU/1.75G RAM/40G Disk
Estimate $0.08
2 vCPU/3.5G RAM/60G Disk
Estimate $0.16
Estimate $0.75
Estimate $0.77
Estimate $0.80

In other words for my usage it’s all very, very cheap. All this was with a quick search of their websites, so much different language used I may have interpreted some of this inaccurately but it’s sightly more than a SWAG to start from. Time to sign up, write some scripts to automate deployment of each and cycle round each one per month and see in practical terms what the costs come to, how well the service works for my use case.

The Big Move II: Java Deployment Automation with javafx-gradle

Before cracking on to step 2 in the master plan, where we rewrite the app in javafx instead of swing, it might be possible to tidy up how I build releases for the users. Currently I take the jar (previously created by eclipse and now by gradle) and then run the javabuilder command from java 1.8 with a whole bunch of parameters to create the install package for each platform, e.g. for mac it’s this:

export JAVA_HOME=`/usr/libexec/java_home`
export JP=$JAVA_HOME/bin/javapackager
CMD="$JP -deploy -srcfiles ./DrumScoreEditor-2.00.jar -outdir ./outdir -outfile DrumScoreEditor -native installer -appclass org.whiteware.DrumScoreEditor -name Drum\ Score\ Editor -Bicon=Drum\ Score\ Editor.icns"
eval $CMD

Run it and you get:

➜ builder.test ./
No base JDK. Package will use system JRE.
Building DMG package for Drum Score Editor
Result DMG installer for Drum Score Editor: /Users/alanwhite/Development/export/builder.test/./outdir/bundles/Drum Score Editor-1.0.dmg
Building PKG package for Drum Score Editor
Building Mac App Store Bundle for Drum Score Editor

This results in a great dmg in outdir/bundles that has the usual drag to install experience for mac. It’s also signed, and for the life of me I have no recollection how it gets hold of my developer certificates from the keychain, probably a smart default/convention (on the ToDo list to investigate).

What I’d really like is to have gradle do all this for me, without me trying to remember each time what I need to do for Mac and Windows. This is what the plugin at promises, I think! This is what we’re discovering here, don’t want to gripe as it may be my lack of context but documentation seems very sparse.

First step was to download the 32MB repository and investigate the contents. Then following the readme and building a couple of the samples, it all seemed to make sense. What is really useful is the FullyExpressed sample, which shows most of the configuration you can specify if the built in defaults (called convention these days) aren’t suitable.

Then, when adding the plugin to my test gradle project for Drum Score Editor, it got messy. So the only way I could see to integrate is was to copy the file javafx.plugin into the gradle project directory and reference it in the build.gradle using plugin from: syntax. Once through this, I was simply not understanding the errors encountered when trying a gradle build, it seemed to be a clash between jar files / directives in the build.gradle. To save you the same pain, this is the point I realised that if I add the javafx plugin, I had to remove the java plugin from the build.gradle. From there, once I’d specified as a minimum the mainClass in the javafx closure in build.gradle, I could at least get a complete successful compile and deploy cycle, i.e. it produced the mac app, dmg, pkg etc. Short lived success though – running the app gave:

LSOpenURLsWithRole() failed with error -10810 for the file /Users/alanwhite/Development/gradletest/build/distributions/

I remember in the dim and distant past dealing with such obscure errors when trying to figure out in the bad old days, prior to javapackager, how to create the mac app. This one appears to say it can’t find a java runtime, but I learned a while back that there’s lots of misleading clues in this space. Turns out I’d mistyped the mainClass in the build.gradle – only telling you here in case anyone googles the error and maybe avoids a wasted hour tracking it down!

Time to move the last pieces of functionality explicitly stated in the old build scripts above, the name of the app is fairly simple, that’s a directive you can see in the final build.gradle below, however icons require some explanation, all became clear when I found this blogpost from the author of the javafx-plugin Don’t be put off by the age of some of these posts btw (it’s 2015 I’m writing this), I’ve seen comments about javafx-plugin not being maintained. This will be a problem when it stops working, right now, I’m getting close to liking it by this point!

Having found my original icons that made the iconset in use on the Mac builds, after copying them into src/deploy/package and prefixing them with shortcut_, a gradle build generated the icns needed and buried them in the app.

The last piece of customisation is the developer id for signing the app. Not sure how the prior method works, but I think now it works by providing my name (as specified in the developer cert), by concatenating with the well known strings Apple use, it pulled it out of the keychain. Very happy with the progress here, next – integrating the Windows build.


First the good news, taking all the work performed on the Mac, adding to a git repo, pushing and pulling down to the windows machine and “it just worked”. gradlew did what is was supposed to in loading the correct version of gradle etc – really useful feature.

The problem was my build was failing when trying to build the installer. I learnt a lesson here after an hour or so poking around the web and trying different things. Important lesson: if something doesn’t work use “gradlew build –debug”. The output is verbose but all I had to work on before this was error 2 on exec of iscc.exe. Nice, huh? With the debug output I could see that iscc.exe was complaining the icon being used for setup was too large.

A bit more digging and it looks like javafx-gradle combines all the different sized pngs in src/deploy/package into a multi-layered ico file so windows can choose the best fit resolution. Useful except if you’ve got the full complement of sizes required for a full Mac iconset, the generated ico is bigger than the 100Kb that Inno Setup supports (taken from the source code on github).

By trial and many errors, I thought the best solution I could come up with for maintaining the conventional directory structures and minimal customisation to the build.gradle file was to use the icon configuration statement and put it inside the windows platform specific section, which overrides the convention if invoked. That way there’s no need to complicate the Mac build with additional statements, nor compromise on file locations.

This however wasn’t actually why it worked, I’d clumsily also deleted a 512×512 png at the same time as it turns out from checking the javafx-plugin source code that the icons statement doesn’t apply within the platform specific scope statements. Grr, controlled testing needed.

apply plugin: 'eclipse'
apply from: 'javafx.plugin'

dependencies {
    compile fileTree(dir: 'libs', include: ['*.jar'])

sourceSets {
    main {
        resources {
            srcDir 'src/main/java'

javafx {
    appName 'Drum Score Editor'
    mainClass 'org.whiteware.DrumScoreEditor'

    profiles {
        macosx {
            bundleArguments = [
                'mac.signing-key-user-name' : 'Alan White'

The Big Move

I’d like to have Android and iOS versions of Drum Score Editor, I’ve been asked many times. To do this, the recommended way according to each of the vendors of the platforms (Google, Apple, Microsoft) is to buy and learn their proprietary implementation languages to get the best seamless experience, completely rewrite your app and have it feel like it belongs on their platform, naturally. Drum Score Editor is 25,000 lines of Java, I’m not rewriting that several times!

  • Mac OS X: learn a language called Swift, the Xcode IDE, the Mac OS APIs, the Mac App Store, proprietary licensing and in-app purchase technology
  • iOS: as above, but with a bunch of different underlying iOS APIs ……
  • Microsoft Windows: a different language, C++, different Windows APIs, a different App Store, licensing and purchasing technologies
  • Android: well it’s written in Java, but not as we know it Jim. A different build environment, yet another set of app store and purchasing technologies

Unfortunately that means lots of different skills, and lot’s of code rewriting multiple times; skills I can’t afford to employ or learn, and time I’d lose for little functional progress as Drum Score Editor is my hobby, not some multi-million dollar IT conglomerate.

However, there’s new technologies becoming available from the open source world that might make the concept of one piece of software for all platforms possible. Drum Score Editor is written in a language called Java and it’s user interface built using the Swing libraries. There’s a new way of writing desktop apps in Java called JavaFX. Once written in JavaFX, you can use various technologies to make your app installable, and runnable on Mac OS X, Windows, iPads and Android tablets. Some work will need to be done for each platform to make the app feel natural, but the bulk of the effort remains common.

Sidebar: I don’t think I’ll ever get away from the differing app stores and technologies, I do like the reach the app store concept gives, e.g. the free 1.97 version of Drum Score packaged for the Mac App Store had 1,077 downloads. Neat, but to replicate on all platforms, and provide an in-app purchase for the studio workflows is a lot of work.

Looking into the technologies needed to get closer to having the bulk of Drum Score’s source code being the same across all platforms and to reduce the amount of effort involved in packaging and releasing it, the first thing I need is a cross-platform build system. Gradle is touted as the way to go with the best integrations for Android and iOS, as well as being designed with Java in mind in the first place. Currently I use Eclipse’s built in build mechanism to produce an executable jar, and then use the javabuilder tool to create installers for Mac and Windows. Gradle appears to consume all of that, and integrate with Eclipse.

So ….. the plan!
Step 1: convert to Gradle, ensuring I can retain the Eclipse IDE, source code control in Git, and build Mac and Windows packages as before.
Step 2: convert Drum Score Editor to JavaFX, leveraging the development and packaging system in step 1, producing Mac and Windows packages
Step 3: produce iOS package, and see how usable it is & introduce platform specific code in Drum Score Editor for the iOS platform
Step 4: repeat step 3 for Android

Step 1: Convert to gradle
A little tricky figuring it out. There’s lot’s of documentation but it doesn’t seem to be complete, it’s hard to pick through it, and in some cases various stack exchange answers were the only source. Nevertheless if you’re used to working open source you’ll be used to picking around for the ultimate truths.

I wanted to leverage the convention over configuration paradigm as much as possible, without changing actual program source code.

My settings.gradle file looks like this: = 'DrumScoreEditor'

This ensures the produced jar file with the application in has the same name, rather than the name of the directory you’ve put the code in to build it.

The step 1 build.grade looks like this so far:

apply plugin: 'java'
apply plugin: 'eclipse'

dependencies {
     compile fileTree(dir: 'libs', include: ['*.jar'])

sourceSets {
     main {
          resources {
               srcDir 'src/main/java'

jar {
     from configurations.compile.collect { it.isDirectory() ? it : zipTree(it) }
     manifest {
          attributes 'Main-Class': 'org.whiteware.DrumScoreEditor'

The dependencies statement is needed because I always copy locally the exact versions of 3rd party libraries I’m using. We can argue about best practises but this is to reduce my overheads. I do a lot of development disconnected from the network so need to be self-contained and as few distractions due to changing versions of these libraries. This effectively tells the compiler where to look for the 3rd party libraries. This is one area where I had to change the project structure.

The sourcesets resources directive is because the resources, i.e. images, that the app loads are currently in a well known location in the source tree, and this ensure they’re copied over into the equivalent location in the class tree that’s bundled into the final jar. I could have tried moving them, so the convention would work but that means a source code change – could be simple as I used a public static constant for the path, not taking the distraction on just yet.

The jar statement is all about how to configure the produced jar file. Here we say bundle all 3rd party libraries in the jar, and then specify the entry point to the program.

At this stage ensure you can run your jar, after ‘gradle build’ it’s in build/libs. A ‘java -jar DrumScoreEditor.jar’ and all was eventually well, once I’d got the build.gradle file looking as above.

A quick ‘gradle eclipse’ and the necessary files were created to allow a relatively drama free import into eclipse using File->Import, Existing Project Into Workspace. Ran first time, due to having resolved all the runtime libraries first as per the build.gradle.