Author here! Hope you take a look at the project and find it cool. There's a lot of interesting stuff here. In particular, the Video linked on the landing page is a great intros from a Java developer point of view, and the following video is a great intro from a Build Tool Architect point of view:
For those of you who want to learn more about the design principles and architecture of Mill, and what makes it unique, you should check out the page on Design Principles which has links to videos and blog posts where I elaborate on what exactly makes Mill so different from Maven, Gradle, SBT, Bazel, and so on:
I've mentioned this in a few places, but the comparisons with other build tools are best-effort. I have no doubt they can be made more accurate, and welcome feedback so I can go back and refine them. Please take them with a grain of salt
I'm also trying to get the community involved, so it's not just me writing code and running the show. To that end, I have set up a bounty program, so pay out significant sums of money (500-2000USD a piece) for people who make non-trivial contributions. It's already paid out about 10kUSD and has another 20kUSD on the table, so if anyone wants to get involved and make a little cash, feel free to take a shot at one of the bounties! https://github.com/orgs/com-lihaoyi/discussions/6
How possible is it to make your tool "zero-config" by default? I see a lot of comments in this thread and elsewhere on twitter asking for essentially `go build`, `go fmt`, `go test` for Java/JVM. I think the language has quite strong convention around directory layout and file naming already, so do you think it would be possible for mill or a mill wrapper to offer the same kind of standardized zero config workflow? I think a JVM tool that gets that right - takes it as far as possible to the golang model - would have a lot of happy users.
Scala CLI has replaced the default runner since Scala 3.5, so you can effectively do `scala run`, `scala fmt`, and so on. On the Java side, I believe JBang provides a very similar developer experience.
Fundamentally it's hard to reconcile both worlds though. Building non-trivial multi-module projects on the JVM is inherently complex especially when you throw in multiple build targets, multiple toolchains, multiple platforms...
With simpler build tools (like in Go or Rust) you shift this complexity elsewhere, typically in a Makefile and/or a Docker/OCI based build pipeline, and these can get pretty complex too. Let alone distributed build tools like Bazel.
There's scala-cli, which has become the default Scala run command since Scala 3.5 but is also available separately. It has all those bells and whistles and allows scripts to grow organically. And no matter the name, it handles Java code too.
With scala-cli, there's not even a need to download a Java runtime or a language distribution. You can let the runner do its thing, or pass options to choose the JVM and the language version to use, or even write those options into special headers in the code files. You can also write tests, format code... it's all built-in. And in cases the code outgrows the tool and there's a need to migrate to a different build tool, there's even a feature to export the build to Sbt or Mill.
Not the author, but this is unfortunately a bit more difficult than it sounds. Like for example, where do you get the name of the jar file to build? I guess you could use the name of the root directory, but that may not be ideal.
How do you figure out dependencies? Import statements in .java files give you the packages to import, but those package names could be provided by one or more .jar files and, regardless, the package names need not bear any relation to the jar name or its group/artifact IDs (if pulling from e.g. a maven-style repository, which basically everyone does).
For multi-module projects, how do you figure out the dependencies between the modules, even? Sure, you could probably figure that out by parsing all the .java files in all modules and figuring out what they provide and import, but that would be slower than maven, probably.
You could certainly do this for small, dependency-free programs, but it would be such a niche use case that I don't think it would be worth the time.
Name of the jar? `java build jar/foo` -> foo.jar, `java build src/dog` -> dog.jar.
Dependencies? It's okay to use a dependency list file for these - I guess I don't consider this config compared to the stuff I usually find in a Gradle or Maven file. The thing I'm alergic to is all the stuff that isn't a dependency list.
In Go, these go in a go.module file that's automatically updated by the build tooling, which runs instantly & has a cache you never need to think about. Go has the advantage of import paths being URLs that specify the dependency too, but I think my Go-For-Java tool would use reverse package import search from an online service to map eg com.foo.bar.something.Potato to the appropriate package "foo-bar" from Maven Central or whatever. Building that index seems like a trivial program to write for the average Java engineer.
The more I think about this "go for java" idea, the more I want to build it "in anger" just to see how off-base I am. Maybe I really am just going to re-implement or wrap sbt, mill, gradle, https://www.jbang.dev/ idk. It just feels like the experience could be an order of magnitude simpler as an end-user with some conventions strictly enforced by the tooling.
There are initiatives like declarative Gradle or JetBrains' Amper. But I assume they'll hit a wall in real-life exactly like Maven does. Think about packaging for instance, I see at least 4 or 5 different ways that are fairly common, and that's for one target only.
Dependency resolution uses Coursier, which is one of the open source JVM dependency resolvers. SBT uses it too, and my last company used it with Bazel
The "ivy" thing is legacy haha. Mill used to use Apache Ivy to resolve dependencies, years ago. Coursier was a better/faster replacement, but names have a tendency to stick around
I’d be interested to read a comparison with Bazel (which you already mention as one of the influences).
For somebody looking to escape from Gradle, Bazel seems like one of the most promising alternatives, as it’s built on sane and sound fundamentals. Although in practice it has plenty of rough edges and annoyances, so maybe there are areas where Mill can do better.
Traditionally I've labelled my OSS projects 1.0 when they've stabilized and the rate of change has greatly reduced. Right now Mill is not there yet, but maybe if at end-2025 we realize no breaking changes have been needed since end-2024, we can call it 1.0
Just a tiny notice, the github page, nor the website seem to currently contain an “installation” link. The one found by google returns a ‘page not found’ for the current version.
A build tool that is not only fast but configured in a type safe way sounds great. I really like this quote from the "Why use scala" part of the documentation:
> Most developers using a build tool are not build tool experts, and have no desire to become build tool experts. They will forever be cargo-culting examples they find online, copy-pasting from other parts of the codebase, or blindly fumbling their customizations. It is in this context that Mill’s static typing really shines: what such "perpetual beginners" need most is help understanding/navigating the build logic, and help checking their proposed changes for dumb mistakes. And there will be dumb mistakes, because most people are not and will never be build-tool experts or enthusiasts
I gave Mill a try earlier this year. My hope was to escape the nightmare that is Gradle, which I've been using for many years. Mill sounds great in theory (except for the Scala DSL). Unfortunately, I couldn't get a basic Java build to work in half a day, even though I have (admittedly rusty) working knowledge of Scala. It was one obscure error after another. My conclusion was that Java support isn't ready. There was also very little documentation on how to build Java.
In my opinion, using a GPL as the build language of a polyglot build tool is a dead end, both for technical/usability reasons and because the ensuing language wars can't be won. I'm looking forward to the day when a build tool embraces a modern config language such as CUE or Pkl.
Depending on when you tried, it could be worth trying again: support for Java has improved greatly in the last few months, as have the documentation. Come by our discord channel if you get stuck and i can help unblock you
I've had the same gripe about having to keep up with a second language just for the build tool for a while. Try taking a look at JeKa https://jeka.dev/
Mill looks interesting, but, _from a Java development perspective_, it has the same fundamental challenge as Gradle (and most other build systems), which is that its config language _is something other than Java_. That means there's a significant cognitive burden to understand and manage something that one hopes to not have to think about very often.
I find that the pain I experience with Gradle isn't usually about how to do something clever or customized etc, but instead it's when I haven't thought about Gradle syntax in the last 3 months since everything has been silently working, but now I need to figure out some small thing, and that means I need to go re-learn basic Gradle stuff - whether it's groovy, Kotlin, or some aspect of the build DSL - since my mind has unloaded everything about Gradle in the meantime.
Simplifying the semantic complexity of a general purpose build system will always help, but the most useful thing for me would be if the configuration for a Java build were to natively use the Java language directly.
It's intended as a replacement for _scala_ builds. Having a build definition in the native language that doesn't require a different syntax (like a declarative syntax such as maven xml or toml) makes task customization easier for the maintainer of a given project. Unfortunately, it also means that you have to know the language and read the documentation for the build system.
If you want something declarative, there's also bleep[1] in the scala ecosystem. And for single module builds there's scala-cli[2]. It's also possible to use gradle and maven for scala projects, but for an java-only shop I wouldn't be using mill or bleep because there's no need to introduce a new language just to manage the build. For scala/java/kotlin hybrid projects though, gradle or mill or sbt would be my recommended tool because of how tightly they are coupled with the cross-platform build matrix nature of scala library and build system plugin ecosystems. For larger builds, it's mill or bazel because there s a performance cliff in sbt and gradle, and bleep is too new to have all the standard plugins ported. We use mill at writer.
The intention has changed, Mill now explicity targets Java and Kotlin as well. It now has dedicated Java/Kotlin docsite
sections and examples, and has grown integrations with Palantir-Format, Checkstyle, Errorprone, Jacoco, and all their Kotlin equivalents (ktfmt, ktlint, kover).
Java and Scala (and Kotlin) are remarkably similar from a tooling perspective, so Mill tries to target both using the same shared infrastructure
> It's intended as a replacement for _scala_ builds. Having a build definition in the native language [...] makes task customization easier for the maintainer
Totally agree! But the title of the post says "Mill: A fast JVM build tool for Java and Scala" :) - it certainly looks like better tool for the Scala community.
For projects that are primarily building Java sources, it'd be nice to have a build system that uses Java code to describe the build. I don't think this exists at the moment.
Even in Java, because the language is relatively verbose, many frameworks fall back to an "inner platform" of magic annotations which have the same problem: e.g. just because you know how Java works doesnt mean your mind hasn't unloaded all the SpringBoot annotation semantics! But despite that it is worth it, because conciseness does matter.
Mill using Scala syntax is like that, but with the added advantage that even if you forget how Scala works, your IDE does not. You can really lean on Intellij or VScode to help you understand and navigate around a Mill build in a way that is beyond what is possible for most build tools: You can autocomplete things, peek at docs, navigate the build graph and module tree, etc. and learn what you need to learn without needing to reach for Google/ChatGPT. I use this ability heavily, and I hope others will enjoy these benefits as well
> Until you need to fix a 3 year old build that has some insane wizardry going on.
My experience with Gradle is that it's the "3 year old build" that is almost certainly a death knell more than the insane wizardry part. My experience:
git clone .../ancient-codebase.git
cd ancient-codebase
./gradlew # <-- oh, the wrapper, so it will download the version it wants, hazzah!
for _ in $(seq 1 infinity); do echo gradle vomit you have to sift through; done
echo 'BUILD FAILED' >&2
exit 1
I dislike Gradle as much as you probably do, but between Maven and Gradle, the one that "vomits" stuff on the command line is definitely Maven.
Gradle errs by going too far to the other end: it just doesn't log anything at all, even the tasks that are actually being run (vs skipped... do you know how to get Gradle to show them?? It's `gradle --console=plain`, so obvious!! Why would anyone complain about that, right?!) or the print outs you add to the build to try to understand what the heck is going on.
Having worked with Maven and Gradle, I'd say Gradle was worse in the average case, but better in the worst case. There are way more Gradle projects with unnecessary custom build code because Gradle makes it easy to do.
On the other hand, when builds are specified in a limited-power build config language, like POM, then when someone needs to do something custom, they have to extend or modify the build tool itself, which in my experience causes way more pain than custom code in a build file. Custom logic in Maven means building and publishing an extension; it can't be local to the project. You may encounter projects that depend on extensions from long-lost open source projects, or long-lost internal projects. On one occasion, I was lucky to find a source jar for the extension in the Maven repository. It can be a nightmare.
The same could happen with Gradle, since a build can depend on arbitrary libraries, but I never saw it in the wild. People depended on major open-source extensions and added their own custom code inside the build.
When I used Maven, extensions had to be published to and pulled from a public repo. We couldn't even use the private repo that we used for the rest of our libraries, because the extension had to be loaded before Maven read the file where our private repo was configured.
Whereas a Gradle build can read Groovy files straight from disk.
I'm using several maven plugins (not extensions) that are defined within the reactor project itself. It works well.
You do need to split your build into multiple projects governed by a reactor but you'll have that anyway as soon as you have more than 1 module. Then you just always build the reactor. Pretty much the same idea as gradle.
Then you don't have a standard build, you have a build with multiple steps that needs to be documented and/or scripted. In an organization where every other project builds in a single step with "mvn package", and people can check out a repo and fire up their IDE and stuff just works, people are going to get bent out of shape because from their perspective, things aren't working out of the box.
A slightly more powerful build tool that supports custom code in the build doesn't force users to script around it. You can create an arbitrarily customized build that builds with the same commands as a Hello World project. (It's a double-edged sword, to be sure, because people don't try as hard to avoid customization as they would with Maven.)
You can, but why should you need to? Why can't the build tool take the plugin code directly off of disk, build it, and use it? This kind of orchestration of manual steps is what build tools are meant to be good at
Sure. But adding ability to self-modify the build drastically increases the complexity of a build tool. Maven developers decided that they want to avoid that.
1: I believe that you encountered errors, programming is packed to the gills with them, but correlation is not causation in that just because it did not immediately work in your setup does not mean it's impossible or forbidden
My problem with gradle is that they keep making breaking changes for low value things like naming of options, so I have to chase deprecation warnings, and can never rely on a distro supplied gradle version
Gradle devs, please get over yourself and stay backward compatible.
Not to defend Gradle too much, but Groovy is a superset of Java. So if you want, you can just use the regular Groovy dialect and then write Java in your build scripts, it should work.
This is not entirely a solution though, because Gradle's APIs are fairly complicated and change regularly.
(Nitpick, but it’s just a “superficial” superset. The biggest difference is probably doing “multi-methods”, aka the runtime type of an argument deciding which method implementation to call vs java’s static overload resolution.)
The thing that’s great about maven is its declarative nature. You can declare goals and profiles for whatever you need the build system to do.
The main appeal that I can see from mill over maven is the power of dynamic programming over static xml files. Maybe good lsp/ide support will make managing a build system like this bearable?
Yes, IDE support in Mill is key. Without IntelliJ or VSCode, Mill would not be nearly as pleasant to use as it is today.
Mill and Maven both let you declare goals for what you want to do. One does it in XML and one does it in typechecked code. While XML does work, doing things in code with typechecking and full IDE support turns out to be pretty nice as well!
The comparision with Gradle is not up to date. There is stated that you would end up in an untyped mess of Groovy build files, but statically typed Kotlin files are the default for quite some time now in Gradle!
https://mill-build.org/mill/0.12.1/comparisons/gradle.html
Author here. Unfortunately this is because my own experience with Gradle is not up to date; I've only lived in the Gradle Groovy world! If anyone is interested in helping out, I have a 1500USD bounty on porting a gradle.kts build to Mill, so we can do a fair up-to-date comparison https://github.com/com-lihaoyi/mill/issues/3670
I never got why people thought Kotlin would help Gradle. It absolutely doesn't.
Groovy was never the problem (Groovy has types, always had, you could use them if you wanted).
Think about it: what do you do with a build tool? You write a little recipe, then you run it. Does that remind something? Yes, it reminds scripts, like bash scripts you run all the time in your terminal. And why are scripts almost universally written without types... and no typed alternative, of which there are many, has caught on? Because if you're just going to modify a script and immediately run it, while keeping it short enough so you can know what it does without reading a book, how does adding types help you? Quite to the contrary, scripts (including build scripts) should be small, and having types all over the place make it far more verbose than it should be, likely pushing it out of your comfortable local memory in your brain, at which point you need something akin to a "real" programming language and a compiled program, not a script. Larger programs benefit from types because you don't just run the program, make changes, and run them again, like you do with scripts. You write them, test them, compile them, package them and finally you distribute them to your users who hopefully only need to configure them, not modify their internals. If your build is that complex, that's exactly what you should be doing instead of trying to shoehorn types into your scripts and expecting them to look like real programs.
Also, the Kotlin DSL just doesn't assist in the most problematic aspect of Gradle: its total lack of discoverability. Try doing something on your Kotlin Gradle file using a plugin you're not familiar with (which is all of them for most of us). It's completely impossible unless you know the DSL of the plugin, just like it was the case with Groovy... Once you know the DSL, it's fairly easy, but even in Groovy you will get auto-completion once you've got to the DSL "entry point", no need for Kotlin. I've been saying this since before they introduced the Kotlin DSL, and now i feel completely vindicated. I've never met anyone who told me "Gradle is so much easier now with Kotlin". But it did mess up plugins I wrote in Kotlin as now Gradle has a dependency on a very particular version of the Kotlin compiler, and God help you if your plugin was written with a different version in mind.
I disagree with the static vs. dynamic typing part. Modern statically typed languages (like Kotlin, Scala, Rust etc.) are concise and readable. In the case of the Groovy DSL for Gradle it was sometimes hard to get code right or to find a bug. Even IntelliJ struggled at times with this mess of a DSL. So, in my opinion Kotlin is definitely an improvement here!
However, I agree with your second part, the DSL as such. The syntax is arbitrary in many cases and just not easy to remember or to make sense of. It looks like a DSL for the sake of a DSL. Take a look at this example (https://docs.gradle.org/current/userguide/plugins.html):
plugins {
application // by name
java // by name
id("java") // by id - recommended
id("org.jetbrains.kotlin.jvm") version "1.9.0" // by id - recommended
}
Why are there two ways to reference a plug-in? Why is the version written without parenthesis? Why is version an infix operator? Why not something as simple and consistent as this:
Yes the lack of discoverability, plus the unfamiliar syntax of Groovy, plus names changing between versions, I started with Gradle thinking it would be easier but in the end I'd love to go back to Ant. That was awful to write but at least you could understand it.
I spent years copying essentially the same ant-file across projects, just changing dependencies and target names. It's not really rocket science and unless you're trying to be clever, most java projects can look pretty much the same from a build perspective
What tends to be complex about build requirements that necessitates special purpose tools? Golang seems to be doing fine with just go build and go test. What else are people doing with gradle/maven that requires static typing, DAGs, plugins etc.?
`go build` and `go test` do work, at limited scale and complexity. In Scala there's Scala-CLI which is excellent. If they work for you, you probably aren't the target market for these build tools. Once you start layering on bash scripts, layering on make, layering on Python scripts, layering on manual steps written down in a readme.md somewhere, that's the time when you should consider a proper build tool. And if that has never happened to you in your career, count your blessings :)
Why not just write some boring, pure code to handle the build? Why not write my build system in vanilla LYAH Haskell? It turns out that builds do have some specific requirements that most programs do not need to care about: caching, parallelization, introspection, and so on. Check out the following blog/talk for more details:
Thus "naively" building your project "directly with code" ends up not working, so you do need some additional support. While most build tools end up constructing a complete bespoke programming environment from scratch, Mill tries to leverage the Scala language and JVM as much as possible, so you can re-use all your expertise and tooling (e.g. IntelliJ, VSCode, Maven Central, etc.) almost verbatim while getting all the necessary build-tool stuff (parallelism, caching, introspection) for free. Check out those two links if you want to learn more!
Thanks, the post actually answers my question on build requirements. It's a very good write up overall!
If your USP is to solve those layering cases, then why not target other ecosystems like golang/rust as well? Your design philosophy certainly seems to be language agnostic. By calling yourself a build tool for Java and Scala, it gives an impression that this is solving problems specific to those environments, and your adoption also indicates as such. Is it that these communities do not like to adopt such tools or is there something about the JVM ecosystem that tends towards having complex build requirements?
I called it a build tool for Java and Scala because that's what its good at right now. The software industry is hugely varied, so I can't target everything at once. In particular, this tool started off targeting 100% Scala, but branches out to Java since they share a lot of concepts (classfiles, jars, assemblies, maven central, etc.)
But you are right that it is not JVM specific! In the docs there is an example of adding Typescript module support, and an initial strawman implementation takes about 100 lines of code. I'm hoping others can extend Mill to places where I do not have the time and expertise
I also opened up a 500USD bounty to add a strawman Python example, so if anyone wants to try their hand at writing the 100 or so lines necessary, here's the link :) https://github.com/com-lihaoyi/mill/issues/3862
The issue is that most people in jvm land are on a closed bubble and haven't seen anything else.
This is true for build systems as is for non OO design for example.
Most simply don't know better and the rest of us are simply stuck.
Ant and then Maven started simple enough but people always find a way to justify adding more stuff.
Gradle already started complex enough and they keep adding more stuff...
The big thing is that many Java builds are not just blobs of binaries crammed together, but also have structure and metadata, sometimes generated on the fly.
Not all Java builds are simply compiles. There are several projects that rely on processing steps during the Java build. EAR files are jars within jars.
Then, of course, there's all of the dependencies.
The modern Maven based repository based dependency manager is a blessing and a curse. Drag and dropping an artifact into your project that inevitably downloads the entirety of the internet. Now you may wish to cull your dependency tree, so that needs to be expressible as well.
The primary benefit of Maven and the pom.xml file is that for a vast majority of applications it just work. Even better, its become a universal "project" format that many IDEs directly support. It well handles "dependency hell" in a cross tool way.
I wish Maven were a bit faster, but, simply, it's as fast as it can be for what it does. A good Ant build just flies, but Ant "doesn't do anything". It's just a bag of steps that it follows (for good and ill), in contrast to Mavens declarative style (for good and ill).
I have no experience with Gradle other than I've never run into enough problems with Maven to justify trying something else. On its surface, it doesn't really appeal to me. I was comfortable with Ant (I have no problem with XML), I'm mostly comfortable with Maven. I've not been unhappy enough with Maven to try and jump back to Ant w/Ivy.
To be blunt, if anything it might be you who live in a closed bubble.
Builds that require ad-hoc functionality is the default. It’s extremely rare that everything fits nicely into “cargo build” or other single language build tools’ model. And while these often have escape hatches, at that point you have to write imperative code with no caching and parallelization that is literally the job of a build tool.
I've been doing jvm apps for almost 20yrs...
What builds need to do and what people made the builds do are completely different things.
I don't remember a single project I was involved in that could not had had a simpler build...
True. In our example we didn’t have to generate online help and pdf manuals from the same asciidoctor sources. But when we chose to, we really needed the customization that cradle offered.
Which are mostly small, isolated library code, whose purpose is to be easily incorporated into larger programs. That’s the 90% easy path of build tools.
What about a large project built over 6 years by 50 people, and that has to use some obscure technology to communicate with company A, and another one that has an idiotic build step?
You have to keep in mind that Gradle Inc. earns money by providing consulting for complex builds. An easy build tool would destroy this business model ;-)
In JVM world, the de factor equivalent to `go build` and `go test` are `mvn compile` and `mvn test`, which works 99% percent of the time.
Other build tools and plugins just compete/fill in for:
* improved build speed / test speed: using background daemon to reduce strtup speed, intelligent caching / task reordering to avoid redoing, etc..
* extra functionalities like code generation, publishing or deployments. As code generation is really big in JVM world, and there are many ways to deploy an application: jar + libs in a zip file, uber jars, container image, etc...
Curmudgeon here: this was true for a relatively brief period of time. Nowadays I'd say that gradle has (inexplicably to me) taken the lead - and everyone adds custom crap to their gradle build making them far less predictable than maven builds used to be.
I guess it's better than the nightmare over in the front-enders' world...
I rarely see Maven files in non-trivial projects that are anything but a confusing mess of XML.
Granted, the constrained abilities do tend to keep folks from writing one-off snowflake build customizations, which is nice. But it still leave a hell of a lot to be desired. It was however, leagues ahead of Ant, which wasn't a high bar.
I saw a lot of that too - I just see more and worse with gradle.
Maven gave us two things; good dependency management and convention driven builds (removing the horrible scripted build stuff in Ant).
Gradle from my point of view took the second one away again and it feels like it was just because people didn't like XML and couldn't be bothered to learn how Maven's build lifecycle actually worked!
My typical mvn session after a month of not touching maven:
% mvn
[ERROR] No goals have been specified for this build.
Uh.
% mvn build
[ERROR] Unknown lifecycle phase "build".
Uh, nope, that wasn't it....
% mvn compile
BUILD SUCCESS
Now trying to run the app... Error: app jar not found.
Reading the README.md. Aha! So I need to install it!
% mvn install
16:59:35,075 [INFO] Building <blahblahblah> [1/32]
...
16:59:38,483 [INFO] -------------------------------------------------------
16:59:38,483 [INFO] T E S T S
16:59:38,483 [INFO] -------------------------------------------------------
^C
// me searching on google how to not run tests
% mvn install -DskipTests=true
// now good...
Ok, let's run some tests. The FlakyTest broke again in CI. Let me run it locally:
% mvn test FlakyTest
17:02:19,481 [ERROR] Unknown lifecycle phase "FlakyTest".
Aaargh, ok, googling it again:
% mvn test -Dtest=FlakyTest
17:03:16,102 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.3.0:test (default-test) on project blah blah: No tests matching pattern "FlakyTest" were executed! (Set -Dsurefire.failIfNoSpecifiedTests=false to ignore this error.)
The original idea behind maven is nice.
But the defaults are so bad, that the whole experience of convention over configuration has been ruined.
Cargo and go build systems took the original maven philosophy and implemented them right, with good UX.
Gradle, SBT and friends took a step back to the times before maven, and went fully the Ant way, doubling down on "configuration over convention". Where "configuration" is actually "programming in a DSL on top of another language on top of Java".
I'm not trying to ignore your point, but FWIW the next line after the one you specified includes presumably what a normal person would want to see
[ERROR] No goals have been specified for this build. You must specify a valid lifecycle phase or a goal in the format <plugin-prefix>:<goal> or <plugin-group-id>:<plugin-artifact-id>[:<plugin-version>]:<goal>. Available lifecycle phases are: pre-clean, clean, post-clean, validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy, pre-site, site, post-site, site-deploy. -> [Help 1]
> [ERROR] No goals have been specified for this build.
> [ERROR] Unknown lifecycle phase "build".
And somehow missed that the error message also lists the valid lifecycle phases. (could have instead complained that it list all lifecycle phases which is a lot)
> // me searching on google how to not run tests
How do I know for other build tools how to skip tests? Skipping tests should not be necessary and is done to regularly. I would not consider a tool which immediately points out how to ignore those stupid tests as good UX.
When I want to run tests, I tell it to run the tests. When I want to install, I tell it to install. I don’t want it to run tests when I tell it to install. I don’t want tools who know better what I want to do than me.
Try to build a Go project that uses Cgo and non-trivial C/C++ libraries. Throw in cross-compilation for more fun. You'll end up with an external build system that invokes Go build as one of the steps.
Go projects normally just tend to be self-contained server-like software that doesn't need a lot of external libraries. But once you step away from that, you're on your own.
I guess my problem with Gradle is that app building should be way simpler than it is. Apps are not something niche anymore, but the tooling is still similar to the embedded software for microcontrollers.
I'm working on a project that encompasses both JVM (Gradle, Kotlin) and Golang.
My hot take: JVM build tools, especially Gradle, are a soup of unnecessary complexity, and people working in that ecosystem have Stockholm Syndrome.
In Golang, I spend about 99% of my time dealing with code.
In JVM land, I'm spending 30% just dealing with the build system. It's actually insane, and the community at large thinks this is normal. The amount of time it takes to publish a multi-platform Kotlin library for the first time can be measured in days. I published my first Golang library in minutes, by comparison.
You speak from my soul! I'm in the Java world for a really long time now and I'm wondering for years why the build tools need to be so complicated an annoying. I know Go, Node.js and bit of Rust and all have more pleasant easier to use build tools! The JVM (or GraalVM) as an ecosystem is just fine and probably one of the best, but build tools might be achille's heel. Maybe it would be a good idea for Oracle to invest into that area ...
My experience of JS projects is that build tools are frequently ad-hoc. That is, there simply isn't a general build tool at all, but just a large pile of scripts calling under-documented libraries. Parallelization, caching and quite often even portability are just missing.
To justify this statement consider this blog post I wrote a while ago about porting GitHub Desktop (an Electron app) from its prior build/deployment system to Conveyor [1]. Conveyor is a tool for shipping desktop apps and is implemented as a single-purpose build system. The relevant part is this commit:
The amount of code that can be deleted is huge! Some of it is in-process code that isn't needed with Conveyor (setting up Squirrel etc), but a lot is just shell scripts that happen to be written in JS. Replacing that with a real build system not only simplifies the codebase but means the build steps are fully parallelized, fully incremental, easier to debug, portable (the build can run on any platform), progress is reported in a uniform way and so on.
So whilst the JS ecosystem's approach to build tools may be "simple" in some way, in the sense that there's no dominant build tool like Maven or Gradle, that simplicity does cost you in other ways.
I'm in JVM land. I spend very little time dealing with the build system. It is actually insane how well it works.
Also, why does it matter how long it takes to publish a library for the first time? It sounds like a non-issue to me. I have written dozens of libraries and published them to a local artifactory instance because it simply doesn't matter if your company specific code is accessible to the world or not.
One note from having worked with both that I don’t see mentioned: Golang dependencies are sources you basically pull and compile with your own code. In JVM-land dependencies are precompliled packages (jars). This adds one little step.
...or a big step, if cross-compiling is required (e.g. Kotlin Multiplatform)
I'm surprised there is no source-only dependency solution for JVM -- it'd solve this issue. Pull down the source and build on the fly. Perhaps there is and I'm unaware?
I'm afraid Java/Scala/Kotlin compilers are too slow to make that convenient. Even currently building pure Java projects can take minutes when it's compiling just like 300k lines. What if it had to compile millions of lines from all the dependencies?
The actual compilation step is 100% not the bottleneck - it can go as fast as 10k-50k lines per second! (According to the Mill benchmark, but that’s the Mill-independent part).
Comparatively, Go does “only” 16k lines per second based on some HN comments.
But you’re likely comparing on different hardware though. Go compiling only 16k lines per second is hard to believe for me. Maybe they meant on single CPU core. Rustc compiles over 50k lines per second on my MBP in debug mode and Go must be definitely faster, as everyone knows rust is very slow to compile.
But anyway, you may be right. I just ran mvn install for the second time with no source change on my current project. It took 57 seconds.
The java metric is also from a single core. But you are probably right that it should only be taken as a rough ballpark, but java is definitely in the same ballpark as go in compile speed.
What issue would it solve? The fact that you can build a jar in any OS and then just use that anywhere else is actually a huge benefit of using Java, as you don't force everyone to re-compile your library source code.
Well since the builds tend to be monstrously complicated for some reason, and there’s no standard build tool, maybe it’s more impossible than possible to consider source based distribution. Or it would be like JavaScript where you still need a build and publish step to turn “developer Java / other languages” into “vanilla source distributable Java”.
> The amount of time it takes to publish a multi-platform Kotlin library for the first time can be measured in days. I published my first Golang library in minutes, by comparison.
It's a bit Apple & Orange comparison: publishing a JVM only Kotlin library is quite easy, it's the multiplatform part that takes time.
Last time I published a JVM library I had to Open A Jira Ticket to request the rights to publish a package on the main package registry. Then I had to verify I owned the DNS name prefix for my package by fiddling the DNS records at my hosting provider. It took days just to get authorized! Not including the time needed to like, figure out how to make JARs happen.
Merely as a "for your consideration," GitLab ships with its own Maven repository (along with npm, docker, Nuget, and a bazillion others)[1] so you have total sovereignty over the publishing auth story. I can appreciate going with Central can be a DX win if you're distributing a library, since having folks add <repository> lines to their pom.xml or settings.xml is a hassle, but at least you get to decide which hassle you prefer :-D
Sometimes barrier to entry is good. For example, both npm and cargo struggle with package name squatting and malicious packages that are miss spellings of common packages.
Pedantically, that's only one way to resolve a go package - and for sure the more obvious[1] - but the most famous one I know of is gopkg.in/yaml.whatever that uses a <meta> tag to redirect to its actual GH repo, which only the deepest golang ninja would know how to use: compare view-source:https://gopkg.in/yaml.v3 with view-source:https://gopkg.in/yaml.v3?go-get=1
I've been working on Java-based systems for about 20 years now, and I fully relate to that. Same experience.
This is so annoying that I prefer to use Rust over Java even in areas where things like better performance or better type system don't matter. But being able to start a fresh project with one `cargo init` and a few `cargo add` invocations to add any dependencies... well, this is priceless.
Interesting that you ended up going all the way to Rust land instead of just using one of the multiple tools that have been created to help with this, like:
Are you aware of Maven Archetypes[1]? I believe they were the "cookiecutter" before cookiecutter existed, although I am 10000000% on-board that their discovery story is total garbage :-(
But I don’t want to copy a full project with prepopulated list of dependencies chosen by someone else. I want to start small and add dependencies I need.
Init and then what? The story of discovering and adding dependencies is still much worse. Nothing like cargo add/remove or crates.io where I can quickly search dependencies with their descriptions with standardized links to repos and documentation. Actually even Python is nicer in this regard with PyPi and pip install, even though virtual envs are pain.
Interesting. I spend nearly zero time with my maven setup and almost all the time is in coding. I am genuinely curious to know where that 30% time goes? Is it waiting for builds?
I think there are a lot of "JVM Lifers" who are so deep in the ecosystem they are unaware how much better things can be.
Anecdote: I wanted to publish a ~100LoC multiplatform Kotlin library -- just some bindings. I publish these sorts of things for Go with just a "git push".
Steps were:
1. Spend a few hours trying to understand Maven Central/Sonotype, register and get "verified". They're in the middle of some kind of transition so everything is deprecated or unstable.
2. Figure out signing, because of course published packages must be signed. Now I have a secret to keep track of too, great.
3. Discover that there is no stable Gradle plugin for publishing to the "new" Maven Central, it's coming soon... Choose one of the handful of community plugins with a handful of stars on GitHub.
4. Spend a few hours troubleshooting a "Gradle build failed to end" error, which ended up being due to signing not finding a signing key. 3rd party plugin didn't handle errors properly, and a bug in Gradle meant that my secret wasn't picked up from local.properties.
4. Eventually discover that because Kotlin Multiplatform can't be cross-compiled, there is no way to actually publish a multiplatform library without spinning up a bunch of CI runners. And you can't just publish code -- JVM packages have to contain compiled artifacts.
5. Realise this now involves maintaining GitHub Actions and Gradle, which is an ongoing cost.
6. Give up.
The harm that this kind of complexity must be causing to the ecosystem is immeasurable.
Although a lot of it is generic badness, Kotlin Multiplatform isn't the JVM ecosystem. You don't need CI runners to publish a JVM library. The reason it comes up with Multiplatform is because Kotlin defines "Multiplatform" to mean platforms like JavaScript, or their own LLVM based compiler toolchain that bypasses JVMs entirely.
I’d just like to add, NPM gets a lot of flak (mostly deservedly) but it too is still vastly easier than anything in the JVM ecosystem.
Even with all the headaches around modules versus CJS, and JS versus TypeScript, NPM is a lot easier than Gradle. Notably, you have a choice of alternate tools (eg pnpm, yarn, bun) that interoperate pretty well.
I guess my point is, Gradle and Maven are specifically and outstandingly bad.
If you think gradle and maven are bad, you should try Mill! There is more to build tooling than gradle or maven, the field has evolved significantly since those tools launched 15-20 years ago, and Mill tries to do things better
I must be missing something here. Don't the tools you mentioned do a lot less than Gradle? Gradle knows test depends on compile, which depends on code generation (say protobuf) - with caching and change detection. Compare that to chaining up the commands in the `scripts` section of `package.json`.
I could be convinced if those features of Gradle actually worked well, or even worked properly, like dependency management does in e.g. Bazel.
In practice, Gradle really seems to fall down on the basic task of just being able to build stuff in the first place. It feels like you’re constantly fighting version hell just to find a Gradle version and plugins that work together, let alone your actual code dependencies.
And if you actually do need to do something slightly more complicated, like code generation, it’s very difficult to work with and the docs are really bad.
I have no complaints for the well trodden path (e.g. https://github.com/google/protobuf-gradle-plugin). I have also written some custom build steps, and indeed the docs aren't very helpful - but the final implementation is quite simple.
Npm also gets a lot of flak for the low bar it sets for introducing malicious code by impersonating an idling maintainer or presenting yourself as a successor. The friction, the secrets to keep, they are there for a reason.
That's the trick. You publish the source code. And it's still faster to build all dependencies from source than maven / gradle manages to resolve and download the binary dependencies ;)
That's true, Maven is ridiculously slow to resolve dependencies while Gradle only really works with reasonable speed if you allow it to hog your system with a deamon.
I myself wrote a dependency resolver that matches Maven in functionality, and even a large project that uses Spring Boot and its dozens of dependencies can be resolved in a couple of seconds. About 10x faster than Maven or something like that. If you look at Maven's source code you'll see why. It's the worst kind of Java Enterprise overengineering you can imagine, complete with its own dependency injection framework, everything is pluggable (for no reason, really, do you really need to replace HTTPS for your protocols?? In Plexus you can), to the point that all the de-coupling results in lots of things duplicating functionality everywhere. I am not sure but I would bet Maven parses your POM at least 10 times to do anything due to the de-coupled nature of it.
Maven is actually pretty behind in terms of JVM dependency resolution. Mill uses Coursier, same as my last company did, and when my last company switched from Maven to Coursier we saw a 2 order of magnitude speedup, with resolution commands that used to take 30min finish in a few seconds to give the exact same artifacts and versions.
I actually have no idea why these other resolvers are so slow, or why Coursier is so fast, but this slowness is very much a "maven" or "gradle" thing rather than a "jvm" thing. And Mill using coursier does significantly better!
There are quite a few cases: the moment you touch another language, resources that require a compile-step (e.g. xml schemas to code dtos, protonbuf, all that kind of stuff), sometimes even the code itself requires generation.
Build tools sit in an unhappy corner of the design space where they provide features not found in the core of regular programming languages, and which are so generally useful that there's a temptation to make them very abstract, but then they often lack some of the features that let regular programs scale well.
The key feature that justifies their existence is parallel and incremental execution of DAGs of world-mutating tasks. This is an awkward fit with most programming languages, hence the prevalence of DSLs. But people don't want their build system to become a general purpose programming language, because they don't want to think about build systems at all and because programmers don't buy programming languages anymore, so, this causes a big design tension between generality (people want to use build systems to automate many things) and deliberately limiting the expressive power to try and constrain the design space and thus tooling investment required.
Java is in an awkward place because the JDK was born in the 90s on UNIX, by people who thought make is a sufficiently good solution. You still see remnants of this belief in the official Java tutorials, in JEPs, and in the fact that OpenJDK itself is compiled using an autotools based build system! (fortunately it's one of the nice make based build systems out there).
The problem with make is twofold:
1. It assumes a CLI that's both powerful and standardized provided by the host OS. Windows violates this assumption but Java is meant to be portable to Windows.
2. "Plugins" are CLI tools or scripts and so make implicitly assumes that subprocess creation is cheap. But process creation on Windows is expensive, and starting up JVM programs is also expensive due to the JIT compiling.
Therefore make just doesn't work well in the JVM ecosystem. At the same time, the Java project wasn't providing any competing solution, so the wider open source community was left to fill in the gaps. These days language developers provide build tooling out of the box as part of the base toolset along with the compiler, but Java still doesn't.
So - you ask, what are people doing with Gradle/Maven that requires all those features. The answer is: everything! Gradle builds frequently orchestrate dozens of different tools as part of a build pipeline, build documentation websites, do upload and deployment, download and manage dependencies, run security scanners and license compliance checkers, analyze dependency graphs, modify compiler behaviors, and so on.
Additionally Gradle isn't specific to Java, or even JVM apps. It can also be used to compile C/C++ programs, run native code compilers like Kotlin/Native, and it abstracts the underlying platform so Gradle builds aren't tied to UNIX.
It's not clear to me how this is better than Gradle. And I hate Gradle.
At first glance, Mill looks like it has many of the pitfalls of Gradle:
- Plugins: Creates the temptation to rely on plugins for everything, and suddenly you're in plugin dependency hell with no idea how anything actually works.
- Build scripts written in a DSL on top of a new language: Now I have to learn Scala and your DSL. I don't want to do either!
- Build scripts written in a language that can be used for code too: Versioning hell when the compiler for the build system needs to be a different version to the compiler for the actual project code. See: Gradle and Kotlin
Author here! The issue here is that builds, and many other "just configuration" scenarios, are fundamentally complex. So many projects that start off as "just XML" or "just YAML" end up implementing their own half-baked programming language interepreter inside of their XML/YAML/JSON/whatever.
There is a reason why Bazel went with Python/Starlark, why Pulumi and CDK and friends are getting popular. Fundamentally, many of these use cases look surprisingly like programming languages: maybe not immediately, but certainly after you've dug in a bit. And having a properly designed purpose-build programming language (e.g. StarLark) or a flexible general purpose language (e.g. Typescript, Kotlin, Scala) does turn out to be the least-bad option
I agree that Bazel did pretty well with Starlark, but the reason that’s sane is because it’s not Python, though the syntax is similar. It avoids getting into trouble with people using Python language features that would result in upgrade hell and annoy other programmers who aren’t Python experts.
(Though, debugging complicated Starlark code can still be difficult.)
Why do you call these other languages “better”? They’re different, but I’m not sure why either of the one’s you mentioned would be better for this use case.
I'm afraid that no current config language is an obvious fit for Mill. That's because Mill is fully reactive and doesn't distinguish between build configuration and execution by design.
There is basically no DSL. You simply write what a build needs, e.g. you write a function `collectCFiles()` that collects every file with extension `.c`. You then issue a command like `gcc ${collectCFiles()}`. And pretty much that’s it - you can use shell commands, or do anything in scala (or java or whathaveyou). You simply have your functions return either a value (e.g. a checksum) or a location, which is the only mill-specific logic.
So your compileC() function will simply invoke your collectCFiles() function, and this invocation implicitly creates a dependency between these tasks. You have written literally the simplest way to describe your build logic. But in the background mill will cache your functions’ inputs outputs and parallelize those that need re-run, which is what a build tool should do.
The implementation may not be the theoretical best, but I think the idea is pretty much the perfect build system out there.
First invocation may be. Subsequent builds are very fast, unless someone decided to write random bullshit into the build scripts that execute at config time, making the config process impure.
I’m mostly thinking of Android projects. If I have time I’ll try some speed tests with a new basic project. But I don’t think I’ve even once done something in Android Studio and thought “huh, that was surprisingly fast”. Maybe some of the hot reloading stuff is okay (when it actually works).
Mill's early goal was to be a saner sbt, incidentally also fixing the parts of sbt that are/were unreasonably slow due to questionable design decisions.
Maven has never been relevant to the Scala ecosystem given most of the community has pretty much moved straight from ant to sbt. Only a few Spark related projects stubbornly use Maven, which is a major pain given the lack of cross-building abilities. Slow dependency resolution and inefficient use of Zinc merely add insult to injury.
Yeah... that's my experience with Scala all around - it's abysmally slow, especially if you use any sort of "metaprograming"... (one of the reasons I stay clear of the language)
I had worked out that math for “like pip but actually works” but few people were conscious that pip didn’t quite work reliably for large and complex projects — I didn’t think it was possible to sell it.
Uv won hearts and minds because it was uncompromisingly fast: people did not really care that it had a correct resolving algorithim or that is was really reliable because it is not written in Python and thus can’t trash it’s own dependencies (maybe a solvable problem in that the build tool can have its own virtualenv but isn’t it nice to for your package manager to be a binary that can’t get dependencies screwed up no matter how hard the users try?)
The last time I checked, sbt was much faster than mill for incremental builds. Mill has a faster cold startup time, but sbt uses classloader tricks to reuse jitted classes so that it doesn't have to reload the scala standard library. When running tests continuously and rerunning on save, sbt was much faster than mill for equivalent projects. I haven't tested in three years or so though. But I would encourage people to make a simple project in sbt and mill and run `sbt ~test` and compare it to `mill -w test`. In the past, I found that after a few iterations, sbt could respond to changes in a few hundred milliseconds while mill would take multiple seconds to retest the same code. That difference really adds up when you are iterating on a problem.
That said, I have come to believe that the jvm is a bad platform for a build tool. Everything that touches the jvm becomes bloated and slow, particularly for startup. I no longer write scala because of my frustration with the bloat (and scala adds its own bloat on top of the jvm).
The test thing is just a matter of defaults, in Mill subprocess testing is the default and in-process testing opt-in via .testLocal.
I also believed that a lot of existing JVM tooling is bloated and slow, so we are in agreement! Mill tries to be different, so do give it a chance if you can. There is life beyond Maven, Gradle, and SBT
sbt is one of the worst engineering mistakes I've ever witnessed. It was a constant source of esoteric ergonomics and frustration for no clear reason other than being the pet project of someone who really loved implicits.
Build times always were my biggest Scala complaint. Arguably code base specific, as I suspect it had a lot to do with all the macros and type level metaprogramming, but, if that kind of thing is possible and customary, an average working programmer who doesn't control the codebase they show up to is going to end up stuck dealing with it.
It's a lot easier to build a language that works great, but only in the hands of a single skilled careful owner, than a language that stands up to the abuse of many careless temporary users, and still gets from point A to point B reliably.
Like the difference between a sportscar and a rental company or police fleet sedan.
Big fan of mill! (Coming from maven, Gradle and SBT) - the 1:1 mapping of build tasks to output files is the big one for me as it makes understanding other people's builds so much easier going through it step by step
Author here. It does! I started working on Mill when I started learning Bazel, during my first months at Databricks. There's a lot of cross-pollination of ideas there, from my 7 years adopting and maintaining the Bazel build at Databricks, but I haven't had time to do a proper head-to-head comparison. Hopefully someone else can though!
Bazel is a rats nest of complexity. In my experience it does what it does very well once set up, but setting it up is tremendously complicated, much of it IMO incidental complexity.
Rolling out Bazel at my prior employer tool about one person decade of engineering time. I've talked to other companies that tried to roll it out and failed. Bazel is hard
And Bazel is not really getting any easier! Like most projects, it is getting more complex over the years as features acrete. I think there is space for a tool like Mill for less sophisticated users who can't afford to spend a person-decade rolling out their build tool
There's a big difference between the complexity of setting up Bazel and the experience of using Bazel at a company where Bazel has been set up for you. As a user I love Bazel. When working with Java, I use the java_binary and java_library rules. When coding in Go, there's nothing new to learn regarding the build, just use go_binary and go_library instead. Everything is repeatable, builds and tests are cached, it's easy to query the build dependency tree, etc.
A few startups are offering "Bazel build/test as a service." It's one way to eliminate the work involved in setting up Bazel for an organization.
Sorry to hijack this thread a bit, but I currently work at a Scala shop and have grown to like writing it. I worked at Clojure heavy place previously. This tool looks neat.
Has anyone at the senior level recently moved on from Scala to other languages recently? Any issue finding jobs or learning the new role?
Author here! Hope you take a look at the project and find it cool. There's a lot of interesting stuff here. In particular, the Video linked on the landing page is a great intros from a Java developer point of view, and the following video is a great intro from a Build Tool Architect point of view:
* https://www.youtube.com/watch?v=UsXgCeU-ovI
While Mill is focusing on JVM for now, it is very extensible and I have a strawman demo of adding a Javascript toolchain in ~100 lines of code https://mill-build.org/mill/0.12.1/extending/new-language.ht...
For those of you who want to learn more about the design principles and architecture of Mill, and what makes it unique, you should check out the page on Design Principles which has links to videos and blog posts where I elaborate on what exactly makes Mill so different from Maven, Gradle, SBT, Bazel, and so on:
* Mill Design Principles https://mill-build.org/mill/0.12.1/depth/design-principles.h...
I've mentioned this in a few places, but the comparisons with other build tools are best-effort. I have no doubt they can be made more accurate, and welcome feedback so I can go back and refine them. Please take them with a grain of salt
I'm also trying to get the community involved, so it's not just me writing code and running the show. To that end, I have set up a bounty program, so pay out significant sums of money (500-2000USD a piece) for people who make non-trivial contributions. It's already paid out about 10kUSD and has another 20kUSD on the table, so if anyone wants to get involved and make a little cash, feel free to take a shot at one of the bounties! https://github.com/orgs/com-lihaoyi/discussions/6
How possible is it to make your tool "zero-config" by default? I see a lot of comments in this thread and elsewhere on twitter asking for essentially `go build`, `go fmt`, `go test` for Java/JVM. I think the language has quite strong convention around directory layout and file naming already, so do you think it would be possible for mill or a mill wrapper to offer the same kind of standardized zero config workflow? I think a JVM tool that gets that right - takes it as far as possible to the golang model - would have a lot of happy users.
Scala CLI has replaced the default runner since Scala 3.5, so you can effectively do `scala run`, `scala fmt`, and so on. On the Java side, I believe JBang provides a very similar developer experience.
Fundamentally it's hard to reconcile both worlds though. Building non-trivial multi-module projects on the JVM is inherently complex especially when you throw in multiple build targets, multiple toolchains, multiple platforms...
With simpler build tools (like in Go or Rust) you shift this complexity elsewhere, typically in a Makefile and/or a Docker/OCI based build pipeline, and these can get pretty complex too. Let alone distributed build tools like Bazel.
- https://scala-cli.virtuslab.org
- https://www.jbang.dev
There's scala-cli, which has become the default Scala run command since Scala 3.5 but is also available separately. It has all those bells and whistles and allows scripts to grow organically. And no matter the name, it handles Java code too.
With scala-cli, there's not even a need to download a Java runtime or a language distribution. You can let the runner do its thing, or pass options to choose the JVM and the language version to use, or even write those options into special headers in the code files. You can also write tests, format code... it's all built-in. And in cases the code outgrows the tool and there's a need to migrate to a different build tool, there's even a feature to export the build to Sbt or Mill.
Not the author, but this is unfortunately a bit more difficult than it sounds. Like for example, where do you get the name of the jar file to build? I guess you could use the name of the root directory, but that may not be ideal.
How do you figure out dependencies? Import statements in .java files give you the packages to import, but those package names could be provided by one or more .jar files and, regardless, the package names need not bear any relation to the jar name or its group/artifact IDs (if pulling from e.g. a maven-style repository, which basically everyone does).
For multi-module projects, how do you figure out the dependencies between the modules, even? Sure, you could probably figure that out by parsing all the .java files in all modules and figuring out what they provide and import, but that would be slower than maven, probably.
You could certainly do this for small, dependency-free programs, but it would be such a niche use case that I don't think it would be worth the time.
Name of the jar? `java build jar/foo` -> foo.jar, `java build src/dog` -> dog.jar.
Dependencies? It's okay to use a dependency list file for these - I guess I don't consider this config compared to the stuff I usually find in a Gradle or Maven file. The thing I'm alergic to is all the stuff that isn't a dependency list.
In Go, these go in a go.module file that's automatically updated by the build tooling, which runs instantly & has a cache you never need to think about. Go has the advantage of import paths being URLs that specify the dependency too, but I think my Go-For-Java tool would use reverse package import search from an online service to map eg com.foo.bar.something.Potato to the appropriate package "foo-bar" from Maven Central or whatever. Building that index seems like a trivial program to write for the average Java engineer.
The more I think about this "go for java" idea, the more I want to build it "in anger" just to see how off-base I am. Maybe I really am just going to re-implement or wrap sbt, mill, gradle, https://www.jbang.dev/ idk. It just feels like the experience could be an order of magnitude simpler as an end-user with some conventions strictly enforced by the tooling.
There are initiatives like declarative Gradle or JetBrains' Amper. But I assume they'll hit a wall in real-life exactly like Maven does. Think about packaging for instance, I see at least 4 or 5 different ways that are fairly common, and that's for one target only.
I couldn’t quickly find how dependency resolution and versioning works in Mill. Can you give any pointers?
Also, what’s with the “ivy” on https://mill-build.org/mill/0.12.1/comparisons/maven.html ? Any relation to Apache Ivy?
Dependency resolution uses Coursier, which is one of the open source JVM dependency resolvers. SBT uses it too, and my last company used it with Bazel
The "ivy" thing is legacy haha. Mill used to use Apache Ivy to resolve dependencies, years ago. Coursier was a better/faster replacement, but names have a tendency to stick around
Hi, it looks pretty interesting! There's a broken link in the homepage though, Mill vs sbt links to the gradle page
Good luck for your project!
I’d be interested to read a comparison with Bazel (which you already mention as one of the influences).
For somebody looking to escape from Gradle, Bazel seems like one of the most promising alternatives, as it’s built on sane and sound fundamentals. Although in practice it has plenty of rough edges and annoyances, so maybe there are areas where Mill can do better.
The current version is 0.12.1.
What's required for v1.0?
Traditionally I've labelled my OSS projects 1.0 when they've stabilized and the rate of change has greatly reduced. Right now Mill is not there yet, but maybe if at end-2025 we realize no breaking changes have been needed since end-2024, we can call it 1.0
Just a tiny notice, the github page, nor the website seem to currently contain an “installation” link. The one found by google returns a ‘page not found’ for the current version.
Does it support Quarkus (esp. native build)?
There is no reason it would not: https://github.com/alexarchambault/mill-native-image
It should! I haven't had a chance to create an example myself yet, but there's a 500USD bounty open if anyone wants to take a crack at it https://github.com/com-lihaoyi/mill/issues/3549
Just a note that the author Li Haoyi is a fantastic contributor to Scala community.
He has written multiple useful libraries. Out of many JSON libraries, his one was the most intuitive and practial.
His book is excellent too. I bought it when it came out. It is worthy of a plug: https://www.handsonscala.com/
I miss working on Scala projects. Sadly I rarely see new ones these days.
Does IntelliJ plugin finally work on Scala 3? About 2 years ago it was half broken.
A build tool that is not only fast but configured in a type safe way sounds great. I really like this quote from the "Why use scala" part of the documentation:
> Most developers using a build tool are not build tool experts, and have no desire to become build tool experts. They will forever be cargo-culting examples they find online, copy-pasting from other parts of the codebase, or blindly fumbling their customizations. It is in this context that Mill’s static typing really shines: what such "perpetual beginners" need most is help understanding/navigating the build logic, and help checking their proposed changes for dumb mistakes. And there will be dumb mistakes, because most people are not and will never be build-tool experts or enthusiasts
I gave Mill a try earlier this year. My hope was to escape the nightmare that is Gradle, which I've been using for many years. Mill sounds great in theory (except for the Scala DSL). Unfortunately, I couldn't get a basic Java build to work in half a day, even though I have (admittedly rusty) working knowledge of Scala. It was one obscure error after another. My conclusion was that Java support isn't ready. There was also very little documentation on how to build Java.
In my opinion, using a GPL as the build language of a polyglot build tool is a dead end, both for technical/usability reasons and because the ensuing language wars can't be won. I'm looking forward to the day when a build tool embraces a modern config language such as CUE or Pkl.
Depending on when you tried, it could be worth trying again: support for Java has improved greatly in the last few months, as have the documentation. Come by our discord channel if you get stuck and i can help unblock you
Thanks for the offer! I’ll give Mill another shot the next time Gradle drives me crazy. :-)
I've had the same gripe about having to keep up with a second language just for the build tool for a while. Try taking a look at JeKa https://jeka.dev/
Mill looks interesting, but, _from a Java development perspective_, it has the same fundamental challenge as Gradle (and most other build systems), which is that its config language _is something other than Java_. That means there's a significant cognitive burden to understand and manage something that one hopes to not have to think about very often.
I find that the pain I experience with Gradle isn't usually about how to do something clever or customized etc, but instead it's when I haven't thought about Gradle syntax in the last 3 months since everything has been silently working, but now I need to figure out some small thing, and that means I need to go re-learn basic Gradle stuff - whether it's groovy, Kotlin, or some aspect of the build DSL - since my mind has unloaded everything about Gradle in the meantime.
Simplifying the semantic complexity of a general purpose build system will always help, but the most useful thing for me would be if the configuration for a Java build were to natively use the Java language directly.
It's intended as a replacement for _scala_ builds. Having a build definition in the native language that doesn't require a different syntax (like a declarative syntax such as maven xml or toml) makes task customization easier for the maintainer of a given project. Unfortunately, it also means that you have to know the language and read the documentation for the build system.
If you want something declarative, there's also bleep[1] in the scala ecosystem. And for single module builds there's scala-cli[2]. It's also possible to use gradle and maven for scala projects, but for an java-only shop I wouldn't be using mill or bleep because there's no need to introduce a new language just to manage the build. For scala/java/kotlin hybrid projects though, gradle or mill or sbt would be my recommended tool because of how tightly they are coupled with the cross-platform build matrix nature of scala library and build system plugin ecosystems. For larger builds, it's mill or bazel because there s a performance cliff in sbt and gradle, and bleep is too new to have all the standard plugins ported. We use mill at writer.
1. https://bleep.build/docs/
2. https://scala-cli.virtuslab.org/
The intention has changed, Mill now explicity targets Java and Kotlin as well. It now has dedicated Java/Kotlin docsite sections and examples, and has grown integrations with Palantir-Format, Checkstyle, Errorprone, Jacoco, and all their Kotlin equivalents (ktfmt, ktlint, kover).
Java and Scala (and Kotlin) are remarkably similar from a tooling perspective, so Mill tries to target both using the same shared infrastructure
> It's intended as a replacement for _scala_ builds. Having a build definition in the native language [...] makes task customization easier for the maintainer
Totally agree! But the title of the post says "Mill: A fast JVM build tool for Java and Scala" :) - it certainly looks like better tool for the Scala community.
For projects that are primarily building Java sources, it'd be nice to have a build system that uses Java code to describe the build. I don't think this exists at the moment.
One option is `bld`. They added IntelliJ support since I looked at it last, so that's nice.
https://github.com/rife2/bld
Even in Java, because the language is relatively verbose, many frameworks fall back to an "inner platform" of magic annotations which have the same problem: e.g. just because you know how Java works doesnt mean your mind hasn't unloaded all the SpringBoot annotation semantics! But despite that it is worth it, because conciseness does matter.
Mill using Scala syntax is like that, but with the added advantage that even if you forget how Scala works, your IDE does not. You can really lean on Intellij or VScode to help you understand and navigate around a Mill build in a way that is beyond what is possible for most build tools: You can autocomplete things, peek at docs, navigate the build graph and module tree, etc. and learn what you need to learn without needing to reach for Google/ChatGPT. I use this ability heavily, and I hope others will enjoy these benefits as well
My problem with gradle is that its configuration language is a programming language.
Sounds amazing in practice. And it is. Until you need to fix a 3 year old build that has some insane wizardry going on.
> Until you need to fix a 3 year old build that has some insane wizardry going on.
My experience with Gradle is that it's the "3 year old build" that is almost certainly a death knell more than the insane wizardry part. My experience:
Contrast that with https://github.com/apache/maven-app-engine (just to pick on something sorted by earliest push date, some 10 years ago):I dislike Gradle as much as you probably do, but between Maven and Gradle, the one that "vomits" stuff on the command line is definitely Maven. Gradle errs by going too far to the other end: it just doesn't log anything at all, even the tasks that are actually being run (vs skipped... do you know how to get Gradle to show them?? It's `gradle --console=plain`, so obvious!! Why would anyone complain about that, right?!) or the print outs you add to the build to try to understand what the heck is going on.
Having worked with Maven and Gradle, I'd say Gradle was worse in the average case, but better in the worst case. There are way more Gradle projects with unnecessary custom build code because Gradle makes it easy to do.
On the other hand, when builds are specified in a limited-power build config language, like POM, then when someone needs to do something custom, they have to extend or modify the build tool itself, which in my experience causes way more pain than custom code in a build file. Custom logic in Maven means building and publishing an extension; it can't be local to the project. You may encounter projects that depend on extensions from long-lost open source projects, or long-lost internal projects. On one occasion, I was lucky to find a source jar for the extension in the Maven repository. It can be a nightmare.
The same could happen with Gradle, since a build can depend on arbitrary libraries, but I never saw it in the wild. People depended on major open-source extensions and added their own custom code inside the build.
> it can't be local to the project
It certainly can be, in the same repository.
When I used Maven, extensions had to be published to and pulled from a public repo. We couldn't even use the private repo that we used for the rest of our libraries, because the extension had to be loaded before Maven read the file where our private repo was configured.
Whereas a Gradle build can read Groovy files straight from disk.
I'm using several maven plugins (not extensions) that are defined within the reactor project itself. It works well.
You do need to split your build into multiple projects governed by a reactor but you'll have that anyway as soon as you have more than 1 module. Then you just always build the reactor. Pretty much the same idea as gradle.
> When I used Maven, extensions had to be published to and pulled from a public repo.
You can just `mvn install` them locally into your local repository.
Then you don't have a standard build, you have a build with multiple steps that needs to be documented and/or scripted. In an organization where every other project builds in a single step with "mvn package", and people can check out a repo and fire up their IDE and stuff just works, people are going to get bent out of shape because from their perspective, things aren't working out of the box.
A slightly more powerful build tool that supports custom code in the build doesn't force users to script around it. You can create an arbitrarily customized build that builds with the same commands as a Hello World project. (It's a double-edged sword, to be sure, because people don't try as hard to avoid customization as they would with Maven.)
You can, but why should you need to? Why can't the build tool take the plugin code directly off of disk, build it, and use it? This kind of orchestration of manual steps is what build tools are meant to be good at
Sure. But adding ability to self-modify the build drastically increases the complexity of a build tool. Maven developers decided that they want to avoid that.
It can, for plugins. GP is talking about extensions which you typically don't need.
As the most extreme counterexample of your ... experience[1], someone made a plugin that allowed writing pom files in languages other than XML: https://github.com/takari/polyglot-maven/tree/polyglot-0.7.2...
With an especial nod to https://github.com/takari/polyglot-maven/tree/polyglot-0.7.2... given this submission
1: I believe that you encountered errors, programming is packed to the gills with them, but correlation is not causation in that just because it did not immediately work in your setup does not mean it's impossible or forbidden
My problem with gradle is that they keep making breaking changes for low value things like naming of options, so I have to chase deprecation warnings, and can never rely on a distro supplied gradle version
Gradle devs, please get over yourself and stay backward compatible.
Not to defend Gradle too much, but Groovy is a superset of Java. So if you want, you can just use the regular Groovy dialect and then write Java in your build scripts, it should work.
This is not entirely a solution though, because Gradle's APIs are fairly complicated and change regularly.
(Nitpick, but it’s just a “superficial” superset. The biggest difference is probably doing “multi-methods”, aka the runtime type of an argument deciding which method implementation to call vs java’s static overload resolution.)
The thing that’s great about maven is its declarative nature. You can declare goals and profiles for whatever you need the build system to do.
The main appeal that I can see from mill over maven is the power of dynamic programming over static xml files. Maybe good lsp/ide support will make managing a build system like this bearable?
Yes, IDE support in Mill is key. Without IntelliJ or VSCode, Mill would not be nearly as pleasant to use as it is today.
Mill and Maven both let you declare goals for what you want to do. One does it in XML and one does it in typechecked code. While XML does work, doing things in code with typechecking and full IDE support turns out to be pretty nice as well!
The comparision with Gradle is not up to date. There is stated that you would end up in an untyped mess of Groovy build files, but statically typed Kotlin files are the default for quite some time now in Gradle! https://mill-build.org/mill/0.12.1/comparisons/gradle.html
Author here. Unfortunately this is because my own experience with Gradle is not up to date; I've only lived in the Gradle Groovy world! If anyone is interested in helping out, I have a 1500USD bounty on porting a gradle.kts build to Mill, so we can do a fair up-to-date comparison https://github.com/com-lihaoyi/mill/issues/3670
I believe you have influence over the syntax highlighting on GitHub of .mill files by informing it they're actually Scala, which would make reading those files much nicer IMHO: https://github.com/github-linguist/linguist/blob/v8.0.1/docs...
Or, I believe you can submit a PR to linguist to make it globally registered: https://github.com/github-linguist/linguist/blob/v8.0.1/CONT...
You can maintain your competitive advantage over gradle by not constantly breaking backwards compatiblity, by the way.
I never got why people thought Kotlin would help Gradle. It absolutely doesn't.
Groovy was never the problem (Groovy has types, always had, you could use them if you wanted).
Think about it: what do you do with a build tool? You write a little recipe, then you run it. Does that remind something? Yes, it reminds scripts, like bash scripts you run all the time in your terminal. And why are scripts almost universally written without types... and no typed alternative, of which there are many, has caught on? Because if you're just going to modify a script and immediately run it, while keeping it short enough so you can know what it does without reading a book, how does adding types help you? Quite to the contrary, scripts (including build scripts) should be small, and having types all over the place make it far more verbose than it should be, likely pushing it out of your comfortable local memory in your brain, at which point you need something akin to a "real" programming language and a compiled program, not a script. Larger programs benefit from types because you don't just run the program, make changes, and run them again, like you do with scripts. You write them, test them, compile them, package them and finally you distribute them to your users who hopefully only need to configure them, not modify their internals. If your build is that complex, that's exactly what you should be doing instead of trying to shoehorn types into your scripts and expecting them to look like real programs.
Also, the Kotlin DSL just doesn't assist in the most problematic aspect of Gradle: its total lack of discoverability. Try doing something on your Kotlin Gradle file using a plugin you're not familiar with (which is all of them for most of us). It's completely impossible unless you know the DSL of the plugin, just like it was the case with Groovy... Once you know the DSL, it's fairly easy, but even in Groovy you will get auto-completion once you've got to the DSL "entry point", no need for Kotlin. I've been saying this since before they introduced the Kotlin DSL, and now i feel completely vindicated. I've never met anyone who told me "Gradle is so much easier now with Kotlin". But it did mess up plugins I wrote in Kotlin as now Gradle has a dependency on a very particular version of the Kotlin compiler, and God help you if your plugin was written with a different version in mind.
I disagree with the static vs. dynamic typing part. Modern statically typed languages (like Kotlin, Scala, Rust etc.) are concise and readable. In the case of the Groovy DSL for Gradle it was sometimes hard to get code right or to find a bug. Even IntelliJ struggled at times with this mess of a DSL. So, in my opinion Kotlin is definitely an improvement here!
However, I agree with your second part, the DSL as such. The syntax is arbitrary in many cases and just not easy to remember or to make sense of. It looks like a DSL for the sake of a DSL. Take a look at this example (https://docs.gradle.org/current/userguide/plugins.html):
Why are there two ways to reference a plug-in? Why is the version written without parenthesis? Why is version an infix operator? Why not something as simple and consistent as this: How does the DSL help here? Is it more readable? Easier to lear or to remember?Just look at Guice how nice a DSL can look like with pure Java:
I'd really whish Java had build tools with better developer experience. I whish Mill the best luck!Yes the lack of discoverability, plus the unfamiliar syntax of Groovy, plus names changing between versions, I started with Gradle thinking it would be easier but in the end I'd love to go back to Ant. That was awful to write but at least you could understand it.
+1
I spent years copying essentially the same ant-file across projects, just changing dependencies and target names. It's not really rocket science and unless you're trying to be clever, most java projects can look pretty much the same from a build perspective
What tends to be complex about build requirements that necessitates special purpose tools? Golang seems to be doing fine with just go build and go test. What else are people doing with gradle/maven that requires static typing, DAGs, plugins etc.?
Author here!
`go build` and `go test` do work, at limited scale and complexity. In Scala there's Scala-CLI which is excellent. If they work for you, you probably aren't the target market for these build tools. Once you start layering on bash scripts, layering on make, layering on Python scripts, layering on manual steps written down in a readme.md somewhere, that's the time when you should consider a proper build tool. And if that has never happened to you in your career, count your blessings :)
Why not just write some boring, pure code to handle the build? Why not write my build system in vanilla LYAH Haskell? It turns out that builds do have some specific requirements that most programs do not need to care about: caching, parallelization, introspection, and so on. Check out the following blog/talk for more details:
* Blog Post: Build Tools as Pure Functional Programs https://www.lihaoyi.com/post/BuildToolsasPureFunctionalProgr...
* Video: Mill: a Build Tool based on Pure Functional Programming https://www.youtube.com/watch?v=j6uThGxx-18&list=PLBqWQH1Miw...
Thus "naively" building your project "directly with code" ends up not working, so you do need some additional support. While most build tools end up constructing a complete bespoke programming environment from scratch, Mill tries to leverage the Scala language and JVM as much as possible, so you can re-use all your expertise and tooling (e.g. IntelliJ, VSCode, Maven Central, etc.) almost verbatim while getting all the necessary build-tool stuff (parallelism, caching, introspection) for free. Check out those two links if you want to learn more!
Thanks, the post actually answers my question on build requirements. It's a very good write up overall!
If your USP is to solve those layering cases, then why not target other ecosystems like golang/rust as well? Your design philosophy certainly seems to be language agnostic. By calling yourself a build tool for Java and Scala, it gives an impression that this is solving problems specific to those environments, and your adoption also indicates as such. Is it that these communities do not like to adopt such tools or is there something about the JVM ecosystem that tends towards having complex build requirements?
I called it a build tool for Java and Scala because that's what its good at right now. The software industry is hugely varied, so I can't target everything at once. In particular, this tool started off targeting 100% Scala, but branches out to Java since they share a lot of concepts (classfiles, jars, assemblies, maven central, etc.)
But you are right that it is not JVM specific! In the docs there is an example of adding Typescript module support, and an initial strawman implementation takes about 100 lines of code. I'm hoping others can extend Mill to places where I do not have the time and expertise
I also opened up a 500USD bounty to add a strawman Python example, so if anyone wants to try their hand at writing the 100 or so lines necessary, here's the link :) https://github.com/com-lihaoyi/mill/issues/3862
To be blunt, nothing.
The issue is that most people in jvm land are on a closed bubble and haven't seen anything else. This is true for build systems as is for non OO design for example. Most simply don't know better and the rest of us are simply stuck.
Ant and then Maven started simple enough but people always find a way to justify adding more stuff. Gradle already started complex enough and they keep adding more stuff...
The big thing is that many Java builds are not just blobs of binaries crammed together, but also have structure and metadata, sometimes generated on the fly.
Not all Java builds are simply compiles. There are several projects that rely on processing steps during the Java build. EAR files are jars within jars.
Then, of course, there's all of the dependencies.
The modern Maven based repository based dependency manager is a blessing and a curse. Drag and dropping an artifact into your project that inevitably downloads the entirety of the internet. Now you may wish to cull your dependency tree, so that needs to be expressible as well.
The primary benefit of Maven and the pom.xml file is that for a vast majority of applications it just work. Even better, its become a universal "project" format that many IDEs directly support. It well handles "dependency hell" in a cross tool way.
I wish Maven were a bit faster, but, simply, it's as fast as it can be for what it does. A good Ant build just flies, but Ant "doesn't do anything". It's just a bag of steps that it follows (for good and ill), in contrast to Mavens declarative style (for good and ill).
I have no experience with Gradle other than I've never run into enough problems with Maven to justify trying something else. On its surface, it doesn't really appeal to me. I was comfortable with Ant (I have no problem with XML), I'm mostly comfortable with Maven. I've not been unhappy enough with Maven to try and jump back to Ant w/Ivy.
To be blunt, if anything it might be you who live in a closed bubble.
Builds that require ad-hoc functionality is the default. It’s extremely rare that everything fits nicely into “cargo build” or other single language build tools’ model. And while these often have escape hatches, at that point you have to write imperative code with no caching and parallelization that is literally the job of a build tool.
I've been doing jvm apps for almost 20yrs... What builds need to do and what people made the builds do are completely different things. I don't remember a single project I was involved in that could not had had a simpler build...
True. In our example we didn’t have to generate online help and pdf manuals from the same asciidoctor sources. But when we chose to, we really needed the customization that cradle offered.
> It’s extremely rare that everything fits nicely into “cargo build”
161538 crates do not agree with you ;)
Which are mostly small, isolated library code, whose purpose is to be easily incorporated into larger programs. That’s the 90% easy path of build tools.
What about a large project built over 6 years by 50 people, and that has to use some obscure technology to communicate with company A, and another one that has an idiotic build step?
> Which are mostly small, isolated library code, whose purpose is to be easily incorporated into larger programs.
Which is how software should be generally made - from small and simple things, not from huge behemoths that contain the kitchen sink and brew coffee.
The apps are also built using cargo build.
For your outlandish use cases you can use Bazel.
You have to keep in mind that Gradle Inc. earns money by providing consulting for complex builds. An easy build tool would destroy this business model ;-)
They transitioned to a product company 5+ years ago. https://gradle.com
Gradle's complexity comes from at least two places:
1. The original vision of solving complex multi-technology/language Enterprise builds.
2. Poor early design decisions that they never recovered from.
In JVM world, the de factor equivalent to `go build` and `go test` are `mvn compile` and `mvn test`, which works 99% percent of the time.
Other build tools and plugins just compete/fill in for:
* improved build speed / test speed: using background daemon to reduce strtup speed, intelligent caching / task reordering to avoid redoing, etc..
* extra functionalities like code generation, publishing or deployments. As code generation is really big in JVM world, and there are many ways to deploy an application: jar + libs in a zip file, uber jars, container image, etc...
Curmudgeon here: this was true for a relatively brief period of time. Nowadays I'd say that gradle has (inexplicably to me) taken the lead - and everyone adds custom crap to their gradle build making them far less predictable than maven builds used to be.
I guess it's better than the nightmare over in the front-enders' world...
I rarely see Maven files in non-trivial projects that are anything but a confusing mess of XML.
Granted, the constrained abilities do tend to keep folks from writing one-off snowflake build customizations, which is nice. But it still leave a hell of a lot to be desired. It was however, leagues ahead of Ant, which wasn't a high bar.
I saw a lot of that too - I just see more and worse with gradle.
Maven gave us two things; good dependency management and convention driven builds (removing the horrible scripted build stuff in Ant).
Gradle from my point of view took the second one away again and it feels like it was just because people didn't like XML and couldn't be bothered to learn how Maven's build lifecycle actually worked!
Like I say, I'm a curmudgeon...
My typical mvn session after a month of not touching maven:
The original idea behind maven is nice. But the defaults are so bad, that the whole experience of convention over configuration has been ruined.Cargo and go build systems took the original maven philosophy and implemented them right, with good UX.
Gradle, SBT and friends took a step back to the times before maven, and went fully the Ant way, doubling down on "configuration over convention". Where "configuration" is actually "programming in a DSL on top of another language on top of Java".
I'm not trying to ignore your point, but FWIW the next line after the one you specified includes presumably what a normal person would want to see
I am open to the fact that maybe catastrophically old versions of Maven did not include that help text, but certainly since 3.0 from 14 years ago https://github.com/apache/maven/blob/maven-3.0/maven-core/sr...> [ERROR] No goals have been specified for this build.
> [ERROR] Unknown lifecycle phase "build".
And somehow missed that the error message also lists the valid lifecycle phases. (could have instead complained that it list all lifecycle phases which is a lot)
> // me searching on google how to not run tests
How do I know for other build tools how to skip tests? Skipping tests should not be necessary and is done to regularly. I would not consider a tool which immediately points out how to ignore those stupid tests as good UX.
When I want to run tests, I tell it to run the tests. When I want to install, I tell it to install. I don’t want it to run tests when I tell it to install. I don’t want tools who know better what I want to do than me.
Try to build a Go project that uses Cgo and non-trivial C/C++ libraries. Throw in cross-compilation for more fun. You'll end up with an external build system that invokes Go build as one of the steps.
Go projects normally just tend to be self-contained server-like software that doesn't need a lot of external libraries. But once you step away from that, you're on your own.
I guess my problem with Gradle is that app building should be way simpler than it is. Apps are not something niche anymore, but the tooling is still similar to the embedded software for microcontrollers.
I'm working on a project that encompasses both JVM (Gradle, Kotlin) and Golang.
My hot take: JVM build tools, especially Gradle, are a soup of unnecessary complexity, and people working in that ecosystem have Stockholm Syndrome.
In Golang, I spend about 99% of my time dealing with code.
In JVM land, I'm spending 30% just dealing with the build system. It's actually insane, and the community at large thinks this is normal. The amount of time it takes to publish a multi-platform Kotlin library for the first time can be measured in days. I published my first Golang library in minutes, by comparison.
You speak from my soul! I'm in the Java world for a really long time now and I'm wondering for years why the build tools need to be so complicated an annoying. I know Go, Node.js and bit of Rust and all have more pleasant easier to use build tools! The JVM (or GraalVM) as an ecosystem is just fine and probably one of the best, but build tools might be achille's heel. Maybe it would be a good idea for Oracle to invest into that area ...
My experience of JS projects is that build tools are frequently ad-hoc. That is, there simply isn't a general build tool at all, but just a large pile of scripts calling under-documented libraries. Parallelization, caching and quite often even portability are just missing.
To justify this statement consider this blog post I wrote a while ago about porting GitHub Desktop (an Electron app) from its prior build/deployment system to Conveyor [1]. Conveyor is a tool for shipping desktop apps and is implemented as a single-purpose build system. The relevant part is this commit:
https://github.com/hydraulic-software/github-desktop/commit/...
The amount of code that can be deleted is huge! Some of it is in-process code that isn't needed with Conveyor (setting up Squirrel etc), but a lot is just shell scripts that happen to be written in JS. Replacing that with a real build system not only simplifies the codebase but means the build steps are fully parallelized, fully incremental, easier to debug, portable (the build can run on any platform), progress is reported in a uniform way and so on.
So whilst the JS ecosystem's approach to build tools may be "simple" in some way, in the sense that there's no dominant build tool like Maven or Gradle, that simplicity does cost you in other ways.
[1] https://hydraulic.dev/blog/8-packaging-electron-apps.html (Disclosure: Conveyor is a commercial product made by my company)
I'm in JVM land. I spend very little time dealing with the build system. It is actually insane how well it works.
Also, why does it matter how long it takes to publish a library for the first time? It sounds like a non-issue to me. I have written dozens of libraries and published them to a local artifactory instance because it simply doesn't matter if your company specific code is accessible to the world or not.
One note from having worked with both that I don’t see mentioned: Golang dependencies are sources you basically pull and compile with your own code. In JVM-land dependencies are precompliled packages (jars). This adds one little step.
...or a big step, if cross-compiling is required (e.g. Kotlin Multiplatform)
I'm surprised there is no source-only dependency solution for JVM -- it'd solve this issue. Pull down the source and build on the fly. Perhaps there is and I'm unaware?
I'm afraid Java/Scala/Kotlin compilers are too slow to make that convenient. Even currently building pure Java projects can take minutes when it's compiling just like 300k lines. What if it had to compile millions of lines from all the dependencies?
The actual compilation step is 100% not the bottleneck - it can go as fast as 10k-50k lines per second! (According to the Mill benchmark, but that’s the Mill-independent part).
Comparatively, Go does “only” 16k lines per second based on some HN comments.
But you’re likely comparing on different hardware though. Go compiling only 16k lines per second is hard to believe for me. Maybe they meant on single CPU core. Rustc compiles over 50k lines per second on my MBP in debug mode and Go must be definitely faster, as everyone knows rust is very slow to compile.
But anyway, you may be right. I just ran mvn install for the second time with no source change on my current project. It took 57 seconds.
The java metric is also from a single core. But you are probably right that it should only be taken as a rough ballpark, but java is definitely in the same ballpark as go in compile speed.
What issue would it solve? The fact that you can build a jar in any OS and then just use that anywhere else is actually a huge benefit of using Java, as you don't force everyone to re-compile your library source code.
Well since the builds tend to be monstrously complicated for some reason, and there’s no standard build tool, maybe it’s more impossible than possible to consider source based distribution. Or it would be like JavaScript where you still need a build and publish step to turn “developer Java / other languages” into “vanilla source distributable Java”.
> The amount of time it takes to publish a multi-platform Kotlin library for the first time can be measured in days. I published my first Golang library in minutes, by comparison.
It's a bit Apple & Orange comparison: publishing a JVM only Kotlin library is quite easy, it's the multiplatform part that takes time.
Last time I published a JVM library I had to Open A Jira Ticket to request the rights to publish a package on the main package registry. Then I had to verify I owned the DNS name prefix for my package by fiddling the DNS records at my hosting provider. It took days just to get authorized! Not including the time needed to like, figure out how to make JARs happen.
In go: `git push` to a public repo
In js: `npm publish` after making an NPM account
Merely as a "for your consideration," GitLab ships with its own Maven repository (along with npm, docker, Nuget, and a bazillion others)[1] so you have total sovereignty over the publishing auth story. I can appreciate going with Central can be a DX win if you're distributing a library, since having folks add <repository> lines to their pom.xml or settings.xml is a hassle, but at least you get to decide which hassle you prefer :-D
In fairness, GitHub also finally got on board the train, too: https://docs.github.com/en/actions/use-cases-and-examples/pu...
1: https://docs.gitlab.com/ee/user/packages/maven_repository/
Sometimes barrier to entry is good. For example, both npm and cargo struggle with package name squatting and malicious packages that are miss spellings of common packages.
This isn't an issue in the Go ecosystem, because the package name is the GitHub repo.
I don't think a high barrier to entry is overall good, in fact I think it encourages larger more complex packages to justify the maintenance burden
Pedantically, that's only one way to resolve a go package - and for sure the more obvious[1] - but the most famous one I know of is gopkg.in/yaml.whatever that uses a <meta> tag to redirect to its actual GH repo, which only the deepest golang ninja would know how to use: compare view-source:https://gopkg.in/yaml.v3 with view-source:https://gopkg.in/yaml.v3?go-get=1
1: err, modulo that go.mod stuff that secretly adds a version slug to an otherwise normal github URL -- I'm looking at you, Pulumi: https://github.com/pulumi/pulumi/blob/v3.137.0/sdk/go.mod#L1
In rust: `cargo publish` after making an account on crates.io
I've been working on Java-based systems for about 20 years now, and I fully relate to that. Same experience.
This is so annoying that I prefer to use Rust over Java even in areas where things like better performance or better type system don't matter. But being able to start a fresh project with one `cargo init` and a few `cargo add` invocations to add any dependencies... well, this is priceless.
Interesting that you ended up going all the way to Rust land instead of just using one of the multiple tools that have been created to help with this, like:
* Spring Boot (it has a UI to create projects where you pick Java version, DB, build tool, some libs etc): https://spring.io/guides/gs/spring-boot
* JHipster - the nuclear option, pick what you want a la carte: https://www.jhipster.tech/
* JBang - a cute CLI for this: https://www.jbang.dev/
* Maven Archetypes - the old fashioned way (existed before "create-app" kind of tools appeared): https://maven.apache.org/guides/introduction/introduction-to...
And most IDEs also have "new project" wizzards.
Are you aware of Maven Archetypes[1]? I believe they were the "cookiecutter" before cookiecutter existed, although I am 10000000% on-board that their discovery story is total garbage :-(
1: https://maven.apache.org/archetype/index.html and https://maven.apache.org/archetype/maven-archetype-plugin/us...
But I don’t want to copy a full project with prepopulated list of dependencies chosen by someone else. I want to start small and add dependencies I need.
It’s like LEGO vs Playmobil. I want LEGO. ;)
How does that differ from `gradle init`?
Init and then what? The story of discovering and adding dependencies is still much worse. Nothing like cargo add/remove or crates.io where I can quickly search dependencies with their descriptions with standardized links to repos and documentation. Actually even Python is nicer in this regard with PyPi and pip install, even though virtual envs are pain.
https://central.sonatype.com/ and https://mvnrepository.com/ are exactly that
Interesting. I spend nearly zero time with my maven setup and almost all the time is in coding. I am genuinely curious to know where that 30% time goes? Is it waiting for builds?
> the community at large thinks this is normal
Half are ignorant. Other half are like me and just stuck with no options.
But the tooling ecosystem on the JVM truly is horrific.
I think there are a lot of "JVM Lifers" who are so deep in the ecosystem they are unaware how much better things can be.
Anecdote: I wanted to publish a ~100LoC multiplatform Kotlin library -- just some bindings. I publish these sorts of things for Go with just a "git push".
Steps were: 1. Spend a few hours trying to understand Maven Central/Sonotype, register and get "verified". They're in the middle of some kind of transition so everything is deprecated or unstable. 2. Figure out signing, because of course published packages must be signed. Now I have a secret to keep track of too, great. 3. Discover that there is no stable Gradle plugin for publishing to the "new" Maven Central, it's coming soon... Choose one of the handful of community plugins with a handful of stars on GitHub. 4. Spend a few hours troubleshooting a "Gradle build failed to end" error, which ended up being due to signing not finding a signing key. 3rd party plugin didn't handle errors properly, and a bug in Gradle meant that my secret wasn't picked up from local.properties. 4. Eventually discover that because Kotlin Multiplatform can't be cross-compiled, there is no way to actually publish a multiplatform library without spinning up a bunch of CI runners. And you can't just publish code -- JVM packages have to contain compiled artifacts. 5. Realise this now involves maintaining GitHub Actions and Gradle, which is an ongoing cost. 6. Give up.
The harm that this kind of complexity must be causing to the ecosystem is immeasurable.
Although a lot of it is generic badness, Kotlin Multiplatform isn't the JVM ecosystem. You don't need CI runners to publish a JVM library. The reason it comes up with Multiplatform is because Kotlin defines "Multiplatform" to mean platforms like JavaScript, or their own LLVM based compiler toolchain that bypasses JVMs entirely.
Very true, although it definitely feels like part of the ecosystem since it uses the same project structure, build tooling etc.
I’d just like to add, NPM gets a lot of flak (mostly deservedly) but it too is still vastly easier than anything in the JVM ecosystem.
Even with all the headaches around modules versus CJS, and JS versus TypeScript, NPM is a lot easier than Gradle. Notably, you have a choice of alternate tools (eg pnpm, yarn, bun) that interoperate pretty well.
I guess my point is, Gradle and Maven are specifically and outstandingly bad.
If you think gradle and maven are bad, you should try Mill! There is more to build tooling than gradle or maven, the field has evolved significantly since those tools launched 15-20 years ago, and Mill tries to do things better
I must be missing something here. Don't the tools you mentioned do a lot less than Gradle? Gradle knows test depends on compile, which depends on code generation (say protobuf) - with caching and change detection. Compare that to chaining up the commands in the `scripts` section of `package.json`.
EDIT: another comment making this point: https://news.ycombinator.com/item?id=41969847
I could be convinced if those features of Gradle actually worked well, or even worked properly, like dependency management does in e.g. Bazel.
In practice, Gradle really seems to fall down on the basic task of just being able to build stuff in the first place. It feels like you’re constantly fighting version hell just to find a Gradle version and plugins that work together, let alone your actual code dependencies.
And if you actually do need to do something slightly more complicated, like code generation, it’s very difficult to work with and the docs are really bad.
I have no complaints for the well trodden path (e.g. https://github.com/google/protobuf-gradle-plugin). I have also written some custom build steps, and indeed the docs aren't very helpful - but the final implementation is quite simple.
Npm also gets a lot of flak for the low bar it sets for introducing malicious code by impersonating an idling maintainer or presenting yourself as a successor. The friction, the secrets to keep, they are there for a reason.
> I published my first Golang library in minutes, by comparison.
For what platform(s)?
Or did you really just push the source code?
That's the trick. You publish the source code. And it's still faster to build all dependencies from source than maven / gradle manages to resolve and download the binary dependencies ;)
That's true, Maven is ridiculously slow to resolve dependencies while Gradle only really works with reasonable speed if you allow it to hog your system with a deamon.
I myself wrote a dependency resolver that matches Maven in functionality, and even a large project that uses Spring Boot and its dozens of dependencies can be resolved in a couple of seconds. About 10x faster than Maven or something like that. If you look at Maven's source code you'll see why. It's the worst kind of Java Enterprise overengineering you can imagine, complete with its own dependency injection framework, everything is pluggable (for no reason, really, do you really need to replace HTTPS for your protocols?? In Plexus you can), to the point that all the de-coupling results in lots of things duplicating functionality everywhere. I am not sure but I would bet Maven parses your POM at least 10 times to do anything due to the de-coupled nature of it.
Maven is actually pretty behind in terms of JVM dependency resolution. Mill uses Coursier, same as my last company did, and when my last company switched from Maven to Coursier we saw a 2 order of magnitude speedup, with resolution commands that used to take 30min finish in a few seconds to give the exact same artifacts and versions.
I actually have no idea why these other resolvers are so slow, or why Coursier is so fast, but this slowness is very much a "maven" or "gradle" thing rather than a "jvm" thing. And Mill using coursier does significantly better!
I recommend the "build systems a la carte" paper for a good overview of the various problems build systems address
There are quite a few cases: the moment you touch another language, resources that require a compile-step (e.g. xml schemas to code dtos, protonbuf, all that kind of stuff), sometimes even the code itself requires generation.
Build tools sit in an unhappy corner of the design space where they provide features not found in the core of regular programming languages, and which are so generally useful that there's a temptation to make them very abstract, but then they often lack some of the features that let regular programs scale well.
The key feature that justifies their existence is parallel and incremental execution of DAGs of world-mutating tasks. This is an awkward fit with most programming languages, hence the prevalence of DSLs. But people don't want their build system to become a general purpose programming language, because they don't want to think about build systems at all and because programmers don't buy programming languages anymore, so, this causes a big design tension between generality (people want to use build systems to automate many things) and deliberately limiting the expressive power to try and constrain the design space and thus tooling investment required.
Java is in an awkward place because the JDK was born in the 90s on UNIX, by people who thought make is a sufficiently good solution. You still see remnants of this belief in the official Java tutorials, in JEPs, and in the fact that OpenJDK itself is compiled using an autotools based build system! (fortunately it's one of the nice make based build systems out there).
The problem with make is twofold:
1. It assumes a CLI that's both powerful and standardized provided by the host OS. Windows violates this assumption but Java is meant to be portable to Windows.
2. "Plugins" are CLI tools or scripts and so make implicitly assumes that subprocess creation is cheap. But process creation on Windows is expensive, and starting up JVM programs is also expensive due to the JIT compiling.
Therefore make just doesn't work well in the JVM ecosystem. At the same time, the Java project wasn't providing any competing solution, so the wider open source community was left to fill in the gaps. These days language developers provide build tooling out of the box as part of the base toolset along with the compiler, but Java still doesn't.
So - you ask, what are people doing with Gradle/Maven that requires all those features. The answer is: everything! Gradle builds frequently orchestrate dozens of different tools as part of a build pipeline, build documentation websites, do upload and deployment, download and manage dependencies, run security scanners and license compliance checkers, analyze dependency graphs, modify compiler behaviors, and so on.
Additionally Gradle isn't specific to Java, or even JVM apps. It can also be used to compile C/C++ programs, run native code compilers like Kotlin/Native, and it abstracts the underlying platform so Gradle builds aren't tied to UNIX.
That's why it's so complicated.
Build systems exist in two turning complete rabbit holes/ slippery slides:
Configuration and workflow execution.
It's not clear to me how this is better than Gradle. And I hate Gradle.
At first glance, Mill looks like it has many of the pitfalls of Gradle: - Plugins: Creates the temptation to rely on plugins for everything, and suddenly you're in plugin dependency hell with no idea how anything actually works. - Build scripts written in a DSL on top of a new language: Now I have to learn Scala and your DSL. I don't want to do either! - Build scripts written in a language that can be used for code too: Versioning hell when the compiler for the build system needs to be a different version to the compiler for the actual project code. See: Gradle and Kotlin
Author here! The issue here is that builds, and many other "just configuration" scenarios, are fundamentally complex. So many projects that start off as "just XML" or "just YAML" end up implementing their own half-baked programming language interepreter inside of their XML/YAML/JSON/whatever.
Examples:
* Github Actions Config Expressions https://docs.github.com/en/actions/writing-workflows/choosin...
* CloudFormation Functions https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGui...
* Helm Chart Templates https://helm.sh/docs/chart_best_practices/templates/
There is a reason why Bazel went with Python/Starlark, why Pulumi and CDK and friends are getting popular. Fundamentally, many of these use cases look surprisingly like programming languages: maybe not immediately, but certainly after you've dug in a bit. And having a properly designed purpose-build programming language (e.g. StarLark) or a flexible general purpose language (e.g. Typescript, Kotlin, Scala) does turn out to be the least-bad option
I agree that Bazel did pretty well with Starlark, but the reason that’s sane is because it’s not Python, though the syntax is similar. It avoids getting into trouble with people using Python language features that would result in upgrade hell and annoy other programmers who aren’t Python experts.
(Though, debugging complicated Starlark code can still be difficult.)
So why not use Starlark? :)
Starlark is great, but so is Scala. People underestimate how big the ecosystem is even for a niche language like Scala:
- Global publishing and distribution infrastructure
- IDE support in multiple IDEs
- A huge ecosystem of third party packages, both Scala and Java
- An excellent Scala standard library and Java standard library
- Good performance.
- Tooling! Jprofiler is great. Others use Yourkit or JFR.
- Mill leans havily on Scala's FP/OO hybrid style with types, while starlark provides none of that and is purely untyped procedural code
Just wanted to mention that there are much better config languages than Starlark by now: CUE, Pkl, etc.
Why do you call these other languages “better”? They’re different, but I’m not sure why either of the one’s you mentioned would be better for this use case.
Modern config languages offer strong validation and advanced IDE support, which is essential for a great user experience.
https://pkl-lang.org/intellij/current/highlights.html
I was going to mention Cue, but I’ve only read about it, not used it, and couldn’t actually say whether it’s better.
I'm afraid that no current config language is an obvious fit for Mill. That's because Mill is fully reactive and doesn't distinguish between build configuration and execution by design.
> end up implementing their own half-baked programming language interpreter inside of their XML
Greenspun's tenth rule.
There is basically no DSL. You simply write what a build needs, e.g. you write a function `collectCFiles()` that collects every file with extension `.c`. You then issue a command like `gcc ${collectCFiles()}`. And pretty much that’s it - you can use shell commands, or do anything in scala (or java or whathaveyou). You simply have your functions return either a value (e.g. a checksum) or a location, which is the only mill-specific logic. So your compileC() function will simply invoke your collectCFiles() function, and this invocation implicitly creates a dependency between these tasks. You have written literally the simplest way to describe your build logic. But in the background mill will cache your functions’ inputs outputs and parallelize those that need re-run, which is what a build tool should do.
The implementation may not be the theoretical best, but I think the idea is pretty much the perfect build system out there.
The first advantage the homepage lists is:
> Mill can build the same Java codebase 5-10x faster than Maven, or 2-4x faster than Gradle
Speed per se can be a good selling point (having to wait for slow builds is really annoying).
I can't really comment on anything else though as I just stumbled upon it here in HN ;)
The goal should be more like 50x faster than Gradle. Gradle is ludicrously slow (at least in every single Gradle project I’ve had to work with).
First invocation may be. Subsequent builds are very fast, unless someone decided to write random bullshit into the build scripts that execute at config time, making the config process impure.
I’m mostly thinking of Android projects. If I have time I’ll try some speed tests with a new basic project. But I don’t think I’ve even once done something in Android Studio and thought “huh, that was surprisingly fast”. Maybe some of the hot reloading stuff is okay (when it actually works).
Are we talking about Maven with its cache extension?
https://github.com/apache/maven-build-cache-extension
Because in my experience, this makes Maven very, very fast.
AFAIR author made quite unfair comparison with simple compile vs full maven build (that executes a lot of additional stuff)
For Scala (of which this is probably the main target) Maven builds are especially slow. I would not be surprised if that was his early focus.
Mill's early goal was to be a saner sbt, incidentally also fixing the parts of sbt that are/were unreasonably slow due to questionable design decisions.
Maven has never been relevant to the Scala ecosystem given most of the community has pretty much moved straight from ant to sbt. Only a few Spark related projects stubbornly use Maven, which is a major pain given the lack of cross-building abilities. Slow dependency resolution and inefficient use of Zinc merely add insult to injury.
Yeah... that's my experience with Scala all around - it's abysmally slow, especially if you use any sort of "metaprograming"... (one of the reasons I stay clear of the language)
Speed wins for dev tools.
I had worked out that math for “like pip but actually works” but few people were conscious that pip didn’t quite work reliably for large and complex projects — I didn’t think it was possible to sell it.
Uv won hearts and minds because it was uncompromisingly fast: people did not really care that it had a correct resolving algorithim or that is was really reliable because it is not written in Python and thus can’t trash it’s own dependencies (maybe a solvable problem in that the build tool can have its own virtualenv but isn’t it nice to for your package manager to be a binary that can’t get dependencies screwed up no matter how hard the users try?)
It's great to see continuing innovation in the Java space!
One tool I've been using to speed up maven is mvnd, the maven daemon. It's a drop in replacement for mvn with impressive speedups.
https://github.com/apache/maven-mvnd
I'm not sure I like the daemon approach. But the cache extension provided me fantastic gains: https://github.com/apache/maven-build-cache-extension
Mill uses the same daemon design as Gradle and mvnd. You do hit edge cases occasionally, but overall it works great
The last time I checked, sbt was much faster than mill for incremental builds. Mill has a faster cold startup time, but sbt uses classloader tricks to reuse jitted classes so that it doesn't have to reload the scala standard library. When running tests continuously and rerunning on save, sbt was much faster than mill for equivalent projects. I haven't tested in three years or so though. But I would encourage people to make a simple project in sbt and mill and run `sbt ~test` and compare it to `mill -w test`. In the past, I found that after a few iterations, sbt could respond to changes in a few hundred milliseconds while mill would take multiple seconds to retest the same code. That difference really adds up when you are iterating on a problem.
That said, I have come to believe that the jvm is a bad platform for a build tool. Everything that touches the jvm becomes bloated and slow, particularly for startup. I no longer write scala because of my frustration with the bloat (and scala adds its own bloat on top of the jvm).
The test thing is just a matter of defaults, in Mill subprocess testing is the default and in-process testing opt-in via .testLocal.
I also believed that a lot of existing JVM tooling is bloated and slow, so we are in agreement! Mill tries to be different, so do give it a chance if you can. There is life beyond Maven, Gradle, and SBT
sbt is one of the worst engineering mistakes I've ever witnessed. It was a constant source of esoteric ergonomics and frustration for no clear reason other than being the pet project of someone who really loved implicits.
I've used mill for some Scala projects in the past and I give it 5/5.
Nice try, I'm still not going to write scala code.
Build times always were my biggest Scala complaint. Arguably code base specific, as I suspect it had a lot to do with all the macros and type level metaprogramming, but, if that kind of thing is possible and customary, an average working programmer who doesn't control the codebase they show up to is going to end up stuck dealing with it.
It's a lot easier to build a language that works great, but only in the hands of a single skilled careful owner, than a language that stands up to the abuse of many careless temporary users, and still gets from point A to point B reliably.
Like the difference between a sportscar and a rental company or police fleet sedan.
Scala was a porsche, but most of us need camrys.
Big fan of mill! (Coming from maven, Gradle and SBT) - the 1:1 mapping of build tasks to output files is the big one for me as it makes understanding other people's builds so much easier going through it step by step
Why not compare it to bazel/pants/buck2 as well?
Mill seems to have taken some inspiration from those as well.
Author here. It does! I started working on Mill when I started learning Bazel, during my first months at Databricks. There's a lot of cross-pollination of ideas there, from my 7 years adopting and maintaining the Bazel build at Databricks, but I haven't had time to do a proper head-to-head comparison. Hopefully someone else can though!
What drove you away from Bazel?
I would expect anyone considering migrating away from a "legacy" tool like maven would consider a "modern" tool like Bazel first.
Bazel is a rats nest of complexity. In my experience it does what it does very well once set up, but setting it up is tremendously complicated, much of it IMO incidental complexity.
Rolling out Bazel at my prior employer tool about one person decade of engineering time. I've talked to other companies that tried to roll it out and failed. Bazel is hard
And Bazel is not really getting any easier! Like most projects, it is getting more complex over the years as features acrete. I think there is space for a tool like Mill for less sophisticated users who can't afford to spend a person-decade rolling out their build tool
There's a big difference between the complexity of setting up Bazel and the experience of using Bazel at a company where Bazel has been set up for you. As a user I love Bazel. When working with Java, I use the java_binary and java_library rules. When coding in Go, there's nothing new to learn regarding the build, just use go_binary and go_library instead. Everything is repeatable, builds and tests are cached, it's easy to query the build dependency tree, etc.
A few startups are offering "Bazel build/test as a service." It's one way to eliminate the work involved in setting up Bazel for an organization.
Sounds great!
Sorry to hijack this thread a bit, but I currently work at a Scala shop and have grown to like writing it. I worked at Clojure heavy place previously. This tool looks neat.
Has anyone at the senior level recently moved on from Scala to other languages recently? Any issue finding jobs or learning the new role?
[dead]
[flagged]
[flagged]