Since most of the code and folder structure are automatically generated, this leaves little room to the developer on how they will organize their project.
There is an old way and a new way to write it. These are not documented well and this makes it hard to find examples/tutorials online as you're never sure which style its using
It's hard to write a complex set of cmake files for a large project from scratch without constantly referring to documentations or google even for experienced users.
Usually, make puts resulting binaries in the same place where it finds sources. In cmake, the recommended way to build is 'out-of-source' — put the product somewhere else. This allows to keep the source tree clean and use faster storage for building (like ramdisks).
For more information, visit here.
With CMake, one can easily customize the build and install process by introducing many option()s or making custom config.h by configure_file().
ccmake is also a handy tool to manage these options.
The cmake itself suffers from technical debt and also let the users shoot themselves in the foot by writing quick / dirty hacks. This may lead into huge headache when the project scales or is exposed into more platforms.
Takes a very long time. In projects consisting of multiple independent packages (e.g. ROS projects), CMake can take more time than the actual compilation or running of unit tests.
CMake comes with the fantastic ncurses GUI ccmake out of the box. Large codebases including WeeChat use it, to the point that the developer never needs to touch CMake code, but it is highly readable if he wishes to examine it.
GNU Make is not limited to making packages only. It can also be used to install or remove a package, generate tags for it or anything else that can and should be done programmatically.
Very useful for simultaneous cross compilation for multiple targets simultaneously.
Easy to customize build flags depending of targets/target files/external flags/computational results.
Suffix based Rules for build chains.
With Make it's easy for the user that wants to compile a project to do so. All that they need to do is run make and GNU Make will take care of the rest without the user even needing to know what's happening.
Other build tools need wrapper modules to do certain tasks. The biggest disadvantage of these wrapper modules is that they bind you to a version of that tool. With Make you don't have that problem, there's no need for wrappers and no tools to bind you to a version, you can use any version of Make that you want.
The learning curve is rather steep and the documentation could be improved. If it weren't for the wealth of free software projects that help demonstrate how the tool can and should be used, the task of picking autotools would be very hard.
Common bugs are silent in make, such as mispelled variable names or wrong dependencies. Several features make it easy to shoot yourself in the foot, such as target specific variables, which are carried over to dependencies of that target.
"Recursive make" is a common makefile coding pattern which is used to invoke another session of make. Since a session of make only read in one top-level makefile, this is an easy and natural way to build makefiles for projects made of several submodules.
But this pattern causes a lot of problems mainly that you need to partition the dependency tree into several smaller trees. This prevents dependencies from being expressed correctly between instances. This also causes parts of the dependency tree to be calculated multiple times which makes performance suffer. This and many other problems related to recursive make are explained very well in a classic article called Recursive Make Considered Harmful.
If you need support for building releases, detecting system capabilities, and running a full test suite on your release tarball to ensure that it actually works, then you can easily switch up to full GNU automake and autoconf.
Cannot incrementally modify files (e.g. LaTeX PDF, VISing and LIGHTing Quake maps, which takes the same BSP file as input and output), and will not delete files (e.g. rm build/*.o).
Builds only use input files that are explicitly declared in the build specification. On Linux, Bazel runs tools in a sandboxed environment that contain only the minimum necessary files required.
Tried to build Tensorflow with updated Bazel? You may be out of luck, recently it could only be compiled with only a single (and not the most fresh) version of Bazel.
Buck is not made available through the official repositories of major Linux distributions such as Debian, which suggests it's an obscure tool with limited adoption and even smaller relevance.
Gradle is a dependency programming tool first and foremost. Gradle will make sure that all declared dependencies are properly executed for every random task that you execute in your setup. The code can be spread across many directories in any kind of file layout.
Gradle has full integration with Jetbrains IDEA.
IDEA understands multi-module Gradle builds and automatically maintains the IDEA modules within the project.
You also have the option to run unit tests with either the built-in JUnit/TestNG test runner, or delegate running the test to Gradle using the same visualization as the built-in runner.
Android Studio's build system is an Android plugin for Gradle. What's more is that the Android Gradle plugin can be installed and run even on machines that don't have Android Studio, which enables you to build Android apps everywhere (for example continuous integration servers).
Gradle follows the convention over configuration paradigm in order to make it easier for developers by having already made a number of decisions out of the box.
Since Gradle does not use XML but it uses it's own DSL based on Groovy, Gradle scripts tend to be shorter than other build tools that use XML. Boilerplate code is also considerably small because it's DSL is designed to solve a specific problem: moving the software through its lifecycle starting from compilation into static analysis and testing, packaging and finally deployment.
It says it is language-agnostic and supports C++ out of the box. Technically: yes. In practice, it is a build tool build by Java developers for Java developers and you can really feel it there. It is small things, but there is a lot of them and they can grow into big pains.
It seems to add far too much complexity to projects. The build system has a tendency to be more complex than the actual projects that it's being used to build.
Build your project, run the tests, create a release tarball, unpack it with read-only sources, build it and run the tests. This should be the minimum standard for every build system, yet it seems hard to reach.
You can write code for your build system in Ruby. While not my choice for general programming, Ruby is powerful and expressive. Given some knowledge of Ruby, you can create powerful Rake extensions that result in your average target only needing a few lines in the rakefile in spite of having complex behaviors (Is the library for public consumption, or only for use within the current repo/tier? Compile certain files on certain platforms? Link to libraries published from other repos? etc.).
For large codebases or with complex extensions, Rake can become quite slow. I'm aware of one codebase on which it can take 15 minutes to determine that no changes have been made and no recompilation is necessary.