Autotools recommendations

Actually, distro maintainers will hate you because it forces them to track down every source of every macro used inside the or files every time they touch one of those types of files in a backport.

Just use meson, it’s actually readable.

/a distro packager of 21 years


This is a strange forum. A guy comes and ask information about Anjuta + Autotools and gets answered not to use Autotools. I try to give my help, since I actually like Autotools, but my answer gets split into another thread.

The original poster of course wonders why he/she should not use Autotools, but gets asked whether he/she is a troll.

In the meanwhile in this split discussion (which was born as one of the few actual answers to the original question), it looks like I have to pass an exam for liking Autotools, although I repeat that liking a build system is purely subjective.


I am ready to switch to meson/CMake/whatever then. But you would need to explain to me how I can port the following features from Autotools to the alternative build system:

  1. Optionally forking (while building) the original code and create a split package of the library removing from both the public header and the code the functions that rely on I/O operations (for embedded systems) – currently configure --without-io-api does that
  2. Amending the library on the fly to make it possible to compile it for platforms that do not provide the C Standard library (freestanding environments) and guessing the missing data types accordingly (we must use compile checks only, or otherwise it won’t work when cross compiling) – currently configure --without-libc does that
  3. Exporting the version number everywhere so that I can write it only once every time I release a new version (by everywhere I mean source code, package.json, documentation, and so on) – currently configure --enable-extended-config does that
  4. Same as above for the binary version (libtool version info)
  5. Renaming the package during build – currently that is managed by Autotools transformations
  6. Make it work under Windows and update the current MSYS2 libconfini package accordingly
  7. Generate the .def and .exp files under Windows

I probably forgot to mention another similar amount of other features, but I don’t have a very good memory.


If you touch only the files the configure script will not be re-generated, thus m4 will not be required. If you touch the file instead the m4 code is run again. What concrete scenario of backporting asks you to touch the file?

Just to make it clear, for Autotools to break in your scenario we need all the following conditions to be true:

  1. A package needs backporting
  2. For very strange reasons such backporting requires you to edit the file
  3. The file makes use of exotic macros

You must be a very unlucky distro maintainer. How many times in your lifetime did this odd event happen?

No. My first sentence links to the autotools FAQ answering the original question EXACTLY.

YOU then came and blew this up into a semi-theological discussion about belief systems.

You know, like in scientiffic papers, lilnks to direct quotes, references to web pages from distributions. You claim, you have to deliver. it is not my job to read between the lines of some random opion site like hackernews.

1 Like

Please everyone let’s turn down the heat, remember that we are all colleagues here…

I think that would be very interesting, e.g. for GuixSD. It could be a good way to modernize it.

CMake and Meson both support build options, variables and compile checks, and have good support for Windows, we can go through it if you decide you want to try something new and can pick one. It’s up to you though.

1 Like

This has happened to me more than once. I don’t know exactly how many times. It’s been much less of a problem recently since most desktop packages no longer use Autotools. But for Autotools projects, patches touching are pretty common, and that necessarily requires running Because we don’t run for all packages in Fedora, it is very common that this results in packages that fail to build. It might be due to an API break in autoconf-archive, or more rarely autoconf itself. But usually it’s just caused by a m4 macro being altogether missing. It’s a pain to try to hunt down which package provides a particular macro. Inevitably it results in a confusing syntax-related error message: only if you remember to use AX_REQUIRE_DEFINED religiously will you get a reasonable error message that tells you that you are missing a macro (but not where to find it). God forbid you receive a “I can’t build your project” bug report from a user missing autoconf-archive and therefore AX_REQUIRE_DEFINED…

1 Like

It got split into another thread because the answers were spiralling into the usual “tabs vs spaces” and “emacs vs vi” thread, and it literally took two replies to get there.

Splitting topics is the least disruptive option before closing them, or deleting replies.

Having said that, I’m all for closing this particular discussion.

1 Like


The original poster might have missed it, which might explain why they did not answer with “Thanks for the FAQ”, but answered with “Why do I need to switch from Autotools?”.

What I did was answering to a person who asked about Autotools-related stuff as a person who actually likes Autotools and is even mildly enthusiastic about it. But I agree with @jfrancis that we should all cool down.


It would require an incredible amount of work in my case, without certainty that it will be possible/worth. What I needed was to “have full control over the generated makefile”, as you said in this comment. Once I had that, I exploited the fact that Autotools relies on m4 macros and I released my solutions in a separate m4 macro collection, so that everyone could benefit. This kind of micro-extensibility is what I see missing in other build systems.


There is an “original sin” in the Autoconf Archive. For unexplainable reasons the AX maintainers decided to add a file to their repository, so developers thought “Oh! It’s a package!”, and started to depend on it instead of exploiting it (by copying and pasting the macros that they needed manually).

This is one of the reasons that made me create the Not Autotools project (see also this discussion).


It’s fine, I was not really protesting against the split, I only found the accumulation of split + various answers unusual.

You can make external modules and macros in CMake such as the ones maintained by KDE. In Meson it’s preferred to contribute extra modules upstream.

But anyway this discussion is getting way off-topic for Anjuta and GNOME.

I don’t think letting IDE edit the build system automatically is a good idea, In visual studio there is XML based build system that is not human editable and to edit build parameters you should go into several windows and tool options instead of writing single line and then after some releases they might change tool option location and because of that many learning resources turned to be unusable in few years (worst of all vs compiler doesn’t have CLI at all).

in my opinion IDE way is not good for building stable and software and It will adds layers of complexity and unnecessary abstractions because they always try generalize and I don’t see any IDE can solve this issue not Anjuta, builder, VS, netbeans, eclipse, or inteliji.

although autotools comes with his problems but dependency of perl is more stable than python and autotools is mature.

since you are good at autotools I can’t recommend anything new to you but for any beginners reading this thread I recommend the following resources:



Although I do agree in general with this sentence, I also think that we should not pull down good ideas when they come. Anjuta was an excellent idea, which carried a lot of potential improvements for both itself and Autotools, and it’s a pity that it lacks maintainers at the moment.

I have tried to replace it with GNOME Builder, but the only thing I get from installing GNOME Builder is that I am forced to install flatpak, although I hate not using pacman for installing packages (once you install flatpak on Arch a lot of duplicate packages will appear in GNOME Software, one copy coming from Arch the other coming from flathub), and in my opinion AUR is a much better way to deal with bleeding edge software. Besides that, total lack of support for Autotools, so useless for me. And since I am firm in not wanting flatpak installed on my machine, I have always to uninstall GNOME Builder after installing GNOME.

Nothing against GNOME’s way of building software, but it should be clear that Anjuta and GNOME Builder have different scopes. The first is a general purpose IDE written for GNOME that relies on GNU Autotools, the second is an IDE designed to build software that complies with GNOME’s current way of building software.

EDIT I wanted to test if things were still like that after my comment, so I just installed GNOME Builder on Arch again, and it did not force me to install flatpak this time, so I will have to take back my previous statement.

SECOND EDIT Autotools support has improved a lot since last time I tried GNOME Builder, so I take back most of my comment.

Just to make sure people after you don’t get the false understanding what GNOME Builder can or cannot do: GNOME Builder is perfectly capable to build autotools projects. Here is an example from your project @madmurphy

All i had to do is to clone the repo, hitting build and switch to the build terminal to execute the tests. You are probably right in the sense that not all convenience functions are available. For example we can’t derive tests from autotools project. But working with an autotools project is perfectly possible.

GNOME Builder does not force to install anything flatpak related unless you open an flatpak project. It helps in that sense to setup everything for the user to build it automatically. I think you can cancel it, switch to the native build and ignore the flatpak manifest and go for it. Nothing is forced at all here.

1 Like

You are right, I checked and I edited my comment accordingly. Arch does not force anymore to install flatpak along with GNOME Builder.

So I have nothing against Builder anymore :slightly_smiling_face:

Just to make it clear that I did not dream it, this was Arch GNOME Builder’s PKGBUILD from March 25th:

You can see that it depended on flatpak-builder, which in turn depends on flatpak.

In my experience there are working CLI tools, have you seen MSBuild?

since you are trying builder with autotools tell how it compares to anjuta and will you recommend anjuta users to migrate to Builder.

From what I understand all the topic of this thread is about finding ways and tips for making IDE automate some autotools tasks although I don’t believe In this I want to know what problem you face specifically since you started this topic It is better to give details and cases where it needs improvements .


Hi, guys,
As the person who originated this discussion I can say following:

Builder is not capable of building (or better yet - reporting an error) a project that was made with Autotools/Anjuta.

And I just started a thread about this.

Thank you.

I did not use Builder enough to compare, and in general I tend to do the heavy lifting without an IDE. It seems to me that while the graphical interface of Builder is more polished, Anjuta has many more options that are specific to Autotools.

As it must support different build systems, I am not sure if Builder can ever reach the same level of introspection. But if like me you are used to do everything by hand I am also not sure you will ever really miss that. For beginners Anjuta is great because it teaches them how to translate the basic structure of a project into a functioning Autotools project. Builder at the moment cannot do the same. For example, I did not find any option to add an Automake target to the project via IDE. Something like,

examplesdir = $(docdir)/examples
examples_DATA = \
	my_example1.c \

which is possible via Anjuta – and that is how I learned how Autotools works. However Anjuta needs maintenance, or otherwise it will be dead soon. I guess a big help from the community is what Anjuta needs.

Personally I became a fan of meson/ninja. cmake is also ok-ish though
I don’t like it as much.

As for autotools, autoconf, automake, GNU configure, m4 macros … I’d
love to see them go away completely. I am not saying “./configure --options”
is bad; I like the simplicity. But I would never want to maintain this myself.
It’s just too much ugliness and messiness…


Here we are in the realm of opinions, but I think that neither Meson nor CMake get even close to the ultimate final solution for a build system. But I also think that bringing more competition between build systems is a good thing, therefore both Meson and CMake do a good job.

Whatever the final solution will be, it will have to pass through a major standardization of operating systems first, with Microsoft Windows becoming more Unix-like or finally disappearing, more consistency between the BSD and GNU flavors of the Unix-like world, and Google not attempting to reinvent the wheel.

Personally, I believe that a build system must be developed in osmosis with a compiler, so I am not even sure whether a perfect build system able to support different programming languages and compilers can ever exist – unless compilers get standardized too, with MSVC being rewritten from scratch and CLang starting to support all its unsupported languages.

As for the future of Autotools, you might want to have a look at the discussion that took place on Autoconf’s mailing list right before current version (2.71) got released, accompanied by a reignition of enthusiasm. The starting point was the following blogpost written by Zack Weinberg:

Strengths, weaknesses, opportunities, and threats facing the GNU Autotools

I’ve been a contributor to GNU projects for many years, notably both GCC and GNU libc, and recently I led the effort to make the first release of Autoconf since 2012 (release announcement for Autoconf 2.70). For background and context, see the LWN article my colleague Sumana Harihareswara of Changeset Consulting wrote.

Autoconf not having made a release in eight years is a symptom of a deeper problem. Many GNU projects, including all of the other components of the Autotools (Automake, Libtool, Gnulib, etc.) and the software they depend upon (GNU M4, GNU Make, etc.) have seen a steady decline in both contributor enthusiasm and user base over the past decade. I include myself in the group of declining enthusiasts; I would not have done the work leading up to the Autoconf 2.70 release if I had not been paid to do it. (I would like to say thank you to the project funders: Bloomberg, Keith Bostic, and the GNU Toolchain Fund of the FSF.)

The Autotools are in particularly bad shape due to the decline in contributor enthusiasm. Preparation for the Autoconf 2.70 release took almost twice as long as anticipated; I made five beta releases between July and December 2020, and merged 157 patches, most of them bugfixes. On more than one occasion I was asked why I was going to the trouble—isn’t Autoconf (and the rest of the tools by implication) thoroughly obsolete? Why doesn’t everyone switch to something newer, like CMake or Meson? (See the comments on Sumana’s LWN article for examples.)

I personally don’t think that the Autotools are obsolete, or even all that much more difficult to work with than some of the alternatives, but it is a fair question. Should development of the Autotools continue? If they are to continue, we need to find people who have the time and the inclination (and perhaps also the funding) to maintain them steadily, rather than in six-month release sprints every eight years. We also need a proper roadmap for where further development should take these projects. As a starting point for the conversation about whether the projects should continue, and what the roadmap should be, I was inspired by Sumana’s book in progress on open source project management (sample chapters are available from her website) to write up a “strengths, weaknesses, opportunities, and threats” analysis of Autotools.

This inventory can help us figure out how to build on new opportunities, using the Autotools’ substantial strengths, and where to invest to guard against threats and shore up current weaknesses.

Followup discussion should go to the Autoconf mailing list.


In summary: as the category leader for decades, the Autotools benefit from their architectural approach, interoperability, edge case coverage, standards adherence, user trust, and existing install base.

  • Autoconf’s feature-based approach to compiled-code portability scales better than lists of system quirks.
  • The Autotools carry 30+ years’ worth of embedded knowledge about portability traps for C programs and shell-based build scripting on Unix (and to a lesser extent Windows and others), including variants of Unix that no other comparable configuration tool supports.
  • Autoconf and Automake support cross-compilation better than competing build systems.
  • Autoconf and Automake support software written in multiple languages better than some competing build systems (but see below).
  • Autoconf is very extensible, and there are lots of third-party “macros” available.
  • Tarball releases produced by Autotools have fewer build dependencies than tarball releases produced by competing tools.
  • Tarball releases produced by Autotools have a predictable, standardized (literally; it’s a key aspect of the “GNU Coding Standards”) interface for setting build-time options, building them, testing them, and installing them.
  • Automake tries very hard to generate Makefiles that will work with any Make implementation, not just GNU make, and not even just (GNU or BSD) make.
  • The Autotools have excellent reference-level documentation (better than CMake and Meson’s).
  • As they are GNU projects, users can have confidence that Autotools are and will always remain Free Software.
  • Relatedly, users can trust that architectural decisions are not driven by the needs of particular large corporations.
  • There is a large installed base, and switching to a competing build system is a lot of work.


In summary: Autoconf’s core function is to solve a problem that software developers, working primarily in C, had in the 1990s/early 2000s (during the Unix wars). System programming interfaces have become much more standardized since then, and the shell environment, much less buggy. Developers of new code, today, looking at existing configure scripts and documentation, cannot easily determine which of the portability traps Autoconf knows about are still relevant to them. Similarly, maintainers of older programs have a hard time knowing which of their existing portability checks are still necessary. And weak coordination with other Autotools compounds the issue.


  • Autoconf (and the rest of the Autotools) are written in a combination of four old and difficult programming languages: Bourne shell, the portable subset of Make, Perl, and M4. Competing build systems tend to use newer, more ergonomic languages, which both makes it easier for them to get things done, and makes it easier for them to attract new developers.
  • All the supported languages except C and C++ are second-class citizens.
  • The set of languages that are supported has no particular rationale. Several new and increasingly popular compiled-code languages (e.g. Swift and Rust) are not supported, while oddities like Erlang are.
  • Much of that 30 years’ worth of embedded knowledge about portability traps is obsolete. There’s no systematic policy for deciding when some problem is too obsolete to worry about anymore.
  • Support for newer platforms, C standard editions, etc. is weaker than support for older things.
  • Autoconf’s extensibility is unsystematic; many of those third-party macros reach into its guts, and do things that create awkward compatibility constraints on core development. Same for existing configure.acs.
  • The code quality of third-party macros varies widely; bad third-party macros reflect poorly on Autoconf proper.
  • Some of the ancillary tools distributed with Autoconf don’t work well; most importantly, autoupdate (which is supposed to patch a to bring it in line with current Autoconf’s recommendations) is so limited and unreliable that it might be better not to have it at all.
  • Feature gaps in GNU M4 hold back development of Autoconf.

The Autotools as a whole

  • There are few active developers and no continuing funders.
  • GNU project status discourages new contributors because of the paperwork requirements and the perceived lack of executive-level leadership.
  • There is no continuous integration and no culture of code review. Test suites exist but are not comprehensive enough (and at the same time they’re very slow).
  • Bugs, feature requests, and submitted patches are not tracked systematically. (This is partially dependent on FSF/GNU infrastructure improvements which are indefinitely delayed.)
  • There’s a history of releases breaking compatibility, and thus people are hesitant to upgrade. At the same time, Linux distributions actively want to force-upgrade everything they ship to ensure architecture support, leading to upstream/downstream friction.
  • Guide-level documentation is superficial and outdated.
  • Building an Autotools-based project directly from its VCS checkout is often significantly harder than building it from a tarball release, and may involve tracking down and installing any number of unusual tools.
  • The Autotools depend on other GNU software that is not actively maintained, most importantly GNU M4, and to a lesser extent GNU Make.
  • Coordination among the Autotools is weak, even though the tools are tightly coupled to each other. There are portions of codebases that exist solely for interoperability with other tools in the toolchain, which leads to overlapping maintainer and reviewer responsibility, slow code review and inconvenient copyright assignment processes multiplying, and causing confusion and dropped balls. For instance, there is code shared among Autoconf, Automake, and/or Gnulib by copying files between source repositories; changes to these files are extra inconvenient. The lack of coordination also makes it harder for tool maintainers to deprecate old functionality, or to decouple interfaces to make things more extensible; maintainers do not negotiate policies with each other to help. For instance, Autoconf has trouble knowing when it is safe to remove internal kludges that old versions of Automake depend on, and certain shell commands (e.g. aclocal) are distributed with one package but abstractly belong to another.
  • Division of labor among the Autotools, and the sources of third-party macros, is ad-hoc and unclear. (Which macros should be part of Autoconf proper? Which should be part of Gnulib? Which should be part of the Autoconf Macro Archive? Which should be shipped with Automake? Which tools should autoreconf know how to run? Etc.)
  • Automake and Libtool are not nearly as extensible as Autoconf is.
  • Unlike several competitors, Automake only works with Make, not with newer build drivers (e.g. Ninja).
  • Because Automake tries to generate Makefiles that will work with any Make implementation, the Makefiles it generates are much more complicated and slow than they would be if they took advantage of GNU and/or BSD extensions.
  • Libtool is notoriously slow, brittle, and difficult to modify (even worse than Autoconf proper). This is partially due to technical debt and partially due to maintaining support for completely obsolete platforms (e.g. old versions of AIX).
  • Libtool has opinions about the proper way to manage shared libraries that Linux distributions actively disagree with, forcing them to kludge around its code during package builds.
  • Alternatives to Libtool have all failed to gain traction, largely because Automake only supports building shared libraries using Libtool or an exact drop-in replacement.


Because of its extensible architecture, install base, and wellspring of user trust, Autotools can react to these industry changes and thus spur increases in usage, investment, and developer contribution.

  • Renewed interest in Autotools due to the Autoconf 2.70 release.
  • Renewed interest in systems programming due to the new generation of systems programming languages (Go, Rust, D, Swift(?), Dart(?), etc. may create an opportunity for a build system that handles them well particularly if it handles polyglot projects well (see below).
  • Cross-compilation is experiencing new appeal because of the increasing popularity of ARM and RISC-V CPUs, and of small devices (too small to compile their own code) based on these chips.
  • The Free software ecosystem as a whole would benefit from a reconciliation between the traditional model of software distribution (compiled code with stable interfaces, released as tarballs at regular intervals, installed once on any given computer and depended on as shared libraries and/or binaries) and the newer “depend directly on VCS checkouts and bundle everything” model described below. Autotools contributors have the experience and knowledge to lead this effort.
  • Funding may be available for projects targeting the weaknesses listed above.


These threats may lead to a further decrease in Autotools developer contribution, funding, and momentum.

  • Increasing mindshare of competing projects (CMake, Meson, Generate-Ninja, …).
  • Increasing mindshare of programming languages that come with a build system that works out of the box, as long as you only use that one language in your project. (These systems typically cannot handle a polyglot project at all, hence the above opportunity for a third-party system that handles polyglot projects well.)
  • Increasing preference for building software from VCS checkouts (perhaps at a specific tag, perhaps not) rather than via tarballs.
  • Increasing mindshare of the software distribution model originated by Node.js, Ruby, etc. where each application bundles all of its dependencies. While this is considered a profoundly bad idea by Linux distribution maintainers in particular (because it makes it much harder to find and patch a buggy dependency) and makes it harder for end-users to modify the software (because out-of-date dependencies may be very different from what their own documentation—describing the latest version—says), it is significantly more convenient for upstream developers. Competing build systems handle this model much better than Autoconf does.

Thanks to Sumana Harihareswara for inspiration and editing.

Followup discussion should go to the Autoconf mailing list.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.