Ninja anyone?

Dear all,

I was wondering if there’s any strong reason why we’re using the make generator for cmake instead of e.g. ninja ?

I’ve been playing a bit with ninja lately and must say I like how it seems to be quite a bit quicker for recompilations (i.e. during regular dev cycle, after initial build, when you modify a file, re-compile, re-modify, re-compile, etc…).


1 Like

We’ve been thinking about it. Are you volunteering for adding a ninja recipe and an O2 recipe building with ninja? :slight_smile:

Ninja is fantastic. The problem used to be that it cannot handle fortran. But this might no longer be the case and we should definitely investigate.

can try to do that, yes.

1 Like

I had tried it out with O2 and indeed it was nice and fast but the blocker was fortran.
Go for it !

Just to be sure, not counting our external dependencies (like Geant3 and some generators), we don’t use Fortran in O2, do we ?

Yes. If it is just for O2 we should be safe.

OK so we will need:

  • A proper Ninja recipe, picking it from the system if possible, installing it if not (use prefer_system appropriately). We should properly check if the right version is installed from the system there.
  • A proper O2 recipe update (let’s start with it maybe) to use Ninja.

Concerning the last point, I think we can, maybe, experiment with the Common repository for starters (there’s no Fortran code there and updates are not so frequent, making it an excellent testbed).

We will need to tell CMake to generate code using Ninja:

cmake $SOURCEDIR -G Ninja blahblah...

Then instead of make and make install we can directly use:

cmake --build . ${JOBS:+-- -j$JOBS}  # "make"
cmake --build . --target install     # "make install"

The above cmake statements are generic and work also with GNU Makefiles. I honestly don’t know if the installation procedure can be sped up by running it in parallel.

I am especially curious to see how our macOS installation time improves (if so) with Ninja!

Thanks for contributing!

An alidist PR is indeed in the works along those lines :wink:
Will try to post it today.

1 Like

PR done Let’s see how it goes :wink:

I did some timings while putting the PR in place, see

To be really thorough one might imagine to check that all build artifacts (bin,libs) are actually the same for the make and ninja cases. Might actually do that next, as it might be useful if we want to evaluate more build systems at some point ?

Do you understand why system time is so high in both cases?

@eulisse : no

Actually, finding out where the time is spent in our build might be of some interest.

Don’t know how to do it with ninja proper but apparently shake is able to use ninja files and seems to have a very nice build profiling output (report.html, that is query-able)

@costing could be interested in this.

Is it easy to integrate it into our builds? It would be nice to have such profiles produced automatically. We do that already with coverage.

Can we get the report somewhere? Does it include compilation time only or also the overhead? I’ve the impression that we have something stupid (e.g. starting bash with interactive configuration) which dominates over compile time.

That was quite precisely my request, see above. I’d like to have a badge with it as you have done with Coverage.

On build time, well, I have made some research and opened a different topic earlier:

@eulisse : the report I was referring to should now be available at

As far as I can tell, I think the report covers everything. In the report, if you go to the “command plot” dropdown menu you’ll see 3 main area : c++ compilation (green), cd (red, which is not cd but dictionary / rootmap generation AFAIK), and “:” (light blue) which I’ve not figured out yet what it is ;-). Using the “command table” you get the numbers for each command (the number are the unparallel ones if I get it correctly, e.g. the total build is not 35minutes !)

c++ 459 × 35m24s 90.17%
cd 47 × 3m01s 7.67%
: 183 × 36.61s 1.55%
cmake 1 × 8.37s 0.36%
cc 1 × 0.13s 0.01%

@dberzano : I’m no expert on shake itself (just installed it myself last week), but basically, once shake is installed (this requires the haskell platform…) and cmake has generated ninja files, it’s just a matter of doing shake -jN --profile (the -j is important as shake is using 1 core by default where ninja for instance is using everything available by default), so I’d say it should be easy to integrate, yes.

OK if you don’t mind I’ll try doing it for the next WP3 as a demonstrator!

oh, and to be clear, what I timed was just the build (i.e. no install)