Brad King 1296a0eada Ninja: Fix inter-target order-only dependencies of custom commands
Custom command dependencies are followed for each target's source files
and add their transitive closure to the corresponding target.  This
means that when a custom command in one target has a dependency on a
custom command in another target, both will appear in the dependent
target's sources.  For the Makefile, VS IDE, and Xcode generators this
is not a problem because each target gets its own independent build
system that is evaluated in target dependency order.  By the time the
dependent target is built the custom command that belongs to one of its
dependencies will already have been brought up to date.

For the Ninja generator we need to generate a monolithic build system
covering all targets so we can have only one copy of a custom command.
This means that we need to reconcile the target-level ordering
dependencies from its appearance in multiple targets to include only the
least-dependent common set.  This is done by computing the set
intersection of the dependencies of all the targets containing a custom
command.  However, we previously included only the direct dependencies
so any target-level dependency not directly added to all targets into
which a custom command propagates was discarded.

Fix this by computing the transitive closure of dependencies for each
target and then intersecting those sets.  That will get the common set
of dependencies.  Also add a test to cover a case in which the
incorrectly dropped target ordering dependencies would fail.
2016-07-20 13:12:24 -04:00
..
2015-12-10 23:09:16 +00:00
2014-09-16 09:06:29 -04:00
2015-03-06 11:18:19 -05:00
2014-09-16 09:06:29 -04:00
2013-10-08 08:37:50 -04:00
2016-07-13 09:15:16 -04:00

If you think about adding a new testcase then here is a small checklist you
can run through to find a proper place for it. Go through the list from the
beginning and stop once you find something that matches your tests needs,
i.e. if you will test a module and only need the configure mode use the
instructions from section 2, not 3.

1. Your testcase can run in CMake script mode, i.e. "cmake -P something"

Put your test in Tests/CMakeTests/ directory as a .cmake.in file. It will be
put into the test binary directory by configure_file(... @ONLY) and run from
there. Use the AddCMakeTest() macro in Tests/CMakeTests/CMakeLists.txt to add
your test to the test runs.

2. Your test needs CMake to run in configure mode, but will not build anything

This includes tests that will build something using try_compile() and friends,
but nothing that expects add_executable(), add_library(), or add_test() to run.

If the test configures the project only once and it must succeed then put it
into the Tests/CMakeOnly/ directory.  Create a subdirectory named like your
test and write the CMakeLists.txt you need into that subdirectory. Use the
add_CMakeOnly_test() macro from Tests/CMakeOnly/CMakeLists.txt to add your
test to the test runs.

If the test configures the project with multiple variations and verifies
success or failure each time then put it into the Tests/RunCMake/ directory.
Read the instructions in Tests/RunCMake/CMakeLists.txt to add a test.

3. If you are testing something from the Modules directory

Put your test in the Tests/Modules/ directory. Create a subdirectory there
named after your test. Use the ADD_TEST_MACRO macro from Tests/CMakeLists.txt
to add your test to the test run. If you have put your stuff in
Tests/Modules/Foo then you call it using ADD_TEST_MACRO(Module.Foo Foo).

4. You are doing other stuff.

Find a good place ;) In doubt mail to cmake-developers@cmake.org and ask for
advise.