Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: new system to build dependencies locally if needed #4242

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

lgritz
Copy link
Collaborator

@lgritz lgritz commented Apr 24, 2024

High level TLDR:

  • If checked_find_package doesn't find a dependency (or it is not an
    acceptable version), it looks for src/cmake/build_<pkg>.cmake and
    if that exists, includes it. It can do anything, but is expected to
    somehow provide the dependency so that a second find_package will
    find it and then proceed as if it were a system install.

  • I've implemented these scripts so far for Imath, OpenEXR,
    OpenColorIO, fmt, and robin-map, that download, build, install the
    package in OIIO's build area. More to come later?

  • This is really simple with a new build_dependency_with_cmake macro,
    much simpler than ExternalProject_Add, as I've seen it used
    elsewhere.

  • Just look at any of the new build_blah.cmake files to see how simple
    it is for each new dependency we set up this way.

  • By default, pre-installed packages it can find always take precedent
    over building locally. So if you have all the dependencies already
    installed, none of this should behave any differntly than before.
    But there are variables that let you override on a package by package
    basis, giving the option of never building locally, building locally
    if the package is missing, or forcing a local build to always happen.


Various details:

A bunch of cmake things (including checked_find_package) have been
moved into a new file, dependency_utils.cmake.

build_Imath.cmake, build_OpenColorIO.cmake, build_OpenEXR.cmake,
build_Robinmap.cmake, and build_fmt.cmake implement local builds of
those packages. They're very simple, and lean heavily on common
infrastructure of build_dependency_with_cmake, which also can be found
in dependency_utils.cmake.

Robinmap and fmt are extra simple because we use them as header-only
libraries.

For Imath and OpenEXR, I build them as static libraries, so they will
be incorporated into libOpenImageIO (and/or _Util) libraries to be
totally internal to them, there should be no symbols exposed outside
our libraries. This should mean that the resulting libOpenImageIO
should be perfectly safe to link in an application that also links
against OpenEXR, Imath, or OpenColorIO, even different versions
thereof, without any interference. Note that none of those packages
are used in our public APIs, only internally.

OpenColorIO was a little trickier. It builds its own dependencies as
static libraries that are internalized, but OCIO itself is a dynamic
library. So we end up having to make it part of our install, but I use
OCIO's build system to make a custom symbol namespace and a custom
library name, so it still should not interfere with any other OCIO
linked into the application.

We'll see how it goes for furture dependencies we want to add. The
header only, static libraries incorporated and hidden, and dynamic
library but renamed and with custom namespace, are all techniques that
work well. I'm not sure I'd advocate doing local builds of any
dependency that we can't incorporate in one of these ways, but I guess
we'll cross that bridge when we get to it.

New option_utils.cmake has two handy new utilities: set_cache() is
much like the built-in set() when making a cache variable, and
set_option() is much like build-in option(). The biggest difference is
that both allow an environment variable of the same name, if it
exists, to supply the default value. This is something that cmake does
with many of its own controls, like CMAKE_BUILD_TYPE, but does not
make any provision for built-in set() or option() let users do it.

checked_find_package() has moved to dependency_utils.cmake, and has
been enhanced to take several new options, and also so that if the
enclosed find_package() fails and there is a
src/cmake/build_PKG.cmake, it will run it to build the dependency
itself in the build area. If that build_PKG sets a variable called
PKG_REFIND, it will try find_package again to find the one just built.

build_dependency_with_cmake() is given a git repo and tag, and
basically clones the repo, checks out the tag, configures, builds, and
installs it (all in our own build area).

@lgritz lgritz added the build / testing / port / CI Affecting the build system, tests, platform support, porting, or continuous integration. label Apr 24, 2024
@lgritz lgritz marked this pull request as draft April 24, 2024 17:37
@lgritz lgritz changed the title draft: build: new system to build dependencies locally if needed build: new system to build dependencies locally if needed Apr 24, 2024
@lgritz
Copy link
Collaborator Author

lgritz commented Apr 24, 2024

I've marked this as a draft because I noticed that I'm having trouble building opencolorio in this manner on Windows. Needs some minor fixes, but then I'll remove draft status. But I'm confident this is 95% in final form, so by all means start taking a look if you're curious about the approach I'm trying.

@ssh4net
Copy link
Contributor

ssh4net commented Apr 26, 2024

Is it mandatory from now on to build everything as static libs, or OIIO, and required libs as before can be built as dynamic libs?

@fpsunflower
Copy link
Contributor

I like the idea of making the build easier -- at the same time, it feels to me like this is kind of reinventing the wheel. I worry that this setup will be brittle and need constant tweaking (though maybe that is par the course no matter what we do).

For someone new to the project, I think it would be nicer to see calls to the ordinary cmake find_package than discover we've rolled our own flavor that they need to study. This would likely be easier for package maintainers as well.

As an alternative -- would it be worth seeing how many custom "find*.cmake" we can actually remove if we rely on new enough versions of the target libraries (and cmake itself)? How close to a fully "modern cmake" setup could we get?

@zachlewis
Copy link

I've only tested on linux so far, but it works like a dream. I can't tell you how much easier this makes things for us...!

@fpsunflower
Copy link
Contributor

fpsunflower commented Apr 30, 2024

I did a bit of reading and some people suggest using the FetchContent family of functions in cmake.

Did you investigate those? Is there a reason why you rolled your own version?

I'm curious what the pros and cons are. From a cursory glance, it seems to be designed for this exact problem.

@fpsunflower
Copy link
Contributor

Another cmake feature that came up when reading up a bit on this was ExternalProject. I'm not sure I fully grok the differences between that and FetchContent but it seems like another option that might be relevant here.

Likewise there is also ExternalData which could be handy to use for some of the testsuite data (which is a bit clunky to download and setup manually).

@lgritz
Copy link
Collaborator Author

lgritz commented Apr 30, 2024

@ssh4net:

Is it mandatory from now on to build everything as static libs, or OIIO, and required libs as before can be built as dynamic libs?

None of this is mandatory. It only kicks in if important dependencies are not found, or are not adequate versions. I prefer static libs for this particular use because I don't want OIIO to need to also install dynamic libs of the dependencies, which might interfere with system libraries.

@zachlewis:

I've only tested on linux so far, but it works like a dream. I can't tell you how much easier this makes things for us...!

Thanks! I'm still working out some issues for the Windows CI, but that's less that it's difficult and more than I haven't had the time to chase it down in the last several days.

@fpsunflower:

This is my 4th or 5th rewrite of this, including doing full implementations of both the ExternalProject and FetchContent families of functions (and the CPM project as well). Every one had its own set of awkward tradeoffs, and I didn't find any of them as simple and clean as what I've done here. You can see those other implemented variously in OpenEXR (which uses FetchContent, I think, to get Imath) and OpenColorIO (which uses ExternalProject for all its dependencies). Both seem to require considerable setup before the ExternalProject/FetchContent, and considerable extra work after, to fix up everything just right. Whereas I feel like my stab at it is a really minimal way to get EXACTLY what you would get if it found the project externally -- in fact, after doing a build, it just does another find_project so it really does read the exported cmake files of a clean build of the dependency (none of the others work that way).

Here is OpenColorIO's code to build OpenEXR when not found: https://github.com/AcademySoftwareFoundation/OpenColorIO/blob/main/share/cmake/modules/install/InstallOpenEXR.cmake
It's based on ExternalProject_Add and you can see how much ridiculous cruft must surround it. Every one of their dependencies needs a similarly impenetrable file. Compare that to my build_OpenEXR.cmake in this PR, and you can perhaps understand how I arrived at this solution after trying the others.

Note that the general setup I have where it looks externally, then if not found, runs whatever is in the build_blah.cmake, can do anything in that build file. Nothing stops one of the build_blah files from doing ExternalProject or FetchContent or CPM or anything else if it wants. So it's trivial to switch the methodology of that part if we decide down the road that I'm just totally misunderstanding those and they are easy after all. I allow for the possibility that I (and maybe OpenColorIO's authors and those of every other project I looked at) have misunderstood ExternalProject/FetchContent and made it overly complicated. I could be wrong.

@lgritz
Copy link
Collaborator Author

lgritz commented Apr 30, 2024

One issue with FetchContent/ExternalProject is that it's very awkward if the dependency is cmake based, and moreover must have its cmake build system carefully written to be safely included as a subproject. My scheme makes no such assumptions.

@fpsunflower
Copy link
Contributor

I see. That's disappointing to hear. From the advertising it sounded like those commands were solving the exact same problem in a built-in way that would (hopefully) be more familiar to anyone doing the build and more likely to work across platforms out of the box.

From what I was reading -- the whole point of ExternalProject is that you get to invoke cmake again and control its environment exactly. I'm not sure why it would require anything special from the target project (in fact it claims to work with any build system, not just cmake). Once the recursive cmake configure,build,install succeeds, it should be in the build folder as if it had been installed on the system and should be able to be picked up as such. The only catch is that you are getting a completely independent cmake session, so any arguments you want to forward (like compiler, build type, etc...) you have to pass in manually. That could definitely get verbose, although I guess some of that could be automated.

FetchContent on the other hand, definitely seems a bit more fragile because if I understood it correctly, all it does is it adds the fetched projects' targets to yours. So I could definitely see that a not-so well behaved project could make a mess. But for well-behaved project with modern cmake throughout, you get (in theory) a clean dependency graph of all the targets and they all execute within the same cmake session (for better or worse).

In any case, didn't mean to derail the conversation -- it could be that over time the official cmake way of doing things will mature and will be able to be swapped in as you said.

@lgritz
Copy link
Collaborator Author

lgritz commented Apr 30, 2024

So here is my understanding:

FetchContent just gets stuff and puts it in your tree. It doesn't invoke any build as far as I know, you'd need to add all that logic. But it does it immediately at configure time, so may be ok for a header-only library. I believe that the integration between FetchContent and find_package is fairly new, and we can't count on a cmake that recent unless we make our cmake minimum much higher than any of the other related projects.

ExternalProject gives you the works, including build steps, etc., but the problem is that the download doesn't happen until the build step itself, which is problematic for two reasons -- first, many distros and build setups have firm rules against needing internet access beyond the configure step, and second, it means that you can't use the targets or anything else you would've learned from the config files for the rest of your build step.

I also tried CPM which at first appeared to be exactly the wrapping and simplification I wanted, but I found that is really only works for very well structured cmake projects and I had trouble getting it all quite right with the interdependencies of OpenEXR, Imath, and OpenColorIO.

What I found trying to explore these in detail is that they are pretty good under certain conditions: (1) well structured dependencies that have already made themselves subproject-friendly, (2) independent dependencies, so you don't need to worry much about a package that is both a direct dependency of you and also of other dependencies, (3) your aim is unconditional building of dependencies rather than a somewhat complex set of rules we have about preferring system installs. I just didn't find the level of control I preferred, with all the little features I wanted, in a way that it could be expressed compactly for each dependency and didn't require me to have to fight to find exactly the right secret incantation to make it work. So I wrote my one from scratch that covers exactly what we need, with a very simple use-site notation.

I dunno, I could be wrong. But I couldn't make them work without feeling like I was jumping through some ugly hoops. What I would like to do is get this scheme integrated, and then it is easy for us to experimentally swap out any individual dependency's build_dependency_with_cmake call with a FetchContent or ExternalProject to see if that works better.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love that you're puling in v2.3.2 🤘

Just a thought -- it might be nice if there were a way to optionally build + bundle PyOpenColorIO under the OpenImageIO namespace; ie., from OpenImageIO import PyOpenColorIO as ocio or somesuch (when building OIIO python bindings, obviously). It almost seems like a missed opportunity to not bundle PyOpenColorIO, considering the trouble you've gone through to get to this point. I think it could be a handy way to "backdoor" a newer version PyOpenColorIO into a DCC's environment than whatever version of PyOpenColorIO the DCC itself ships with, if any. If that makes sense.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So far, I've been thinking of it strictly in terms of building things needed for OIIO internals. So far, I hadn't really considered intentionally making them in any way externally visible, I'm less sure how to make that "secure" in terms of not breaking anything else on people's system.

VERSION_MIN 3.1
PRINT IMATH_INCLUDES Imath_VERSION)
VERSION_MIN 3.1
BUILD_LOCAL missing
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just and observation: BUILD_LOCAL could potentially be an option for Windows builds and be used for REQUIRED dependencies. It could simplify the process discussed in 3816 and avoid falling back on Bash, vcpkg etc. Do you see it as a use case?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, definitely

# Install the project, unless instructed not to do so
if (NOT _pkg_NOINSTALL)
execute_process (COMMAND ${CMAKE_COMMAND}
--build ${${pkgname}_LOCAL_BUILD_DIR}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On multiconfig generators (MSVC) CMAKE_BUILD_TYPE is ignored. In this case we should specify the build type with --config parameter to avoid mixing release and debug builds.
Imath-3_1_d.lib(half.obj) : error LNK2038: mismatch detected for '_ITERATOR_DEBUG_LEVEL': value '2' doesn't match value '0' in argparse.obj

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, thanks, I believe I've already done that in the version I'm currently using/editing, but I haven't pushed to have it append to this PR for several days. I still am having some Windows issues I was trying to work through before updating.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I hesitated on passing the same CMAKE_BUILD_TYPE while building required dependencies - as these are well tested external dependencies and in majority cases building them in Release mode should be preferred. Do you think --config should take a separate build type to avoid long reconfiguration times if the main CMAKE_BUILD_TYPE gets modified?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you have a good point. I believe there are two use cases here: (1) even when building a debug version of OIIO, you want optimized dependencies because those are dependable anyway; (2) you're doing a deep debug and the problem may indeed be one where you need to trace into a dependency.

I think you're right, though, that (1) is probably the more common case. So maybe I'll add a separate control for the build mode for any locally build dependencies.

Or maybe it's moot? I propose the following hypothesis: the kind of users who need and will use the auto-built dependencies are probably doing a one-time build of our package that they're maybe not too familiar with, whereas the kinds of people who are likely to want a debug build of OIIO are probably more sophisticated users who already have separately installed copies of the dependencies. The strong circumstantial evidence is that anybody building OIIO over and over again probably doesn't want a fresh download and build to happen for dependencies every time they clean the build area of their OIIO. So it's also possible that this doesn't really matter, because there is rarely an intersection between people doing debug builds and who need the on-the-fly dependency builds.

I'll add some control for this once I work out my Windows woes.

CMAKE_ARGS
-D CMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}
# Build static libs
-D BUILD_SHARED_LIBS=OFF
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On MSVC - building only static lib with proj_add_compile_definitions (OPENEXR_DLL) added in src/cmake/externalpackages.cmake results with unresolved symbols:
error LNK2019: unresolved external symbol "__declspec(dllimport) ...
Commenting out this line, resolves the problem but is seem the correct way to handle this, is to condition the check when BUILD_LOCAL missing option is active.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, thanks, found that yesterday, it was really messing with me!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect! Glad you've tracked this one down.

@lgritz lgritz marked this pull request as ready for review May 21, 2024 21:07
@lgritz
Copy link
Collaborator Author

lgritz commented May 21, 2024

Updated and changed from draft status to "ready to review".

I finally have it working on all platforms including Windows (though with the compromise of using dynamic libs on Windows for now -- maybe somebody who knows more than me can return later to try making it work with static libs on Windows also, but this is fine for now).

So at this point, all of Imath, OpenEXR, OpenColorIO, fmt, and robin-map will download and build (at fairly modern versions) if not found on the system at OIIO-build time or not as recent as our minimum required versions. When not on Windows, they all do so with static libs, so they don't expose the rest of the system to any libraries or symbols that may conflict with other versions of those dependencies. On Windows, it has to use dlls for now, but do give them custom library names and symbol namespaces so that, I hope, they don't conflict there, either.

Future expansion might lean toward increasing set set of package for which we add the capability to automatically build if not found.

# If the local build instrctions set <pkgname>_REFIND, then try a find
# again to pick up the local one, at which point we can proceed as if
# it had been found externally all along.
if (${pkgname}_REFIND)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just wanted to say - this is by far my favorite line. Once the package is found, the (CMake) configuration time gets a significant boost on subsequent runs - compared to FetchContent. Nothing is touched after, during the build thanks to find_package. Well done!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. It's one step better than that, even: By default, the download and build of the dependency will occur in OIIO's 'build/deps' area, but you can override that with the CMake variable OpenImageIO_LOCAL_DEPS_ROOT or if that is not explicitly set, then it will check an environment variable of that same name.

So if you set that to someplace outside of the build area, then the dependency builds will persist even if you blow away your OIIO build area or do a 'make clean', or if you switch build modes or other options. (But at the obvious cost of it just reusing that build, and is not smart enough to know it should rebuild if you change the OIIO build files to indicate that it should be pulling a different version of the dependency.)

"Build type for locally built dependencies")

if (MSVC)
# I haven't been able to get Windows local dependency builds to work with
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What problems did you run into while building static libs on Windows? Was it not worth the effort - significantly complicating the process? Is it related to symbol visibility?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I hesitate to give much detail here, because my recollection is incomplete, and any problems I had are likely due to my lack of experience with Windows' quirks. I feel confident that somebody with real Windows development experience could probably get it working with static libs rather quickly.

"Additional dependencies to consider optional (semicolon-separated list, or ALL)")
set_option (${PROJECT_NAME}_ALWAYS_PREFER_CONFIG
"Prefer a dependency's exported config file if it's available" OFF)
set_cache (${PROJECT_NAME}_DEPENDENCY_BUILD_TYPE "Release"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the "Release" option - I'm seeing error while building debug version on Windows: cmake --build . --config Debug

Imath_v_3_1_10_OpenImageIO_v2_6_2.lib(half.obj) : error LNK2038: mismatch detected for 'RuntimeLibrary': value 'MD_DynamicRelease' doesn't mat
ch value 'MDd_DynamicDebug'

Should environment variable ${PROJECT_NAME}_DEPENDENCY_BUILD_TYPE be defined first and also be used in --config?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see. I was thinking that mostly people would want the dependencies to be optimized even if they were building a debug OIIO. But now I see that this my Unix bias speaking -- I think maybe that in Windows land, debug and optimized builds are not freely mixable?

Maybe it's better if we initialize the build mode of dependencies to default to that of the present OIIO build. (But still allow override.)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mix debug and release all the time on Windows. The thing to keep in mind is that there are various types of debugging flags. It's the mixing of the runtime that is a problem. Making a debug build with no optimization and including symbols (what we normally do on unix) won't cause issues.
The frustrating part is that you need to make sure that all statically linked dependencies follow the same convention and most projects default to the convention that "debug" builds also use the debug runtime.

So one fix might be to modify the OIIO cmake so that debug builds don't link to the debug runtime. That is fine for shared libraries of OIIO. But if OIIO is used as a static lib, the consumer of OIIO will now also be required to make the same change and I don't think OIIO should impose this.

We could add a cmake option to pick whether debug builds should use debug runtimes and have it default to yes so by default this works as most users would expect. However, if you're the type of user that uses static libs and knows how to make debug builds that don't use debug runtimes, it's not that hard to just run ccmake and change the "release" mode to use /Od and other debug flags.

So all this to say I agree with lg's above proposal.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I pushed an update that changes the default build mode for dependencies to match that of the main project.

The different Windows debugging incompatibilities are a bit over my head, so I'll leave it to somebody who knows that platform better to come back and enhance this with more flexibility later.

@cedrik-fuoco
Copy link

cedrik-fuoco commented May 27, 2024

Hello! Just wondering what are the reasons that Conan or Vcpkg got dismissed as a solution to manage dependencies?

@lgritz
Copy link
Collaborator Author

lgritz commented May 27, 2024

Externally built dependencies, or automatic locally built dependencies?

For external, we don't dismiss either. In fact, we us vcpkg on our Windows CI, and a little bit of Conan on our Linux CI (for some packages). The "local build" facilities are for when they aren't found externally using whatever dependency management the user or their system has set up.

I have to admit that I'm not crazy about vcpkg in practice for the CI because, as far as I know, there are no cached binaries, no easy way to have it build only the mode you care about (e.g. for our CI, I want to build only release mode, but I seem to have no choice but to pay to burn GHA minutes for vcpkg building debug even though I'm not going to use it), and there's no fine control over which version of each dependency I get or what other build-time controls are used. I'm not sure how (or if) those can be overridden when invoking vcpkg, without doctoring the individual recipe files of ever dependency.

But if you're happy with whatever version of the dependencies vcpkg is pegged to, and the build options it uses, and don't mind that it'll build both debug and optimized (which is fine if you're going to do that build once and then use it for weeks or months), it's totally fine.

Conan -- Again, nothing wrong with it. I'm not experienced enough with it to know if it's any better in terms of easy command-based overriding of version or build options of the dependencies without doctoring recipe files or whatnot. I did look at it potentially as the basis of this PR, and to be honest, I don't remember the specific reason I chose to mostly roll my own. I think maybe it just didn't give me all the controls I needed, or maybe it was my fault and I didn't investigate deeply enough, or perhaps I thought I could cobble something together faster than it would take me to deeply learn the Conan way.

I wish I had a more principled answer to explain. But the long and short of it was that I looked at several solutions: cmake ExternalProject_Add, CPM, Conan, and some others. I found strengths and weaknesses to them all, hard to decide among them, and in the end I broke the paralysis by taking a shot at writing the dead simplest solution I could think of to exactly solve our use case with no additional complexity or cognitive load.

@cedrik-fuoco
Copy link

Thanks for your answer! I think the complexity and cognitive are very valid reasons. My intention is not to change your mind, but I was curious on the thoughts behind the choices. I think that needing finer control with vcpkg or Conan is a common issue.

I have to admit that I'm not crazy about vcpkg in practice for the CI because, as far as I know, there are no cached binaries, no easy way to have it build only the mode you care about (e.g. for our CI, I want to build only release mode, but I seem to have no choice but to pay to burn GHA minutes for vcpkg building debug even though I'm not going to use it), and there's no fine control over which version of each dependency I get or what other build-time controls are used. I'm not sure how (or if) those can be overridden when invoking vcpkg, without doctoring the individual recipe files of ever dependency.

You can use the default triplets (or the community triplets) to build in Release or Debug only. That would definitely save you those GHA minutes.

I don't recall if it is enabled by default, but vcpkg do have a local binary cache that can save you a lot of build time.

But if you're happy with whatever version of the dependencies vcpkg is pegged to, and the build options it uses, and don't mind that it'll build both debug and optimized (which is fine if you're going to do that build once and then use it for weeks or months), it's totally fine.

I had similar issues regarding transient dependencies. For example, if you have multiple dependencies that uses zlib, I don't recall a way to control the version of zlib for all your dependencies - But you can control the version of your direct dependencies using manifest mode (within the versions that are made available by the package).

Conan -- Again, nothing wrong with it. I'm not experienced enough with it to know if it's any better in terms of easy command-based overriding of version or build options of the dependencies without doctoring recipe files or whatnot. I did look at it potentially as the basis of this PR, and to be honest, I don't remember the specific reason I chose to mostly roll my own. I think maybe it just didn't give me all the controls I needed, or maybe it was my fault and I didn't investigate deeply enough, or perhaps I thought I could cobble something together faster than it would take me to deeply learn the Conan way.

I've worked with Conan for awhile and while I like the concept of it, it can feel like you don't have all the low-level control that you would like sometimes (e.g. the way the Cmake generators are implemented)

With Conan, you can control the version of your direct and transient dependencies - which is a big plus over Vcpkg, but it does have a bigger cognitive load.

@lgritz
Copy link
Collaborator Author

lgritz commented Jun 2, 2024

You can use the default triplets (or the community triplets) to build in Release or Debug only. That would definitely save you those GHA minutes.

@cedrik-fuoco Could you elaborate? What do I need to do in particular to a line like

vcpkg install tiff:x64-windows

to make it only build the release version?

@cedrik-fuoco
Copy link

cedrik-fuoco commented Jun 2, 2024

You can use the default triplets (or the community triplets) to build in Release or Debug only. That would definitely save you those GHA minutes.

@cedrik-fuoco Could you elaborate? What do I need to do in particular to a line like

vcpkg install tiff:x64-windows

to make it only build the release version?

vcpkg install tiff:x64-windows-release or vcpkg install tiff --triplet x64-windows-release (same behavior, different syntax) @lgritz

It uses the Community triplets, but they come with Vcpkg: https://github.com/microsoft/vcpkg/tree/master/triplets/community

@lgritz
Copy link
Collaborator Author

lgritz commented Jun 2, 2024

@cedrik-fuoco Thanks, that's very helpful!

A few minor fixes for things I noticed as I started to actually use
these new utilities in other branches I'm working on.

Signed-off-by: Larry Gritz <lg@larrygritz.com>
High level TLDR:

* If checked_find_package doesn't find a dependency (or it is not an
  acceptable version), it looks for `src/cmake/build_<pkg>.cmake` and
  if that exists, includes it. It can do anything, but is expected to
  somehow provide the dependency so that a second find_package will
  find it and then proceed as if it were a system install.

* I've implemented these scripts so far for Imath, OpenEXR,
  OpenColorIO, fmt, and robin-map, that download, build, install the
  package in OIIO's build area. More to come later?

* This is really simple with a new build_dependency_with_cmake macro,
  much simpler than ExternalProject_Add, as I've seen it used
  elsewhere.

* Just look at any of the new build_blah.cmake files to see how simple
  it is for each new dependency we set up this way.

* By default, pre-installed packages it can find always take precedent
  over building locally. So if you have all the dependencies already
  installed, none of this should behave any differntly than before.
  But there are variables that let you override on a package by package
  basis, giving the option of never building locally, building locally
  if the package is missing, or forcing a local build to always happen.

----

Various details:

A bunch of cmake things (including checked_find_package) have been
moved into a new file, dependency_utils.cmake.

build_Imath.cmake, build_OpenColorIO.cmake, build_OpenEXR.cmake,
build_Robinmap.cmake, and build_fmt.cmake implement local builds of
those packages. They're very simple, and lean heavily on common
infrastructure of build_dependency_with_cmake, which also can be found
in dependency_utils.cmake.

Robinmap and fmt are extra simple because we use them as header-only
libraries.

For Imath and OpenEXR, I build them as static libraries, so they will
be incorporated into libOpenImageIO (and/or _Util) libraries to be
totally internal to them, there should be no symbols exposed outside
our libraries. This should mean that the resulting libOpenImageIO
should be perfectly safe to link in an application that also links
against OpenEXR, Imath, or OpenColorIO, even different versions
thereof, without any interference. Note that none of those packages
are used in our public APIs, only internally.

OpenColorIO was a little trickier. It builds its own dependencies as
static libraries that are internalized, but OCIO itself is a dynamic
library. So we end up having to make it part of our install, but I use
OCIO's build system to make a custom symbol namespace and a custom
library name, so it still should not interfere with any other OCIO
linked into the application.

We'll see how it goes for furture dependencies we want to add.  The
header only, static libraries incorporated and hidden, and dynamic
library but renamed and with custom namespace, are all techniques that
work well. I'm not sure I'd advocate doing local builds of any
dependency that we can't incorporate in one of these ways, but I guess
we'll cross that bridge when we get to it.

checked_find_package() has moved to dependency_utils.cmake, and has
been enhanced to take several new options, and also so that if the
enclosed find_package() fails and there is a
src/cmake/build_PKG.cmake, it will run it to build the dependency
itself in the build area. If that build_PKG sets a variable called
PKG_REFIND, it will try find_package again to find the one just built.

build_dependency_with_cmake() is given a git repo and tag, and
basically clones the repo, checks out the tag, configures, builds, and
installs it (all in our own build area).

---

Signed-off-by: Larry Gritz <lg@larrygritz.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
build / testing / port / CI Affecting the build system, tests, platform support, porting, or continuous integration.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants