I have a psychological tic which makes me reluctant to use large libraries (like GLib or Boost) in lower-level languages like C and C++. In my mind, I think:
Large library will, from the code performance perspective:
boost
don't require runtime binaries, they're "header-only"). While the OS will load only the actually used parts of the library to RAM, it still can load more than you need, because the granularity of what's loaded is equal to page size (4 Kb only on my system, though).take more time to load by dynamic linker, if, again, it needs runtime binaries. Each time your program is loaded, dynamic linker has to match each function you need external library to contain with its actual address in memory. It takes some time, but just a little (however, it matters at a scale of loading many programs, such as startup of desktop environment, but you don't have a choice there).
And yes, it will take one extra jump and a couple of pointer adjustments at runtime each time you call external function of a shared (dynamically linked) library
from a developer's performance perspective:
add an external dependency. You will be depending on someone else. Even if that library's free software, you'll need extra expense to modify it. Some developers of veeery low-level programs (I'm talking about OS kernels) hate to rely on anyone--that's their professional perk. Thus the rants.
However, that can be considered a benefit. If other people are gotten used to boost
, they will find familiar concepts and terms in your program and will be more effective understanding and modifying it.
Bigger libraries usually contain library-specific concepts that take time to understand. Consider Qt. It contains signals and slots and moc
-related infrastructure. Compared to the size of the whole Qt, learning them takes a small fraction of time. But if you use a small part of such a big library, that can be an issue.