While I was trying to fit an application in a limited space I noticed that the compiler/linker does not appear to remove functions that are uncalled across C files.
I have a number of C files that are common to a number of applications but not all of these functions are called by all applications.
If I comment out these functions the application will then reduce in size, so I am pretty sure the linker is not removing these (or at least part of these) unused functions. This is regardless of the optimisation level I have set for the compiler.
Ideally I want the compiler/linker to remove these functions automatically.
I have trawled through the options on armlink but I cant seem to find what I want.
I am pretty sure I am missing something obvious. So can anyone explain how I achieve this?
Thanks
Maybe you have sections with "--keep" enabled ? Check your linker settings.
What I do in my projects is:
In the Projects->Options->C/C++ tab, 'Misc Controls' I add:
--feedback=!UnusedFunctions!.Txt
In the Projects->Options->Linker tab, 'Misc Controls' I add:
It's worth building the project a few times to get rid of all of the unused functions and functions that call unused functions.
Works very well.
but I think the OP was after a way to have the tools do it all automagically...?
Thanks Mr Shy that is exactly what I wanted, it works well.
"but I think the OP was after a way to have the tools do it all automagically...?"
Reasonably automagic - You specify the option, you get the result.
I don't think uVision recognises waves of magic wands or telepathic requests yet.
Just a note but traditional linkers works on object files, and not on functions/variables inside object files. So any support by a linker to strip out functions or variables within an object file is a quite advanced feature.
Because of this, a traditional CRTL often contains thousands of C files, just to create thousands of object files just to let the linker select on a symbol-by-symbol level what to include in the final binary.
Isn't it straightforward for the compiler to place each function and static-storage-duration variable into it's own section (or whetever the smallest unit for the linker would be called)? That way the linker can keep track of dependencies and remove unused functions and variables. One reason I can think of against this is this would limit compiler optimization scope to function boundaries. The mechanism used in RealView allows the compiler to expand optimization scope to the whole source code and remove unused functions. This comes at the cost of an extra recompile.
Yes, it's possible to play with linker or with compiler.
I just wanted to note that it does take some extra steps in the tools to perform removal of unused functions. And these extra steps can't be taken for granted. They obviously are very valuable for embedded programming since the available space is so important.
And apparently those files each contain more than one separately useful function. Sorry, but that's just not the right way to build a library. Particularly not if you're going to try and re-use that library of routines for real, so it'll be used with more than one version of one compiler.
Yes, some tool chains can be convinced to break open individual compiled source files and use only poart of them. But it's not a good plan to rely on that feature being available everywhere.
The whole idea of a library, as apposed to just compiling a few big fat object files with your whole set of utility function in there, is that it really does take those many object files, and you need a way of handling them efficiently. The linker picks from the library what it needs. No more, no less.
In the case at hand, one considerable advantage is that you would get the size reduction on every build, not just the last one in a long sequence of complete re-builds --- how will you ever build "official" program images if every re-compile may produce a different one?
"--- how will you ever build "official" program images if every re-compile may produce a different one?"
Different code sizes on different builds has been witnessed before; e.g.,
http://www.keil.com/forum/15677/
For our work, it is important to be able to re-build an image file for a project that is 100% identical to the one built at the time of release. I have projects going back way more than 15 years which I can still re-build and create a binary image for that is identical to the one produced originally.
Using the --feedback switch I stated earlier, I've found two builds to be enough for the compiler/linker to accumulate sufficient information for the smallest image to be created. So my release procedure currently requires me to build a few times until the binary image from the last build is identical to the previous one. At the moment, my projects are small enough and my PC fast enough for this to be an acceptable cost.