It isn’t really that hard it is just extra work.
One option is instead of shipping all your .o files, you can merge all the non-LGPL objects into one using `ld -r`. That way you don’t have to ship a complex script or Makefile to link all your object files, and you can hide more internal details of the non-LGPL part.
You could use an approach like this:
(1) have an initial link step in your build process which links all the the non-LGPL object files into one
(2) have a second link step which statically links the output of (1) with all the LGPL static library archives to produce an executable
(3) still release the executable output by (2) as a release artefact as normal
(4) separately, create an archive (ZIP/TAR/etc) containing all the object file inputs to (2), and release that archive as a release artefact as well
(5) also include in the archive (4) a shell script which does the equivalent of (2) at the end-user site
(6) also for each LGPL library in the archive (4), include its source code
Now, for each released version, you make available both the executable from (2) (maybe incorporated into an installer or package) - and also the archive from (4). Almost nobody is ever going to download and use that “link kit” archive. But by making it available, you are legally complying with your obligations under the LGPL.
So this is the thing - there is nothing technically difficult here, it is just extra work to set it up and maintain it and make sure it doesn’t break and fix it if it breaks. And all that extra work is being done, not for any real end-user requirement, just for technical legal compliance. You can see why a more attractive option would be either to just use dynamic linking (if possible), or else look for a non-LGPL alternative under a more permissive license-either way, you don’t have to bother with all this extra busywork.