ggn wrote:[a) Future proofness. ELF format is going to stay for a long while, but what about a.out?
Don't know, but i don't really care.[/url]
That's exactly the reason why this build was started: don't know when/if a.out will be out, don't really care either now - brownout will work for decades. At least for now we had to change 0 lines of code with 2 major revisions of gcc.
ThorstenOtto wrote:Ppl are using gcc 4.6.4 to compile atari software for quite some years now, and in most cases there is still no urgent need to switch to a newer compiler version.
Nobody shoved this gcc down their throats either.
ThorstenOtto wrote: There are only some packages that would require c++11 or something which 4.6.4 does not support. For this you could use gcc 7.x generating mint executables as before. There is no need to worry about removal of a.out support until
a) that really happens
b) noone is able to bring it back in using patches.
c) that compiler has some new features that are mandatory to be able to use your current software.
That's your opinion and I don't really agree with it - firefighting stuff when it happens usually leads to really bad choices. Besides, why are we arguing and talking hypothetically about something that's already done and released, i.e. a bridge between ELF binaries and TOS binaries?
Also again, people are free to do what they want. When we started this there was no gcc 6.2 or 7.1 version in sight, so this is how we proceeded. If someone had released their version(s) sooner you wouldn't see this at all.
ThorstenOtto wrote:As long as any of the above does not happen, you could still use gcc 7.x
Your approach has some serious disadvantages.
- you can't use *any* of the existing libraries. Everything has to be recompiled, including mintlib, gemlib, and every gnu package you can think of. I don't see any of that libraries.
Why is that a disadvantage? I thought the whole idea of using *nix stuff is that people can and are able to compile their libraries.
Also you're a bit incorrect there - mintlib is compiled using the build script. I did also had a go at gemlib but I got stuck somewhere. As gemlib and others wasn't a big concern for me I simply gave up. But hey, my efforts are up here: https://bitbucket.org/ggnkua/bigbrownbuild/src
- if someone needs it they can take what I did and fix it further. Or they can start from scratch. You know, open source and all that?
ThorstenOtto wrote:- ppl have been using rather old packages (zlib 1.2.5 etc) for quite some time now, without anyone trying to update them to new versions. What makes you think that this situation will suddenly change?
Again, I'm not forcing this build of gcc to anyone. This build was mostly done for the benefits of 2 people. We just decided it would be nice if we released it to the community. Plus if the libs are open source it's simply a matter of recompiling them, right? And it shouldn't be that hard to fix, right? I mean *nix people are usually skilled in fixing build systems?
ThorstenOtto wrote:- if gcc -o hello hello.c does not produce a working executable, you are in big trouble. You would have to change each and every Makefile for that extra step of invoking your brownout helper.
See my point above - *nix people are used to a bit of torture so that won't be a big hurdle
Not sure what feature(s) you have in mind, but i'm quite sure you can achieve the same using a.out object files, or some script analyzing the linker map.
Which doesn't say much really! Use one non-existent script against a non-existent feature - does it matter in the end?
Also another possible feature I have written down is packing, so I won't have to keep adding packers with various random switches in each build script. But again, this will be added if people ask for it or we need it ourselves.