Did I already mentioned how slow programs became? Not only in running, but also in compiling.

On you can find the source code of, a framework to monitor the network back then. Originated from monitoring the buildds on .

However, see & compare the times of e.g. binutils over time...

The database still exists, but the project itself not.

Β· Β· 2 Β· 2 Β· 4

The 4 buildds aahz, arrakis, elgar and kullervo were each with MC68060, 128 MB RAM and sometimes SCSI (arrakis, aahz & kullervo).

Kullervo was the very first autobuilder of Debian in history (AFAIK).
I joined kullervo with arrakis approx. in March 2000. Back then kullervo had a backlog of maybe 2000 packages. And it also hosted the first version of what later became
It consisted of some Rexx and bash scripts.

I operated until approx. 2015. The project became too complex with all of its scripts and needed a redesign.

I also operated several Debian autobuilders for the m68k port, without being an official DD, which was something "special" back then and wouldn't be possible today, I think.

I still think that was unique and a benefit for the Debian project.

That's some small part of in Open Source Software and Debian in particular.

What's your ?

Ah, what I intended to say at the beginning:

When we talk about we should also think about code complexity and code sizes. Letting just the compiler do the optimizations will result in larger packages and longer build times and thus in more wasted energy.

Invest the energy instead into designing an optimized code right from the start - inside of your brain. Think about yourself, where you can optimize your code. Don't let the optimizer of your compiler do it for you.

From the...

... database you can find my examples where the build times increased significantly without much benefit.

I don't think that binutils included so many new features between versions 2.22 and 2.24 to rectify the increase of build times from 8 hrs to >2 days.

There is another table that also tracks installed sizes and such, btw...

@ij that argument has one issue:

If I write everything in Ruby, I have no compiletimes at all! If I then run my company on Ruby on a cluster with 20 nodes instead of running everything on one node with, for example, Rust... I waste more energy simply by running a slower language!

This is not a black-white tradeoff at all, there are shades of grey in between! I still agree with you that thinking about resources in the first place can help with #greenit though!

@musicmatze Well, the point was about compilers and compiled packages. Interpreted software is not applicable in this context.

We should also talk about static vs. dynamic linking and abiut "build everything from source all the time".

and are good (well, actually bad) examples for how alone the choose of programming language determines the waste of resources.

@veer66 No, since this will still require *me* to rebuold all libraries instead of using dynlibs buuily by someone else.

Sign in to participate in the conversation

All friendly creatures are welcome. Be excellent to each other, live humanism, no nazis, no hate speech. Not only for nerds, but the domain is somewhat cool. ;) No bots in general! (only with prior permission). ---