Does the software industry learn?

24 Jan 2022

"Institutional knowledge" - the information that a company collectively knows - is a familiar concept to anyone involved in hiring processes. When an individual leaves who has knowledge the organisation needs, companies will organise offboarding sessions to keep that knowledge within the institution. Maybe they'll even try to hire someone with similar experience.

Lots of companies similarly try to optimise for "Institutional learning", especially smaller firms. This makes a lot of sense - smaller companies don't have the resources to buy in extra experience, so they focus on rapidly expanding the experience of their existing employees. It also fits really well with the agile philosophy of fast development cycles to maximise your knowledge about customers' needs.

But what I would really like to be able to track is "Industry knowledge".

Software, like a small company, seems very good at learning new things. It's practically the first question developers ask each other; "what do you want to learn next?". Everything is New and Shiny and Disruptive. Technologies are king, and the newest is the best. See: Rust, NFTs, [Bit|Doge|Lite]Coin, etc.

In keeping up with the latest and greatest, software developers have got very used to learning as a part of their profession. And as that applies to process changes as well, so it seems like "Industrial Learning" is something Software as a whole does quite well.

But what about "Industrial Knowledge"? As a result of how young the profession is, there are few universally accepted practices and standards. Industrial trends generally come about by replacing the old with the new, not by incremental improvement.

In order for learning to most effectively improve our state of operation, it should build on the totality of previous experience, not just the most recent previous state. But I very rarely see articles looking back at past languages or technological fads and looking at current trends through that lens.

I want to read content picking apart COBOL and Prolog, analysing them for what they did right and wrong, and telling me how it relates to React and Golang.

And, above all, I'd really love for that kind of content to be normalised within tech culture. We need more historians and librarians in our ecosystem, and fewer blue-sky thinkers.

After all, if I remember rightly, that those who fail to learn from history are likely to be worse at stuff.

More like this: