Copyright and the monetization of science is killing off the history of computing

Why do computer programmers start counting at 0 rather than 1? (As in, why do they count 0, 1, 2, 3 etcetera rather than 1, 2, 3, 4 and so on?)

Apparently there’s a lot of lore out there that tries to explain the why, but one Mike Hoye decided to actually find out. He found the answer.

A thing he also discovered is that a lot of research is hidden behind pay-walls. Which is nice if you’re a rich university, but which is bad for society:

Part of the problem is access to the historical record, of course. I was in favor of Open Access publication before, but writing this up has cemented it: if you’re on the outside edge of academia, $20/paper for any research that doesn’t have a business case and a deep-pocketed backer is completely untenable, and speculative or historic research that might require reading dozens of papers to shed some light on longstanding questions is basically impossible. There might have been a time when this was OK and everyone who had access to or cared about computers was already an IEEE/ACM member, but right now the IEEE – both as a knowledge repository and a social network – is a single point of a lot of silent failure. “$20 for a forty-year-old research paper” is functionally indistinguishable from “gone”.

Having legitimate access to what lies behind a pay-wall does not always help. Earlier this year well-known computer programmer Aaron Swartz, who was responsible for a number of technologies you are using for free this very second, killed himself after being indicted for accessing documents that he had every right to access. The public prosecutor had asked the court to lock him up for 50 years. I still don’t understand that story, I will have to look into it further one day.

Hoye in the meantime was “reduced to emailing retirees to ask them what they remember from a lifetime ago because I can’t afford to read the source material.”

Leave a Reply

Your email address will not be published.