Ramblings

The Accidental Programmer

Introduction

I did not become a programmer through study or certification, but through necessity. In an era when memory was counted in bytes, documentation was sparse, and failure was immediate and visible, the work still had to be done. Programs were written not to be elegant, but to run — often unattended, often overnight, and sometimes with consequences if they did not. What follows is not a catalogue of best practice, but a record of how things were made to work by someone learning in public, on live systems, with no safety net and no formal claim to expertise.

(12 January 2026)

The first tool: HP-41CV

Before there was any notion of a personal computer, there were programmable calculators like the HP-41CV. These were not computers in the modern sense, but thay were programmable in a way that felt immediate and contained. Programs were short, visible, and tied directly to a task. Errors produced wrong answers rather than broken systems, and the feedback was instant. It encouraged procedural thinking and numerical caution long before either had names.

Lightweight tools can produce meaningful results.

(25 January 2026)

A tool for scaffolding: Mössbauer Spectroscopy

The scaffolding here was not for the Mössbauer spectroscopy, but rather the colleague for whom I wrote the software.

The program was written in BASIC and was used to control the spectrometer, through an HPIB interface, and then collect the data from the spectrometer through the same interface. This data could be displayed directly on a monitor as a spectrum and saved to a floppy disk.

Though invisible to most, the program allowed the experiment to run efficiently and gave me another early lesson in making tools work quietly behind the scenes.

(25 January 2026)

A tool for structured building: Turbo Pascal

In order to solve this problem, many tools were used, both hardware and software, often in combination and sometimes in parallel.

For the non-ideal diode equation, a carefully structured approach was necessary, and Turbo Pascal provided the discipline to maintain that structure as the work moved between machines.

(26 January 2026)

The tool that never was: NLP

Something that first drew me toward what would later be called artificial intelligence was the purchase of Turbo Prolog in the late 1980s. Its non-linear, declaritive style was unlike anything I had encountered before. It suggested that a computer might answer questions, not by matching keywords, but reasoning about the structure of a sentence. Without realising it at the time, I had begun exploring natural language processing.

In practice, the gap between idea and implementation was large. Prolog’s recursive mechanisms were difficult to reason about, and later attempts using C ran into practical limits when parsing grammars on small machines. Progress was possible in small demonstrations, but fragile and easily broken.

At the time, natural language processing was often presented as being just around the corner. In reality, it demanded levels of memory, processing power, and robustness that simply did not yet exist outside carefully controlled examples. What worked convincingly in papers and demonstrations rarely survived contact with untidy, real input. NLP was promising, but premature.

Even so, those early experiments were not wasted. The experience later proved useful when building the mathematical syntax parser for Master Grapher for Windows (see below).

(26 January 2026)

A tool outside the box: TSR

A colleague from the Department of Music wanted a drill-and-practise program to help students learn perfect pitch.

An electronic keyboard was connected to a PC sound card through a MIDI interface. To allow the lesson program to accept input from the keyboard, a small auxialiary program was written. It waited for the sound card to signal that data was available, then passed that data on to the lesson program for evaluation.

This interface program, written in C, was a terminate-and-stay-resident program (TSR). It lived in a region of memory outside the boundary of MS-DOS, leaving the available MS-DOS memory for the lesson program.

(28 January 2026)

A tool for building a monument: Master Grapher for Windows

Windows programming required a totally new skill set. To build this educational tool, both C and C++ were used.

This monumental task needed the development of a mathematical syntax parser and a grapher - components that ultimately mattered more than the interface itself.

(26 January 2026)

Reflections

  • The end of the Master Grapher for Windows project also marked the death of the Accidental Programmer. I never wrote another line of serious code after that.

(25 January 2026)

Endnote

Somewhere along the way, software ceased to be mentioned in academic publications. Early on, a program was short enough, idiosyncratic enough, that it felt natural to describe it, or at least acknowledge its existence. Later, as specialist packages became indispensable, they also became invisible. A paper would speak of models and results, while the thousands of lines of code that made those results possible quietly disappeared.

I am not referring here to word processors or spreadsheets, but to the specialist software without which much modern research simply could not be done. In many cases, the intellectual content is inseparable from the behaviour of that software. Treating it as a neutral tool is convenient, but not entirely honest.

If future historians of science were to reconstruct our work from the published record alone, they might conclude that computation played only a minor role. They would be quite wrong.

(25 January 2026)