21 August 2025

AI: The Computer Revolution, Again

I'm lucky to have experienced personal computing from the beginning, from Sinclair's Black Watch, ZX80, ZX81 and Spectrum through Pick R83 and MS-DOS 3.3 to today's herding into Microsoft's pen for Windows 11 survivability.

We are perhaps the last human generation to believe we could understand electronic digital computers at every level, from transistors and logic gates through to automated online banner ad auctions, Bitcoin mining, botnets, etc.  It's got so complex, newcomers have to choose which slice of the tottering stack in which they will specialize; code is becoming ephemeral, beyond human scope if it is to be kept on track.

In the 16-bit home computing era, computers became our toys, while futurists were still telling us we needed to learn binary arithmetic at school to prepare for tomorrow's careers.  Hobbyists were asked to justify the time and effort they lavished on their FREDs (Folking Ridiculous Electronic Devices), who would reply "Look, I can create a page of printed text in minutes!" ...not counting minutes spent waiting for code to load off audio cassette at one end, and the noisy printer at the other.

So it is now, with "AI", i.e. the extension of expert-system pre-loaded wisdom, to machine learning.  We play with ChatGPT etc. as toys, while bigger budgets put AI to work; new but transient careers beckon, and early adopters may find the skills built in advance are as mis-aligned as trying to apply only binary arithmetic to higher-level programming languages.

Performance is not yet there; early AI-capable laptops and PCs are as rare and costly as the first round of "multimedia" (sound card, CD-ROM, video playback, a handful of available titles) before Windows 95. Nvidia's monster AI chips are as unattainable for us as 3DFX's dedicated 3D accelerators were way back in the day, when affordable "Windows Accelerator" graphic cards just couldn't cut it for new 3D games.  Industrial-grade AI thrives on ye olde IBM mainframe budgets of half a century ago.

This time round, I'm content to watch from the sidelines - if I was 20 years old again, I'd have jumped into Android when toy smartphones escaped Apple's iron grip, I'd have years of Linux under my belt, and I'd be very actively involved in "playing with AI".


No comments: