Your leaking thatched hut during the restoration of a pre-Enlightenment state.


Hello, my name is Judas Gutenberg and this is my blaag (pronounced as you would the vomit noise "hyroop-bleuach").


decay & ruin
Biosphere II
dead malls
Irving housing

welcome to the collapse
Clusterfuck Nation
Peak Oil

got that wrong

appropriate tech
Arduino μcontrollers
Backwoods Home
Fractal antenna

fun social media stuff

(nobody does!)

Like my brownhouse:
   experience with a good video card
Saturday, March 30 2013
For the past week or so I've been trapped in the reticulated web of Wikipedia entries about early computers and computing. It all began as a way to find out more about the PDP-8, an early computer I kept seeing references to. The PDP-8 was a 12 bit computer that was about the size of a refrigertor and was made entirely of discrete components (transistors, diodes, resistors, etc.). Its memory consisted of arrays of ferrite cores. Understandably, computers made that way were much simpler than even the simplest computing devices available today, and it's been amusing for me to see the hurdles that programmers had to jump through to write usable programs. An example of a typical PDP-8 programming challenge was the fact that when calling subroutines, the programmer had to store the address to return to in the first byte of the subroutine, meaning that subroutines couldn't easily call other subroutines and subroutine libraries could not be written in read-only memory (ROM).
But even fancy mainframes made in the period of discrete electronics (such as the PDP-10) had the sorts of compromises that seem almost comical from a present-day perspective. Some of these were simply a function of being hand-wired equipment; beyond a certain level of complexity, no more complexity could be added without introducing unfixable problems or an untenable level of unreliability.
My interest in old computers also carried me into Wikipedia pages about the origins of calculators. I remember the first LCD calculators that started appearing in the late 1970s, and well into the 1980s I fetishized and experimented with calculators (for example, building digital circuits to automate keypresses) until I was eventually able to buy a far more flexible (if primitive) computer. The best calculator I found in my research was the Casio AL-1000, an early programmable calculator that, like the PDP-8, included ferrite core memory (less than 500 bits of it) and transistorized (not integrated-circuit) circuitry (which was spread out over ten circuit boards). The display consisted of Nixie tubes, which are gorgeous (though completely impractical in modern applications). Here's a video clip of someone with a AL-1000 some 42 years after it was built:

Today I found myself reading about the evolution of supercomputers such as the various models made by Cray. To make a supercomputer with discrete transistors isn't an easy task, and compressing macroscopic circuitry into the smallest possible space made for some cooling challenges, particularly when a computer is using 110 kilowatts of power. It seems like a lot of bother considering that the alternative is to just wait ten or fifteen years for commodity PCs to reach the level of computational power you are trying to achieve today. I get about 20 GFlops/second out of Woodchuck, my main computer (running a Core 2 Quad processor at 2.4 GHz), which is twice as fast as a ETA10 from the early 1990s.

As I mentioned in yesterday's account, I've installed Microsoft SQL Server Express 2008 on Lemur, my old Athlon64 box. The other things on that box are a number of video games that I occasionally play over the course of obsessive two-or-three-day streaks, though between these I go for months and even years without playing games at all. With the games being right there and the box all booted up, it was easy to fire up those games and have a go at them. One of them is Grand Theft Auto: San Andreas (released in 2004), and my preferred way to play it is to find some defensible spot in the scenery to hole up and blast away at society until the tanks come to get me. But I noticed something odd today when I played this and another game: the computer would occasionally just die. To get it working again, I'd have to disconnect it from wall power for a few seconds. I thought maybe there was a problem with one of the memory sticks, but some testing showed that not to be the problem. So then I replaced the power supply. This helped; the power-failures became less frequent and no longer required me to disconnect the power cord for a few seconds to get things working again. But I had not completely eliminated the problem.
I tried replacing the video card, installing a powerful AGP-based ATI Rage Pro 9800 that had been given to me by a friend after he upgraded to a more modern (non-AGP) motherboard. It had a noisy fan and produced a lot of heat, but it made gorgeous high-resolution modes possible in Grand Theft Auto, and these rendered as smoothly as live video. This might seem odd, but (not being a gamer) I'd never actually seen such high-quality realtime computer-generated graphics before; who knew such technology had been just gathering dust in my laboratory for the past four or five years? It demonstrated something I hadn't actually known: a video card is much more important for good game play than the processor is. The Athlon64 I was using dates to a purchase I made back in 2005, and since I never buy processors at the bleeding edge of technology, it's probably about ten years old. While the processor maxed-out whenever I played the game, this didn't seem to affect play at all.
Despite all that, even with the fancy new video card, the computer would still occasionally die. And after about a half hour of play, weird triangle-shaped artifacts started appearing on the screen, encrusting themselves around the computer-generated people like weird alien armor. Eventually there would be too many of these artifacts for me to be able to see what I was doing, and I'd have to quit. Cooling the video card with an additional fan did no good, though not playing the game for awhile would buy me another half hour of artifact-free play. Researching what exactly was wrong with the computer necessitated playing lots of games; this was perhaps the first time that I'd ever had an excuse for wasting my time in this way.

Meanwhile, outside it was a sunny day with temperatures in the 50s. At one point I went out in the driveway and lay down next to Eleanor in the sun, the way one does at this time of year. Ramona saw this, waddled over, climbed onto my chest, and lay down on top of me. And then Clarence the cat came over and rubbed against us all the way he likes to do when he feels the need to celebrate. I tried to get Ramona off of me, but all she ended up doing was stepping on Clarence and making him complain.

On Facebook today, Sara Poiron sent me a link to a video of something new and ridiculous out of the mouth of Rick Santorum. The clip was presented by someone calling herself Susie Sampson and featured a few other interviews with Christianist conservatives who worry about Obama's provenance and his designs on the constitution. That particular clip wasn't all that great, but for some reason I let it keep rolling, and it loaded other clips of Susie doing her schtick one after the other. Though they were short, there seemed to be an endless supply of material. Susie Sampson has a young quizzical face and a slightly-unruly mop of straight blond hair. She speaks in a fake southern accent and affects a primly-sarcastic right-wing persona as she interviews various people, microphone in hand. Her clips are full of great (and occasionally stunning) moments, usually involving crazy people saying crazy, ignorant things. Susie is good on her feet and sometimes comes back with clever follow-ups that bring out yet more crazy. It's all tightly-edited, which means there is not much waiting around between such moments. Given how good she is, it's surprising that this was the first I'd seen of her.

For linking purposes this article's URL is:

previous | next