Archive for April, 2008

Why Reading Books Matters to a Programmer

Tuesday, April 29th, 2008

The programming book market these days is small. Nothing like what it was eight years ago or so. And apparently, programmers don’t read books (any more).

It’s mostly true. But of course, there are still books worth reading. I’m going to take as read the easy arguments: let’s assume that you’ve already pared away all the “learn such-and-such in 24 hours”, anything with “for dummies” or “for teens” or equivalent in the title, and any book that weighs more than your head.

What you are left with is two kinds of books: timeless or beautiful texts that everyone agrees will still be worth reading in 20 years (and in many cases, already have passed this test), and the tougher category to decide on: the middling sort of books, probably specific to certain technologies and/or languages. In many cases they may be very good books, may even have been defining texts on first release, but either technology has moved on, or there is so much great information out there on the web that it makes spending $50+ on a book a proposition to think about.

The answer is: you should absolutely use all the information on the web, pulling it in as you “page fault”, and googling solutions as you go, in a practical way. AND you should absolutely read these books.

Reading and doing are not mutually exclusive ways of learning. They are complementary. And reading is more useful than many pragmatist programmers realise. Sure, you’re not going to learn practical things like debugging or how to wrangle libraries and packages just by reading. But you’re not going to get much wider context just by doing, either.

One of the first programming (culture) books I bought was the New Hacker’s Dictionary. It says this about reading:

Although high general intelligence is common among hackers, it is not the sine qua non one might expect. Another trait is probably even more important: the ability to mentally absorb, retain, and reference large amounts of ‘meaningless’ detail, trusting to later experience to give it context and meaning. A person of merely average analytical intelligence who has this trait can become an effective hacker, but a creative genius who lacks it will swiftly find himself outdistanced by people who routinely upload the contents of thick reference manuals into their brains.

The crucial part here is that reading gives a wider context to doing. This is why programmers should read, and reading and retaining (or at least being able, on second encounter, to remember and put in context) technical information is an important skill to have.

So read those books. Read code and blogs too. It may sound daunting (“How can I possibly remember everything?”) but the more you read and remember, the easier it will get, because brains are great at putting things in context and linking to things that they already know. And then when you get to doing, you’ll see the bigger picture, and perhaps you’ll also remember about a looming pitfall you read about, but which the hastily-constructed web tutorial you’re currently perusing glosses over.

Hardy Heron

Saturday, April 26th, 2008

After a couple of days of slow servers, today it picked up a bit and I installed Hardy Heron without problems. Well, with a couple of minor problems. First, my /boot partition is still just a little too small, so I had to move a few things around to manage the recreation of the initrd file. Second, there were some issues with dependencies in a couple of the GHC library files. But that was also fairly easily sorted.

I haven’t played a lot yet with everything that’s new; just noticed a few minor changes for now. Everything still seems to work!

LA Times Book Festival

Saturday, April 26th, 2008

I went with the family to the LA Times Book Festival today. We parked in the same lot as last year, avoiding the snarls off the 405 at Wilshire (signed UCLA) and heading up to Sunset to come in from the north side.

This year, we had got tickets ahead of time for three panel sessions: two for me and one for Mrs Elbeno (who would have also had two but for the fact that her second choice was cancelled). The two panels I attended were on poetry.

The first was “Walking the Line”, on how and why poets break their lines where they do. It featured David St John moderating, with Marvin Bell, Elaine Equi and Jean Valentine participating. It was an interesting discussion for a beginning poet like myself to hear, especially to compare free verse with metrical poetry.

After a nice chicken sandwich from the Organics To Go stall, the second panel I went to was “The Poet’s Voice” featuring Eloise Klein Healy moderating, with Mark Doty, Amy Gerstler and a curmudgeonly (but entertaining) Albert Goldbarth participating. The subject matter here was more wide-ranging, because the panel weren’t really interested in a narrow interpretation of the topic.

A quick potter around the stalls (avoiding the crazies) and it was time for Mrs Elbeno to get to her panel, which was about new fiction. After that we meandered around a few more stalls before heading back to the car and home. Mini-Elbeno stayed awake for a while, but was out a few miles from home and slept for an hour or so all told, totally zonked by the day out.

Nostalgia corner: teletext

Wednesday, April 23rd, 2008

I want to share with my US readers a little known thing (at least over here) that goes by the name of teletext. This was (still is) a system in the UK that was sort of like the Internet before the Internet. Way back in the 70s the BBC invented a system for transmitting textual information over broadcast TV.

Basically, you take a 40-column, 24-line screen of characters (here “characters” means alphanumeric symbols but also various codes for blocky graphic shapes in 7 foreground/background colours) and you encode it in the PAL analogue TV signal. It lives in the space between frames – technically known as the vertical blanking interval. Once you’ve done that, you simply broadcast the set of pages cyclically, and the TV viewer can hit the “text” button on their TV remote, type in a page number, and wait a few seconds for the TV to receive that page.

In the early 80s, not every TV had the ability to decode and display the teletext signal, but ours did. When you flipped into text mode, the default was page 100 which was the master index. The major subject indices were on the hundreds, with news being first. The news pages in particular often were themselves subdivided so that when you punched in the first news page (101) you’d see, e.g. 5/14 indicating you were on subpage 5 of 14. Each time page 101 was rebroadcast in the cycle (approximately every minute) you’d get a new subpage and gradually work through the news story. Sometimes you had to read quickly before the page changed – but if you were a slow reader, you could always press the “hold” button which prevented the TV teletext buffer from updating. The downside was that when you unheld, you would usually have missed the next subpage and would skip ahead in the story.

Teletext was (is) brilliant. Just like the Internet today – you could get news, sports, weather, TV guide, movie times, flight arrivals and road travel info, and subtitles for TV programmes. We liked to turn on subtitles (page 888) for a lot of things – they were just fun – especially for live events like the Eurovision Song Contest. On such occasions it became obvious that the people doing the live subtitles were working with some kind of phonetic input system, and many a howler would result from near misses with what was actually said!

But the fun and games sections of teletext were the best thing of course. In the early days, this was limited to control codes which hid parts of the page until you pressed “reveal” on the remote. But around the early 90s or so, someone had the bright idea of “fastext” whereby the TV could store multiple pages and precache pages in order to quickly jump to them according to a simple hyperlink system. The remote had four colour buttons: red, blue, green and yellow. Each page could have four hyperlinks therefore, accessed by pressing the fastext buttons.

The standard use of fastext was to assign forward/back/subject index/master index type functionality to the four buttons, but fastext also opened up a whole world of fun and games in the form of Bamboozle – a multi-choice quiz where 3 of the buttons would go to the “wrong” page and the correct answer button would go to the next question.

We played Bamboozle every day for years in the 90s. We loved Bamber Boozler, the avuncular blocky yellow host. Getting all the questions right was surprisingly difficult, and we normally scored in the high teens (out of twenty). Getting all 20 first time, which we did a couple of times a week, was cause for pride. We did figure out a way to cheat: because there was still a finite time between displaying the question page and receiving the fastext-jump pages, if you were quick, you could have multiple guesses with the colour buttons and figure out which one went to the “odd one out” page number. This technique was by no means foolproof though – often the TV would receive the required pages quickly (especially the “wrong” page it seemed to often get almost instantly – no doubt by design) and then your attempt at a quick cheat was worse than a considered guess – serve you right!

From time to time I still miss teletext – the Internet hasn’t quite made it into the living room yet, even though I have a Wii which means I can technically browse on my TV. It’s just not as integrated, simple, or niche-perfect as teletext is. Next time you’re in the UK, turn on your hotel TV and hit that text button…

The world’s worst-performing application

Tuesday, April 22nd, 2008

A few weeks ago I got a PC upgrade at work. My new PC is pretty much top of the line: nVidia 8800 GTX, 3GB of RAM, and 8 CPU cores. It’s pretty nifty.

Which makes it all the more incredible that Outlook 2007 is literally the slowest, poorest-performing application I’ve ever had the misfortune to have to use. I am fully up to date with all the service packs which claim to improve its performance, but they don’t help. I’ve disabled all superfluous add-ons, and RSS feed functionality. The bottom line seems to be that if I want it to run anywhere approaching acceptably, I need to make my mailbox (pst file) smaller.

Is it just me, or is this ridiculous? It beggars belief that I even have to be aware of how my mail is stored in files, let alone manually have to manage them and reduce them. It is simply astounding that, disregarding years of computer and web trends teaching us “put everything in a heap and use good search & management tools to deal with it”, someone at MS deliberately made working with large files slower:

To accommodate new features, Outlook 2007 introduced a new data structure for .pst files and for .ost files. In this new data structure, the frequency of writing data to the hard disk increases as the number of items in the .pst files or in the .ost files increases.

Note You cannot create a .pst file or an .ost file without this new data structure.

What? I mean, what?! Google can search over 8 billion web pages in a small fraction of a second, but because of this boneheaded decision, my spiffy PC takes literally over 5 seconds to change folders in a mail store on my local hard drive? Did everyone at MS somehow think that mail stores were not going to get any bigger?

So because of this, I have manually split my pst files. I’ve taken my files that were organised by subject matter and have been more or less forced to break them down further by time period, which amounts to almost a completely arbitrary form of division.

Well, at least I’ve still managed to get Lookout to work. I suppose that’s something, anyway. No thanks to MS.


Thursday, April 17th, 2008

They say soon we will run out of helium.
Most people think, “What’s the big dealium?
So our voices won’t squeak
When we inhale a leak –
Balloons will just lose some appealium.”

But balloons aren’t the sole use of helium:
For industrial stuff, it’s idealium.
When released without thought,
It can never be caught,
But escapes from the earth with great zealium.

My Experiences with Radioactive Elements

Wednesday, April 16th, 2008

I’ve recently been reading about the elements, and particularly about the rarer ones. Radioactive elements tend to be among the rarer ones on earth. In my life, not counting glow-in-the-dark radium dials/paint, radon emissions from e.g. granite, or other household traces of radioisotopes, I’ve had two interesting encounters with radioactive elements.

The first time I met a radioactive element was in high school, as part of my physics class. Of course we studied radioactivity, α-, β- and γ-decay; used a cloud chamber, etc. According to my teacher, the element we used was protactinium. I’m wondering now whether it really was. Protactinium is extremely rare, and as I understand it, basically occurs in uranium ore as a daughter product of a radioactive decay chain. Theodore Gray, well-known element collector and purveyor of fine posters, says:

“Astatine, francium, actinium, and protactinium are irritating to element collectors. … The problem is that astatine, francium, actinium, and protactinium are absolutely impossible to collect in any meaningful sense of the word. They are so fantastically radioactive and short-lived that if you had a visible quantity of any of them, you would be dead and then it would vanish before your body was cold.”

On the other hand, wikipedia says that 231-Pa has a half-life of 32760 years, and:

“In 1961, the United Kingdom Atomic Energy Authority was able to produce 125 g of 99.9% pure protactinium, processing 60 tons of waste material in a 12-stage process and spending 500,000 USD. For many years, this was the world’s only supply of the element. It is reported that the metal was sold to laboratories for a cost of 2,800 USD / g in the following years.”

It is therefore perhaps plausible that a fraction of a gram of 231-Pa made its way to my high school for use in physics class.

The next time I met a radioactive element was in first-year physics at university. I think the element this time was uranium 238, a relatively common element of the radioactive ones. And the application was a cool one: we repeated Ernest Rutherford’s experiment firing α-particles through gold foil and thus debunking the “plum-pudding” model of atomic structure.

The rest of my first year physics practicals were to do with gyroscopes and various kinds of electronics, and after the first year, physics was no longer on my syllabus – it was all computer science, all the time. So I haven’t played with any radioactive substances since then.


Monday, April 14th, 2008

Sorry for the relative blog silence recently. Here’s what I’ve been doing lately:

Vecto get!

Tuesday, April 8th, 2008

Xach has released version 1.2.1 of Vecto, featuring circle arc paths, and I’m sure a whole lot more on the drawing board.

(asdf-install:install :vecto)

A bit of a new look

Friday, April 4th, 2008

I broke out the programmer art skills and tweaked the header image to something a bit nicer than the default, and while I was at it, made myself an icon for the site.

If the icon doesn’t show up for you, don’t blame me. Get yourself a standardscompliant browser. I don’t care about your pathetic requests for /favicon.ico.