Our brains are networks of neurons, and our memories are patterns of structure and impulses instantiated in them. (The possibly pedantic choice of verb there is a nice way to say "contained," while illustrating that I - at least - have pretty much no idea exactly how. One of the many ironies of introspection: I can remember, but I can't tell you how. I just know that things are connected, in intriguing, inscrutable, surprising and often delightful ways.)
In discussion of computer programming, instantiation often comes up. As do networks (sometimes "neural networks"), and structure. Such discussions often come under the heading "software engineering" rather than "computer programming." Many children have figured out how to program computers, after all, so an adult enterprise needs a more important name. Engineering certainly sounds important.
As a hardware engineer, I confess to a substantial bias to the entire concept of "software engineering." Normally, I keep this to myself, and limit myself to direct criticism of some of the most egregious products that come from the field. As a reader of material on the World Wide Web, you know what I mean, I'm sure.
This morning, while stitching together my personal network of news, between The New York Times and "I, Cringely," I came across an intersection between these often disconnected universes of hardware and software that made me want to write.
It seems Toshiba has been building laptops for more than a decade with a flaw in the microcontroller for the floppy disk drives. They'd copied ("reverse engineered"?) this flaw from NEC, along with the useful parts of the code. NEC fixed their chips quite some time ago, but Toshiba didn't get around to it, in part because the probability of a failure happening and going unnoticed was quite small. Errors come up in computer systems a lot more often than you ever notice, thanks to various kinds of "error correction" that gets built into firmware, microcode and software. It's only when multiple errors happen, or the error correction can't figure out what went wrong, or the programmer didn't anticipate the error that you get one of those wonderful error monologues saying something like "XYZ performed an illegal operation."
Anyway, the flaw was there, and an error could happen, and the error might not get caught and corrected, which would result in a bad byte being written into the file on a floppy disk. Which... might cause a spelling error, or it might cause a program to not work, or not work correctly... who knows? Something would go wrong.
Well, Toshiba hadn't fixed the problem, even though they knew about it for a while. It was hard to reproduce, for one thing, and it didn't seem important enough to take the trouble. But somebody figured out that the flaw was there, and they filed a class action lawsuit. The didn't sue Toshiba because they'd suffered some damage from an actual failure, but rather because they'd been sold a "defective product." They'd been sold a product that might fail, and might cause them some harm. And since this same situation applied to lots of Toshiba's customers, they made it a class action suit. The claim was that Toshiba had sold 5 million defective machines since 1987.
You don't have to be right to sue someone, of course. You just have to want to. (I'm tempted to say "might makes right" here, but I'll resist that temptation.)
And usually, big corporations can and will spend more for lawyers' services than your average man on the street, so it doesn't pay to do this sort of thing. Usually, but not always. In this case, Toshiba was wary of the U.S. system of jurisprudence and the very real possibility that a jury would find against them and make them pay a very, very large sum of money. Toshiba was worried that their company would be bankrupted by a jury verdict for billions of dollars, so they decided to settle under terms that will cost them "only" about one billion dollars.
The two gentlemen who'd been sold a defective product and brought suit will be compensated for their trouble, to the tune of about $25,000. Not too shabby for the remote possibility of a mistake in a file on a storage device that we hardly use anymore. The next sentence in the New York Times report is the one that got me though:
But their attorneys, led by the Beaumont law firm of Orgain, Bell & Tucker, stand to make $147.5 million.
That just floored me. This isn't justice, it's a damn lottery, and Orgain, Bell &Tucker just won it. Pay off the expenses, give the staff 5-years' severance pay, turn off the lights and head for the beach. We are done working.
If they're not done working, it's a personal problem, a bit of compulsion in their gambling behavior. I mean, what would be the point? $20 million's not really enough to fund a lifestyle, you need 50?
A laptop computer and its floppy disk drive are hardware, of course. Even though microcode for a controller is a sort of software, the problem here, the worry is that a unit of stored information on a floppy disk will be wrong. And the risk applies not just to one particular file, but any file you might decide to put on a floppy.
We have high expectations for hardware -- automobiles, sleepwear, strollers, televisions, toasters. We don't tolerate much in the way of latent defects in those things.
Computer hardware is also held to a high standard, although
typically not that high; computers always act
through some kind of software, and our standards for software are
definitely not as high, conditioned by our experience.
It might have something to do with those disclaimers we see in
print and in "dialogue" boxes before we get started. The dialogue is rather
one-sided (and SHOUTED):
NO WARRANTIES. XYZ Corp. expressly disclaims any warranty for the SOFTWARE
PRODUCT. THE SOFTWARE PRODUCT AND ANY RELATED DOCUMENTATION IS PROVIDED
"AS IS" WITHOUT WARRANTY OR CONDITION OF ANY KIND, EITHER EXPRESS OR
IMPLIED, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTIES OR
CONDITIONS OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR
NONINFRINGEMENT. THE ENTIRE RISK ARISING OUT OF USE OR PERFORMANCE OF THE
SOFTWARE PRODUCT REMAINS WITH YOU.
And, "we're not liable for anything," either:
LIMITATION OF LIABILITY. In no event shall XYZ Corp. or its suppliers be
liable for any damages whatsoever (including, without limitation, damages
for loss of business profits, business interruption, loss of business
information, or any other pecuniary loss) arising out of the use of or
inability to use the SOFTWARE PRODUCT, even if XYZ Corp. has been advised
of the possibility of such damages.
There is, of course, the disclaimer that this attempt to exclude all liability by just saying so may not be allowed in some jurisdictions.... but you can just hear their lawyers saying "they agreed!" can't you? The trouble is that at the user end of things, the "YES" button is exactly equivalent to the "ON" button. What would be the point in saying "Cancel"? There is no opportunity for "dialogue," just turning it ON, or OFF.
My point is that if and when we really do have a dialogue about defective software -- when the courts are presented with a class-action suit for all the genuine harm that's been caused, who needs to talk about latent defects! -- then it will make sense to start talking about software and engineering in the same breath.
Tom von Alten tva_∂t_fortboise_⋅_org