David Isenberg returns to the debate about "fixing" the net. I've not been following this debate closely (although I blogged my initial reaction), but I feel that much of the discussion misses the fact that large assemblages of code, and so the net, that are supposed to run indefinitely, are in some ways more like biological systems than like the simple, exactly describable engineered systems of the past. Code gets added, patched, disabled, copied and modified (cf. gene duplication), becomes dead because nothing calls it any longer (cf. pseudo-genes). Other code (viruses) latches onto functionality (receptors) to do its selfish deeds. An army of engineers (immune system) constantly scans for invaders and crashes, and makes patches. The big difference is that biocode does not have engineers to patch it — mutations and selection do the work over time. Still, every large long-running software system I have known resists attack and improves through incremental replacement of parts, with lots of trial and error, not by wholesale redesign.
The internet "fixers" goal is no more realistic than anyone's goal to avoid disease by redesigning their genome and rebooting their body.