Skip to main content

Life in Hell

Fans of Matt Groening ('graining') will get the reference.

Anyway, I am posting to gripe about yet another generic problem in my saga of endless updates to a very complicated small business and research system.

Here's the background:

I have six servers and nine workstations. In addition, I use two servers and a workstation via VPN at a client site. There are other people on the systems, but for the most part, the systems are designed so that I can use various things. It is like a gigantic workstation. Operating systems currently installed:

FreeBSD (Remote Server for ssh, ftp, http, database, mail)
Centos (Fedora derivative)
Fedora
Ubuntu Server
Ubunto Desktop
Custom Linux for Wireless router.

Windows 98
Windows 2000 Pro
Windows XP Pro
Windows Vista Home Premium
Windows Vista Business
Windows Server 2003
Windows Terminal Server Edition

I have also various live-boot CDs that I use and USB (like DSL) and an ancient notebook running Windows 95. These are not all on the network proper. Believe it or not, there even more systems that are used to boot machines from time to time.

The list above, though, represents what I have to manage on an ongoing basis. Older machines simply require updates to ROM every now and then. So do devices. That means the hair-raising process of flashing the BIOS. That can sometimes kill a machine.

With that many devices, hardware is failing all the time. Things you don't expect such as motherboard network adapters just die without warning. Hard disks usually give some warning before they croak, but not always. Every now and then, the BIOS will reset on a machine and it has to be re-configured again. When devices require upgrades (like to GB Lan or 300MB wireless or even just a new hard-disk), all kinds of things just break. Of the last dozen manufacturer supplied drivers I have had, the software was out of date, was broken and required an upgrade. Invariably, the software helpfully tells you that you need new software and then directs you to a web page that does not exist.

I can live with all of the above stuff. What I can't live with are all the dead-stupid design flaws that make it almost a full time job just to maintain what is essentially a large workstation.

Last night, for instance, I told Ubuntu 6.06 Desktop to upgrade itself. After a few questions and confirmations it went merrily on its way saying it would take about 2 hours to do the update. This is on a connection that downloads 30MegaBytes a minute. Ok. I can take the bullet. It is just an adjunct workstation anyway. I did all the nonsense at 2:00 AM and then went to sleep. When I woke up, I found that it had more than an hour to go and had stopped dead asking me if I want to replace a file (that needed replacing). Having something like this in software is a show-stopping design flaw. I has been about ten hours since I started the upgrade and it is still running. It could have done that last night, except somebody who wrote the installation routine (for whatever piece it was) decided that they simply could not wait for an answer to that question and everything ground to a halt. This should never happen.

Windows, BTW, is no better. The only reason it goes faster is because I KNOW it is going to stop every 10 minutes to one hour (it can't really say when) to ask something that could have been asked at the end or the beginning for that matter. That means all the Windows upgrades are basically a half-day down the drain babysitting the constant re-boots. I have been programming at all levels for more than a quarter of a century (some of my code is actually in the Linux OS and applications on Windows, Linux and other operating systems besides). I can honestly see hardly any reason that a system should ever need to re-boot except for a design problem either in hardware or software. Windows can't seem to even install end-user programs in user space without re-booting. What is the deal with that?

Anyway, the big (abstract) gripe here is that more than ten years ago all software should have been incapable of these annoyances such as stopping dead in the middle of a long installation, requiring a system reset, failing to allow cut/paste/drag/drop operations where needed most (like what is the deal with an error message that requires you to transcribe the text by hand???)

A huge bugaboo with me is that in Windows I am constantly having the focus taken away for no valid reason by one of the dozens of programs I run. It happens right in the middle of typing and since I touch-type and multi-task it sometimes happens that a bunch of evil keystrokes end up going into the program that stole the focus.

Every single dialog of any sort should:

1) Have cut, paste, log of all text and message conditions. It has been typed in once by the developer. It does not require typing (and should not even require saving) by the user).

2) Have a NEVER BOTHER ME AGAIN checkbox. If messages are just plain critical, they should log them to another program and allow me to get to them as I am able.

3) Have roll-forward/roll-back and timeout so that if installing and not answered within (say) ten minutes it makes the best choice it can, saves backout information or roll-forward information and returns to the caller so it can continue processing.

4) Nothing should ever steal the focus. It especially should at least wait until the user is not in the middle of input.

Every major software vendor should have a 'sequencing team'. This team should review every bit of software during design, test and maintenance to ensure that these stupid, stupid, stupid sequencing errors do not exist. It seems bizarre to have to form a special team for this, but after more than a decade of constant blunders by every single vendor large and small it is clear that they need it.

I am preparing development guidelines for my company and have a whole bunch of annoyances listed already. If you leave a comment here with one of your pet peeves and it is not already on the list, I will add it to that (LONG) list.

Comments

Popular posts from this blog

The system cannot execute the specified program

It always annoys me no end when I get messages like the following: "The system cannot execute the specified program." I got the above error from Windows XP when I tried to execute a program I use all the time. The message is hugely aggravating because it says the obvious without giving any actionable information. If you have such a problem and you are executing from a deep directory structure that may be your problem. It was in my case. Looking on the web with that phrase brought up a bunch of arcane stuff that did not apply to me. It mostly brought up long threads (as these things tend to do) which follow this pattern: 'Q' is the guy with the problem asking for help 'A' can be any number of people who jump in to 'help'. Q: I got this error "The system cannot execute the specified program." when I tried to ... [long list of things tried] A: What program were you running, what operating system, where is the program? What type of

Coming Soon: General Artificial Intelligence

The closer you get to experts who understand the nuts and bolts and history of AI, the more you find them saying that what we have is not nearly General Artificial Intelligence (GAI), and that GAI seems far away. I think we already have the roots in place with Neural Networks (NN), Deep Learning (DL), Machine Learning (ML), and primitive domain limited Artificial Intelligence (AI). Things like computer vision, voice recognition, and language translation are already in production. These are tough problems, but in some ways, machines are already better than humans are. I expect GAI to be an emergent property as systems mature, join, and augment one another. I was around during the 70s AI winter, and was involved in the 80s AI winter as one of the naysayers. I built a demonstration system with a Sperry voice recognition card in 1984. I could demonstrate it in a quiet room, but as a practical matter, it was not production ready at all. Around 1988 we built demonstration expert systems usin

Your call is important to us, but not much.

Rogers entire network is down and Rogers either does not know why or sufficiently disrespects its customers that it won't say. I was on the advisory committee for the largest private network in Canada serving 150,000 employees countrywide. I was also an active participant building out that network. I installed the first Local Area Networks there. I wrote a code generator responsible for the most critical portion of Bell's mobile network. I also wrote a portion of code for a system in the United States that detected and pinpointed line breaks in their network before they happened. For a time, I held the title 'Networking Professor' at our local College. I registered my first domain name in the 1980s. I have administered Internet network servers for decades. In one capacity or another, I have worked with most of the telecommunications providers in Canada past and present. Nearly a billion devices use a small network codec written by me decades ago.  Except that Rogers was