Most of us are so familiar with Windows or MacOS these days that we probably just accept or workaround issues we barely even notice any more, but seeing my elderly father using his PC years ago and the things he struggled with really highlight some of the basics that, 30 years after Windows 95 was launched, still seem to be a bad but accepted thing. For example:

When you boot up a computer, there's a period where the mouse pointer changes from a 'wait' symbol to an arrow, but the computer's still far from being ready. I used to see my Dad click on whatever he wanted to launch as soon as the arrow appeared – nothing would happen, so he'd click again. Still nothing, so he'd try something else… then after a minute-or-so of the same, the first program would open, followed by another instance of it, then another – so he'd click the 'close' icon, but nothing would happen as it was still busy… and so on. Why can't the OS wait until things are idling before it tells you it's ready for you?

Then there's the 'percentage complete' bar when installing software etc. I know there's lots of variables involved, but surely we can do better than something that whizzes up to 57%, then just sticks there for 5 minutes without explanation, then plods to 100% where it stays for several minutes apparently doing nothing when it's categorically telling you that it's 100% complete. Maybe it's indicating progress in terms of processes it needs to complete rather than time, but how are people meant to know that and in what way is that even useful?

I'm sure there's other stuff too…


Leave a Reply