10072015Wed
Last updateFri, 02 Oct 2015 1pm

Getting to grips with your computer’s ‘default’ mode

A client asked me a question the other day, and much to my embarrassment the answer almost eluded me.  The question was “What is the meaning of the word `default’?”  Sometimes I have difficulty explaining things that are so common and familiar to me that I assume everyone else is equally familiar.

There is no mystery as to why confusion could exist as to the meaning of this word.  If your home library happens to include a fifty-year-old dictionary of the English language, the definition you are likely to find in it is that “default” means “failure in performance,” and that is what the word meant then.  Starting a few decades ago, though, computer engineers starting using the word “default” to describe a value or a setting that was automatically chosen.  It might have been less confusing if computer technicians had deigned to choose a different word with a more synonymous established meaning such as “preset.”  Apple Computers had the right idea in 1982 when they instructed their programmers “Please do not ever use the word default in a program designed for humans.  Default is something the mortgage went into right before the evil banker stole the Widow Parson’s house.”  Unfortunately nobody listened and today we still use the word “default.”

One of the definitions for “default” found online reads: “A default, in computer science, refers to a setting or a value automatically assigned to a software application, computer program or device, outside of user intervention.”  In other words, “default” almost always comes into effect where there are several possible choices, and one of those options has already been chosen for you.

Please login or subscribe to view the complete article.

No Comments Available