TechWhirl (TECHWR-L) is a resource for technical writing and technical communications professionals of all experience levels and in all industries to share their experiences and acquire information.
For two decades, technical communicators have turned to TechWhirl to ask and answer questions about the always-changing world of technical communications, such as tools, skills, career paths, methodologies, and emerging industries. The TechWhirl Archives and magazine, created for, by and about technical writers, offer a wealth of knowledge to everyone with an interest in any aspect of technical communications.
> First, the so called Y2K "Bug" is not a bug. I really hate this description
> of the issue. It is not a bug, it is an inability of many computers to
> display dates the way we humans wish them to be displayed. You can't say a
> system has a bug if it is simply doing what it was designed to do.
OK, pardon my ignorance, cause I'm not in the software field, but I
really am curious...
Obviously a system can only do what it was designed to do. Whether what
it was designed to do is what the designer actually *intended* to design
it to do is another story. So, since software can only do what we tell
it to do, I always thought a bug was when something hadn't been
programmed properly, causing the software to do something to do
something unwanted, even though it was actually following instructions.
In that case, wouldn't faulty "instructions," such as "these two numbers
for the year always have a '19' in front of them" be considered a bug?
And if I'm wrong, what *is* the definition of a bug?