I see in many books certain attempts to ease the apparent pains of using computer terminology.
With the help of Tim Berners-Lee, the Internet became popularized with the creation of the World Wide Web.
This simple statement becomes this convoluted paragraph:
With the assistance of Tim Berners-Lee, a computer technology was developed that allowed computers to communicate each other through what became known as the World Wide Web, which people could connect to through new software such as America Online and CompuServe that came in floppy diskettes. Thus came the existence of the Internet.
Authors continue to be extremely cautious in introducing computer terminology in their writings. But the truth is, who doesn’t know what the Internet is these days? Who doesn’t know what software is? And when authors do use the terminology, they often surround it with these metaphors so as to try to compare it to tasks once done by hand. “The Internet, like a pair of telephone wires, …” “With the advent of the microprocessor, computers once the size of rooms became smaller than the ‘a’ in this book…” This is the virtual world we’re talking about here. There is no substitute for these things.
No. Heck, no. If you’re going to include words like “axle” and “spigot” in a book and don’t bother defining them, then don’t bother with “die size” or “parallelization” either. Suck it up and make people learn the jargon. Don’t talk to them as if they were elderly people.