For most of the time the computing industry has existed, it has been behind the seemingly inexorable march of its own fundamental technologies. For example, it is widely accepted by experts that the speed of microprocessors can be expected to advance on a regular and rhythmic basis: a major breakthrough every six months. Under these conditions, the earliest businesses devoted to computer hardware and software could not muster the full potential of any of the technologies they offered. As soon as they started to really unlock that potential, it was time to move on to entirely new standards!
In the modern era, computing companies are finally beginning to catch up with the curve that defines technological evolution in the industry. Although advancements continue on a regular basis, enterprises are better equipped to take advantage of each one. As a result, the end user gets to enjoy greater functionality and productivity at every stage of a new technology’s growth. But this change has not led to broad acceptance of new technology. In fact, it could be argued that people are as suspicious of new technology now as they were at the very start of the modern computing age, decades ago!
Why do some “computer people” — especially advanced technological experts — shy away from fast adoption of new technologies? After all, for any technology to take root in society, some vanguard needs to come out in support of it. There are many reasons why this level of social support never seems to emerge at the same speed that technology does.
One of the main concerns for technicians and engineers is whether or not a technology is reliable. Does it do whatever it’s intended to do? Does it do so quickly, effectively and transparently? It has taken years for wireless technology to shed the stigma of unreliable presentation, for example. Hours spent diagnosing, debugging and restoring technology that isn’t ready for widespread deployment are hours spent spinning in circles. When they make purchasing decisions, technical personnel take this seriously.
When reliability becomes a mantra among many people at the same time, it develops into technological conservatism. Conservatism is a hallmark of businesses that rely on various technical tools but do not truly understand those tools at the executive level. When firms find themselves increasingly bound to technology but have few “evangelists” who can speak to the most effective uses of that technology, they play it safe. This creates a ripple effect that reduces investment in innovative technology and slows its adoption.
Reliability and conservatism combine when it comes time to introduce a new technology to millions of end users. Every new device requires people to learn new skills — after all, when was it ever necessary to type with your thumbs before the invention of the smartphone? Every new skill, in turn, creates conditions under which end users might fail or encounter unexpected glitches. Each possible failure, whether it’s mechanical failure or user error, complicates the work of delivering technical support inordinately.
Last but not least is the issue of price. While computing organizations have ramped up to the point where they can manufacture new devices almost at the speed they are ready to be used, the cost of replacing technology every six months would never be acceptable — either to enterprises or to individual users. To maintain profits, technology firms need to stand back occasionally to prevent consumer burnout.
Kristopher Mullen has a masters in computer science and works within a CAD development team on the latest software. He wrote this article on behalf of PETAP
Image courtesy: FreeDigitalPhotos.net
Latest posts by Praveen Rajarao (see all)
- Coffee The Only Energy Boost That’s Good For Your Body - August 17, 2017
- Everything You Need to Know About WordPress Survey Plugins - July 18, 2017
- Top Rated PS4 Games in 2017 - July 12, 2017