December 2011 (Vol. 44, No. 12) pp. 116, 114-15
0018-9162/11/$31.00 © 2011 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
The Profession and Digital Technology
PDFs Require Adobe Acrobat
What is the computing profession, and where is it going?
This column started in the last year of the past millennium. In the opening essay (June 2000), I stressed the great need for computing professionals to clearly define the nature of their profession and to accept the social responsibility that goes along with being a professional.
In this, the last essay of The Profession column, my aim is to review those two aspects and to point to past essays in the column that are particularly relevant. To lessen their obtrusion, such pointers are shortened to a publication date noted in parentheses.
While increasing numbers of students are graduating from universities with degrees in areas such as computer science and information systems, the IEEE Computer Society's membership hasn't been increasing. Why not?
I believe this is occurring because such graduates, and indeed the general public, aren't clearly aware of the existence of a computing profession. This confusion is greatly encouraged by the stupid IT initialism.
A software engineering profession does seem to be emerging. For example, a recent issue of American Scientist features an article titled "Empirical Software Engineering," which begins by suggesting that "software engineering is now at a turning point comparable to the dawn of evidence-based medicine" and that "interest in it has exploded over the past decade" (tinyurl.com/AmScSwE).
The article depicts software engineering as combining and contrasting a focus on programmers with the focus on what they produce. My argument against this was first given in this column exactly 11 years ago, and repeated many times. Briefly, just as other engineers use and supervise technicians, so should software engineers use and supervise programmers (Sept. 2002). I don't understand why this isn't generally accepted.
The design and manufacture of computers seems to be adequately covered within the traditional electronic branch of engineering.
However, the nature of the computing profession, if software and electronic engineering are considered separate from it, is complicated by the increasing overlap with the communications field. Digital technology and its needs are now dominating the communications field (tinyurl.com/CcScMgz), somewhat as the Internet and smartphones are dominating computing. And like the computing field, both software and electronic engineering support communications. Should the two fields combine?
In any case, the burgeoning growth of digital technology and the proliferation of people exploiting it have led to a blurred perception of people who work in the computing field.
Most computer users merely run programs as a routine part of their working or personal life. In a way, whether receptionists, supermarket checkout clerks, online game players, teachers, students, website builders, or professionals, they're really program users, For example, my dentist recently used a remarkably sophisticated set of programs to design and make a porcelain crown for one of my teeth.
Such computer users aren't com-puting professionals. The computing professional, it seems to me, needs to be someone who acts as an intermediary between software engineers and users who need serious computing systems designed and implemented to meet their professional needs. Even scientists need help from computing professionals (tinyurl.com/NtrScPr).
In other words, computing is best seen as a secondary profession, one that helps other professions get maximum benefit from the application of digital technology. If this is accepted, then university computing education should give students a thorough grounding in two professions: software engineering for both computers and networks, and whatever profession the individual student wishes to work in (Jan. 2007).
Whether or not my suggestions about the computing profession are valid, the contrast between the proliferation of digital machinery and the somnolence of professional computing organizations is surely highly significant. Something needs to be done. The social consequences of the proliferation are highly dangerous, and helping to counter the dangers needs the attention of a healthy computing profession.
The spread of digital technology is having many effects of various kinds. Whether any particular effect is harmful or beneficial is subjective. For someone with a profitable use of computers in business, that use is beneficial, but the effect of that use on others might well be harmful (tinyurl.com/MagPrGm).
The mounting use of the Web to acquire news and other information is beneficial for Web-based firms like Google, which can offer more specific services than newspapers and television. It's also great for advertisers because their online messages can be more effective, partly because it's more difficult to avoid them. But for the website user, these messages can have a harmful effect (tinyurl.com/GnGmAdv).
The effects of digital technology can be immediate or delayed, and they can be social or personal, local or global. There currently are many such effects, and there will be many more.
One huge class of social effects hinges on money. Money has always been, in principle, a digital technology. For a long time, coins had actual value, and representing value is supposed to be money's role. But when banks were established to hold and protect coins and bullion, they turned to issuing bank notes that had no intrinsic value, and the rot started to set in (Jan. 2009). Banks and other financial institutions have enthusiastically adopted modern digital technology, which has amplified the rot in many ways.
The credit crunch of a few years ago was caused by private banks, supported by government, creating fractional reserves—that is, money without anything of real value to back it (tinyurl.com/GnGMDbt). This is akin to borrowing money without any immediate exchange of value, as in the case of credit cards, or with insufficient surety, as in the case of overmortgaged homes. Very dangerous.
In financial institutions such as stock exchanges, modern digital technology allows transactions where no immediate value is involved because computers can be programmed to transact with other computers (tinyurl.com/MAgHFTr). Money is a social artifact intended to preserve as well as transmit value, but there's nothing social about trade between computers.
The use of digital technology to exploit monetary transactions benefits the people who are in charge of the exploitation. Such people become super rich. "Between 1947 and 1979, productivity in the US rose by 119 percent, while the income of the bottom fifth of the population rose by 122 percent. But from 1979 to 2009, productivity rose by 80 percent, while the income of the bottom fifth fell by 4 percent. In roughly the same period, the income of the top one percent rose by 270 percent" (tinyurl.com/GnGMWlt).
Given that possession of money has become decoupled from the representation of value, digital technology is of obvious value in criminal activities, such as the laundering of money (tinyurl.com/MAgMyLn), the avoidance of taxation (tinyurl.com/MAgCrpTx), and the marketing of illegal drugs (tinyurl.com/MAgInDr).
One of the social problems of digital technology is that it's equally available to enterprising criminals and enterprising businessmen. How long will it be before burglars and murderers begin exploiting robots and drones? Then there's terrorism (Nov. 2001) and climate change (Feb. 2005).
As far as the computing profession is concerned, there's a responsibility for anticipating harmful and criminal uses of digital technology and for pressing for measures that will at least minimize their incidence.
One of the earliest personal effects of digital technology resulted from its use in automating, fully or partly, many kinds of paid work. Not only did this change jobs, but it also put people out of work. Indeed, management often saw the potential of automation for reducing the workforce, and simply directed systems analysts and programmers to take as much of the work and responsibility as possible away from other employees.
Occasionally, responsible professionals succeeded in persuading management to focus instead on helping users do a better job (Nov. 2004). Unfortunately, this approach is rarely adopted.
Until about a decade ago, the computing industry focused on the needs of government and industry. Nowadays, a much larger industry is dominated by personal use: consumerization. The growth of smartphone use is extremely rapid, and tablets are also coming to the fore, yet traditional PCs and laptops are still very popular (tinyurl.com/EcTqySy).
Of course, the most conspicuous personal effect of consumerization is the general use of PCs in everyday life. This is somewhat like the earlier effects of television. People spend more time using the computer and less time interacting socially. There's an obvious effect on personality and physical health, especially with video-gaming (tinyurl.com/WpVGmAd).
Smaller devices like smartphones and iPods have become like items of clothing. People are plugged into them for most of their waking hours. When you see them walking toward you in the street, they look somewhat like robots—no life in their walk and no eye contact—and they sometimes don't look around when crossing a road (tinyurl.com/MAgPdAc). Some people talk or text on their mobile phones while driving their cars, even though they know that it's both dangerous and illegal (tinyurl.com/MAgMbAc).
One of the alarming things about the new small computers is that many children carry them about all the time and even use them surreptitiously in class. This is alarming because children's and teenagers' brains are plastic, continually developing according to their experiences. If they spend an inordinate or inappropriate amount of time interacting with a computer, their brains develop to do this more effectively, and they fail to develop skills related to interpersonal interaction. There's some evidence "among US college students … [of] a trend of decreasing empathy during the same time that social networking has risen to prominence" (tinyurl.com/MAgSGfd).
Lack of empathy—computer-induced autism—is only too likely to occur among young people who use videogames and social media like Twitter and Facebook excessively. Some would argue that using such media helps develop social skills, but the relative absence of face-to-face conversational interaction means that it's selfishness and self-consciousness that are being developed.
It's difficult to see what a professional can responsibly do to help counteract these effects of computer consumerization. Certainly, the dangers can be documented and publicized. In addition, odd specific things can be done, such as combining a workstation and a treadmill (tinyurl.com/JLvTmWs).
One extremely important social effect to tackle is computer-induced autism. It might be possible to use digital technology to help alleviate the disease, given that neuroplasticity is possible in later life (Sept. 2009). But it would be much more effective to prevent it from happening in the first place. This can be done at school with support from the computing profession.
The problem to be solved, especially in early schooling, is how to free teachers and parents from direct responsibility for students' knowledge acquisition so they can focus on fostering the development of empathy. The solution lies in using computer-delivered drills to develop basic abstract skills and knowledge that can then provide the basis for social interaction in the classroom and at home (Mar. 2008, Sept. 2009).
Developing the programs to do this and instructing teachers and parents in their proper use would be the most beneficial social responsibility the computing profession could undertake. It would be tremendously effective if it were done internationally.
My ranting here might seem rather negative. However, I do feel that much could be achieved for and by the profession through the IEEE Computer Society. Readers' support for the Society's activities, both to expand and guide it, could be of great value.
Anyone volunteering to take an active part in the Society can be sure of the help and encouragement of some wonderful people. I am confident of this because I could not have kept up this column over the years without the support of volunteer authors, letter writers, the editorial board, the various editors in chief, and especially the editorial staff. The experience has been gratifying and the people memorable.
Selected CS articles and columns are available for free at http://ComputingNow.computer.org.
Neville Holmes is an honorary research associate at the University of Tasmania's School of Computing and Information Systems. Contact him at firstname.lastname@example.org.