The Community for Technology Leaders
RSS Icon
Subscribe

Letters

(HTML)
Issue No.03 - May/June (2003 vol.20)
pp: 8-13
Published by the IEEE Computer Society
SO YOU WANT TO BE IN THE MOVIES?
Warren Harrison hit the proverbial "nail on the head" with his January/February From the Editor column, "The Software Developer as Movie Icon." I'm a recent college graduate who managed to find gainful employment in software development after graduation. Some of my friends who are still in college pursuing computer science degrees have little clue what they are in store for when they enter the workforce. When I read your article, I couldn't help but think about my education and how I could easily delineate it into two markedly different eras: the first two years, and the second two.


During the first two years, all my computer science courses taught basic programming, data structures, and algorithms. During the second two years, however, courses began to involve more and more teamwork, which I found to be "hoops to jump through." My Software Design and Documentation course seemed the pinnacle of useless activity. Who needs to gather requirements—aren't they are always provided for the developer? Who needs to "design" a system—can't we simply code our modules on the fly? And while we are at it, why bother commenting my code or naming my variables in a sensible fashion? I'll be able to figure out what I was thinking.
It's been eight months since I entered the workforce. All these questions and more have been easily answered one hundred times over. Maintaining others' code, tracing bugs in code I wrote six weeks ago (never mind six months ago), dealing with the entire development process (from business process redesign meetings to dealing with change requests in a production environment) all have been interesting, new learning experiences.
I now find it unfortunate that there was not more of a marriage between my college's information technology curriculum and its computer science curriculum. IT majors might know how to estimate project budgets and write project proposals, and CS majors might know how to properly implement a design specification or find a workaround for some project limitation, but those skills are not necessarily overlapping after graduation.
Thank you for bringing to light some of the problems out there in our (relatively) young field. Here's to the hope that the study of software engineering keeps growing strong.
James Bogosian, RIS Regulatory Systems, Merck Research Laboratories; james_bogosian@merck.com
I agree wholeheartedly with your assessment that, in general, Hollywood does a poor job of portraying our profession. Your examples are indeed typical. Something you didn't mention were the myriad bit parts played by often unknown actors in movies that have nothing directly to do with software but nevertheless portray software professionals as incompetent, overly zealous, antisocial, or otherwise "nerdy." These portrayals, while insignificant to the movie plot, add to the damage done to the software profession's image and thereby dissuade young people from seeking such careers.
Bob Pedigo, Consultant; rmpedigo@ieee.org
Warren Harrison's January/February column wasn't quite what I expected. I might meet the stereotype from the movies in some ways, but not at all in others. I am a professional software developer for the 20th largest company in the world.
When I went to college, I envisioned writing most of my code single-handedly. And in my real job, I do.
When I went to college, I envisioned designing relatively complex systems from simpler parts, and in my real job, I do.
You did list some skills that I certainly need from time to time. I must be organized just to keep up with my huge workload. I need some social skills, to get along with coworkers who I think have disliked me since a previous life. Occasionally, I must be able to put together coherent presentations and speak in front of a small crowd. Most importantly, I must be able to sit through endless hours of meetings to pick out the little bit of meat that I need to know to actually design a program to do what the client wants. Most people who want a system to perform some function don't know how to disect it like a programmer does. And of course, I as a programmer have no idea about EPA regulations, bank transaction protocols, and so on until someone teaches me. Sometimes it can take a bit of work to get the two skill sets to meet in the middle to the point that I can actually start writing code.
Noah Silva, Programmer analyst; nsilva@atari-source.com
You're absolutely right that today's employers expect developers to have good team spirit and quite good communication skills. A more important question is, Are excellent technical skills still important nowadays? Developers who are required to have customer contacts and give regular presentations undoubtedly have less time available to improve their software development skills.
Geert Poels, Senior software developer; geert.poels@skynet.be
Your article was entertaining and dead-on accurate.
I manage a group of developers at a software company. I agree with your assessment that interpersonal skills and the ability to communicate effectively are every bit as important as technical skills in terms of what makes a solid professional developer.
Michael Wisniewski, Supervisor, interface development, Applied Systems, mwisniewski@appliedsystems.com
Great article on the software developer as movie icon!
You say that most of the kids in universities don't "learn the business" or gather real-world experience with respect to teams, specifications, or development processes. However, most computer science professors have little, if any, real-world experience! So they wouldn't be very effective at teaching these concepts anyway. This is what co-op and internship jobs are for.
John Selbie; jselbie@hotmail.com
I concur that we generally do not present a clear idea of what coders or software architects do. I took a round-about path to get "Software Developer" on my business card, myself. I clearly remember expressing interest in high school in pursuing a career in computer science, but concluded that even though I didn't really know what a programmer did, I didn't want to do it.
John D. Verne, Software developer, tactical development, MKS; jverne@mks.com
I enjoyed your article contrasting people's perception of software developers versus the actual personality requirements of real software developers. My question to you is this: If software development is not the right career path for "brilliant, socially awkward young people who code on the fly [and] think they can single-handedly develop ultracomplex systems," where do you see those people fitting in as productive citizens of private industry? This question comes from a physicist turned economist, turned marketer, turned AI researcher, and currently doing a stint in software development.
Francisco Gutierrez, Artificial intelligence, software developer, The Dante Group; francisc@alumni.caltech.edu
I read your article on programmers portrayed in film, a very interesting piece. As far as I can tell, you're basically saying that our profession attracts the wrong kind of people based on the media image.
Ok, that may be.
But what would be the right line of work for the people currently coming to computer science?
Mikko Kurki-Suonio, Software engineer; mikko.kurki-suonio@iki.fi
Your article hit the head on the nail. I work as a developer. I have found it hard to believe, but true, that in big corporations where you have larger teams, you need really good political skills to survive. Which means you need to be a really good communicator.
Ilango Veerasingam, Information systems engineer, The Vanguard Group; ilango_veera@vanguard.com
As a mechanical engineer by training and software engineer/sys admin by trade, I think part of the problem is that anybody can be a software developer because it's so cheap to tool up.
Ultimately, the process of engineering software really isn't that different from engineering anything else. The difference is, you won't find Boeing or General Motors hiring a music major to do aerospace or mechanical engineering just because they've ridden on an airplane or have been driving for the last 20 years. But a semiconductor company hired music majors to code just because they've been hacking (poorly at that) for the last 20 years. These hacks live the five bullet points you listed.
Although I agree that Hollywood sensationalizes software engineers, I would look more closely at the people who actually work in the industry. I think you'll find more hackers than engineers.
Jiann-Ming Su, Development team, systems administrator, Emory University General Libraries, Systems Division; js290@bellsouth.net
I do believe that the computer science field in college suffers horribly from being nothing like the industry. Those of us (like me) who were there during the dot-com boom and have coded before coming to college at least have a concept of what real development is like. On the contrary, those with no previous experience get thrown into an algorithms class where the focus is on independently developing short, hackish programs that use clever tricks! By the time they even get to higher-level classes, the damage has been done—their coding style is sloppy, and they are not used to showing their code to other people or even interacting with others. In addition, this type of course structure discourages people who are more social and like to work in teams from joining computer science. Those intro classes really scare them away. The essence of software engineering is not the algorithms; it's the interaction and integration between people and their code.
Artem Pyatakov, Computer science student, Princeton University; pyatakov@princeton.edu
When I entered the workforce three years ago as a full-time engineer, I thought I was going into my dream job. I was going to get paid to write software. What could be better?
I fell in love with programming in elementary school because it was something I was exceptionally good at, and when I was programming, I didn't have to talk to other kids. I had a lot of trouble relating to my peers, largely because I wasn't surrounded by kids whose parents were also engineers—I was surrounded by the kids of truck drivers, ranchers, and nonengineering professionals. Even if I had wanted to talk to others about my programming hobby, almost no one I knew had any interest or knowledge of computers.
In this kind of an environment, I was encouraged to spend hours by myself with the computer and a few manuals, climbing the daunting initial learning curve that's required to be able to program a computer well. The computer never made fun of my braces to see if I'd break down and cry. It was easier and more rewarding for me to learn how to program than to learn how to relate to people better. The social circumstances I grew up in and the nature of learning to program made me want to be a programmer. I only identified with Matthew Broderick's character in WarGames and Jeff Bridges' character in Tron because I was already like them.
Looking for media influences is not going to answer your question. Look at the psychology and environment that makes someone a good programmer. There is a better answer there.
Jimmy Rimmer, Software engineer, CenterComm; jrimmer@centercomm.com
I am a computer engineer working as an independent consultant, and I've found there is a difference between what the software development process really is and what people have come to believe it is. I remember a discussion with my friends right after Jurassic Park (Part 1) came out. The real fantasy in that movie was not the fact that dinosaurs could be "cloned" by mixing DNA strands from prehistoric, fossilized blood and current amphibians, but the fact that a six-year-old could sit at a Unix terminal and just say, "Oh … this is Unix System …" and begin controlling all of the park's accesses and systems. I agree that this type of portrayal of systems and software by films is detrimental to attracting the right people into software development, but movies would not sell otherwise. Do you think The Matrix would have been such a hit if they had real C++ code scrolling through the screens and real user training sessions instead of "downloading" knowledge to your brain?
Jose Robles, Consultant; j.robles@computer.org
The Art of Enbugging
The minute I laid eyes on the cover of this year's first issue, "The Art of Enbugging" intrigued me. I started skimming the article to satisfy my curiosity and was quickly captivated. Needless to say, I did not let it go until I finished reading it twice. It is a great article—concise, easy to read, and extremely useful.
When software developers switch from using a structured methodology to an object-oriented one, the first thing they must learn is how to write what the authors call "shy code." This is such a simple, natural metaphor that it can be extremely useful and powerful when mentoring experienced structured designers. This is not to say that writing shy code does not apply to structured programming. It should. Often, however, the structured program is not really that structured, and the functions are all gossiping about a global memory that sits naked with no curtains pulled over its windows. Learning to write shy code is not an easy task because the programmer is used to "living" (that is, designing and coding) in a "small town" (application), where everybody knows everybody and where one can't sneeze on one side of town without the other side hearing everything about it. This might be acceptable in small applications, but in today's complex software, it's a sure road to failure.
Another technique I found useful in identifying gossipy objects is analyzing the sequence diagrams that describe their interaction in the system. This technique also ties into Demeter's law described by the authors. Sequence diagrams describing a gossipy object tend to look like a fork. The problematic object sits at the base of the fork while the other objects sit at the end of each tine. This visual pattern is easy to identify, and it is effective. The pattern occurs because the gossipy object keeps prying into the other objects' state, telling one what the other one knows, or using the information to perform an almighty function. Better-behaving objects, obeying Demeter's law, mind their own data and then pass the responsibility to another well-behaved object they have direct relationships with. The sequence diagram describing well-behaved objects looks like a staircase, where each step represents an object's processing followed by a message to the next one, and so on.
Besides writing shy code, there is another effective antidote to software enbugging: software antibugging. We know we enbug our software. No matter how careful or how experienced we are, in the end we make mistakes because we're human. The same way that we take a flu vaccine to prevent flu before we have it, we can write code that detects bugs for us and lets us eliminate them before they cause harm. There are many well-documented ways we can antibug our software, including the use of asserts and class invariants, exception handling, smart pointers, and advanced error-handling mechanisms. Asserts help you code your assumptions, and they tell you when they are wrong. Exception handling is unforgiving and forces you to handle all your errors where they appear, as soon as they appear. Smart pointers prevent resource leaks and use of noninitialized data. Finally, smart error-handling mechanisms can log errors by type and priority, work in release builds, and tell you the exact time and place the error occurred even when the software was running unattended on a user's machine. Because this kind of code doesn't add new features, doesn't correct existing functionality, and in general is invisible to users, it tends to be left out. Antibugging your software costs you resources, but they are resources well spent. They will save you from many long nights in the office trying to fix bugs before the morning release.
Magdin Stoica, Technical lead, Atlantis Systems International; magdins@computer.org
Dave Thomas and Andy Hunt respond:
Thank you for your kind comments. The image of naked global data sitting with the windows open will stay with us for a while. Your technique of using sequence diagrams to identify promiscuous objects is an interesting one, something we'll investigate further. Regarding your point about antibugging: we agree, and promote many of these techniques in our books, articles, and courses.
However, every line of code written is a potential bug, and the more antibugging code you write, the greater the chance that you'll introduce even more problems into your system. As with most things, defensive coding taken to extremes can be dangerous.
When to Make a Type
The column by Martin Fowler ("When to Make a Type") in the January/February 2003 issue was valuable because, although most modern languages let you define new types, this facility is often incompletely understood. I remember being taught that a type is a set of values and the operations on those values. Many programmers define a plethora of new types (sometimes every declaration seems to have its own type!) but then fail to define the relevant operations. The fact that a type has two aspects must be stressed.
I cannot agree with Martin's suggestion that types such as money should be built into a mainstream programming language. Traditionally, certain basic, useful types (often supported by hardware) have always been built in. Beyond that, the programmer should define extra types, as necessary. A well-designed language will make this easy to do.
It is important in software engineering to separate concerns, in this case to separate how things are expressed from what is expressed. It is the language's job to provide the "how." Monetary calculations, albeit important, belong to the "what" category, so these facilities should be provided by defining a Money class. Of course, the compiler writer could provide such a class so as to, for example, target a particular market. In this way, useful types could effectively come with the language without being part of it. Remember that although facilities for monetary calculations are essential to some people, a great many others have no use for them at all—and might well object to being forced to pay for them.
The money example is very helpful. Because every money value corresponds to a (nonnegative) integer, the money type is often derived from a built-in integer type. But the operations on money are not identical to those on integers, as anyone who has tried to use square dollars will have discovered. A feature of a well-designed modern language should be the ability to say (using the money example for dollars), "I want a type that has the values of the integer type, I want it to have the addition operation, but I do not want to be able to multiply dollars by dollars." Another sometimes overlooked concept is the idea of copies of types. Thus, a program that handles both dollars and euros should have a type for each. Although the two types will be identical in that they have the same values and operations, the types are distinct, making it impossible to add 2 euros to 3 dollars while forgetting about the exchange rate.
A little point is that the distinction between ordinal and cardinal numbers, which applies to time and also to ages, is not always realized. Thus 2003 AD is different from the duration of 2003 years that have passed since AD 0. Adding durations to get a longer duration, or adding a duration to an age to get an older age is valid, but adding two ages is probably an error.
It is true that the definition of new types can sometimes seem never-ending. A rule I use is that new types should be defined for data that is visible at the design level but not for data that exists entirely within algorithms. This is, however, a rule of thumb, so it is not uncommon to find preexisting types used for global data and specially defined types for local data. A more complete set of criteria might be

    • Does this data have a range of values different from any other type yet defined?

    • Does it have a set of operations different from any other type yet defined?

    • Must it be kept distinct from other data that has the same values and operations?

    • Last and least: Is the data's scope such that a change to it would affect a large part of the software?

Chris Pursglove, PolyAdic Software; cpursglove@aol.com
15 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool