I disagreed with his points earlier, and I disagree with his points and his reasons now. I think I stand in a pretty good position to do so. As an instructor, I'm not in a position comparable to his, I'm just a part-time high and middle school instructor. Otherwise, however, I am an aerospace engineer. My work covers a wide range of systems for aircraft, launch vehicles, satellites and probes. I've done design work for nuclear systems and man-rated systems. I've been doing safety-critical design work for about 28 years now. My work includes mechanical, electronic, and software design--usually more than one at a time on any given project.
So let's take a look at what the "Anti-Java Professor," Dr. Robert Dewar, is saying this time around.
On a high level, the presentation suggests that universities are turning out what are in essence incompetent script-kiddies by using Java in the core curriculum classes for Computer Science. The claimed cause is all those libraries that Java has, insulating the student from "real" programming.
He states that Java programmers aren't employable because their function is too easily outsourced. Without the ability to create "complex systems" their skills aren't up to snuff. His concept of what makes for a "complex system" seems to be circumscribed by an aerospace/defense mindset, as he says that Java does not appear in complex systems, only in "trivial" applications, using the example of code used in aircraft as the place that Java doesn't appear. C and Ada are called out as being the "suitable" languages.
I've been around long enough to remember when C was the latest hazard to our aircrafts' software. I remember when our poor students were being under served by that poor language, Pascal. When the only thing you found in "real" applications was languages like JOVIAL, PL/I, machine and assembly language. The fuss he makes about Java sounds like the fuss over barbed wire to me. That awful, poorly structured, type-unsafe language C was the end of the world to some. You didn't find it in aircraft, 25 years ago. It had all these library functions--how could the programmer know what lurked inside them? How could you validate code written on top of them?
Well, it's been managed. There are compilers and libraries for C that are validated and well documented. Now there's C code flying.
Does that make C a better language? No. There are plenty of other uses for C, from writing entertainment software to system code, that don't require the same level of code validation as an aircraft's instrumentation, communication, and control systems. C came to aircraft because it was an effective language to code in and for writing maintainable code, and because there was a huge base of labor to draw on.
Java has those advantages, too. Maybe it's time to get a JVM validated for use with safety-critical code? That's all it would take to get Java in the air. In fact, it would seem to me that a validated runtime environment like a JVM with the appropriate low level reliability features could be a safer software environment than one based on C, or even ADA. In most systems that have the space and processing time available (which is many now) a base environment with applications running on top of it is typical. In some cases this is a high reliability RTOS, in others it's a specially-written core system. There's no reason it couldn't just as well be a hi-rel JVM.
There's also the matter of limiting one's perception of "complex systems" to aircraft code. Distributed database systems and transaction processing systems don't count as "complex"? Why, because the programmers don't necessarily have to code every algorithm from scratch? Java has been most successful on the server, in some cases acting as a translation layer between an underlying application and the web presentation (what I'm presuming Dr. Dewar considers "trivial"), in other cases it is far more than that. Enough to be called "complex" in my book. Does anyone die when this code goes down? Not necessarily, but that's not a measure of complexity.
Does Java make programmers incompetent?
Dr. Dewar claims so. He lists some questions he'd use to screen applicants. What's interesting about these questions to me is that I think the average BSCS grad would have had just as much trouble with these questions 30 years ago as they would today. I also consider these questions to be just as narrowly focused as his view of complex systems. Paraphrased, the questions are:
How would you find an error caused by improper compilation by the compiler?
How would you identify and deal with a bug in your processor's implementation?
For as long as I can remember, the undergrad CS program has only required two classes that even address these areas. One class in hardware, one in assembly language. Both are brief introductions to their subject. There isn't time in a one-quarter or one-semester class to get beyond this, particularly when it's first exposure for the majority of students. Operating systems classes are no longer about system level code, and haven't been for a long time. There's far more to talk about in an operating systems class than implementation of program loaders and I/O drivers these days.
The first question could be covered to some degree through the use of a plain debugger. I think the solution he'd look for would be to go a step further, and look at the object code. Usually a student will have gone through these steps at some point in their education, but they aren't, and haven't been, something that is routinely performed in most programming classes at the undergraduate level, no matter what language is used.
Hardware classes at the undergraduate level spend a lot of time on flip flops and gates. The better ones talk about processor and system architectures. Identifying bugs in silicon is usually well out of scope.
There's also this thing called the "internet." Seldom do we have to be pioneers when it comes to using compilers and processors any more. Before, there were long delays in information transfer. Likewise, if you're writing code for aircraft, you're not going to be an early adopter. Looking up known issues is likely to be a far more useful skill than finding them yourself the hard way. And if you're coding for ASICs, well, expect some OJT and experience between the time they hand you your sheepskin and the time you lay hands on code.
When I went through college, we did have some undergraduates who knew this stuff. Not one of them learned it in school. When you came into any class, you could quickly pick out the enthusiasts who made a hobby of computers and electronics. And this group mostly didn't wait until they had a degree to go out and get a job using what they knew.
There are study programs that encourage a lot of hands-on work. And that get students out into internships in a real working environment. These are great, but a student can do the same thing on their own as well. By the time I went to college I had held a job as a programmer, a job as an electronic technician, and had run my own company selling electronic and computer systems of my own design. In the case of universities and colleges, my own feeling is that waiting for a "senior project" to do practical work is waiting too long. The problem is that the curriculum is so filled with classes of low value that it prevents a student from both getting a good GPA and holding down a job relevant to their major. It's possible, but it's a good way to burn out a promising programmer, too. Unfortunately, accreditation boards don't seem to reward institutions for getting students into real work, but they seem to support lots of non-major classes.
An attack on the Java programming language as the source of the job market's ills is misplaced at best. Java isn't the problem. It doesn't make headlines to say that the problem is the same as it always has been: non-major requirements for degrees prevent any depth in a four-year program. The other half of this is a problem that's really no problem at all: Only a fraction of in-major graduates are ever going to be drawn to work that goes beyond creating applications, mash-ups, or other high level code. This is both because of personal preferences and because of the market itself.
The place where a tragedy occurs here is when a student goes through a program, gets a degree, and thinks that makes them prepared for any programming job. The answer to this is the same as it always has been. Students need to get their heads out of the school zone. They need to stop relying on the university as a primary source of education. The resources for self-education are better than they have ever been before. The opportunities for doing real work with either hardware or software outside of school are the best they've ever been. Either as a hobby or in an entry-level job.
This is where the real education happens. It also makes the university education more valuable. Students come prepared to learn, and with a desire to learn, when they already have some background. They get the full advantage of the broadening aspects of college. They can interact with their professors at a much higher level, because they aren't having to learn the language first and have at least a general grasp of the basics. The elements that fit into theoretical frameworks are familiar to them--they aren't memorizing them by rote and hoping that it'll all make sense somewhere down the road.
Among the solutions Dr. Dewar calls for to the problem of unprepared programmers is one on one mentoring. This isn't realistically available for most undergraduates. At least not in school. It is available to interns, however. It's also available to hobbyists who are able to find a group of fellow enthusiasts.
One of the biggest lessons I've learned in my many years of working, interviewing, and managing projects is that specific technical skills are far less important than outlook and character. Technical skills are the easiest skills to teach. They're also the skills that must be updated and replaced regularly.
Actual on the job work that's going to be handed to a new hire is typically going to be work that already has an established procedure. That means it's going to be relatively easy to teach, and that the situations that arise in the performance of that work are pretty well known. That includes the identification of compiler and hardware bugs. What becomes of paramount importance is the team itself and its willingness to share information, and the character of the new worker and their willingness to partake of the information and shared lore of the rest of the group.
This isn't to say that one would want to hire someone to program C or assembly who doesn't have any relevant experience. But it also doesn't mean that using Java makes them a bad candidate.
It may be time consuming to interview candidates, but turning away candidates is an educational process. If candidates are turned away without getting a sense of where they failed in the interview, then it is just a waste. But if they come away knowing that they couldn't answer a battery of questions on handling registers in C, then they have information they can act on.
They may come away with the knowledge that the job was not for them. Or they can buck up, go home, and download a C compiler and some resources, and get cracking. They may end up delivering pizzas for a few weeks, but chances are that'll give them even more incentive to press on with their own studies. In less than a semester, they can acquire the skills to manipulate registers in C, debug object code, and familiarize themselves with the practices of writing hi-rel code. No matter what language they studied in school.
If a student comes out of a university not knowing how to expand their education in this way, then the university has done them a far greater disservice than the use of any language you care to name.
Debating over Java's use in education is a sure headline-grabber these days. It's a cheap and easy ploy, and its intellectually dishonest. Java is not the problem, any more than Pascal was when it replaced Fortran, or C was when it replaced Pascal in the educational curriculum. In each case the universities were making the change to serve their students.
The problem is the wall between education and reality. Some institutions do a lot to get past this, but not many, and even that's no guarantee that a student is going to get any advantage. If students are using Java to write "trivial" web apps, then at least they are on the web. They're getting far more exposure to reality than any student who spent the same time writing a bubble-sort in Fortran, or Pascal, or C. For them the ideas of networking, clients, and servers are like air; it's just there. They start with a lot of knowledge they don't even realize they know. Sure, they can't debug a bad instruction in a processor, but how many CS graduates of the past 30 years could? The ones with an EE minor, maybe.
And if you want to limit your time on interview, put a section in the application for hobbies and magazines read. If you don't see something like electronics, robotics, or microcontrollers in the hobbies or something like Circuit Cellar Ink, Embedded Development, or other trade journals in the magazines they probably shouldn't be on your 'A' list if you want someone who can debug assembly and processors.
I've updated this article to use the proper form of address for Dr. Dewar. At the time I wrote this article I failed to find a reference for this. Since then a reader, "aquerman" has provided me with the appropriate information. No disrespect toward Dr. Dewar was intended through the failure to use his proper title earlier, regardless of our divergent views on the role of Java with respect to programmer incompetence.