Two computing practitioners from an Ada shop in New York, Dr. R.B.K Dewar and Dr. E. Schonberg, who are also professors emeritus at New York University, have recently slammed Java as a first programming language. Their article has received quite a bit of attention and created wide discussion.
I think they are completely barking up the wrong tree.
Dewar and Schonberg report some observations, and than jump to conclusions that are not in any way supported by the observations or their argument.
Specifically, they state that today’s students are lacking certain skills (low level programming and formal methods), and then go on to blame the use of Java as an introductory language for this problem.
To state my conclusion upfront: They describe a badly designed curriculum, and then blame one programming language for the education’s problems.
The fallacy in Dewar and Schonberg’s argument is mainly that they imply that the use of Java means that low level programming, such as pointer arithmetic and runtime complexity, is not being taught anymore.
While it is true that this is the case in some institutions, and when present indeed represents a problem, it is not necessarily connected with the use of Java as a first language. True, Java isolates the programmer from direct pointer manipulation, so this is an area that is not practiced in introductory Java courses. However, I have never seen anyone arguing that this means that this material should not be taught anymore.
In many good institutions that I am familiar with, including my own, this material has been moved to other, often second year, courses. It seems from what they describe that this may not be the case at NYU. That is then indeed a hole in the curriculum. But blaming the programming language on bad curriculum design is a simplistic and misleading argument that lacks real insight.
Dewar and Schonberg complain that “the Java programming courses did not prepare our students for the first course in systems”. Well, tough luck. If they expect C programmers, a Java course indeed would not prepare them well. Nobody said that teaching Java would magically produce C programmers. If we want students to be competent C programmers (and we certainly want that for our computer science students, but not necessarily for the multimedia students) then we need to teach C as well.
They are essentially complaining that, now that we don’t teach our students C anymore, they don’t know C anymore. Well, then teach them C before you expect them to use it. But this still does not make any reasonable argument why Java should not be the first language. Java as a first language has many advantages over C (or Ada, which the authors, somewhat desperately, try to present as a necessary modern mainstream language).
The authors state that “Because of its popularity in the context of Web applications and the ease with which beginners can produce graphical programs, Java has become the most widely used language in introductory programming courses”. Here, they fundamentally fail to understand why so many educators have adopted Java as a teaching language.
The truth is, it was neither because of the web, nor because of graphics, but because Java represents a reasonably clean, manageable, nicely designed and very well supported implementation of a modern programming paradigm: object orientation. It certainly represents object orientation, which educators decided they want to teach, better than C, C++, or the authors’ favourite: Ada.
Java allows discussion of the most important concepts of programming, such as abstraction and professional quality program construction, to be done first, so that low level details can be filled in in later courses with other languages.
Java clearly was and is a much better choice of language for introductory teaching than C, C++, or Ada, from the late 90s, when most schools switched to it, to the present day (despite a longish number of detailed criticisms that I might have).
Dewar and Schonberg want us to custom train students for their little niche of applications: security sensitive real time applications. Sure, that is an important field, and computer scientists need to be trained for it. But this is not all there is, as the authors seem to think. The field of computing has grown significantly since Dewar and Schonberg’s view of it seems to have solidified.
The fact is that introductory computing courses are now taken by people from what are (or should be) different disciplines: computer scientists, software engineers, multi-media experts, electronic engineers, applied computing students, network computing students, and a lot more, with differing degree names in different parts of the world. The field has developed.
Some institutions still try to squeeze all in a single degree, but that does not really work anymore. It has passed breaking point. The field has grown so much, we really need different degrees. Everyone of those needs a first programming course. Clearly not everyone of those needs Ada. Or even C.
Blaming the first language for a failure to design a good curriculum where other necessary languages are taught appropriately is a naïve argument that misses the point and isn’t helping anyone.