I recently discussed with a software developer regarding the proper education of software programmers, relating to several recent articles on the topic published on Embedded.com: "Students need to learn multi-multiple programming languages," by Greg Gicca, and "The education of embedded software engineers," by Robert DeWar.
In the course of our conversation, he referred to a recent article in the November, 2012 issue of IEEE Computer Magazine "Debugging on the shoulders of Giants," as an example of how computer programming should be taught—not driven by the specifics of learning a particular programming language, but with the aim of teaching general principles that can be applied in any computing environment and where no one language is viewed as better or worse than another, but is simply a tool created to accomplish specific tasks.
Much of the IEEE article was about the effort at the U.S. Air Force Academy to duplicate the original IAS computer (aka the von Neumann architecture) designed—and programmed—70 years ago by a team of scientists and engineers led by John von Neumann at the Institute of Advanced Studies in Princeton, N.J. The USAF project was part of an effort to create materials and tools for a course at the U.S. Air Force Academy on "Great Ideas in Computing." This included IASSim, an emulator for use by college freshmen to help them program in IAS assembly language.
In the process of creating the course building blocks, the authors – Barry Fagin and Dale Skrien—had to go back to the original documentation for the IAS computer convert it into machine readable form for the emulator, recreate the original programs written for the IAS computer and then execute the programs.
The article chronicles the steps they went through, the programs they ran and what they learned about how Von Neumann and those on his IAS project team created the architecture, the instruction set they designed and the programs they wanted to run. In the process they discovered how the original team stumbled upon many of the tools, procedures and methodologies that are commonplace now, or in their absence how they created debugged the code manually, the errors they made and why and how they found them.
Of all the many scientists and engineers of the last hundred years, John Von Neumann is the one who I hold in the most awe, not just for his sheer intellect, but for the many areas of science and technology he was interested in and to which he made significant contributions: not just in computer science, but mathematics, quantum mechanics, fluid dynamics, economics, game theory, genetics and the structure of DNA, self-replicating machines and statistics.
Beyond the breadth of his interests and the influence of his ideas was his ability to move back and forth between the abstract arena of scientific investigation and the hands-on aspects of applying those ideas in the world of engineering.
His ability to easily move back and forth between the two worlds of theory and application was brought home to me as I read this article on what the authors found out about the problems Von Neumann and the Princeton team faced and how they solved them: I/O limitations, memory and instruction format, number representation, self-modifying code, the pros and cons of formal methods, instruction set design, and orthogonality.
But what most impressed me were the efforts he and his team made at debugging the code they developed, despite the problems they faced and the fact that they were creating the tools and procedures they needed "on-the-fly," as situations developed—all without the tools at hand now for the programmer.
Von Neumann and his team wrote 15 programming problems to run on the IAS computer ranging from relatively simple ones involving algebraic expressions, parameters and subroutining, iteration, BCD to binary conversion, sorting and merging lists and double precision sums to more complex ones such as Newton's method for calculating square roots and Lagrangian interpolation.
Given they were breaking new ground and some of the tough mathematical problems they attempted, the number of errors that Fagin and Skrien found was surprisingly low: seven programs were error free and in several others the errors were typographical in nature. And the errors found in a few of them owed as much as to the mathematics involved and how to represent that in the code as to actual coding errors. According to the authors, because the target machine implementing the IAS architecture was not built for another five years, in the early 1950s: "the relatively small number of errors in the code is quite remarkable."
Aside from giving me yet another reason to admire Von Neumman, what impressed me as I read the article was how closely the USAF "Great Ideas in computing" course seems to reflect the approach espoused by Gicca in his article: "Understanding just a single language promotes solutions that only approach a problem from a single perspective," he writes. "Knowing multi-multiple languages allows the problem to be looked at from a variety of perspectives so that multi-multiple solutions can be compared and the most natural solution for the problem can be selected."
The closing paragraph of the IEEE Computer article is also worth thinking about: "Our exploration into the IAS machine makes us wonder if some sort of exposure to older machines makes sense for future computer designers. After all, those who do not learn from computer history are condemned to repeat it."