Programming Languages: Is newer always better?
I constantly hear the belief that modern programming languages and environment are better than older programming languages. More productive, easier to user, and so on. It would stand to reason: nobody would make a new programming language with worse features than an already existing programming language. Or would they?
Everyone seems to think that this is fact. But surprisingly it's not. There are many features in older programming languages which are not present in today's languages. I predict these features will be re-invented by the next generation of programming languages authors, and everyone will think they are geniuses for having come up with these ideas. But at the same time those new languages will omit most of the good points of today's languages. This cycle can go on forever.
It's like the cycle that tends to take place of "the network" vs "the standalone computer".
- Central – IBM used to make mainframe computers, which one would access from terminals, i.e. central computing power, distributed usage.
- Local – But those computers were slow because they were remote. Then e.g. Sun invented the "workstation". The PC then followed. Local power to everyone.
- Central – Then the web happened. Suddenly everything was remote again. "All you need is a browser!". No local software installation nightmare. (Perhaps) independence from the single operating system vendor.
- Local – And now "using the web offline" is back in fashion. So that'll be local computing again then.
A few facts, for those who think there was no programming before Javascript, the web:
- 1957 – Fortran released: expressions, variables, loops, subroutines
- 1959 – LISP released: treating functions as data, enabling higher-order programming
- 1967 – Simula 67 released: Object-oriented programming
Consider the following:
- Variable Bounds. Ada, developed for the American military, with high emphasis on program correctness, allows one to define bounds to variables. For example "array with index between 1 and 100" or "0 and 10" or number "not more than 5". Most variables, in reality, have allowed ranges. Why not express it in the program, it's more self-documenting and it allows the run-time, and to an extent the compiler. to check the constraints. Isn't minimization of bugs something that affects not just the military?
- Strict typing. If you know an object being passed to a function is a "User", it's no good being passed an "Email Address". The set of operations those objects can perform are completely different, so even if the programming language is "advanced" enough to be able to accept the parameter, the first method call to the object will fail. Why not express that and let the compiler check that. C++ can do it (since 1983) so let's use that not Perl which can't do it. Recently I read an article making a joke about casting everything to a string, but in reality that's the default behaviour (in fact the only behaviour) of all scripting languages.
- Knowing what's going on. In C, it's well defined what "0" means or what the string "abc" in a program means, and so. Ask a C programmer if 0==NULL and as a PHP programmer if 0==null and see a) their reaction times b) if they're correct. The C programmer will know fast and be correct, the PHP programmer will not. Who do you think writes programs with fewer subtle bugs?
- Enumerated types. Is a user "active", "disabled", "inactive"? Having such options are common to all domains. C can define an enumerated type since ANSI C (1989) and Lisp since 1959. Why did Java have to wait until Java 5.0 (in 2004), and why do we have to create unreadable programs with languages like Ruby which can't do them at all? For example what does the function error_log("user not found", 2) do in PHP, what does the 2 mean?
- No compiler. Every byte in an interpreted language costs time to interpret. So it makes sense to have short variable names, fewer comments, for run-time efficiency. Is this the sort of programming style one should be encouraging?
- No linker. You can build big libraries in a linked language, and only those functions used by the program (or used by the functions used by the program) will be included in the final executable. In Java, PHP etc, all the code you use is available all the time, taking up memory. I am often criticized for writing "too many libraries", or code being "too object-oriented" in scripting languages, which is a fair criticism, as that code will run slower. However is it really an improvement to remove this function-pruning feature, which means bad programming practices will produce more efficient code?
- Multiple compile errors. Why do modern programming languages such as PHP only tell you the first error in your program, then abort? This is laziness on the part of the compiler writer. Old compilers tell you all the errors in your program, so you can correct them all, without having to correct one, retry, correct next one, retry, and so on.
- Formatted strings. There is nothing wrong with the format concept behind C's "sprintf" command, originating from 1972. You can print numbers, strings, specify precision, field length and so on. (Apart from the inability to reorder parameters.) Why did C++ introduce the
<<
notation? (At least you can still use printf in C++). Why is this re-invented, worse, in .net? Why did Java have to wait until Java 5.0 to get this feature? Why do we have to reinvent the wheel (worse) all the time? - Auto-creation of variables. When programming languages like C were created, the authors made the decision that it was an error to use a variable without declaring it. This caught all sorts of errors such as misspellings of variables. Why have these decisions been forgotten, and every scripting language allows you to just use variables without declaring them? This means hours of searching for bugs when you simply misspell a variable name, something that's going to happen to everyone at some point. We're only human and we have to take that into account.
The above is a list of things that have got worse over the last 2 decades, I.e. they haven't just not got better by staying the same, but these things have actually got worse.