|
|
Subscribe / Log in / New account

The New Age of Programming

June 23, 2004

This article was contributed by Joe Klemmer

A long time ago in a Galaxy far, far away I was a programmer by profession. This was in the days of Mainframe COBOL programs. Over the years I have gradually found myself becoming a Systems Administrator. While I do less "real" programming now, I have picked up a number of different languages like perl, php, shell, C/C++, etc. Recently I have found myself thinking about the differences in programming languages and it seems to me that there are two basic kind of languages. For lack of better terms I'll call them "Standards Based" and "Internet Based". These two branches have some interesting differences that might not be apparent at first glance.

What do I mean by Standards Based languages? These are languages that are generally defined by ISO standards committees. For the purpose of this piece we'll consider the following languages to be in this category: Ada, C, C++, COBOL, Pascal and SmallTalk. As for Internet Based languages, we'll use Java, Perl, PHP, Python, Ruby and Tcl/Tk. So what is the significant differences between these two camps? What are the advantages and disadvantages of using one or the other?

One of the advantages of the Internet Based languages is the fact that they are languages that have grown up and proliferated on the Internet. Being designed and built to work in the online world, they can easily do things that other languages can't, or must be shoehorned into doing. Even though C and C++ are quite capable of dealing with the 'Net, they aren't as at home as, say, Java. With the Internet Based languages you can develop and implement a system or application in a much easier fashion than with the others. The developers of these languages also had the advantage of being able to learn from the older languages, making what had previously been difficult much simpler. There are also some down sides to the Internet Based languages.

The most noticeable one is, ironically enough, one of their considered strengths; the speed with which they evolve. The language definitions for PHP, Python, Ruby and even Perl and Java are done at light speed. It's lucky if a language lasts for a few years before being massively updated. While this is fun for developers doing small, cutting-edge work and those doing R&D, it's not so good if you need to build an application for a large production system. I've been involved with a massive online system that's been built in Java. The application works fine, but the amount of work that the developers have to do to maintain and enhance it with older versions of Java is not insignificant.

With the older languages there's much more long-term stability. You don't find major changes in the language definitions happening at such a fast pace. A program written in Ada or C++ twenty years ago will still compile and run on today's platforms. These languages are not stagnant, however. The current C standard is C99 and it may surprise you to know that the most up to date language is COBOL with the current standard dated 2002. Standards Based languages, by definition, are standardized and stable. This does make for slow adaptation to changes in the IT/IS world, such as the development of WiFi and other new technologies. This adaptability/stability aspect is, as I said earlier, both an advantage and a disadvantage for each of the different models of languages.

Programming languages are tools. Different languages have their own strengths and weaknesses. Most seasoned developers have an idea of these issues. However, the speed of a language's evolution is often an overlooked aspect. Sometimes, slow and steady is better than fast and new.

Index entries for this article
GuestArticlesKlemmer, Joe


to post comments

an exageration?

Posted Jun 24, 2004 5:17 UTC (Thu) by mbp (guest, #2737) [Link]

A C++ program from 20 years ago would have been from very early indeed in that language's development, before the first release of the standard. Would such a thing still build on today's compilers? It might, if it were in the C subset, but I don't know if a complex program would work.

http://cm.bell-labs.com/cm/cs/who/dmr/chist.html

Tim Bray describe Java as "the next cobol". I think that's pretty accurate, in both a positive and negative sense: it'll probably still be running in twenty years, but it's not exactly fun.

http://sourcefrog.net/weblog/software/languages/java/next-cobol.html

The New Age of Programming

Posted Jun 24, 2004 8:25 UTC (Thu) by ekj (guest, #1524) [Link] (3 responses)

I am sorry to sound negative, but does this article actually have a point ? Beyond the pompous title you actually end up saying very little.

Some languages are defined by ISO standards, others are not. Those that are tend to be older languages evolving slower than those who're not.

That's more or less your entire article in one paragraph.

Then you flesh it out by adding lots of blabla that we've all heard a gazillion times already, and that doesn't really inform anyone about anything. Cute-sounding but ultimately useless stuff like "languages are tools", which is obvious to anyone, or; "Different languages have their own strengths and weaknesses" which just says that different languages are different.

The New Age of Programming

Posted Jun 24, 2004 10:27 UTC (Thu) by nix (subscriber, #2304) [Link] (1 responses)

<blockquote>
Some languages are defined by ISO standards, others are not. Those that are tend to be older languages evolving slower than those who're not.
</blockquote>
... and some languages, like Scheme, started out with a standard and then proliferated onto the Internet. (C++ has done this too, to an extent: see the Boost project.)

The New Age of Programming

Posted Jun 24, 2004 10:28 UTC (Thu) by nix (subscriber, #2304) [Link]

... and some people are too stupid to hit the 'HTML' button.

Sorry. :(

The New Age of Programming

Posted Jun 24, 2004 15:33 UTC (Thu) by teddylwn (guest, #4318) [Link]

That is indeed too negative, but you do have a point:
it is surprising to see such an article on lwn.
But if the author of the article would post the source of a
20-year C++ program I would gladly forgive his lack of point :-)

languages vs libraries

Posted Jun 24, 2004 9:05 UTC (Thu) by davidw (guest, #947) [Link] (3 responses)

A flexible programming language will adapt to new situations and needs via libraries. If the core language is small and flexible (think C) then it's easy to keep reusing it for new things that the original designers never considered.

That's one of the things I really like about Tcl - a very simple core, with an extremely flexible syntax (control structures, for instance, are regular commmands). In other words, you can add new constructs to the language... like lisp, to some degree. Python has a nice small core, but OTOH keeps getting bits of syntax welded on to it (list comprehensions, for instance).

The ability to repurpose is something I don't like about PHP. It's very firmly wedded to one particular niche, where it happens to work well, but there is nothing about the language itself that is very exciting, or was any better than other languages that could have been used for the same tool.

In closing, Java(TM) is not an internet language, nor a standards language. It is a proprietary Sun Microsystems product.

languages vs libraries

Posted Jun 24, 2004 14:44 UTC (Thu) by khim (subscriber, #9252) [Link] (1 responses)

That's one of the things I really like about Tcl - a very simple core, with an extremely flexible syntax (control structures, for instance, are regular commmands)

This may be so but when I see "simple core, flexible syntax" but no compatibility between versions (in any direction), no support for number of packages on new versions and no support for number of packages on old versions "disaster" is the only thing that comes to my mind. Try to find Debian's discussion on TCL and you'll see that it's even worse then Perl in this regard...

languages vs libraries

Posted Jun 24, 2004 16:21 UTC (Thu) by davidw (guest, #947) [Link]

This is FUD. You may not like Tcl, but please stick to the facts. Tcl is quite backwards compatible, supports binary packages through several versions without recompiling, and has zillions of packages for it.

Tcl's big problem is that it is no longer cool, and so people like to rag on it. Not that it doesn't have defects, but they aren't the ones you listed, - you appear to just be repeating something you read second hand.

languages vs libraries

Posted Jun 24, 2004 14:49 UTC (Thu) by mmarsh (subscriber, #17029) [Link]

A flexible programming language will adapt to new situations and needs via libraries.

Amen, brother. I do most of my heavy-lifting programming in C++, since that's what I used primarily as a grad student cutting my teeth on sizeable chunks of code. Consequently, I've built up a set of libraries to handle such things as networking, setting up distributed systems, automatic ASN.1 serialization, and event-based concurrency. Could these have been included in the language? Sure. I could use Java and remove the networking and serialization libraries at least. The fact is, however, that I don't have to. The packages only have to be written once, and let's not kid ourselves that Java's networking interface is a logical consequence of the language design—it's a package that just happens to be included in the distribution and defined as part of the "standard".

Thanks for the feedback

Posted Jun 24, 2004 19:09 UTC (Thu) by X-Nc (guest, #1661) [Link] (4 responses)

Hmm... I guess that I just wasn't as clear about what I was saying as I thought I was. It is interesting that most of the comments are focusing on the trees and missing the forrest, but again, that's likely my fault. My intent was to write something that would be fairly non-technical so that non-programmers or new programmers could follow it. Should I try and be more technical in future articles?

What kind of topics would readers of this section find interesting? Reviews? Point-of-view? Historical? I'm very interested in any feedback.

Joe

feedback

Posted Jun 24, 2004 20:35 UTC (Thu) by nicku (subscriber, #777) [Link] (1 responses)

Many LWN subscribers are probably quite experienced programmers, and I would imagine that there would be a fairly small number of non-programmers here. Yes, I think it's right to say that we like technical stuff.

feedback

Posted Jun 25, 2004 0:52 UTC (Fri) by X-Nc (guest, #1661) [Link]

Can't argue with that. I'll try and make future articles more technical in depth. FWLIW, I have been writting for beginning or non-techie readers so much recently I have to practice on writing more meaty articles.

Thanks.

Thanks for the feedback

Posted Jun 25, 2004 11:23 UTC (Fri) by teddylwn (guest, #4318) [Link] (1 responses)

I don't think the problem is "technical" versus "non-technical".
The front page of lwn is sometimes technical, sometimes not, but it gives
a remarkably in-depth and concise account of events related to linux.
And the reader comments often start with "well said".

Your article reminded me of some of Jerry Pournelle's columns (sorry if I
misspelled his name) in Byte magazine in the mid-80'. It's entertaining,
but does not have real content. Not the kind of article I subscribe to lwn
for.

There are more than enough things going on in the "development" community,
some of them are not easy to follow from the project's web sites and
mailing lists. For example, I was disappointed by the lack of coverage of
the XFree86 - Xorg transition in X11 development.
Especially because there is little information filtering out, there is a
real opportunity for investigative journalism. Lwn could provide a real
service to the community by providing in-depth analysis on a few chosen
topics on the development page, in a way similar to the kernel page.
It does that from time to time already, for example I found the coverage
on SSA in gcc was excellent, but definitively not every week.

Thanks for the feedback

Posted Jun 25, 2004 15:22 UTC (Fri) by X-Nc (guest, #1661) [Link]

> Lwn could provide a real service to the community by providing
> in-depth analysis on a few chosen topics on the development page,
> in a way similar to the kernel page.

I'm sure that the LWN editors would love to have the kind of in depth coverage for all their sections that the kernel page has. Jon is deeply involved with kernel development so it's fairly easy for him to write about it. The other LWN editors, it is my understanding, are in a much less techie position and don't have the time to spend digging deep into the guts of some project or distro. As for the contributing authors, myself included, I'd love the chance to get into real meaty subjects and dig deeply into projects & distros. However, I only have a few minutes a week to devote to writing. I have to work my "day job" in order to pay rent. LWN can't afford to pay my salary (meagar that it is).

I am going to try and do more in depth stuff, especially on distros and various app/util reviews. My only other strength is that I know a lot of the history of Linux. I started with in in November 1991. A while ago I'd writen a loose, fluffy piece on the Linux history that I have experienced over the years. I might try and clean it up and flesh it out to be something solid. Maybe. One of these days.

The New Age of Programming

Posted Jun 24, 2004 21:29 UTC (Thu) by mly (guest, #2171) [Link] (3 responses)

It seems to me that the article writer makes a serious mistake bundling a lot of diverse languages in two groups like that, and then draw general conclusions about all languages in each group from clearly limited experiences of odd samples in each group. It's only misleading.

While C++ has been defined in standard documents, it took a lot of years to decide on a standard, and more years for compiler vendors to conform. For instance, compilers still don't agree on whether...

for (int i=0; i<5; i++) {
// whatever
}

...means that i is a local variable inside the for-loop block or outside it. Many programmers still avoid C++ beacuse they consider it too difficult to write maintainable cross platform programs with it. Compilers are too different.

It seems to me that people who need to work in cross platform environments often prefer C or Python.

I don't have a lot of experience with Java, but I've certainly heard some horror stories about upgrades and incompatibilities there.

I agree that e.g. Python is developed at a much faster speed than COBOL or C++, but I've rarely experienced that ten year old Python programs fail to run in the latest interpeter. The Python community members maintain a lot of code, and are very reluctant to cause maintenace problems for themselves. I think it's just the same with Perl, Tcl etc.

Backward compatibility

Posted Jun 25, 2004 18:33 UTC (Fri) by giraffedata (guest, #1954) [Link] (2 responses)

>I've rarely experienced that ten year old Python programs fail to run in the latest interpeter.

But remember to consider the other direction of compatibility: Have you seen a recent Python program fail to run on a ten year old interpreter?

I distribute a small amount of software in Perl. I frequently break it by using a new Perl feature that my users' old systems don't have. Neither the interpreter nor the documentation help me know when I'm using a new language feature that my users might not be able to handle.

By contrast, I distribute a large amount of C/libc code, which I test with gcc -ansi. I virtually never have the same problem.

So I will tend to stick to C and Bourne Shell for software that is for the masses.

I think a fundamental point of the article is that in the one camp, you have people who not only can but must use the latest and greatest technology; and in the other, you have people who are stuck in the past, which is sometimes a good place to be.

Backward compatibility

Posted Jun 26, 2004 1:06 UTC (Sat) by mly (guest, #2171) [Link]

>> I've rarely experienced that ten year old Python programs fail to run
>> in the latest interpeter.

>But remember to consider the other direction of compatibility: Have you
>seen a recent Python program fail to run on a ten year old interpreter?

I don't think I've heard of anyone still using a ten year old Python interpreter, but sure, a program using new Python 2.3 features won't run on a RedHat 7 box with Python 1.5.2 for instance.

You have the same kind of problem with C++. For some reason, C++ was placed in the same camp as C. There is now COBOL for .NET and OO COBOL. I'm pretty sure programs written for these systems won't run on any old IBM either. The only Pascal dialect in wide use is Delphi's Object Pascal. How compatible with anything else is that? If the article autor wanted to say that recently invented features can't always be used on old platforms, I agree, but the bundling of languages into two groups like that has little to do with this problem.

If you rely on any third party libraries, you probably have the same issue with C. At least if the libaries don't come as pure ANSI C source code with no dependencies on other system components. The typical C way to avoid this is not to use third party libraries. Instead every programmer reinvents the wheel and writes his own undocumented libraries for everything. (I'm only exaggerating a little here. ;)

It all depends on who your target users are, what kind of features you need, and how you deploy your software. It's only for certain subsets of computer users you can expect C source code and shell scripts to be useful as a deliverables.

You can't really expect a Python runtime environment to be installed everywhere anyway. You certainly can't ship simple Python scripts to the typical windows user, and expect them to run them. One approach for Python applications is to deliver the runtime environment and needed libraries with the application, and install it in a special location for just this program. (Disk is cheap, right?)

In a Linux environment you might expect people to be able to get Python if they don't have it. If you develop your programs using Python 1.5.2 and avoid a tiny number of things that changed since then (assert and yield became keywords) you should be pretty safe. That's still a much less primitive tool than ANSI C and shell script.

The problems in deploying software in a constructive way can't be solved in such as simplistic way as to divide programming languages into two broad groups and to select anything from one of the groups. For an insightful analysis of this problem I recommend Luke Hohmann's Beyond Software Architecture: Creating and Sustaining Winning Solutions http://www.amazon.com/exec/obidos/ASIN/0201775948/

Backward compatibility

Posted Jul 1, 2004 2:41 UTC (Thu) by roelofs (guest, #2599) [Link]

But remember to consider the other direction of compatibility: Have you seen a recent Python program fail to run on a ten year old interpreter?...By contrast, I distribute a large amount of C/libc code, which I test with gcc -ansi. I virtually never have the same problem.

But 10 years ago, there were still a lot of pre-ANSI compilers floating around. SunOS, for example, shipped with a bundled compiler ("cc"), so a lot of people never bothered to install gcc. Maybe I'm stretching things a bit, but then again, maybe you are, too. ;-) Shift that 10-year window back a few years, and you'll land right in the middle of the K&R/ANSI changeover.

(Believe it or not, this exact issue came up recently: should Info-ZIP finally toss out K&R compatibility--which has some serious uglification issues--or nuke it in favor of more readable code? I favor the latter, but some folks think any sacrifice of portability is bad.)

Greg


Copyright © 2004, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds