It's interesting, yet not unexpected, to see how a next generation computer language becomes everything the previous generation wanted to be.
Java became what Ada wanted to be (write once, run anywhere), and JavaScript has become what Java applets wanted to be (web browser executable code).
Apple Languages
Over the last 15 years, or so, Apple has tried new languages while never giving up their tried and true Objective-C. One nice thing about Objective-C, which eased its learning curve, is that it's a superset of ANSI C. Most any ANSI C code will run in Objective-C.
Apple's current Cocoa API goes back to the birth of NeXT in the late 1980s. In the early days, app kit and foundation kit, which carried over into Cocoa, were the bees knees. If you didn't code on NeXT technology before 1995, then you probably didn't write object oriented code, commercially, until Java was released by Sun Microsystems. If you've ever coded in Objective-C, then you've probably figured out that all of those classes beginning with NS came from NeXTSTEP.
In the late 1990s, Java was too big for Apple to ignore, so they integrated it into WebObjects 3.5. Java and Objective-C are very similar. Java code is almost a one-to-one mapping to Objective-C and the two languages can be "bridged" such that code written in Java can be automatically and seamlessly converted to run in Objective-C and vice versa.
Trivia: The communication between Java and Objective-C is handled by a Java to Objective-C Bridging Specification, aka a .JOBS file. Get it?!?
War of the Languages
The best feature of Java is that it's a strongly (static) typed language.
The best feature of Objective-C is that it's a weakly (dynamically) typed language.
This type of thinking can lead to holy wars between coders as they argued which language was better. To avoid sending mixed messages, Apple made it clear about a dozen years ago that Java was Apple's server language of choice, while desktop apps would be developed in Objective-C.
Apple did experiment with Java desktop apps when Mac OS X was released ten years ago by integrating Java into the operating system. But Java, just like Flash, isn't a good desktop environment for many reasons. If memory serves, I believe Steve Jobs referred to Flash as a CPU hog and Java as a pig. Both Java and Flash run inside virtual machines which is a completely isolated operating system within your own operating system. It's literally the equivalent of speaking to someone else through a translator. The virtual machine must first be loaded and then translate the Java or Flash code into machine language at run time. Compare that to Objective-C which can be compiled directly into machine code that runs natively.
It seems strange that a language developed almost 30 years ago is still cutting edge, today, until you realize that Unix has been around since 1969.
No comments:
Post a Comment