Thursday, April 26, 2018

Apple's Language Holy War

In the late 1990s, we used to joke about language holy wars at Apple. Apple had purchased NeXT, in December 1996, for WebObjects and the NeXTSTEP operating system (which became Mac OS X and was recently rebranded as macOS). Since NeXTSTEP's release, in the late 1980s, the OS was built on Objective-C (a superset of ANSI C that was object oriented [OO]). In the mid-1990s, Java came along, from Sun Microsystem, and it quickly became the first, mainstream, OO language leading to a holy war between Objective–C and Java.

WebObjects was originally written in Objective-C, but, by WebObjects version 3.5, in 1997, it was fully bridged with Java using the cheekily named JOBS (Java to Objective-C Bridging Specification). A WebObjects developer could write code in Java and most every Java object had a corresponding Objective-C object wrapped and running in the background.


Java vs. Objective-C

Around WWDC 2000 or 2001, Apple settled the holy war by stating that Objective-C would be used on the client (Cocoa desktop development) and Java would be used on the server (WebObjects server development). But we'd still argue about the pros and cons of the two languages.

The strength of Java was that it was a strongly typed language. The strength of Objective-C was that it was a weakly typed language. So, the pros and cons were subjective. It really depended on your needs. Objective-C would let a developer "touch the metal" meaning a developer could write code to interact with a computer's low level memory. This is very powerful, but it requires a lot of responsibility on the software developer's part since they'd have to manually manage their program's memory usage by using pointer arithmetic. Pointer arithmetic allows a developer to directly touch values in the memory of a computer. If the developer makes a miscalculation, such as terminating a string incorrectly, it could cause the program to crash.

The selling point of Java, which was very similar to Objective-C, is that it didn't use memory pointers. Instead, Java code ran inside a virtual machine that acted like a sandbox between the executable code and the operating system. Since Java couldn't directly touch computer memory, it used references instead of pointers. The big joke in Java was, if you tried to call a method on an instance variable that was null, you'd throw a NullPointerException, which was a poor choice for an exception class name since Java didn't have pointers. That message class should have been NullReferenceException. An excellent solution for avoiding these bugs, called optionals (option types), has been implemented in Apple's Swift programming language, three years ago. But I digress.

Since Java ran inside a virtual machine, it was a little more complicated to talk directly to the OS. For example, if you needed to access the computer's file system then you probably shouldn't hard code something like "c:/ProgramFiles/tmp" since that wouldn't work if your Java code ran on a Mac (c:/ is the path to the main hard drive on a Windows computer, whereas macOS doesn't care about the physically drive, but, rather, the medium being accessed with a path like "/Volumes/Macintosh HD/Users/jmoreno/tmp."

Since the path to a file or folder (directory) on each OS was different, it required the software developer to use global variables that the Java virtual machine populated when it started up (on Windows, it's "c:/" and on macOS it's "/Volumes/Macintosh HD."

Not hard coding OS paths requires a bit of discipline, but it keeps the software developer honest and prepared if their code needs to run on a different OS than was originally intended. This type of discipline was key, in 2005, when Apple switched the Macintosh CPU from IBM's PowerPC chip to Intel's CPU. Without the public realizing it, Steve Jobs announced that Apple had been secretly developing Mac OS X for both CPUs and the time had come for Apple to switch to the same CPU that Windows ran on. This had the side effect of allowing Windows to run natively on a Mac using Apple's Boot Camp utility software.


Code Reviews

I have written a lot of sloppy code, in my time, and I discovered that group code reviews, weekly or biweekly, were a great help. This was the place where we could show off our code to the rest of our team; and the rest of the team could question anyone on the code they wrote. Most teams usually don't go out of their way to review someone else's code if it works as expected. Typically, it's not until a particular software developer has left a team when someone else has to read and review the departed team member's code. This can raise a lot of questions as to what the original purpose of the code was.

Good coding practices and discipline will pay dividends years down the road, so take the time to do it right. If not now, then when?

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.