Brad Cox |
According to this Wired article Objective-C is now the third-most popular programming language on earth.
This is both surprising and sad.
Objective-C has been around a long time (at least in computer years) - probably since around 1981 or '82. It was created by Brad Cox after working with Smalltalk at ITT.
I have written here about Smalltalk before - created at Xerox PARC and forerunner of virtually everything you might use today in terms of a computer via the Alto Computer.
C, the number one programming language on earth according to the article, is not "object oriented" - hence its not as attractive as other programming languages for doing certain types of programming. C++ exists as an alternative to C (C++ was originally designed as a conversion process, like Objective-C, that converted a program written in C++ to C before actually compiling).
I recall using Objective-C in the late 1980's - it was crufty then and its still crufty now.
The worst thing about all this is that Apple (and possibly Next Computer though I do not know for sure) had to make Objective-C work with something called "the event loop" and "memory management."
In modern computers you can think of the event loop as the part of the computer that waits around for something to happen - the mouse to move, the network to deliver a packet, a key to be pressed. The event loop then takes information about that "event" and passes it on. Programs can have multiple event loops, i.e., parts of programs that wait, for example, for specific kinds of events.
Objective-C, until recently, had what's called a "manual" storage pool for memory. Program use memory to store things, e.g., the text I am typing. As I type I need more storage so I might, for example, allocate a larger chunk of memory than I am using, copy what I am working on to it, and "release" the old storage. So over time I am allocating and releasing memory.
Much like a chemistry lab with a limited amount of glassware that gets dirty, eventually you run out of glassware (memory). Then the dirty glassware must be washed and reused. With Objective-C as implemented by Apple you must make explicit calls to allocate and free memory.
The only problem is that there are archaic, hard to keep track of manual mechanisms to understand whether or not a particular piece of glassware is in-use as opposed to simply dirty and ready for washing.
The end result is that its easy to either A) allocate memory and set it up inadvertently so that its never reclaimed (dirty glassware that's no longer needed but never gets washed) or B) allocate memory so that it get reclaimed before you are done with it (glassware that gets grabbed for washing in the middle of its use in an experiment).
C++ has similar issues but the access to memory allocation and release is explicit - you always know when you are doing what.
Java has "automatic garbage collection" which means that the Java runtime environment does the right thing all the time by correctly tracking how a given chunk of memory is being used.
(To be fair Apple has release a new version of Objective-C that has automatic memory allocation like Java - but so far I have found this unsuitable for commercial software.)
In any case you get used to the Apple model eventually and you most have to worry about correctly managing chunks of memory only where the get allocated and freed frequently.
But, as they say, come on Apple! Its the 21st Century!
C++ was never designed for UI-based programming (though it can be used for that). Presumably NextStep (the basis for OS X and iOS) was...
In any case the problem with Apple can be thought of like this: Imagine a series (dozens) of labs where each lab has its own glassware with can only be washed in its own lab - it can be used anywhere in the building - but only washed in the lab its associated with without breaking it.
Now the assistants in each lab who wash the labware cannot tell if it belongs to their lab until they wash it - and if they do and its not theirs it breaks. All the labwere looks the same (just like memory allocated anywhere in Objective-C).
There are no marks on the labware to tell what lab it goes with.
Now imaging researchers running around between labs bringing pieces and parts of their experiments and lab warewith them. Soon someone will forget to return a beaker to the lab it belongs to and "boom" - when an assistant goes it wash it in the wrong lab the glassware will break.
So what Apple has done is, instead of using a modern self managing memory system, develop an elaborate set of rules and tools to manage things.
Legions of "inspectors" that run around constantly accounting for lab ware and its location. A giant over head of bureaucrats and functionaries to help "manage things."
Special tools that attach tracking hidden dongles to each piece of lab ware so that an complex computer can tell you if anything has been misplaced.
To further compound the problems imagine that in each lab each researcher wears a protective suit so you can't tell if they're from your lab or another.
To me it always seemed like the "nanny state" for programming and programmers.
No comments:
Post a Comment