I have spent the last couple of days trying to prove to myself that I had not gone insane...
(Skip this if you are not interested in computers and programming but take away this: Imagine if one day you discovered that your car would not started if you were wearing your red jacket. Of course the red jacket itself was not really the issue and the fact that, unbeknownst to you, someone accidentally or unkowningly left a jammed key fob in the pocket that messed up the cars ignition system was. You would think you were going insane because how could the car know what jacket you were wearing? The point being that the unknown key fob is making it appear that the jacket is the issue. This is a story about how you eventually figure out the real problem...)
I have been working with Apple software to develop an iPad application. Apple software is based on something called Objective-C developed by Brad Cox. This is a holdover from the days that Steve Jobs was at NeXT computer. NeXT was a company Job started after leaving Apple in the late 1980's.
Objective-C is an unpleasant version of the language C. C is a programming language popular since the 1970's and C++ is its main evolutionary branch for language advancement.
Unfortunately NeXT began using Objective-C and when Jobs returned to Apple he bought NeXT and used it as the platform for what is today OS X. Objective-C is an "object oriented" language - which I will not explain here - that is based on the concept of sending "messages" to "objects".
So conceptually you can think of a "door". I can tell the door to "open" or to "close" by sending the "door" a message. This, of course, is all just fine and dandy. The problem with Objective-C and Apple is that in order to make software development work on their platforms they have to tinker with the machinery associated with just how this simple concept works...
So in Objective-C you might send the door a message like this:
[ door close ]
The "door" is the receiver and "close" is the message. I might create a "door" like this:
door = [[Door alloc] type: wooden ];
What this does is allocate a new "door" object and initialize it to being of type "wooden".
In this model the "door" holds a value which represents the door object in the computers memory.
The first problem comes in when the alloc message, which causes a new door to be created, fails.
Objective C represents this by setting the door's value to something called nil. So, perhaps we are out of doors, and instead of there being some kind of exception or error we just get nil.
The next issue is that in common usage Apple software doesn't care much to check to see if things have failed so instead of saying something like:
if (door == nil) { ... error ... }
they just rely on the fact that in Objective-C you can send any message you want to nil. When nil receives a message is simply does nothing at all.
Unfortunately this seems to be a life-style choice for Apple programmers. Over the years Apple has expanded the Objective-C concept to something called Cocoa. Cocoa and, to some extend, the iPhone platform, rely on Objective-C along with a lot of enhancements and additions to make their software function. So the notion of things simply ignoring situations that are "wrong" is now embedded into these platforms.
So in Cocoa you have a much more rich and complex system for dealing with things like windows on the display, disks and networking. This concept includes something called delegates. A delegate, like my example of the "door" above, can receive messages. So, for example, you might have a mouse on your computer.
In the code in the computer that deals with the mouse you might have a mouse object. That mouse object might assign a delegate to receive messages, e.g., the fact that the mouse moved.
You might ask why have a delegate for a mouse when the mouse should be handling things on its own.
The reason for this is that most modern computer systems like Windows and OS X and Linux have something called "event loops" in them to monitor the computer's activity. The event loop monitors whatever the computer is doing, for example, it watches the information that the mouse is sending to it as it moves (perhaps over a wireless connection or through a USB cable). If a mouse object in Objective C has asked the computer to notify it about things concerning the mouse then the event loop will take mouse information and send it to the mouse's delegate.
Now at least to me this seems wrong. Why not just tell the mouse what to do? Why bother to create software that requires the mouse to have a delegate to do its work for it?
What's even worse in Objective C is that you need to tell your code that, when you are creating a mouse object for example, the mouse must be prepared to officially delegate notifications. That way the rest of the computer knows what to do when mouse changes occur.
What made my life miserable for the last day and a half was that some code I had been working on did not have the delegate part yet it continued to work. Since I used some code from Apple as a starting point I foolishly assumed that what was it it was right - and it was to a point - but not to the point I was relying on.
Normally when you write good code you put all of the things that are relevant to something, i.e., like delegation, together in one place. That way when someone looks at the code they can see the intention.
Since 90% of all software costs have to do with support and maintenance and not with writing new code making it clear what your intention was when you wrote the code is key. But in the case of Apple I think that the sloppy Objective-C model has done them wrong.
In the code I started with everything but the information about the delegation was in one file. The delegation was in another file off by itself. Now it was my fault for not seeing it. Sadly I assumed that everything I needed was all in one place because that's how I work. I have software that has been in the field in production for literally a decade or more and I cannot rely on my memory to tell me what I was thinking when I wrote it.
So I copied the code (without the necessary delegate) and it worked fine - somehow the Apple software figured out that the delegation was to work and made it so (without any sort of notification to me of course). So the code was morphed over time in to a complex system unrelated to the original sample code and, all the while, it continued to work - until yesterday.
Yesterday I made a change that was quite innocuous. My code was running over the wireless network just fine. I wanted to change it to work over a USB cable and I just had to remove some things in one spot and add a bit of code to communicate in the new way. All of this even worked except that some of the delegates that were working stopped. Of course without any error or explanation.
I went over and over the code, checking and rechecking my work. (Of course since the code was long changed from the sample code I had started with I never saw nor would see the missing necessary piece.) Finally, in a bout of total frustration after many hours of messing about unsuccessfully, I took the offending functionality that was not working and I wrote a new, small program to test just it.
I hooked up the code to a button on a simple display so that it would run when I pressed the button. But low and behold while I was building the project the compiler (a program that makes the human readable instructions into "machine language" so it will actually be understood by the computer) pointed out I was missing the declaration that a delegate was needed:
Controller.m:31: warning: class 'Controller' does not implement the 'NSStreamDelegate' protocol
Hmm. My real project doesn't offer this message - it merely partially (unreliably) works - even though I can tell my real project the the same thing about delegates.
And this is my big beef with Apple.
Things work right if you know what I call the proper "Mumbo Jumbo" (magic, voodoo, etc.) The tools Apple offers mostly tell you what you need to know but not always. Just like sending messages to nil. While in general I think the Apple hardware is excellent I am not real pleased with the development environment from the perspective of being able to know what's correct and what's not. If XCode and the run-time system can tell you or can notice you are doing something wrong - it should tell you instead of simply remaining silent...
All of this, of course, leads you to believe that you must question your own sanity.
No comments:
Post a Comment