Search This Blog

Friday, June 29, 2012

Thoughts on Self Publishing...

Over the last years on this blog I have talked, among much else, about how the industry of "print" is largely going away - changed by technology: groupon, wifi, smartphones, and so on.  Will it ever vanish completely?  No, my mom, who is in her eighties will still get hardcopy statements and bills.  (As a mater of fact I do to - the work better in a tax audit - but that's a separate story).

Recently I wrote here about Synthodeon, LLC and MIDIProbe - my first effort to enter the self-publishing world if iTunes and iPhone apps.  I also wrote about the CD I produced (see the "Fall to Earth" ad at the right of this blog).

But my efforts are just a drop in the bucket in terms of self publishing.

The entire set of "publishing industries" - literary, music, and so on - is going through an interesting process: the elimination of the middle man.

I just recently read about "On the Island" - a successful self-published novel (see this).  But there are many others in literature and music.

So why is this happening?

I think to a large degree technology has changed the economics of taste.

Before self publishing was possible (its been available with demand print for some time) there was a significant cost to produce and market books.  No one wanted to spend a million dollars promoting a book which would only sell a thousand copies - no matter how good it was.

So publishing houses developed that made gambles on what would sell - not what was good, not what people even necessarily would like - but what would sell or could be sold.  The rights were bought, the investment made in production and marketing - and out pops a best seller.

The only problem with this is that while people might buy this best seller it might not be the "best" a "best seller" could be.  Maybe the publisher doesn't think what readers would find really interesting is "not ready" for commercial sale - so they don't buy it.

But today that's different.  I can upload a file to Amazon for free and maybe people will buy it - music, art, literature - not just commercial products.

I can buy Google adwords for a hundred dollars and let people know.  I can pay a professional proof reader to fix up my work or a professional reviewer to review it.  And suddenly I have a product.

Now I have read a good number of $0.99 Amazon Kindle books over the last year.  They have their flaws but I'd say that of thirty or so I have bought I have only abandon maybe two or three.

So that's a ten percent failure rate - at $0.99.  I don't think that the failure rate is much better spending $8.95 USD at Borders Barns & Noble. 

So about $60.00 (some cost $2.99, a few $12.99 and some are free) versus about $400.00 USD.

At Amazon at least reader reviews are the most helpful - Amazon will recommend books but you can't tell much from just that.

Reviewers at Amazon do a good job I think.

And to a large degree in the modern world, at least, they act like a publisher - and all for free.

Quite honestly I don't see how publishing as we know it today will survive.


Thursday, June 28, 2012

Emily, Meet Mr. Dotcom...

Over the last six months or so I have been writing about Kim Dotcom and www.megaupload.com.

Megaupload is the New Zealand company seized by the US for supposed copyright infringement.  Unfortunately for the US, according to the Sydney Morning Herald, the raid and seizure on Mr. Dotcom was illegal.  The reason for this was simple: the warrants were too vague.  Vague about the offense and vague about what was to be taken.

Further, the judge ruled that the act of taking any copies or clones of www.megaupload.com data off to the US was illegal because the original warrants where illegal.

Kind of a blow here for Hollywood and the FBI.

Ultimately, as I noted in other posts (see the posts with "megaohoh" as a label), the only real criminal in the case of megaupload is you, or rather, folks like Emily who I wrote about yesterday.

Mr. Dotcom did not acquire games or movies and store them on his website for others to download.

You did that.

Mr. Dotcom did not give the URLs of uploaded movies and games to others.

You did that.

Hollywood and the RIAA know that there is little change of success in prosecuting every Emily in the US - there are probably about 30 million of them.  Most of the "Emily's" have dabbled in "shared music" sites over their underage lives - so in the eyes of the law all are guilty like Mr. Dotcom.

But all this makes bad press.

And all these kiddies are also the best customers of Hollywood and RIAA members.

So what do you do when your best customers are stealing your product?

You go after anyone you can find that might be involved that has deep pockets -like Mr. Dotcom - regardless of the law or legality.

(Note that the US case against Mr. Dotcom has more serious problems besides an illegal search: among other things Mr. Dotcom's company has no US office or legal presence and no warrant was legally served on his company by US officials which leaves the ability of the US to even prosecute megaupload in question.)

Until the invention of CDs none of this was an issue.

Cassette tapes were too low-quality for the RIAA to concern themselves with - every kid with a cheap cassette recorder to record a song - but the quality was always worse than the record.

CDs were alright at the start because they were expensive and players were audio devices.

But that changed by about 1990 when anyone could buy a computer with an audio-capable CD drive.

Soon after technology for burning audio CDs became available as did cases of 100 CDs for $50 USD and "Rip. Mix. Burn" from Apple.

The final nail was the internet and file sharing programs.

By 2000 or so the same became true of DVDs and movies.

Today its true for games as well.

Like Emily most kids who were teenagers in the last fifteen years have probably never bought many CDs because they were simply too expensive.  Sure a few artists like the Beatles and Metalica have held out for a period of time in order to preserve cash flow - but for the most part everyone else caved in (and so too the Beatles).

Emily doesn't have the concept of "ownership" in the digital age.  Kind of like someone in the Midwest not having an idea about western "water rights."  Water is simply there and available - your turn on the tap and out it comes.  Sure someone is paying at some point - mom and dad if you're at home - but you can run down to the local river or lake and take all the water you are likely to need for free.

Perhaps you are violating some law by parking your car near the lake and scooping up a 5 gallon jug of water - but no one cares.

In the west, of course, this is not true.  Water rights are detailed and important.  But probably not at the individual level unless you own a large farm requiring irrigation.

Another part of the problem is the very notion of "the cloud." I can put my using in the "cloud" and its available where ever I and a "cloud-based" device are.   This makes the music seem like its ubiquitous - again like water.  I turn on a tap and out it flows.

At the end of January 2012 I wrote "Some are More Legally Equal than Others."

In the post I linked to a Youtube of Neil Young's "Down by the River."  The video, which I postulated was infringing, is today still there.  So I imagine that no one has filed a takedown notice with Youtube.

Does that mean no on cares?

Its hard to say.

Certainly musicians perform this song in bars around the world.

Does anyone care if these performances violate some law?

Again, its hard to say.

The problem is that as the songs become iconic in some sense the listeners "take them over" from the artists and the record companies.  The songs become part of the culture and social fabric.  And this is not just for young people.

I think the digital age has made this process of subsuming art into the culture much more efficient.

When I was young you bought a record - you might wear it out playing it over and over.  You might break it taking it to your friends house to play.  You memorized the words and the melody.  You memorized the guitar solo.  But since it was only a record there was only so much you could do with it.

Today that process requires no physical activity - you can share music sitting home in your dark basement.  Yet the emotional connection to the song is still there.

That emotional connection is what being a fan is about.  Someone might remember what they were doing when they heard that song: a first love, a death, overcoming some significant obstacle in their lives.

In the today's "cloud" that does not change.  Only the means to access the songs.

Mr. Dotcom simply created a tool, like Google, where like-minded folks could share what they would otherwise share in person.  Of course, the "cloud" is much bigger than your neighborhood and the cost to transfer a song from point A to point B is much less than jumping on your stingray bicycle with your album and riding to your friends house.

But the various steps using the cloud are still the same as is the emotional result in 2012.

At the end of the day people will never stop sharing music (or games or movies) because there is an emotional connection to them.  One that supercedes people's ability to distinguish "theft" from a feeling of "ownership."  People share common events in their lives and music connects them whether the RIAA likes it or not.

And Google, Mr. Dotcom and "Rip. Mix. Burn." will always be there to help them because there is money to be made when people lose the ability to distinguish "emotional ownership" from "theft."

The only real question left is simple: when does personal use become theft?


Wednesday, June 27, 2012

Dear Emily...

Yesterday I wrote about how I thought that "security" had become a replacement for respect, privacy and morality here in the 21st century.  Today I'd like to take this idea a little further with respect to music.

A producer/sound engineer I know posted this link on Facebook to a blog post by an Nation Public Radio (NPR) intern named Emily.  Emily is an almost 21 year old who, by her own description, is an "avid music listener, concertgoer, and college radio DJ."  Emily has some 11,000 songs in her personal "library."

What's more interesting, though, is Emily's idea of stealing music, er, rather what's not stealing (from her blog): "But I didn't illegally download (most) of my songs. A few are, admittedly, from a stint in the 5th grade with the file-sharing program Kazaa. Some are from my family. I've swapped hundreds of mix CDs with friends. My senior prom date took my iPod home once and returned it to me with 15 gigs of Big Star, The Velvet Underground and Yo La Tengo (I owe him one)."

Clearly Emily see's that Kazaa (a now ancient and defunct "free" music sharing network) was not something exactly above board.  But beyond that she seems to have little idea of what it means to legally acquire music.

If you read the entire blog post it seems clear that Emily has some idea that what's she's doing is not exactly "right" with respect to the copyright owners of the music files she has - but she's not exactly sure.  In her closing paragraph she says she'd like a music library hopefully with"... more money going back to the artist than the present model."

So if we think about the cost of Emily's library with respect to the artists we find she's paid for about $150 USD worth of product (given an average $.99 USD per track iTunes cost).  But she has some 11,000 songs, or nearly $11,000.00 of material.

The bottom line is that Emily sees no real problem in all this and merely hopes for the best for all artists in the future.

Of course, I am not to first to comment on Emily's activities: see this "Letter to Emily" as an example.

Emily is not alone, I think, in her perspective of what theft is with regard to music, nor, for that matter movies.  The ubiquity of internet content and the ease of which it can be accessed has, as Emily admits, given her a different perspective than an old geezer like me.

She admits in growing up in a time when, from her perspective, there was no such thing as a CD (or vinyl for that matter).  To her the entire world of music is digital - no album art, no CD sleeves, nothing.  Just digital noise in a file.

Clearly who ever raised Emily did little (or perhaps couldn't do much) to turn Emily against the tide of "free music."

But what's Emily left with?

She cannot even comprehend that acquiring music "freely" is the same as stealing it.  Certainly there is some apparent nascent idea in her head that the artist is "not getting paid" but it seems clear that from her perspective its simply some "abstract" fault in society.

So given this background would we expect Emily to have any sort of nascent respect for the digital privacy or rights of others?

My guess is no - but to be fair to Emily she may have some basic notion that friends handing her a CD of passwords and logins might be "wrong" in some way.

So its little wonder that everybody and their brother is so concerned about security nor is surprising that everyone feels they must "lock down" their computer systems and data.

Yet even the "Letter To Emily" starts out with "My intention here is not to shame you or embarrass you...."

What's up with that?  Why not shame and embarrass her.  She's stealing.

The author of the piece claims to be a college teacher.  I wonder if he has a problem simply photocopying textbooks and passing out the result without compensating the authors.

In fact, should he write a text book or paper how would he feel if the same was done to him?

Today we see the signs of music everywhere - ear buds, iPhones, clouds of music, and so on.

Unfortunately everyone's "feel good" vibe seems to come at a very high price of those who actually produce what makes them feel good.

As a musician, someone who makes a living from creating intangible software, and as a performer I can see all sides of this.

I wrote "ASCAP" a while back and have covered the actions of the RIAA.

In general I think that strong arm tactics never work. 

Musicians instead need to educate everyone else about what it means to be properly compensated - but it will take a generation so that Emily's offspring know better...

Tuesday, June 26, 2012

Morality and Respect for the Key

RSA SecurID keys - most anyone with a corporate affiliation, whether as a employee, customer, client, etc. - are as ubiquitous as water.

Basically large organizations have their IT departments set up "secure" VPNs so that outsiders (or remote employees or departments) can access their systems.

But RSA keys are not all that secure -  a paper described in this Ars Technica article (link to Crypto 2012 paper as PDF here) - offers to bypass the key in as little as a few minutes.  Similar techniques are applied in the PDF to many other forms of secure IDs.

Now all of these cracks are basically tricks involving ways to "get around" the fact that factoring large numbers which is the basis of the RSA encryption scheme.

But that's not the point of this post.

Instead I have been wondering for quite a while now when technological "devices" and "security" became a substitution for morality.

Fifty years ago the notion of security was quite different than it is today.

If you had private documents, money, photos, stocks, bonds, and so on you kept them in a safe.  Usually a large, heavy metal box with an old fashioned dial on the front that you turned back and forth to a specific set of numbers in order to open it.

If you didn't want a safe the local bank would rent the equivalent to you in the form of a safety deposit box.

Beyond that most things were "locked up" in a cabinet or desk drawer.

Computers, networks, routers, VPNs and all the rest had yet to be invented.  If you did have a computer it was likely simply inaccessible outside the locked room where it operated.

Locks and keys were fairly ineffective and the basic idea was that "locks kept honest people honest."

This meant that if you had integrity and were "tempted" to forgo that integrity to pry into something not yours the fact that it was locked up served as a reminder that it was not yours to view.  The goal of a having something in a locked drawer was not that the drawer could not be penetrated (most desks from those times had simple locks which could easily be forced) - instead it was to let others know that the content was private.

The fact that something was "private" was taught in childhood.

You learned to respect the property of others.  If you forgot, the lock served as a reminder.

Of course, things left lying around in the open might be available to prying eyes...

The UNIX crypt command was the first commonly available encryption software that I am aware of.  It was part of the Bell Labs UNIX distributions available in the 1970's.  It was a software implementation (as far as I know) of the German Enigma encryption device from World War II.  It could encrypt and decrypt a UNIX file.  Since the only things on such files in those days were geek stuff there was little reason to take it too seriously (that and the fact that the Allies had broken the Enigma code in WWII anyway).

There were of course military encryption technologies at that time but even knowledge of them was thought to be a crime.  The CIA was thought to have the worlds largest computers and a staff of cryptographers.   Basically all this was reason to "stay away" from military encryption.

The invention of the RSA public key encryption in 1978 changed all this.

The RSA model makes it easy for anyone to create and use a two part cypher.  RSA has a "public" key and a "private" key.  Everyone is allowed to know your "public" key, no one but you the "private" key.  The keys are related as factors of very large integers.

RSA messages are based on "signing" with either the public or private key.

If I want to send you a message I can encrypt it with your public key.  Only you can decrypt it because only you have the "private" key.  To ensure the message came from you you can encrypt it with your private key first.  Then, after I decrypt it with my private key I decrypt it with your public key and I know the message can only be from you.

This differed significantly from having "keys" which had to be exchanged before messages could be sent.

But all this technology aside something changed over perhaps the last decade or so.

The ubiquity of computers has created boundless opportunity for people to access others private material - whether on a desktop computer or the internet.

And, it would seem, because private materials and data are seen as so "available" the old notions of respect for privacy have vanished.

Access to any number of things, personal and otherwise, are now most likely a simple password away.

No longer are their moral strictures tell you that accessing someone's private data is wrong.

Instead there's an endless parade of ever-stronger cypher-based technologies and ever-more-invasive laws specifically targeting "hacking" - which today has come to mean for the most part breaking into a digital system and stealing password data.

No longer is this considered to be counter to common morality.

Hacking to discover the "evils of US Imperialism," as is the case of Julian Assange, is considered heroic.

It seems to me that we have replaced the notion of traditional morality with strong cryptography.

The assumption is that someone will steal your private data unless you actively work to prevent it.  Conferences like Crypto 2012, in the name of openness, actively publish ways to crack security at public companies that store your banking, credit card, stock and investment, and other personal information.

But this means as a race (of humans) and as a society we have discarded the notions of "privacy" and "respect for the property of others."

Every ore clever lawyers have found ways to allow those perpetrating these crimes escape justice - mostly by using clever legal tricks to circumvent traditional laws, e.g., simple theft, when applied to electronic hacking.

So law makers have retaliated, so to speak, with ever more specific hacking laws.  Of course, these turn out to be overly general in ways not imaged and have the effect of actually making legal behavior a crime.

Today the law and crime are involved in a every-expanding dance to see which side can outdo the other.

Leaving those of us outside this world as the victim.

And the strictures against "cruel and unusual punishment" leave the criminals with virtual no punishment - after all isn't punishment cruel by definition?

So the result is that these crimes are met mostly with non-punishment punishment - such as not using a computer for live - and which are, of course, unusual and unenforceable.


Monday, June 25, 2012

Identical Twins: Not so Identical After All

Many years ago I heard about interesting studies of identical twins separated at birth: stories of reunited twins smoking identical brands of cigarettes, married to partners with the same name, things like that.

I doubt that much scientific was done to asses those sorts of things.

But one thing science did asses was IQ comparison of identical twins (monozygotic - from the same egg).

Interestingly it turns out that IQ correlates most strongly with twins so long as they are well fed and taken care of.  However, if you don't feed a child or don't provide a healthy environment things don't go as well.

The initial 1990 study was very controversial - so much so that it was not funded via "traditional" means.  And, according to this WSJ article, it received a great deal of criticism.  Subsequent studies, however, have shown that the initial study, conducted by Thomas Bouchard and Nancy Segal at the University of Minnesota, was quite accurate.

The bottom line is that identical (monozygotic) twins raised apart in similarly "healthy" and "well fed" environments have similar IQ's around 70% of the time.  Take away "healthy" or "well fed" and this falls off dramatically. 

However, identical twins raised together had a 88% chance of similar IQs.

If you start looking at fraternal twins, siblings, cousins, and so on the number drop off rapidly.

So all discussions of preferences for mates and cigarettes, what does this say?

I think that the interesting numbers are not the number of similar IQs but instead the numbers of dissimilar IQs.

For example, if 70% of identical twins raised apart have similar IQs, what about the 30% that don't?

More interesting, what about the 12% (opposite the 88% with similar IQs raised together) that don't?

So you have identical genetics to your identical twin sibling but you don't share a similar IQ?

It would seem the first thing this says is regardless of genetics your personal results with IQ and brain development may vary significantly.

Why could this be?

For one thing, the structure of your brain is determined in part by the stimulation you receive as a young child, say under the age to two or three.  So at least theoretically one twin could be "favored" in some way over another during childhood causing the other to receive less stimulation.

On the other hand, one twin my find their environment more compatible with there constitution than the other creating a situation where one grows more than the other.

Yet even identical twins do not have the same fingerprints.

Our brains on average (according to Wikipedia) have some 10^14 synapses - as children we have some 10^15.

As we grow our brains change and we lose connections (or the numbers stabilize).

Synapses are not simple logic gates like you might find in a computer.  They are influenced by chemicals in the brain as well as though electrical activity.

And no one really has any good idea about how, for example, our memory's actually work in terms of neural function.  Or how we process audio or visual data.

The bottom line, of course, is that its impossible to ensure that each child receives exactly what's required because even genetically identical child raised in the same household 12% of the time have different IQs.

In fact, we don't even know how to guarantee the same outcome if we tried.  (One would expect on average that identical twins were raised more or less identically, i.e., one was not locked in the closet.)

It seems to me like this is a case of Wolfram's "Computational Irreducibility" in the sense that we have no way to work backwards from a given set of twins and their IQs to see how things got that way.

It also seems, based on the fact that finger prints of identical twins are not identical, that parts of a of human beings physical structure vary even given identical genetics, i.e., are their parts of your brain that are like your fingerprints?

Some science has recently revealed that even identical twins are not identical at other levels:  see this.  Which is interesting... Perhaps genetics is not as "strong" as we though in its influence over how we grow.

So I guess at even the most basic level we are always going to be different...


Friday, June 22, 2012

From Plumb Brook to ゴジラ Gojira (Godzilla)

I find these old pictures fascinating.  This one, taken in 1981, is from an article at Wired.com discussing NASA's Plumb Brook Nuclear Facility.  (This view makes it look like some of the old Ohio Turnpike rest stops.)

Plum Brook was built to research atomic airplanes.

It is located in Sandusky Ohio on the site of the Plumb Brook Station - land given to veterans of war of 1812.  Subsequently Plumb Brook Station was converted to handle US Government ordinance and later in the 1950's converted into a nuclear research facility.

Atomic airplanes were abandon before the Plumb Brook facility and its reactor were completed but President JFK reinvigorated the program with his atomic rocket program in 1961.

The Plumb Brook reactor was taken "critical" in June of 1961 for the first time.

The program continued into January of 1973 when the site was "temporarily closed."  The site never reopened and today the site has been mostly "decommissioned" (as documented here by NASA).

Nuclear aircraft research ended with the Convair X-6 program (from Wikipedia):


The idea was to use the General Electric HTRE-3 jet nuclear-powered get engines to propel the aircraft.  At the bottom are two traditional jet engines.  Air is taken in the front, passed into the reactor (above the two engines) to cool it, and the heated air passed back into the jet engines to drive the turbines (from Wikipedia):


Test versions of this engine still exist at the Idaho National Laboratory (from Wikipedia):


In September of 1951 project MX-1589 became the first and only flight test of a nuclear reactor in the US.  The reactor did not power the plane but instead was used to evaluate shielding for the crew.  There some 47 flights over southwest Texas and New Mexico.

The program was canceled in 1961.

Recently I watched the original Japanese ゴジラ Gojira (Godzilla) movie.



Key to the plot is the fact that Gojira (Godzilla) is unleashed on Tokyo by a nuclear explosion.


While the US was developing these atomic flight systems the Japanese, the first victim of US atomic energy, were developing a voice to protest against nuclear weapons.  I find it fascinating that the Japanese, while recent victims of the atomic bomb, apparently had little problem fully embracing nuclear energy for power a few decades later.


There is a scene in Gojira where the "dangers" of atomic weapons are discussed.  There is sort of an eerie (and ironic) predictive aspect to the scene where the same fears could be applied to the Japanese themselves and Fukushima some 55 years later.

Thursday, June 21, 2012

Apps, Children, and Legal Loop Holes

First US Patent circa 1790 for making Potash
Its always been interesting to see what people think about "owning ideas."

Here in the US there are two forms of government-sponsored "idea ownership."

The first is copyright law.  I can write something unique (or create art) and I can ask the government to recognize this with a copyright.

There is also patent law.  Originally this was for "inventions" - physical devices.  (In fact, early on the US Patent Office required working inventions before issuing patents.)  A patent is a document that describes some sort of device or process (today this includes software as well).

In both the case of patent and copyright the registered "owner" is said to own the intellectual property defined by the patent or copyright.

Today the primary uses of this seem to be the right to license and the write to sue.

As a software developer I create an application.  I make sure I record a copyright for this software and I ensure that when it runs it emits a copyright notice (it may also include this in the executable code as well).  As the owner I can cause this software to be licensed by others: and end user or, as in the case of MIDIProbe for example, sold through the iTunes store.

Copyright ensures that Apple, for example, does not suddenly claim ownership of my application.

In my experience copyright is fairly easy to manage.  In the case of an iTunes app, for example, I can show via backups the entire history of the products development which in general makes it clear who owns it (not withstanding some other agreements or documentation to the contrary).

Copyright is always clear because a copywritten work contains a copyright notice: a piece of software, artwork (songs and physical art - you can register it with the US government without marking the actual item).

Patents are a somewhat different story from copyright.

In a patent you make one or more claims, usually about how something works: the wizzel connects via the froom to the bandersnatch, as the wizzel rotates the bandersnatch rotates causing...

Once you own a patent no one else can use the same process as your patent describes.


In the case above no one else in the US can have a wizzel rotating a bandersnatch via a froom.

But what does that mean?

To start with the it means that the patent office could not find another example of the same thing - though that does not mean there isn't one somewhere.  If, in my case, a wizzel is an electric motor, a froom a shaft, and a bandersnatch a wheel then its unlikely, but not impossible, that I will receive a patent because the notion of a electronic motor driving a wheel via a shaft is neither new or unique.

Unfortunately, if the patent examiner does not delve deeply enough into my terminology and discover this he or she may grant me a patent.  And given such a patent I may then go off and attempt to sue others using the motor/shaft/wheel concept.

Now this example is pretty simple-minded.  But imagine instead some complex 100,000 line software program instead of the motor/shaft/wheel.  Imagine to software, for example, scans some images and picks out some aspect of the image, e.g., a finger print.

How might a patent examiner know what else has been done along these lines? 

What about other uses of image processing unrelated to finger prints?

What about equivalent algorithms that some other patented (or rejected) invention uses.

The problem here is that things become complicated quickly: first in the patent office because there are so many invents submitted each year (millions) and second because accurately evaluating these requires vast and advanced knowledge in each field the invention addresses.

Third, upon being granted, there is no certainty that someone else may have also patented the same idea in a different form.

Unfortunately, as I wrote about with Google and Oracle, disagreements arise ("Google's Waterloo - US Patent 6,061,520")

(I was both right and wrong about this. I predicted Google would lose and they did - with damages of $0 USD.  Oracle is appealing - this is why they took the $0 - so I guess the jury is still out...)

So, unlike copyright, patent law creates unpredictable outcomes - not to say there can't be copyright disputes (such as this about an author reusing their own work in Ars Technica).

So think about copyright in this context:  A Scientific American article dated December 8, 2011 by Both experiments are said to have seen evidence of the long-sought Higgs, pointing to a particle mass of around 125 billion electron volts, or 125 GeV. (125 billion electron volts is roughly the mass of 125 hydrogen atoms.)* Such results would not constitute an ironclad discovery quite yet, being below the required "5 sigma," a measure of statistical reliability. But the two experiments are rumored to have seen signals of 2.5 sigma and 3.5 sigma, which together would give a strong hint.

In December, rumors circulated regarding hints of the Higgs around 125 gigaelectronvolts (GeV), roughly 125 times the mass of a proton. While those rumors eventually turned out to be true, the hard data only amounted to what scientists call a 3-sigma signal, meaning that there is a 0.13 percent probability that the events happened by chance. This is the level at which particle physicists will only say they have “evidence” for a particle."

Try Googling for "ironclad discovery quite yet, being below the required 5 sigma" - many, many blog articles turn up using exactly this phrasing for example.

I see quite a bit of this when I research my blog entries.  Both entries seem very similar - quoting similar numbers, related values, and so one.  Of course, both are probably based on yet another article, perhaps from the LHC group...

The problem is how close is too close in copyright?

Then there is this Ars Technica article claiming that an author wrote the same words twice for two different employers.  Thinking about the 125 GeV above how different can you make two separate reports about that event sound?

On the one hand I don't think its fair if someone plagiarizes the work of another.

On the other how different do we expect two articles written about the same event, even by the same author, to be if they are to be factual?

The problem, I think, is that patent and copyright law were created during a time when movable print was still a relatively new item and there was not a lot of "printed" content in the world.  This made it possible to actually examine particular cases in detail.

Similarly patents required physical inventions - something that was costly in the early 1800's.  You didn't get involved in serious R & D unless you had the funding and there was little point in stealing someone else's work.

Today, according to this WSJ article, 13 year old's are writing and selling apps on-line.

What do they know about copyright or patent law?

Yet if they hit it big with a game, for example, you can imagine the field day the lawyers will have...

At the end of the day no amount of "law" will fix this - only morality.

But today's model as often as not seems to be "I'll grab what I can and let the lawyers protect me while I do..."

Not something either the copyright or patent laws of the US were designed to support.

Wednesday, June 20, 2012

The Higgs Boson verus "Climate Science"

Particles in the Standard Model (from Wikipedia)
I was interested today to see that the Large Hadron Collider has produced some interesting results - perhaps indicating that there is such a particle as the "Higgs Boson."

Particle physics is interesting to me for another reason.

To me it represents what's still right about science... let me explain.

In grade school in the 1960's we all grew up with the "atomic" model of matter: protons, electrons and neutrons - electrons spinning around little a nucleus like planets around the sun.  Of course, at higher levels of education there was much more: strong and weak forces, electromagnetism, and others.  There were also many kinds of experiments being conducted with exotic technologies like "cloud chambers," atom smashers, and so on.

All of these things produced data which physicists had to explain.  Why was there a weak force, why was there electromagnetism, and so on.

To do this, starting around 1960, a theory, known today as the "Standard Model" was developed.

The idea was that there was a theoretical set of "particles" that interacted with each other in predictable ways - mostly by a notion of "exchanging particles."  Along with the "theory" are detailed mathematical descriptions of the particles and their behaviors.

The last particle predicted but not yet observed by this theory is the "Higgs Boson."

So the "physics community" built the Large Hadron Collider (LHC), at a cost of some $10 billion USD or so in order to have a device that could detect one.  (The Higgs Boson requires an enormous amount to observe and no existing device available to physicists was suitable.)

Now physicists are not like climate scientists (or doctors or medical scientists for that matter) in that they like to believe that their equations are accurate.  This means that the results of these LHC experiments much have a high degree of certainty that what is being observed, the Higgs Boson in this case, is not some random other event.

To measure this certainty physicists use something called a sigma derived from the notion of standard deviation.

Basically this is a way to determine what your confidence level is of your measurement.

In the case of the LHC they like to see at least a sigma 5 (which would mean something like a confidence level of 0.9999994).

Now the equations and the Standard Model don't necessarily "explain" physics as we know it today, but the do offer an accurate (sigma 5) and predictive model based on observation.  (For example, I wrote about the "The End of Certainty" last post which offers insight into completely different ways of thinking about physics.)

So for me the question becomes why is something like "climate science" (or medical science) held to these same rigorous standards?

In that last post I mentioned this Nature article.  It talks about global ocean temperature.

Basically the article talks about data sets, adjustments to data sets, attempts to compare various mathematical climate models, and talks about matching this to observed data.

Sadly you don't find much related to confidence levels or sigmas.

The reason, of course, is simple.  There is no equivalent "Standard Climate Model" that matches (in concept) the  physics "Standard Model."

Now remember that the physics "Standard Model" is simply a set of mathematical tools that matches, with a high degree of certainty, the real world.  Its does not "explain" the real world (and in fact there are many things in physics, like the two-slit experiment, which seem quite unexplainable given our current knowledge base) but merely offers accurate tools to predict it at a certain level.

In climate science there isn't any such predictive tools or theory.

If climate science had such a tool it would be able to predict rainfall or temperature to withing a few hundredths of an inch or degree.

The excuse is that climate science "projects" what the climate will do, not predict it (see this in Scientific American).

To my mind this is like Phlogiston theory from the 1600's or aether theories of the 1900's.

Both of these theories "projected" reasons for things: phlogiston for combustion, claiming that there was a "fire element" contained in combustible matter that was released by burning and aether for transmission of light (claiming there was a substance that permeated the universe through which light traveled).

People believed these theories and to some extent they created reliable predictions for a while - at least until they were disproved by experiment.

Physicists learned from these theories.  They learned that accuracy, repeatability and predictability where important when making predictions.

While they could "project" that the aether was bearing the light from point A to point B it turned out that their projections were wrong.

So aside from the other criticisms of "climate science" there is also the notion of "projection" versus "prediction."

(Now I don't believe that the Standard Model is the be all end all of physics - there are issues with it and, as I have stated, it does not offer explanations for many things.  So clearly more science is required.  But our modern world, replete with cameras, computers, cell phones, and so on relies on the fundamental predictability of the mathematics of the Standard Model.)

The Standard Model is, then, I guess simply a tool.

And so is "climate science."

The problem then boils down to which tools is sharper?

The one that can cause the CCD in your 8-megapixel phone to take a picture (how often when you push the button does the phone fail to work?) or the one that cannot accurately predict tomorrow's weather?  (I suppose "climate science" is in that sense like Astrology - which could predict "Venus Rising in the Seventh House" at some point in the future (or whatever) without the knowledge of calculus or Keplers laws.  Perhaps we should all revert to Astrology...?)

To me today's "climate science" is simply the phlogiston theory of 400 years ago.

Physics has learned a lot in that time, perhaps the same knowledge and tools should be applied to "climate science."

Don't you think that if "climate science" could build their own equivalent of the LHC and prove itself correct they would already have done so?  After all it would only cost $10 billion USD.

Tuesday, June 19, 2012

The End of Certainty

So for a couple of weeks I have been plodding through Ilya Romanovich Prigozhin's book called "The End of Certainty."  I wrote about Prigozhin in my last article on Mugs and his health travails.  I haven't gotten through all of it yet and I am now toiling through the really difficult parts.

Prigozhin's premise is that the arrow of time as we perceive it is due to what he describes as a lack of reversibility, i.e., irreversibility.

The thinking goes something like this. 

Suppose I have Newton's law were F = ma (force = mass times acceleration).  Now acceleration is a change in velocity over some period of time. 

(Think of it this way: You are driving along at 55 mph and you decide to press the accelerator to the floor for ten seconds - during that ten seconds your velocity will increase given a reasonably, non-junk car; that's your acceleration.  The mass of your car is, for purposes here, its weight.  So during the ten seconds the force with which you might hit something is increased, say as you go from 55 mph to 70 mph.)

Now given our scenario the laws of physics don't say whether time with regard to acceleration is moving forward or backward.

That's right - if time moves "forward" (positively) then our car goes faster.

But if time goes "backward" (negatively) then our car slows down.

Now as vaunted as Newton's law is no where does it require time to move only forward.

This is not only true of Newton's law but also Quantum Mechanics and virtually all the rest of physics.

The direction of time is not constrained to move forward.

So Prigozhin asks and presumably answers why is this?  Certainly in terms of observations we never see time moving "backward" (negatively).  In fact, as far as I know, no experiment has ever shown time operating in reverse.

Yet as best as today's science, physics and mathematics can describe there is no reason it can't.

Prigozhin describes time's perceived behavior as "irreversiblity."

So what does this mean?

We can take a simple example: a billiard ball being rolled on a table.  Someone picks it up, rolls it, it rebounds off the sides a few times and rolls to a stop.

Now we can "model" this with a computer quite accurately.  In fact, there are billiard programs available that work just like the real thing - replete with friction, physics and all the rest.

The problem is that the models used by these programs are by and large "reversible" (save for the injection of "randomness" in the program which, for billiard balls is not necessarily significant or observable so we will assume for this discussion that its not present).

So I can run the program backwards (either by design or by hand with the debugger).  In any case the result will be that the ball will return to its starting point (the equivalent of time running backward).

But in reality this is not possible.

One reason is that there is a change in the thermodynamic entropy of the billiard ball "environment."  Picking up the ball, rolling the ball, bouncing the ball off the edges all involve the transfer of heat.

Again this might be simulatable on the computer but Prigozhin claims in part that the thermodynamic aspects of this contribute to the irreversibility.

So what does this say about human "models" of things that simulate physical aspects, e.g., the billiard ball model or something much larger, say a climate model?

For one thing it says that these models simulate our equations for the physical aspects of these things - not the things themselves.

So we as human's have created these equations to describe things.  Its little wonder then when we use them to model the actual physical world the equations operation exactly as we expect.

Unfortunately, even in the world of Quantum Mechanics, this match-up between equations and observed reality don't offer a description or insight into the true nature of what's being simulated.  While the equations might predict what we see they don't tell us why we see what we see.

And clearly, through Prigozhin work, the equations must be incomplete because they are reversible and reality is not.

I think that there is also a connection here between Wolfram's "Computational Irreducibility" and something that is physically "irreversible".  Almost as if Wolfram has discovered that, given certain equations and algorithms, there is a point at which you cannot peer back beyond the "irreversibility" of a given physical situation.

So we can think about the following experiment.  Inside a hermetically sealed, closed, shielded box which is totally isolated from the rest of the world I place my billiard table, a ball and a some means to cause the ball to roll on the table.  While I may be able to calculate exactly where the ball will end up my calculations, at least given our level of knowledge today, will remain reversible while the situation inside the sealed room will not.  Not really different from the Large Hadron Collider, for example.

So what does all this say?

For one thing that "modeling," particularly of large-scale physical systems, is basically a bogus non-scientific endeavor because without some notion of irreversibility the models only simulate the researchers equations and not physical reality.  In turn this means that our models don't offer understanding of why something is by definition.

For another it says that something like the pressure of a gas in a sealed container is likely governed by Wolfram's "Computational Irreducibility" because the behavior of the gas (and the molecules that make it up, their energy, momentum, and so on) is "irreversible."

What inspires me to think about this and to study authors like Prigozhin are articles like this one at Wired about human's causing global warming.  From the source article "Large-scale increases in upper-ocean temperatures are evident in observational records1. Several studies have used well-established detection and attribution methods to demonstrate that the observed basin-scale temperature changes are consistent with model responses to anthropogenic forcing and inconsistent with model-based estimates of natural variability..."

Science studying science-created, fully reversible models. 

Models based as well on admittedly incomplete data.

There is far more here than I can cover in a single post, but at least I have tried to scratch the surface of what it is that I am thinking.  Its also difficult to present this material without all the mathematical complexities to motivate it.

Prigozhin's work is now more than a quarter century old and "The End of Certainty" was written nearly 15 years ago.  It seems clear to me that his work reaches down to a fundamental level upon which all of human understanding pivots.

I am personally interested in the relationship between Wolfram's work on "Computational Irreducibility" and how it works with Prigozhin's ideas - particularly in the area of what can and cannot be "known" about computational systems.

I guess in some way's what is most "profound" about all this is how little we can know about something as simple as the output of one of Wolfram's basic automata even though we have a full and complete description of it. 

If we cannot understand something that simple how can we understand something as complex as physical reality?

Monday, June 18, 2012

NeverSeconds, Politics, Lunch and Nagging Questions...

A "NeverSeconds" school lunch image by Martha.
I always find the most interesting things on Wired.  Today I found an article about something called "NeverSeconds."  A blog about school lunches.

The idea is that a 9 year old Scottish girl named Martha has decided to take a camera to her school and photograph her lunches and post them, with ratings no less.

Her helpful dad makes various notes on some posts and kids from all over the world send in their lunch photos as well.

After some time the school officials find out and ask her to stop.  A "ruckus" breaks out with various back and forth between Martha's parents and the school officials.

Martha becomes involved with a British charity called Mary's Meals - a charity serving meals to children world wide.

At the bottom of the blog is a page counter that ticks off about one every second with close to six million hits as of today (June 18).

What's troubling about all this is that its very similar in nature to "Catfish," which I wrote about a while back in "Wolf's Reality Test: Crocumentaries and the Fall of Civilization.

Catfish is a movie about "Nev Schulman" - a New Yorker who lives with his brother Ariel and Henry Joost in New York.  The story centers on Abby Pierce, an eight-year-old child artist from Ishpeming, Michigan, who creates wondrous paintings.  As things progress we discover that the entire "Abbey" operation including the paintings is actually run by Abbey's mother Angela - for what purpose its never exactly clear - supposedly to fulfill her emotional needs with friends on Facebook.

The point of my post is not that there is anything wrong with Martha or NeverSeconds. But I do wonder about the separation between the reality of an internet "presence" (like the characters in Catfish create) and the "real world."

I have grandchildren in the age range of both Abbey and Martha.

Their mother is not interested in creating an exposure for her children like the parents of both Abbey and Martha.

Is this really children expressing themselves or exploitation?

I cannot really say that I blame her - particularly with all the crimes related to kidnapping, perversion, and so on popular with trolling internet pedophiles.

Now I write blogs and whore for internet attention - but with the explicit purpose of exposing myself to business reasons; and I am an adult.

Martha's and Abbey's parents have allowed their children to become "internet famous" without I think really understanding or appreciating the full ramifications of what that means.

The adult author of the Wired article is admittedly captivated by "NeverSeconds" - which is also odd.


While in general I think everyone who pays into "public school" (here in the US public means open to anyone who lives in the district, in Martha's case public means "private") should be interested in what children are being fed its hard to understand what makes this particular blog so appealing.


I suggest that the author march down to the local school and attempt to involve herself in what's going on there because I think that would be far more interesting.


Clearly nine year old Scottish Martha has uncovered the politics of something as supposedly simple as a nutritious school lunch.


To bad no one else in the US has the same interests to find out what's going on in their own local school district.

 

Friday, June 15, 2012

Inspired Design for Apple Apps

Its nice to see that thatgamecompany is going to provide its games like Flower and Journey on platforms beyond the PS/3.

A while back I purchased a PS/3 console so that I could better understand the "gaming" user interface (UI) model.  Prior to working on the Synthodeon products most of my work involved the software engines "under the covers" rather than the UI.  Certainly at Lexigraph we've built web sites, worked on client art work, and so on so we understand graphics and visual display issues, but until Synthodeon I really never worked on integrating the art and programming aspects together into a single "platform."

(I sort of missed the "gaming" thing - I am probably a little bit too old and I was also heavily involved in game hardware in the mid 1970's - both in design and repair.  We worked on hardware games that looked like this:




I worked on a similar game - the difference was that the "race track" was vertical and there were multiple cars.  The game was Midway Game's "Wheels" I think.


These games were implemented with in 7400 series logic chips on a multi-layer circuit board probably 12" x 18".  It generated video that was sent directly to the TV portion.


The display resolution on the screen is probably about one square inch worth of iPhone display - and the iPhone is in full color - these games were black and white.


The boards were built and tested by hand.)

So in my lifetime things have changed quite a bit in this industry.  For example, all the computing power on earth in the mid 1970's is probably encompassed in a PS/3 or in an apartment building full of kiddie PS/3 users.

Games by thatgamingcompany include Flower, which I posted about here before in "Flower" and Journey.


The Synthodeon UI splash screen for MIDIProbe looks like this


While not the same I think that perhaps there is a shared design "inspiration."

(Of course, the notion of not "stealing" someone else's design is always present as well.  While you can be a fan you don't want to be to much of a fan.  The soon-to-be released Mac version of MIDIProbe looks like this:


Note that the app does not look like an iPhone or iPad - though I tried to capture the feel of both.  I also chose a different, game-inspired paradigm for the close and hide window elements.  But this has yet to get past Apple...  and that's another story in and of itself.)

Now MIDIProbe is more or less a tool and not a game - but Journey provided a lot of inspiration for "clean" and "simple" during the UI design phase.

I think that one thing that leading edge gaming design (which I think thatgamingcompany does) is driving is the notion of simplicity and beauty in an "everyday" think.  (I am not into things like "God of War" where you have a beautiful UI where you rip off peoples heads or pull out their guts.)

My father was an architect and as a child I was exposed to things like Bauhaus - a German design school ("arts and crafts" in the design of buildings) he was interested in.

Bauhaus produced furniture and interiors along these lines:



So you not only have to write software that performs a given function but you have to think about it like a design-school project as well; making the job more complicated.

In any case the point of this is that my trusty PS/3 is helping me move along into the future by giving me ideas about how to display what I need to have my apps do and inspiring my artistic side.

As an interesting aside living in a space like the one above was not so easy.  As you can see leaving anything lying around on the floor or a chair disrupts things - so you have to clean up all the time.  Similarly there are limits to what kind of furniture you can have, i.e., its not likely anyone will be putting a "comfy" couch in this room.

Similarly the bed above is not for those who like "soft" for sleeping.

One of Apple "big things" for Apps is outstanding graphics, i.e., they want your app to not only function but to also look exceptional.  Sadly today, especially with 650,000 or so apps this is not the case.  It seems clear that most were not created by trained designers and they mostly appear "cartoonish" is nature.

Thursday, June 14, 2012

Crop Circle Fun

A while back I watched a 2002 documentary on, of all things, crop circles.  It showed how a small group of MIT students were able to create a crop circle with a variety of credible properties in about four hours.

I am fascinated about how, even given direct evidence of people demonstrating how they actually made crop circles as a hoax, people still believe.

The basic idea that people follow to make the circles is covered with the image above right.  You take a 2" x 8" (or equivalent in your culture) board about 4-5 feet wide.   You drill two holes - one at each end - and pass a rope through to make a "handle" like the image depicts.

You then walk forward picking up the board under your foot, push the board forward while keeping it under you foot, and step down.  This lays the plants under foot forward (in the same direction you are traveling). You then continually repeat this step.

(As a child growing up in farm country this is a totally obvious maneuver.  As a small kid you had to do this with your foot to make a path through fields of wheat - which often pissed the farmer off.)

The big trick here, and why its called a crop circle, is the second rope.

The second rope is attached to something at the center of the circle, e.g., a stake, a person, etc.  The persons stopping down the plants holds the other end of the rope (perhaps with a knot so that the position does not change) while moving around the center of the circle.

Now as children in the 1960's we had a toy called a Spirograph:


It came with a bunch of plastic "gears" with holes in them.  You placed the gears inside larger "wheels" - also with gear teeth - and you placed a pen into one of a variety of holes in the "gears."



There was also the "Super SpiroGraph:"



As you moved the pen around the motion of the pen point was governed by the size of the gears and the location of the hole the pen was in.

The result, as you can see from the cover of the box, are "crop circles:"



Notice they all (for the most part) revolve around a center point (the center of the SpiroGraph "circle" wheel).

People always wonder how this could be done quickly and in the dark (assuming that a human couldn't get it done so it must be aliens).

Of course, with a pencil and string you can easily duplicate the "SpiroGraph" concept and test out your designs in the comfort of home with inexpensive tools you can buy at the dollar store.  Once its working you merely "scale" up the string sizes to rope sizes and go about your work in farmer John's field.

There are tons of iPhone apps that do the "spirograph" functionality as well.

Of course there are still many "doubters" who are convinced that aliens are doing all the heavy lifting.

Chief among their "evidence" are things like this:


Showing that the crop circle plant stems have been somehow altered (N1) (full disclosure on how this must be aliens here).

Since its impossible to prove a negative I am not going to attempt to show that aliens didn't do it.

All this despite English pranksters Doug Bower and Dave Chorley revealing in 1991 that since 1978 they had personally made more than 200 crop circles (using the techniques I outlined above plus a few other tricks).

Are there natural phenomena doing these circles - perhaps in some cases - but most of the crop circles are too regular and formulaic to be natural.

Yet despite all of this people still "believe" (see this in Italy).

Today there are, of course, genuine "crop circle researchers" who invest a lot of time and effort and resources into "crop circle research."  But this, like many other kinds of "research" has some problems.

First off, without capturing how these are made, i.e., unhoaxed video of alien craft creating one, and by ignoring the obvious (hoaxers coming forward with plausible explanations) its a little difficult to image what kind of research can be done.  The planet is, after all, an "open system." 

So while alien craft could be causing these circles so could humans.

This human idea of "believing" in something regardless of evidence to the contrary is interesting and seems to persist in science, i.e., climate research.  Here we pick what it is that's causing some effect we observer and figure that it "must be" the cause, i.e., man-made CO2.

Again, the earth is an open system and without concrete causal data how can we know?

It seems like pseudoscience is taking over the planet - kind of like the "self-based morality" you see today: "I can do what I want" and now "I can believe what I want" - which is fine as long as you don't pretend that its actually science and fact.

The real reason I think that crop circles persist is that the surface of the earth on which they can be created is vast and normally its done at night in the dark - so there is only a small chance of someone coming upon the activity (and no one typically walks about in the middle of a large grain field at night).

I'd like to write more but an alien craft just landed outside in my field and I have to go...

Wednesday, June 13, 2012

Tools for Musicians: Synthodeon's MIDIProbe

So, after about eighteen calendar months of effort the first Synthodeon, LLC app is in the iTunes App store.  You can visit www.synthodeon.com for more details.  The app, MIDIProbe, and its functionality are described in detail here on that site.

MIDIProbe is a high-performance tools for tracking MIDI events in an iOS (iPhone, iPad) or (soon to be relased) Mac environment.



MIDIProbe - Synthodeon, LLC

Now it didn't eighteen months to create MIDIProbe - that took only about six weeks.

Though it took a lot of time to get the whole "iPhone App" schtick down - from creating the proper icons to following the Human Interface Guidelines (HIG) - the real work was involved in developing and/or acquiring the software libraries to allow identical apps to run on both iOS and Mac OSX and on developing the suit of applications that are to follow MIDIProbe.

It took some time to get Synthodeon, LLC set up and running (thanks in part to www.LegalZoom.com) - I really did start a new business there and it cost only $100 USD plus the cost of various state filings.

The iTunes Connect experience, where you upload your app, is much like the CD Baby experience that I used to launch my CD Fall to Earth last year.  iTunes Connect is a bit more picky and you have to be a registered developer ($99 USD) to do it.

(As a side note its interesting to think about the comparative experience doing a CD versus an app.  The costs to launch, at least for me, were roughly comparable.  Both require a significant investment in artwork and presentation.  Both require you to have at least some mastery of content.)

In addition, and probably most disconcerting, you have to have your app "reviewed" by Apple to ensure that its what it says it is.  The check for crashing, proper use of the HIG, etc. so there is always some concern that you overlooked something.

Right now version 1.0 is in the app store.  But that will change as soon as I complete the work required to run the iSym version MIDIProbe to run on Mac OS X 10.6 (Snow Leopard).  It currently runs on 10.7 (Lion) but a lot of music/MIDI types still run the older 10.6 version due to issues with Lion (see the Synthodedon blog).

I created iSym using the "Chameleon." Basically Chameleon is a set of Mac OSX functions that bring iOS functionality to Mac OSX.  iOS and Mac OSX are to a large degree related, for example many of the basic OS functions and objective C libraries are compatible.  But iOS uses something called UIKit as opposed to the Mac notion of NSWindows - there are no "windows" save the screen in iOS so much of the idea behind the Mac OSX software stuff is not needed per se.  The guy that created Chameleon created a layer on top of the Mac stuff to bring UIKit to OS X.

UIKit is not a complete port of iOS - just enough to do a simple UI.  And there are many subtle differences in how things are handled - particularly in the realm of audio and events.  But an app like MIDIProbe can be done in about a dozen files (I have written a number of libraries that I use as well - also portable).

My hope is to bring the same functionality to Windows at some point as well - for there is a far larger market.  "Chameleon" is a pure OSX app - but the model should be relatively portable and I can see the path to completing it.

The Apple iOS development environment changes about every six months or so requiring some effort to keep up and keep things straight.  Apple seems to have a concerted effort to usher everyone along into new releases and to retire old ones, I suppose for the purpose of eliminating legal issues, such as the location logging issue I blogged here about a while back.

Although I devote a lot of time to Synthodeon these days I am still quite active with Lexigraph and its customers, though the current economy has not done its customers any favors.

What's interesting here is that its quite a transition even from software development for PDF to App development.  In the Lexigraph world the customers have been around in some cases for a decade.  They are corporate in nature, you deal with a series of people over long periods of time - with corporate acquisitions, with accounting, with problems, with crisis, and so on.  The software is tied inexorably to the customers work - you test with content from them, you build systems that integrate with their environment.

In the App world everything and everyone is anonymous.

Synthodeon, LLC, and Fall to Earth, for that matter, already have active customers.  I don't have any idea who they are, I know some are in foreign countries, I will probably never speak with them by voice, I might see a review they write or get an email.

You invest an enormous amount of time and effort into something where there is no way to tell what the payback will be.  In my case, for example, one of the goals of releasing Fall to Earth was to use it as a demonstration platform for the App products I was building - after all what point is there to creating a music product if you don't (or can't) use it yourself.

I have a few tools to track the level of activity on the sites and blogs - but its really hard to tell what people are doing and thinking.

For me it was important to get this first app launched.

There are more to follow as well as customized hardware and music-relate performance gear.  But it will take time to get all this completed.

I guess in today's world my only "calling cards" for Synthodeon, LLC are my music and my apps - far different than the days of Lexigraph with expensive and troublesome employees, phone sales, printed business cards and literature, air travel, trade shows, and so on.

Tuesday, June 12, 2012

Wither Apple

WIkipedia's "MacBook Pro" entry is out of date...  :-(
After nearly a decade the eponymous Mac aluminum laptop with a 17" display is gone from the Apple store.

According to various sources the reasons include the 15" is less bulky, the new 15" Retina display is plenty dense, and so on and so forth.

I have used the 17" laptop for nearly its entire life and I am saddened by this turn of events.

The 15" is simply too small for real work, real demos, and so forth.  While its nice to have a lot of pixels there's really no point in having them for lots of things - like programming, demos, and web surfing.

Currently I use two in my office - one with Lion, one with Snow Leopard.  This is because the newest XCodes's don't work reliably for maintaining an existing customer base.

You see Apple is very busy advancing the leading edge forward.  So busy, in fact, that those who rely on their technology for an installed based have problems.  Problems because corporate inertia means that in-the-field life cycles are longer than Apple's life cycles.  So just as a release is deploying in the "corporate field" Apple is releasing the next, somewhat incompatible version of  OS X or iOS.

(The "desktop" Apple boxes are facing a somewhat similar fate as well - lack of updates - paltry updates...)

I have noticed this as part of the Cook regeme.

I think that Steve Jobs understood that the supply lines leading up to the leading edge needed to be maintained because without them the leading edge would wither and die.

Recently my son was telling me about his experience at a big .NET developer show.  All the Microsoft presenters were running Mac laptops.

These 17" laptops are probably the best computers I have ever used.  Relatively fast and reliable.  No, they are not the lightest thing around, but then again I can run my business from one.  I can see everything I need to see: I usually have at least two dozen apps running at the same time.

While pixel density is nice its not helpful if it makes everything smaller and each new laptop version with a higher density display shrinks more of the OS output down.

To me it seems like Apple is grasping at straws to squeeze every last drop of revenue from their iOS products and in the process cutting off the supply lines behind.

Hey Apple, how about something new instead?

iOS and iPhones and iPads are great...  but they are old news at this point.

You can keep adding gimmicks like Siri to them but so what.

The Android phone market is now larger.  Sure its not as beautiful or elegant but you know what, people are buying them.

I recently bought a 17" Dell laptop for around $1,500.00 USD - probably about 60% the price of a MacBook Pro.  I spend the extra money for the MacBook because its easier to use, it works well with Time Machine, and so on.  There is enough value there to justify the purchase due to the time I save.

(My standard thought is add up all the time on Windows you spend watching the hourglass cursor spin.  I'll wager you waste two man weeks of time with Windows each year - more than enough at even $750.00 USD a week in time to justify the extra cost.)

But with Lion this is changing - I now spend more time watching the "beach ball" spin than ever.

It will probably get worse with the next release.

With its recent peak in stock price I think Apple has reached the apogee of its existence.  Jobs is gone and Cook is too focused on the bleeding edge and not enough on the supply lines.

Cook is far too worried about the stock price and not worried enough about the future and technical innovation.  iTV or whatever its called today is a bust.  There are only smart phones now, no more iPods.  Which leaves Apple with really only one main product: phones.  iPad's to me are basically phones that you can watch a movie on but can't use to make a call - which is kind of how Apple treats them.

I'd like to see a 17" iPadish laptop - something that would work standalone as an iPad or as a display for a detachable keyboard.  iPads such for reading technical things - the display is too small to get a large diagram open and visible (and no, it does not good to see it in teeny tiny type).  No doubt the market is "too small" for this sort of thing - hence no "innovation."

The only real thing holding me back from jumping ship to Android is my distaste for Java.

Soon the only Apple products available will be phones with 4" 50 megapixel displays - no more computers, no more laptops - all useless because no one can see all the pixels in a 4" megapixel display.

Some claim there will be a new 17" laptop along in a few months when the "retina" display for that format is cheap enough...  great. 

What do I do in the mean time if I need a new machine?

Thanks Apple.

Monday, June 11, 2012

Global Sea Level, North Carolina and Responsible Public Policy

I came upon this fascinating article over the weekend: "N. Carolina Senate Decides to Include Science in Sea Level Projections After All."  According to this source the North Carolina government has been chastised and ridiculed over the fact that it prohibits “rule, ordinance, policy, or planning guideline that defines sea level or a rate of sea-level rise within a coastal-area county” that fall outside of an official state rate of rise, i.e., if you make public policy (law) in North Carolina you must use the state's definition of sea level.

The argument seems to be related to whether or not the sea level will rise 8" or 39" inches in the next 85 years.

No doubt the North Carolina politicians are using figures like this to make their estimates:

As you can see it shows that sea level has risen some 200 or so millimeters over the last hundred and thirty years.  In North Carolina somebody in state government took a ruler and drew a line off to the right following the trend of the last one hundred and thirty years and came up with 8" more of sea level rise by 2100.

Then they attempted to make public law that said in effect we're going to work as if this is the case for long range planning, i.e., how far from shore do I put a public highway given a thirty or one hundred year life expectancy for the project.

Being sworn to uphold the laws of the state I am not sure what else should be done.

But others disagree (like those as Ars Technica, Scientific American, and the "Colbert Report").

They believe that due to global warming climate change the trend line above is about to go vertical and the sea level will rise by 39" inches in 2100 instead.

So what should a law maker do?

Apparently, at least according to the sources listed above, run around claiming the world is about to end like Chicken Little waving their arms and spend tax payer money on what might happen.

By this reasoning we should be spending public money building bunkers to save us from alien invasions and ghost attacks as well - because, well, they might happen too.

Wikipedia has a nice historical chart on sea level:


You can see from this that, as the glaciers melted at the end of the last ice age, the sea level went up some 120 meters (120,000 mm).  Of course, because the line levels out in the last eight thousand years humans tend to think that this must be the "normal" sea level, but I doubt this is the case.

There are many sources that show glaciation is cyclical so one suspects that the sea level varies significantly in a cyclical manner over time as well (source here - assuming that CO2 drives temperature and temperature drives sea level directly):

Unfortunately glaciers and varying sea levels wipe out a lot of direct historical evidence about what's going on in the climate.

Articles like this suggest that in fact glaciers that are receding today were a lot smaller in the last few thousand years.

Other articles suggest that as ice sheets melt in the Arctic the plankton below bloom in the presence of sun and CO2 and change the climate.

(I linked only one for each case above, but there are many more for you to Google.)

So scientists have to guess.

And since we're talking about 120,000 mm of sea level change a 1% error would be 1,200 mm or about 39" inches.

Now last time I checked climate science, weather, and fortune tellers were unlikely to come within an accurate 1% estimate of anything they work on.  For example there are many web sites comparing climate science predictions to fact.  Most show something like this:


(The small purple dots at the bottom are the actual climate data, the bright green, orange and blue lines are the "predictions."  The error here is significantly larger than 1%.  And yes, temperature is actually rising but so what?  Without knowledge and evidence of the cause its no more significant than temperature falling or staying the same...)

So based on this is it wise to make public policy (and spend tax dollars) on climate science?

Would you want your money spent on what might happen, even at a 5% error rate?  (I make up the 5% based on a combination of historical facts: cyclical climate including ice ages, corresponding temperature swings, changes in various geological factors, etc.)

I think the responsible answer is no.

Now in North Carolina I did not see it reported anywhere that the laws in question were for all time and could never be changed based on new evidence.  All I saw was a bunch of politicians saying "I'm responsible for public funds and without some kind of actual evidence" I am going with law based on proven past history rather than speculation.

North Carolina's budget appears to run close to break even according to this link - and certainly year-to-year runs within about 10% of even - even given today's economy.

(Imagine, a responsible state government with a balanced budget.  Seems like they understanding the concept of estimation and evidence.)

The bottom line here is that believers in climate science dogma want to "shout down" responsible government. But without some shred of evidence that things will change dramatically why would responsible adults do anything but what North Carolina has done.

Certainly aliens might invade, but do I want public funds spent on that today?

What about the Yellowstone volcanic caldera erupting?  It might happen surely and, I argue, there is better evidence for this as a threat to humanity than climate change.

However, since this is not tied directly to evil "human activity and fossil fuel" no one cares.  But wouldn't a volcanic ash cloud the size of the Louisiana Purchase be a problem?

Yup, I think so...