Tuesday, November 30, 2010

Has Apple reached the tipping point in the Enterprise?

Business has traditionally been a tough market for Apple, with Windows having a near lock on IT.

But the iPad is proving to be too attractive for business to ignore, and reports like 4 Reasons Enterprise IT Should Support the iPad and Gartner to CEOs: Seize the iPad Opportunity Now indicate that the barriers to Apple products are falling.

With the recent introduction of the MacBook Air version 2, I expect a lot of enterprise employees will be showing up to work with one of these with the intention of replacing their work provided Windows machine.  I wouldn't underestimate the demand from the senior ranks for IT support for Mac, and that will allow the wider adoption lower down.

Still, the relative cost-advantage of Windows machines may keep enterprises from outfitting the entire organization with Macs.  But enough people will either have the clout needed to get one on the company's dime or will buy one themselves to allow the Mac laptop to no longer stand out as the oddball at in the conference room...

My prediction: Sales of the MacBook Air for the holiday season will exceed all expectations and in January talk will begin about Apple growing to at least 1/3rd of the PC marketplace.

Sunday, October 17, 2010

Is "The Social Network" the "Wall Street" of the 2000s?

In the movie "Wall Street", Oliver Stone captures the excesses of the 1980s "Corporate Raider", distilling the essence of that character down to Gordon Gecko.  Gecko is a self-made man, in control of his fate, and lives a life more exciting and wealthy than any of us can hope to lead.

Stone intended Wall Street to be a cautionary tale, about how the combination of greed and amorality will lead to an unhappy ending (in Gecko's case, a very long jail term per Wall Street 2).  Yet the movie had a very surprising and opposite result: legions of future wall streeters saw that movie and decided that they, too, wanted to be masters of the universe and become fabulously rich from finance.  That we are face with a large financial crisis now as a result of those Wall Streeter's greed and recklessness can only stand as a sad testimony to the inadvertent effects of the 1980s movie.

Comes now the movie "The Social Network", and the parallels between it and the movie Wall Street are striking.  We have a young man who appears willing to act amorally to achieve his goals.  We have an older mentor who introduces him to the ways of the world.  We have people whose lives and fortunes are severely affected by the double dealing of the movie's protagonist.  And we have the requisite acquisition of money, fame, and women.  Only the story stops now, at the upswing of the arc, and besides paying a few people off, the protagonist has succeeded beyond his wildest dreams.

Which leads me to my question, is "The Social Network" the new "Wall Street"? Will legions of young folks decide the way to fame, fortune (and, it seems, the beds of college coeds) be through technology start-ups?

I hope so.  Because the baby Wall Streeters learned that they could be rich by moving other people's money around and taking a bit off the top, baby The Social Networkers will learn that they can be rich by creating a product that meets a real need in the marketplace.

But I also doubt so.  Because working on Wall Street is tough, competitive, but it's also a corporate job.  The risk you take is only that you won't progress as far or as fast as you'd like. The path Zuckerberg and Facebook took was an all-in bet, and, oddly enough, the "Wall Street" offspring are very risk adverse when it comes to their own fortunes...                      

Tuesday, April 27, 2010

Why Apple hates Flash Apps

It should have dawned on me earlier, but if you look at how Adobe Air applications are written, it is very similar to the way Apple builds applications with Objective-C.  The two environments are conceptually a lot closer to each other than, say, Apple and Microsoft C# or Apple and Java.

Both Apple's and Adobe's environments make it easy to add special graphical effects to your application, and both are really good at making much more immersive applications than you can make in other development environments.  At the same time, both have different looks to them -- you can recognize an AIR app very easily in comparison to a native iPhone app.  Much like you can tell a Java app from a native one on any platform.

If I were Apple, I would dislike AIR apps for a couple reasons.  One, I'd be unhappy about Adobe bringing the kind of capabilities to all platforms that had been exclusively Apple's.  Second, I wouldn't want people getting used to the AIR look and feel on my platforms instead of my native one.

If AIR had produced completely native looking apps on the iPhone, I don't think there'd be half the problem...

Tuesday, April 20, 2010

The iPad and the PC's dirty little secret

Pretend for a moment you're not a techie.  You don't have a computer science / engineering degree.  You haven't spent years maintaining PCs, installing software, ... you are, let's say, Uncle George.

Pray tell me, as a mere user of PCs, what is the significant improvement from Windows 95 to Windows 7? Saying it's newer is not an answer, nor any of the newer-in-disguise answers (e.g., Windows 95 is not supported by Microsoft, new hardware doesn't come with Windows 95 drivers, etc.).  Saying it's faster is explaining Intel's advancements, not Windows.

How do you tell Uncle George in what way the progression from Windows 95, 98, Me, XP, Vista, and finally to 7 has changed the way he uses computers -- in a way that he understands and recognizes?

You can't, of course.  Put Uncle George on Windows 95 on your dual-core machine (assuming you could find all the right drivers for it), and he would probably just tell you 95 is a lot faster.

This is not an anti-Microsoft complaint; put Uncle George on Linux and the only bash he'll use is when he takes a sledge hammer to the computer.

OK, so if Windows hasn't improved his perceptible user experience ("more stable", "less malware" are all questionable), has it gotten easier for him to maintain his system?  Has the progression to Windows 7 greatly improved Uncle George's ability to administer his systems? (Is that laughter I hear?)

After 15 years of Windows improvements (we'll skip Windows 3.x out of charity for Microsoft), the PC is still basically unmaintainable by common people.  Buy a new PC, load it up with anti-virus and Norton utilities, set them all up to go, and within a few years the system will be slower, buggier, and more likely to be infested than ever.  PCs have not been made usable by the common man.  (The common woman has a shot at it, I believe ;-) ).  Remember streaming video? So does Uncle George, but it seemed to get jerky and slow a few years ago for no good reason and he's given up on it.  What happens to PCs?

Look at other things of similar complexity:  Just because Uncle George gets lost and drives down the wrong road it doesn't mean his car develops a malevolent mind of its own and start shooting other cars.  Cars don't run on Windows.

And that's where the iPad comes into play.  So many people jump on Apple's case about its vetting of applications and rules for developers.  And, yes, Apple is thinking of its shareholders first.  But it is definitely thinking of Uncle George second.  You can't get a virus on an iPad.  No program you install is going to keep Netflix from crisply streaming video.  The iPad isn't going to insist on rebooting right this very second because some third tier application has decided to upgrade underneath you.  I guarantee you that the performance of your iPad is going to be as good on its third anniversary as it is on its unboxing day.

And that's the dirty secret of PCs.  When Uncle George figures this out, he may wonder why he bothers with a PC at all.

Saturday, April 10, 2010

Apple's limiting developer tools for iPhone

When Lee Brimelow said "Go screw yourself Apple", he was reflecting what many people feel about Apple's decision to limit the use of development tools to cut out the use of Flash & Flex in building iPhone/iPod/iPad applications.

But I think that is also a short sighted view into Apple's motivation.  I have a law about observing people's actions, which is "just because you don't understand the rationale, it doesn't mean it's irrational."  For Apple to limit developer tools out of spite would be irrational.

So what could the rational rationale be?

Well, I think a big clue is that this happened right at the same time as the iPhone OS 4 preview.  What did Apple show off that would cause them to want to limit development tools, and why would they care?

I think you can point to two things: Multitasking and UI experience. 

It's clear that Multitasking is a big thing -- it was first on the list.  And it's clear that Multitasking was hard to get right -- there's no reason to delay it except for trying to find the right way.  And why is Multitasking hard? Because Apple is very concerned about the user experience.  And, generally, Multitasking is a big drag on the user experience of media, especially video.  So Apple spent a long time trying to figure out how both the foreground application and the background application can deliver a seamless user experience.

No matter what you might say, adding additional layers -- such as a flash interpreter -- is going to degrade application performance.  In a single-tasking system, either it's not enough to be noticeable, or if it is noticeable, it is only the application that is penalized.  But in a multi-tasking system, non-native code could affect not only its performance, but the quality of that Skype call in the background.  And remember the penalty is not just in speed, but in size.  Both of which become important in a multitasking environment.

The second on the list is the UI experience.  Native iPhone apps have a very similar UI experience.  But a non-native tool is going to produce a different UI.  You might argue that it's better, or better suited to the task at hand, but nonetheless it is different.  Apple is introducing new interaction models as well, from iAds to background processing.  How likely are Flash developers to use those?

I think it's a better bet that Apple's desire to deliver a consistent experience -- consistent performance with a consistent UI -- is at the heart of the change in development tools.  And ensuring that developers use -- and know -- a set of tools native to the environment help ensure that consistency.