It should have dawned on me earlier, but if you look at how Adobe Air applications are written, it is very similar to the way Apple builds applications with Objective-C. The two environments are conceptually a lot closer to each other than, say, Apple and Microsoft C# or Apple and Java.
Both Apple's and Adobe's environments make it easy to add special graphical effects to your application, and both are really good at making much more immersive applications than you can make in other development environments. At the same time, both have different looks to them -- you can recognize an AIR app very easily in comparison to a native iPhone app. Much like you can tell a Java app from a native one on any platform.
If I were Apple, I would dislike AIR apps for a couple reasons. One, I'd be unhappy about Adobe bringing the kind of capabilities to all platforms that had been exclusively Apple's. Second, I wouldn't want people getting used to the AIR look and feel on my platforms instead of my native one.
If AIR had produced completely native looking apps on the iPhone, I don't think there'd be half the problem...
Tuesday, April 27, 2010
Tuesday, April 20, 2010
The iPad and the PC's dirty little secret
Pretend for a moment you're not a techie. You don't have a computer science / engineering degree. You haven't spent years maintaining PCs, installing software, ... you are, let's say, Uncle George.
Pray tell me, as a mere user of PCs, what is the significant improvement from Windows 95 to Windows 7? Saying it's newer is not an answer, nor any of the newer-in-disguise answers (e.g., Windows 95 is not supported by Microsoft, new hardware doesn't come with Windows 95 drivers, etc.). Saying it's faster is explaining Intel's advancements, not Windows.
How do you tell Uncle George in what way the progression from Windows 95, 98, Me, XP, Vista, and finally to 7 has changed the way he uses computers -- in a way that he understands and recognizes?
You can't, of course. Put Uncle George on Windows 95 on your dual-core machine (assuming you could find all the right drivers for it), and he would probably just tell you 95 is a lot faster.
This is not an anti-Microsoft complaint; put Uncle George on Linux and the only bash he'll use is when he takes a sledge hammer to the computer.
OK, so if Windows hasn't improved his perceptible user experience ("more stable", "less malware" are all questionable), has it gotten easier for him to maintain his system? Has the progression to Windows 7 greatly improved Uncle George's ability to administer his systems? (Is that laughter I hear?)
After 15 years of Windows improvements (we'll skip Windows 3.x out of charity for Microsoft), the PC is still basically unmaintainable by common people. Buy a new PC, load it up with anti-virus and Norton utilities, set them all up to go, and within a few years the system will be slower, buggier, and more likely to be infested than ever. PCs have not been made usable by the common man. (The common woman has a shot at it, I believe ;-) ). Remember streaming video? So does Uncle George, but it seemed to get jerky and slow a few years ago for no good reason and he's given up on it. What happens to PCs?
Look at other things of similar complexity: Just because Uncle George gets lost and drives down the wrong road it doesn't mean his car develops a malevolent mind of its own and start shooting other cars. Cars don't run on Windows.
And that's where the iPad comes into play. So many people jump on Apple's case about its vetting of applications and rules for developers. And, yes, Apple is thinking of its shareholders first. But it is definitely thinking of Uncle George second. You can't get a virus on an iPad. No program you install is going to keep Netflix from crisply streaming video. The iPad isn't going to insist on rebooting right this very second because some third tier application has decided to upgrade underneath you. I guarantee you that the performance of your iPad is going to be as good on its third anniversary as it is on its unboxing day.
And that's the dirty secret of PCs. When Uncle George figures this out, he may wonder why he bothers with a PC at all.
Pray tell me, as a mere user of PCs, what is the significant improvement from Windows 95 to Windows 7? Saying it's newer is not an answer, nor any of the newer-in-disguise answers (e.g., Windows 95 is not supported by Microsoft, new hardware doesn't come with Windows 95 drivers, etc.). Saying it's faster is explaining Intel's advancements, not Windows.
How do you tell Uncle George in what way the progression from Windows 95, 98, Me, XP, Vista, and finally to 7 has changed the way he uses computers -- in a way that he understands and recognizes?
You can't, of course. Put Uncle George on Windows 95 on your dual-core machine (assuming you could find all the right drivers for it), and he would probably just tell you 95 is a lot faster.
This is not an anti-Microsoft complaint; put Uncle George on Linux and the only bash he'll use is when he takes a sledge hammer to the computer.
OK, so if Windows hasn't improved his perceptible user experience ("more stable", "less malware" are all questionable), has it gotten easier for him to maintain his system? Has the progression to Windows 7 greatly improved Uncle George's ability to administer his systems? (Is that laughter I hear?)
After 15 years of Windows improvements (we'll skip Windows 3.x out of charity for Microsoft), the PC is still basically unmaintainable by common people. Buy a new PC, load it up with anti-virus and Norton utilities, set them all up to go, and within a few years the system will be slower, buggier, and more likely to be infested than ever. PCs have not been made usable by the common man. (The common woman has a shot at it, I believe ;-) ). Remember streaming video? So does Uncle George, but it seemed to get jerky and slow a few years ago for no good reason and he's given up on it. What happens to PCs?
Look at other things of similar complexity: Just because Uncle George gets lost and drives down the wrong road it doesn't mean his car develops a malevolent mind of its own and start shooting other cars. Cars don't run on Windows.
And that's where the iPad comes into play. So many people jump on Apple's case about its vetting of applications and rules for developers. And, yes, Apple is thinking of its shareholders first. But it is definitely thinking of Uncle George second. You can't get a virus on an iPad. No program you install is going to keep Netflix from crisply streaming video. The iPad isn't going to insist on rebooting right this very second because some third tier application has decided to upgrade underneath you. I guarantee you that the performance of your iPad is going to be as good on its third anniversary as it is on its unboxing day.
And that's the dirty secret of PCs. When Uncle George figures this out, he may wonder why he bothers with a PC at all.
Saturday, April 10, 2010
Apple's limiting developer tools for iPhone
When Lee Brimelow said "Go screw yourself Apple", he was reflecting what many people feel about Apple's decision to limit the use of development tools to cut out the use of Flash & Flex in building iPhone/iPod/iPad applications.
But I think that is also a short sighted view into Apple's motivation. I have a law about observing people's actions, which is "just because you don't understand the rationale, it doesn't mean it's irrational." For Apple to limit developer tools out of spite would be irrational.
So what could the rational rationale be?
Well, I think a big clue is that this happened right at the same time as the iPhone OS 4 preview. What did Apple show off that would cause them to want to limit development tools, and why would they care?
I think you can point to two things: Multitasking and UI experience.
It's clear that Multitasking is a big thing -- it was first on the list. And it's clear that Multitasking was hard to get right -- there's no reason to delay it except for trying to find the right way. And why is Multitasking hard? Because Apple is very concerned about the user experience. And, generally, Multitasking is a big drag on the user experience of media, especially video. So Apple spent a long time trying to figure out how both the foreground application and the background application can deliver a seamless user experience.
No matter what you might say, adding additional layers -- such as a flash interpreter -- is going to degrade application performance. In a single-tasking system, either it's not enough to be noticeable, or if it is noticeable, it is only the application that is penalized. But in a multi-tasking system, non-native code could affect not only its performance, but the quality of that Skype call in the background. And remember the penalty is not just in speed, but in size. Both of which become important in a multitasking environment.
The second on the list is the UI experience. Native iPhone apps have a very similar UI experience. But a non-native tool is going to produce a different UI. You might argue that it's better, or better suited to the task at hand, but nonetheless it is different. Apple is introducing new interaction models as well, from iAds to background processing. How likely are Flash developers to use those?
I think it's a better bet that Apple's desire to deliver a consistent experience -- consistent performance with a consistent UI -- is at the heart of the change in development tools. And ensuring that developers use -- and know -- a set of tools native to the environment help ensure that consistency.
But I think that is also a short sighted view into Apple's motivation. I have a law about observing people's actions, which is "just because you don't understand the rationale, it doesn't mean it's irrational." For Apple to limit developer tools out of spite would be irrational.
So what could the rational rationale be?
Well, I think a big clue is that this happened right at the same time as the iPhone OS 4 preview. What did Apple show off that would cause them to want to limit development tools, and why would they care?
I think you can point to two things: Multitasking and UI experience.
It's clear that Multitasking is a big thing -- it was first on the list. And it's clear that Multitasking was hard to get right -- there's no reason to delay it except for trying to find the right way. And why is Multitasking hard? Because Apple is very concerned about the user experience. And, generally, Multitasking is a big drag on the user experience of media, especially video. So Apple spent a long time trying to figure out how both the foreground application and the background application can deliver a seamless user experience.
No matter what you might say, adding additional layers -- such as a flash interpreter -- is going to degrade application performance. In a single-tasking system, either it's not enough to be noticeable, or if it is noticeable, it is only the application that is penalized. But in a multi-tasking system, non-native code could affect not only its performance, but the quality of that Skype call in the background. And remember the penalty is not just in speed, but in size. Both of which become important in a multitasking environment.
The second on the list is the UI experience. Native iPhone apps have a very similar UI experience. But a non-native tool is going to produce a different UI. You might argue that it's better, or better suited to the task at hand, but nonetheless it is different. Apple is introducing new interaction models as well, from iAds to background processing. How likely are Flash developers to use those?
I think it's a better bet that Apple's desire to deliver a consistent experience -- consistent performance with a consistent UI -- is at the heart of the change in development tools. And ensuring that developers use -- and know -- a set of tools native to the environment help ensure that consistency.
Subscribe to:
Posts (Atom)