The real iPhone conspiracy

So I’ve used a Mac for a while and I’m just starting on iPhone development and a blinding flash of the almost-obvious strikes me. This is not the Blackberry killer or the Palm killer, it’s the long-fuse Microsoft killer.

Remember the monkey dance? Ballmer yelling “Developers, developers, developers!”, while jumping around like a neurally defective and sweating profusely (one could be excused for suspecting some cholinergic poison, but he lived through it, so that is not the answer). Right. I mean, he’s right. Developers is what makes or breaks a platform, but now he’s losing them, so he really has no reason to celebrate.

When Apple designed the iPhone, they could have created a special development system and language for it, but even though it may have been easier, they didn’t. They chose to tweak the development environment for OSX to include the iPhone, and by necessity, also putting OSX on the iPhone. The result of this is that if you want to develop for the iPhone, you have to get a Mac (strike 1), learn OSX (strike 2), learn Objective-C (strike 3), learn Cocoa (strike 4), and by then you’re so deeply immersed in the Mac environment that you won’t find your way out again. Since you can run your Windows stuff, including Visual Studio, just fine under Parallels or Fusion, you don’t need that Dell or HP machine for anything anymore, and you’re not sorry to see them go. In other words, you’ve got a developer that clearly isn’t going to like going back to .NET development again. I mean, once you’ve used these two environments (Xcode/Cocoa/Objective-C vs .NET/Visual Studio) it’s practically impossible to enjoy .NET anymore. It’s so far behind and so very clunky in comparison it’s almost a joke.

So, every developer you task with iPhone development is almost certainly lost from the .NET camp forever. This I can’t prove, but I’m convinced of it. But now is the question: who are these developers? Do they already develop for the Mac or are they from the “other” side? Again, by the seat of my pants, I’m convinced that a very large and increasing proportion come from large enterprise .NET development organisations that need to add a client for their large systems on the iPhone. See where this is going?

It’s only just begun.

Update: I suddenly realized that I fused two unrelated events together in my mind. Steve Ballmer did the monkey dance and yelled “Developers, developers…!” at two different, equally traumatizing, occasions. I’m not sure that’s any better, though. It’s all very disturbing.

The end of .NET? I can’t wait.

Ok, I admit, that title is a bit over the edge, but still that is how I feel. Developing for .NET is increasingly becoming not fun and far too expensive. The only reason to do it is because customers expect products for .NET, but under slowly increasing pressure from developers, that is going to change. It may take a while, but it will happen. There are a number of reasons for this.

.NET development is single platform. Admittedly the largest platform, but a platform that is increasingly having to share the market with other platforms. And already, according to some, there’s more sales potential for small developers in the OSX market than in the Windows market, due to a number of factors like customers that are more willing to buy and to pay for software, less competition in each market segment, etc.

.NET development is also entirely dependent on Microsoft’s development tools and those are increasingly expensive. For reasonable development, you need an IDE, a good compiler, version control, bug handler, coverage analysis, profiling, and a few more. We used to have most of that in the regular Visual Studio, but recently MS has removed all the goodies and plugged them into the Team system only, which carries an obscene pricetag (in Sweden around USD 13,000 + VAT for the first year…). This means that a regular one-man development shop can barely afford the crippled Visual Studio Professional at USD 1,500 for the first year. Sadly, there aren’t even any decent and affordable third party products to complement the VS Pro so it becomes a “real” development suite. And with every version of Visual Studio this only gets worse. More and more features are added to the Team suite and removed from the Pro. This is not the way to breed a happy following.

Meanwhile, OSX comes with XCode, which is almost as good as Visual Studio Pro, and is free. Objective-C is also a much more modern language with more depth than any .NET language, even though it is actually older. But, sadly, it’s not cross platform either and I don’t see how you can get the Windows fanboys of the Scandiavian healthcare scene to even consider another platform. Same probably goes for most other industries.

I’m no fan of Java, but on the other hand I’ve never worked much with it so that opinion doesn’t count. Eclipse, the IDE often used for Java development, is cross platform, very capable, and open for other languages such as Python, Flex, and many more. Yes, I know, in theory so is Visual Studio, but how many real languages do you have there? You’ve got one: Basic, masquerading as C#, J#, and, um, Basic.

Using Eclipse on any platform, you’ve got a real good chance of covering the line of tools you need, profilers, coverage, version control, without much pain and without breaking the bank. And you can write crossplatform integrated larger systems.

So, I guess it’s time to bite the bullet. I really like XCode and OSX, I really know C# and .NET, but I really only believe in Java, Flex, Python, Perl, C++ under Eclipse for enterprise development in vertical markets. And in XCode under OSX for regular shrinkwrapped desktop apps.

Not even Silverlight is very attractive and that is largely due to the marketing and pricing of the tools for it. A small developer organisation can’t afford it. Flex and AIR looks like serious contenders, though.

GotoMeeting runs on the Mac!

I love GotoMeeting, but until now I had to run it under Windows. You could only run it as a viewer under OSX before, but the latest version has a full function Mac client. I opened GotoMeeting under OSX (Firefox) then under XP (Firefox) on the same machine in a full screen virtual window, and I got this beautiful effect as the image was recursively drawn out into the distance. Looks a bit like a very deep cinema with the rows populated by Mac icons up front and in the back. Click for a larger image.

In furtherance of Mac pimping: RAID

What do you give to a Mac Pro that has it all? A hardware RAID card, of course. Outside of the sheer pimping factor, there’s a business reason, too, of course. Yeah. Like less danger of losing stuff.

Before getting one, I studiously read forums, user groups, ratings, and stuff, and came away with the clear impression that all of them have some problems. Among all the stories, it seemed that the Apple card had the least features and the most small problems, but also had the least really huge problems. So I went with Apple, as usual. The Mac Pro RAID card is the most expensive one, by a fairly wide margin, and it frequently has battery problems. Also, it won’t allow your Mac Pro to sleep fully, and if you sleep it partially, it often hangs and won’t wake up. But since my Mac, even with a full complement of drives, isn’t exactly noisy (you can’t hear it at all beyond a couple of meters), I can live without sleep. (You can take that several ways, all of them intentional.) I do shut it down for the night, though.

The card itself is impressive. Fairly heavy, reinforced by an aluminium profile along the edge, and with that distinctive Apple hardware smell. (Wonder what that is, actually…) The big chunk of aluminium you see in the pic is the battery. In the images, the battery cable is still unconnected.

The card comes with a very good hardware installation manual, a few little tools, but to work comfortably you’ll need a really long Phillips type screwdriver which is not supplied. The installation involves taking out the front fan assembly to get at the iPass cable (a cable connecting the drive backplane to the motherboard, which needs to be rerouted to connect to the RAID card instead), but that’s not a big deal. There’s a difference in how much stuff you need to disassemble depending on if you have an early 2008 8-core (I do) or an older Mac Pro, where you need to loosen and slide the memory cage in the older model. I didn’t have to do that.

I did go beyond the recommended procedure and took out the first SATA backplane connector as well, which made it a whole lot easier to untangle the iPass cable. Then I put the connector back in, of course. This maneuver provided for a whole lot more slack than the manual described, so I’m sure I could have put the card in any slot, not just the top slot. (The specs say that the card has to go in the top slot due to the short iPass cable.) Not that I see a reason to put the card anywhere else, but who knows?

It’s widely reported that the battery charges very slowly. In my case, it took some six hours to a full charge, I let it run overnight. According to specs, it’ll go through a deplete/recharge cycle every three months. During that cycle, the write cache will be disabled, slowing the system down somewhat.

The system I installed it on had four 500 Gb drives in it. I emptied the drives 2 through 4, backing up to a couple of 320 Gb Western Digital “My Passport” USB drives. I started the backup procedure days ahead of time, of course. It takes forever. I also backed up the system boot drive using SuperDuper to two external drives.

Apple mentions that you can migrate the system drive to the RAID array, but that is only if that drive has already been formatted using the RAID card. In other words, if you have a Mac Pro without a RAID card running, you can’t migrate that system drive. You have to reformat with the card, then restore from a backup.

Reformatting and then preparing the single volume for 4 x 500 Gb tooks something like nine hours all together. After that, I did a restore of the system drive, which took a while as well (two hours? don’t remember). The disk utility on the Leopard install DVD lets you restore SuperDuper images, making it all very convenient. Total space is around 1.2 Tb in one single volume.


Well, the system now takes much longer to boot, maybe a minute or two. According to miscellaneous group postings, that’s due to the RAID card spinning up the drives one by one. Makes sense. But once the login box comes up, things really take off. From logging in to getting a fully live system with all the menu bar items populated is a question of seconds now. Jeez. You could see the socks fly.

Since I develop on Windows, I run anything between two and five Windows instances under Parallels and VMWare Fusion on this machine. To open a saved Windows XP 1 Gb takes around 7 seconds now. It takes around 4 seconds to quit it all the way until Parallels has exited. I can get two XPs and a 2000 up and fully running in less than 15 seconds, using Butler. You get the message, this machine has become unbelievably snappy.

The drawbacks?

The slow initial boot. Doesn’t bother me much. The lack of sleep function bothers me a little more, since I can’t just leave everything open like I used to. I used to reboot this machine every three weeks or so. On the other hand, I used to have a heck of a time remembering where I got all those open windows from, where I started each process etc, so just having to do it again every day makes me a little more aware of what I’m actually doing. I see that as a good thing in a way.

I have two 23″ Apple cinema screens on this machine and the only way of powering them down now without shutting down the system is to enable the power switch on the side of the screens. To enable these switches, the USB cable to the screens have to be connected, but there aren’t that many USB connectors on this system (two in front, three in back) and only the ones in the back can be reached from the pigtail cable arrangement these monitors have. I ended up connecting one of them to a backside port, and using a USB extension cable to connect the other one to the back side of the first monitor. I use the other two backside USB ports to connect two external USB hubs, not wanting to chain them too much. The two USB ports up front aren’t very convenient for fixed cabling, so I leave those for the occasional USB stick.


This card provides RAID level 5 (and 1 and 0 and 1+0, if I remember correctly), which doesn’t allow expansion on the fly. That is, if I want to replace the 500 Gb drives with 1 Tb drives or more later, I have to copy off the entire 1.2 Tb volume to external storage, switch drives, init the new drives and volume, then restore. Uh-oh… sounds like a project to me. You can’t even take out the old drives, put them in another cabinet and restore from there, since they can only be read using the Mac Pro RAID card now. But this problem is common to all RAID 5 implementations. Raid-X, Raid-X2, Drobo’s stuff, and other proprietary solutions do get around it in various ways, though, but that’s all for external NAS storage.

Pimp state?

In summary, the system looks like this now: Mac Pro early 2008, dual quad core 2.8 GHz, 16 Gb RAM, 1.2 Tb single volume on a Mac Pro RAID card, NVidia 8800 GT and two 23″ Apple Cinema displays. Hm… what’s next?

Not so good video card

My Mac Pro came with an ATI 2600 XT card, which turns out to be not so great. We’re having a heat wave in Sweden right now, and that card is definitely getting the vapors. The symptom is that the machine freezes and has to be hard booted to snap out of it. The most reproducible way of getting there is to run World of Warcraft in full screen. And you’re not going to tell me that I can’t run WoW during the summer holidays. That’s ridiculous.

Continue reading “Not so good video card”

Why apps will get slower

New machines come with multicore processors. Mine has eight, ought to be plenty fast. Unless the apps only use one of them, of course. Since the number of cores go up pretty quickly with each generation, while the speed of each core remains more or less the same, and the workload of the apps goes up, the net effect of a singlethreaded app is that its performance goes down with each new generation of hardware. So, please, fellow developers, get a grip and go multithreaded now.

For the last half hour I’ve been watching grass grow, or rather the Mac OSX Stuffit Expander unpack excercise files from These excercise files are for Final Cut Express HD and consists of 12 sitx files, each around 240 Mb. …ah, it just finished. Looking at how the CPUs are loaded during the execution of the Expander, it’s no mystery why it’s so slow:

As you can see, there’s a 100% CPU hogger walking from core to core. It’s even clearer just as the walking ends and the process is done:

Interestingly, during this half hour, Safari hung (which in itself isn’t too unusual) and Parallels that was running two XP instances in the background that were doing nothing and I did nothing with, crashed. Normally, this machine is stability itself, except for occasional Safari hangups and I’ve never before seen Parallels crash like this, so I think there’s a connection.

Now, if you look at how a righteous app like iMovie ’08 works, you’ll see something like this (while creating movies):

I could run WoW with totally normal performance even while iMovie was going full blast. No crashes or hangs either. I wouldn’t be surprised if the system is most stable when all cores have some headroom left, while 100% load of any core is destabilizing. I’m just guessing here.

Memory mix on Mac Pro

I have a hard time finding info on exactly which combinations of memory are doing exactly what on the Mac Pro 8-core. As far as I understand, full memory speed is only achieved with four memory modules, since the machine can access four in parallel. I did get four 2 Gb modules (800 MHz) from OWC and I’m very happy with them, but there are the two 1 Gb modules (also 800 MHz) that were delivered with the machine and what do I do with those? If I plug them in on the top memory board as the manual advises, will accesses to these two modules be slower than accesses to the rest of memory, or will all memory access slow down?

To find out, I did a very unscientific test. I plugged in the two 1 Gb modules, so I had a total of 10 Gb of RAM in the machine, then booted it up. Ran a few programs and then started WoW. Had my character run around in a circle in Gadgetzan and saw a consistent jerkiness. The framerate according to Wow was 29 fps.

Powered down, pulled the two 1 Gb modules, booted up again and went back into WoW. Again ran around in a circle in Gadgetzan and had 60-65 fps all the time. Gone was the jerkiness. Conclusion: it’s not worth it to plug in pairs of RAM you’ve got lying around. Stick to foursomes.

Oh, BTW, if your machine becomes sleepchallenged (reboots when you try to wake it from sleep) after swapping memory, do the pull-all-cables-wait-15-seconds-then-plug-in-again, plus the parameter RAM reset (cmd-opt-P-R) one or more times. At least, that fixed it for me.