Subversion server on Snow Leopard server

As I already bragged about, I got me one of those delicious little OSX Mini Snow Leopard Server boxes. So sweet you could kiss it. I just got everything together to make it run a subversion server through Apache, too, and as a way to document that process, I could just as well make a post out of it. Then I can find it again later for my own needs.

First of all, subversion server is already a part of the OSX Snow Leopard distribution, so there is no need to go get it anywhere. Mine seems to be version 1.6.5, according to svnadmin. Out of the box, however, apache is not enabled to connect to subversion, so that needs to be fixed.

We’ll start by editing the httpd.conf for apache to load the SVN module. You’ll find the file at:


Uncomment the line:

#LoadModule dav_svn_module libexec/apache2/

Somewhere close to the end of the file, add the following line:

Include "/private/etc/apache2/extra/httpd-svn.conf"

Now we need to create that httpd-svn.conf file. If you don’t have the “extra” dir, make it, then open the empty file and add in:

<Location /svn>
  DAV svn
  SVNParentPath /usr/local/svn
  AuthType Basic
  AuthName "Subversion Repository"
  AuthUserFile /private/etc/apache2/extra/svn-auth-file
  Require valid-user

Save and exit. Then create the password file and add the first password by:

sudo htpasswd -c svn-auth-file username

…where “username” is your username, of course. You’ll be prompted for the desired password. You can add more passwords with the same command, while dropping the -c switch.

Time to create svn folders and repository. Create /usr/local/svn. Then create your first repository by:

svnadmin create firstrep

Since apache is going to access this, the owner should be apache. Do that:

sudo chown -R www firstrep

Through Server Admin, stop and restart Web service. Check if no errors appear. Then use your fav SVN client to check if things work. Normally, you’d be able to adress your subversion repository using:


Finally, don’t forget to use your SVN client to create two folders in the repository, namely “trunk” and “tags”. Your project should end up under “trunk”.

Once up and running, this repository works perfectly with Panic’s Coda, which is totally able to completely source code control an entire website. If you don’t know Coda, it’s a website editor of the text editor kind, not much fancy graphic tools, but it does help with stylesheets and stuff. It’s for the hands-on developer, you could say.

The way you manage a site in Coda is that you have a local copy of your site, typically a load of PHP files, which are version controlled against the subversion repository, then you upload the files to the production server. Coda keeps track of both the repository server and the production server for each site. The one feature that is missing is a simple way of having staged servers, that is uploading to a test server, and only once in a while copy it all up to the production server. But that can be considered a bit outside of the primary task of the Coda editor, of course.

You could say that if your site isn’t mission critical, but more of the 200 visitors a month kind, you can work directly against the production server, especially since rolling back and undoing changes is pretty slick using the Coda/subversion combo. But it does require good discipline, good nerves, and a site you don’t really, truly need for your livelihood. You can break it pretty bad and jumble up your versions, I expect. Plus, don’t forget, the database structure and contents aren’t any part of your version control if you don’t take special steps to accomplish that.

Coda doesn’t let you access all the functionality of subversion. As far as I can determine, it doesn’t have provisions for tag and branch, for instance. But it does have comparisons, rollbacks and most of the rest. The easiest way to do tagging would be through the command line. Or possibly by using a GUI SVN client, there are several for OSX. I’m just in the process of testing the SynchroSVN client. Looks pretty capable, but not all that cheap.

The cutest little muscle machine ever

I got me that brand new Apple Mini with Snow Leopard OSX Server unlimited edition included. This is such an adorable machine, you wouldn’t believe it. It has everything you can wish for in a server, as far as I can make out after just a couple of hours with it. It’s super easy to set up and to monitor. It’s small, it’s beautiful, it’s almost totally noiseless, and seems to use hardly any power. When you feel the case, it’s just barely warmer than the environment and the same goes for the power supply. When I switch off everything else in the room, I can only hear the server running from less than a meter’s distance. It seems to produce about the same noise level my 13″ white MacBook does when it’s just started and perfectly cool. In other words, practically inaudible. Still, it’s running two 500 Gb drives in there, which I’ve set up as a mirrored (Raid 1) set.

I’ll probably brag about this system some more once I get to know it better. But meanwhile, it’s the nicest computer purchasing experience I’ve ever had. Except for the Mac Pro. And the MacBook. And the iMac, of course. And the iPhone. And Apple TV.


What’s up with Snow Leopard and file sizes?

Yes, I know Snow Leopard changed the way they calculate file and volume sizes, but what I’m seeing here is too weird to be explained by that. I’ve got a few image files in a folder on my desktop and the filesizes I’m seeing with ls -al is:


Now watch the png file sizes when I look at it using Finder:


Oops… WTF was that??! A display bug! Let’s try again after juggling the column widths so the selection bar straightens itself out again:


Just to be sure, I opened up the info panel on the first file:


Yes, truly, here it says 109,207 bytes while ls -al says 15,843 bytes for the same file. And yes, I’ve checked and double checked and triple checked, I do indeed look at the same file. Doing a spotlight search also only returns one image. Uploading the image to a webserver and checking through Transmit shows the 15k size. Here it is, the file, from a webserver:, so you can check for yourself.

So why is Finder reporting a size value seven times larger?

Update a little later: yes, I used the ls -al@ to find the resource fork and that is what is making the difference. Maybe Finder should have the option of showing that separately at least in the inspector? Maybe I should read the man pages before posting? Maybe I should wonder what exactly are in those resources? Maybe I should just shut up and crawl under a rock?

Yet another update: I used 0xED to look into the file and the fork. The fork is full of Adobe info, since I used Photoshop CS4 to convert from a BMP to PNG. And, obviously, when uploading the image using Transmit, that fork is stripped off. Well, now I know that Photoshop saves a load of info in a resource fork, possibly including  info I don’t want them to save. Can’t see any obvious way of excluding that in the Photoshop save dialog box. So take care when passing on images to others that you strip off the resource fork first. Somehow.

Update about “Somehow”, this is how to do it: create an empty file, copy it over the resource fork, then delete the empty file. Like so, in terminal:


Anything but games are illegal?

I’m having this most surrealistic dialog with a very agreeable iTunes support person, about invoicing. The thing is I bought a few apps from the iTunes app store, among which Omni Focus for the iPhone, but the invoice (or “receipt”) I got from Apple doesn’t mention sales tax at all. Just the net amount in Swedish crowns. It is, however, correctly addressed to my company.

As practically anyone realizes, this is super weird and means I can’t recover the sales tax when I enter this document into my accounting. So I wrote to iTunes support and asked for a correct invoice. The ensuing conversation follows (I took out the name of the iTunes representative).

Continue reading “Anything but games are illegal?”

The end of .NET? I can’t wait.

Ok, I admit, that title is a bit over the edge, but still that is how I feel. Developing for .NET is increasingly becoming not fun and far too expensive. The only reason to do it is because customers expect products for .NET, but under slowly increasing pressure from developers, that is going to change. It may take a while, but it will happen. There are a number of reasons for this.

.NET development is single platform. Admittedly the largest platform, but a platform that is increasingly having to share the market with other platforms. And already, according to some, there’s more sales potential for small developers in the OSX market than in the Windows market, due to a number of factors like customers that are more willing to buy and to pay for software, less competition in each market segment, etc.

.NET development is also entirely dependent on Microsoft’s development tools and those are increasingly expensive. For reasonable development, you need an IDE, a good compiler, version control, bug handler, coverage analysis, profiling, and a few more. We used to have most of that in the regular Visual Studio, but recently MS has removed all the goodies and plugged them into the Team system only, which carries an obscene pricetag (in Sweden around USD 13,000 + VAT for the first year…). This means that a regular one-man development shop can barely afford the crippled Visual Studio Professional at USD 1,500 for the first year. Sadly, there aren’t even any decent and affordable third party products to complement the VS Pro so it becomes a “real” development suite. And with every version of Visual Studio this only gets worse. More and more features are added to the Team suite and removed from the Pro. This is not the way to breed a happy following.

Meanwhile, OSX comes with XCode, which is almost as good as Visual Studio Pro, and is free. Objective-C is also a much more modern language with more depth than any .NET language, even though it is actually older. But, sadly, it’s not cross platform either and I don’t see how you can get the Windows fanboys of the Scandiavian healthcare scene to even consider another platform. Same probably goes for most other industries.

I’m no fan of Java, but on the other hand I’ve never worked much with it so that opinion doesn’t count. Eclipse, the IDE often used for Java development, is cross platform, very capable, and open for other languages such as Python, Flex, and many more. Yes, I know, in theory so is Visual Studio, but how many real languages do you have there? You’ve got one: Basic, masquerading as C#, J#, and, um, Basic.

Using Eclipse on any platform, you’ve got a real good chance of covering the line of tools you need, profilers, coverage, version control, without much pain and without breaking the bank. And you can write crossplatform integrated larger systems.

So, I guess it’s time to bite the bullet. I really like XCode and OSX, I really know C# and .NET, but I really only believe in Java, Flex, Python, Perl, C++ under Eclipse for enterprise development in vertical markets. And in XCode under OSX for regular shrinkwrapped desktop apps.

Not even Silverlight is very attractive and that is largely due to the marketing and pricing of the tools for it. A small developer organisation can’t afford it. Flex and AIR looks like serious contenders, though.

Mac Vista

Since I have both Parallels and Fusion running, I found it useful to try out Vista under Fusion. According to tests, Vista runs better under Fusion, while Parallels’ forte is XP. No, I haven’t verified that, I’m entirely happy assuming those tests are right.

Windows Vista is kinda pretty, even though I don’t see any aqua or aero effects, or whatever they’re called. (Here’s a full size view on how it looks while it is huffing and puffing its way through the initial 49 updates of the OS.)

But, even the Mac Pro I’m running now, with its 8 Gb of RAM is starting to page out when I’m running Vista. Admittedly, I’m running a few more things at the same time…

task bar, pretty full

… as you can see. Two XPs (1 Gb, resp 768 Mb) are running, Photoshop, Transit, the Fusion with Vista (1 Gb), Skype, Safari, OmniFocus, Mail, NeoOffice Writer, iTunes, Preview, Transit, and Activity monitor. As you can see on the activity pie chart, there’s just a gig of blue and nothing green to be seen. I guess the next move will be to fill’er up to 16 Gb of RAM, and if that’s not enough, I have to go to 32 Gb, which this machine is supposed to handle. I’m sooo spoiled…

Just like on OSX, I just now created a separate admin account and changed my regular account from admin status to “standard”. I’m very curious to see if it’s workable. While running as admin, I got a deluge of “Please approve this action” dialog boxes. Let’s see what happens if I’m not an admin and I try to install Open Office.

First, it blocked the download, warned me about the dangers of the internet, but it’s easy enough to approve it and proceed. Then it warned me about the installation program being unsigned, fair enough. Then it asked me for an admin logon to install the program (perfect!). And then the installation threw up an error box:

Failed Open Office installation

Did a quick Google and didn’t find anything about this error (“Wrapper.CreateFile failed with error 123”). Interestingly, after clicking “OK”, the installation proceeded, where I would have expected it to abort, rewarding me with:

Installation Completed dialog box

And, hey, it seems to work! Ok, so far so good under a limited user account. That is definitely good news.

Next little test, let’s open Task Manager:

Task manager

No problem, up it comes. Fine. Now let’s click “Resource Monitor”, which I know only admins can use:

Needs permission

Darn it, I’m bloody impressed! Instead of having to do that cumbersome “RunAs…” stuff, Vista does exactly what OSX does, asks for admin credentials (it even put in the right admin user name, which I rubbed out with a bit of yellow mud just above the password prompt). And up comes the resource monitor.

Phew, never thought I’d have to say I like this very limited first look at Vista after what I’ve heard about it. I think I’ll keep exploring it.

Now, let’s look at what Vista sees as the underlying machine, that is what Fusion pretends to be:

Basic info about computer

It sees a dual 64 bit processor and 1 Gb of RAM. Nice. If you look at the very bottom you see a new MS policy of automatically activating the OS, instead of letting the user do it. Normally I wouldn’t care, but if you’re running MSDN copies, you should be aware of this, since you often don’t want to waste activations on every installation you do. Vista isn’t going to wait for you to approve if I interpret that statement correctly. (I’ll wait and see if it goes ahead without asking or not after another three days, will keep you updated about it.)

So, what about the “Windows Experience Index”? It says “1.0” here, can’t be lower than that. Hm. Better check out the details:

Windows Experience details

Ah, now I see, the overall rating is equal to the lowest rating, which I got for gaming graphics. I have enabled “3D graphics” but I get no aero. I think Fusion doesn’t support Aero yet, but I couldn’t find anything on the web to confirm that, so I may be wrong. Apart from that, I find the above scores pretty impressive. Vista, at first blush, seems useable on this machine, not too sluggish, but then nothing is sluggish on this setup, really.

Parallels or Fusion?

Way back when, I used to use VMware desktop on my Dell for development. When I switched to the Mac, I naturally selected Parallels desktop to let me run Windows instances under OSX. A couple of days ago I was offered a review license for VMware Fusion, so I tried it out to see if it’s better than Parallels, even though I actually have very few complaints about Parallels.

So what I’m comparing here is Parallels Desktop 3.0 for Mac and VMware Fusion 1.1.1. My comparison isn’t in any way exhaustive, just a first impression after a few days of use and for a fairly limited application, namely software development and backups and stuff.

Parallels aboutVMware Fusion about

The machine I’m running these guys on is my brand new Mac Pro with dual quadcore Xeons at 2.8 GHz and 8 Gb of RAM. On this machine it’s hard to have any software perform poorly, so I wouldn’t be able to detect much in the way of inefficiencies, if there are any. Nice for me, but it hobbles my advice somewhat. For details on the machine, see my earlier entries on “Mac XP”.

I’m running an old Win 2000, and two instances of Windows XP under Parallels. One XP is equipped with MS SQL Server developer’s edition and Visual Studio 2005, while the other one harbours Visual Studio 2008. They have 1 Gb resp. 768 Mb of RAM allocated in Parallels. The Win 2000 has just 512 Mb, but I don’t use that one much.

For this comparison, I created a third Win XP and gave it 512 Mb of RAM. I plan on using this VM as a “utility VM”, containing stuff like backup software. The first thing I installed in it was Retrospect 7.5 for Windows that came bundled with my Netgear ReadyNAS+ (5 clients included) and then I purchased and added a further 5 client licenses. So it can now backup 10 clients, mixed Mac OSX and Win clients.

It turns out that Fusion is a pretty good choice for this “utility” VM, since it allows me to allocate 2 virtual CPUs. Retrospect does exploit multiple CPUs if you have them, so this allows Retrospect to use two of the eight cores I have in the machine. Parallels would limit Retrospect to just one core.


Running Retrospect in one of the other XP VMs would make that VM go very slowly. All it’s activities would be limited to one core on the Mac and I would have a nasty time of working in Visual Studio in the same VM at the same time. Having Retrospect run on two cores in its own VM allows me to work in the other VMs without noticing any slowdown at all. It’s great! In all fairness, running Retrospect in a Parallels VM would have had the exact same result, except Retrospect would have run slightly slower.

I’m usually writing quite a bit of multithreaded code, making it practically necessary to run on a multiprocessor to avoid subtle bugs. That would seem to mandate Fusion. But I don’t know how fully it emulates two CPUs. Does it interrupt right in the middle of memory accesses like a true multiprocessor machine would do, or is it more civilized than that? The lack of documentation about this is a problem, just like for hyperthreaded CPUs. In both these cases, it’s very unclear how close they mimic a true multiprocessor machine.

As far as simply running most software, I think both these products do a grand job. I’ve not encountered any problems with either, but remember I’ve done much more on Parallels than on Fusion. The network setups are also practically identical with choices for host sharing, bridged, and host only. The intricate and flexible network configurations we see in VMware’s Windows product aren’t found in Fusion (yet). What’s also lacking is a decent snapshot management in Fusion. You can take snapshots and revert, but there’s no management of multiple snapshots like in the Windows product or in Parallels.

Miserable keyboard handling

Now for my real beef with both of these products: the keyboard. Obviously, most keystrokes should be passed on to the virtual machine, some should be converted, and some intercepted and sent to the host OS. Both these products have made a mess of this even though it ought to be simple to get right.

In Parallels, the command and control keys swap nicely on the left side of the keyboard while remaining unswapped on the right side. Kinda confusing, but I don’t mind getting used to it. But function keys is a real problem. Sometimes I succeed in getting them through to the VM using different combinations of command and control or something, then I can’t remember exactly what I did. It also varies according exactly which function keys we’re talking about. The function keys that have predefined uses for dashboard, exposĂ©, and similar, behave differently from other function keys. Parallels does have a menu where one can select magic key combinations to send to the VM, but it would have been great to have these keystrokes pass right through under their own steam, so to speak. Having to select function keys from a menu is good for once or twice, but get’s old real quick. This is how it looks under the “Actions” menu in Parallels:

The Actions menu

Under VMware, there’s a setting in the “Preferences”, which means that it is the same for all VMs under VMware:

Preferences in VMware

As you can see, there’s a single checkbox “Enable Mac OS keyboard shortcuts” and it works admirably. A little too admirably, in fact. Once you deselect it, all keystrokes go to the VM, including command-tab. Now there’s no point in passing command-tab to the VM since Windows doesn’t know what a command key is. But it makes sure I can’t easily switch between apps on the Mac. This is ridiculous, since Windows reacts to alt-tab, so they could just as well left command-tab for OSX. The new Mac keyboards also have a special “Fn” key where the useless “Help” key used to be. That ought to be exploited by Parallels and Fusion somehow, but isn’t.

I don’t know which of the systems, Parallels or VMware, got the key settings most wrong; it’s a close call. To me it’s obvious they really could spend a little effort in getting this right, since it’s the one thing that makes working with VMs hard; everything else is almost perfect. Having options allowing all keystrokes to pass to the VM except command-tab and possibly control-space (which I use for QuickSilver), would be absolutely great. Allow the user to freely define another couple of magic combinations that should not pass to the VM, and you’re set.

Converting: defeated by Mickeysoft

Both products are able to convert a VM from the other product to its own system. Both of them take forever to do it, but seem to do a good job of it, ultimately. But Windows isn’t happy about it, since it sees the conversion as a move to another machine and then insists on needing a new activation. Just to be a real PITA, Windows only gives you three days to reactivate, if it was already activated. Considering that you get 60 days to activate (for the MSDN version of XP), you are actually severely punished for having activated your XP in the first place before the conversion. How very nice of MS. Actually, WGA being what it is, it’s not a good idea to convert Windows installations from Parallels to VMware or vice versa at all. Actually, if you can avoid activating at all, that’s even better, but it limits you to 60 days per setup.

Dock difference

Parallels shows an actual live image of the VMs screen in the dock and in the task switcher, so even though a VM is hidden behind a stack of other apps, I can keep an eye on the dock icon and see if some compile has finished or a dialog box is waiting for input:

Parallels dock icon

Fusion, on the other hand, just shows a Fusion logo, missing an opportunity to display something useful:

Fusion dock icon


My current conclusion is that both products are great and work just fine. Both need serious work in the keyboard handling. Fusion has dual CPUs, a major advantage, especially on a multicore machine. Parallels has better snapshot handling and really useful dock icons.

Parental controls done right

Just gave my iMac to my 6 year old daughter, so I got a chance to explore the parental controls in Leopard. Which is why I gave her this machine in the first place.

Now, this stuff is done right all the way.Parental controls logo

First, having a limited user account on a Mac is not a problem for anything, which is a major first step. Then I set up her account to have “parental controls”, which is just a checkbox to click. Then I used system preferences on *my* Mac Pro, went to Parental controls, saw her machine with her account listed and logged in twice, once as admin on my machine, once as admin on hers.

After that, I can select, one by one, the apps she can use. For Safari, I can enter the websites she can access (I can approve new sites on the fly on her machine using my password, for that once or permanently). For iChat I can set which users she can chat with, except I simply disabled it for now. For Mail, I can set which email addresses she can write to and receive from.

Selecting allowed apps

Interestingly, if she receives mail from an unapproved mail address, it’s redirected to my account and my Mac Mail shows me the email and asks me if this source is allowed to write to her. If I approve, inside Mac Mail, the address is added to her list of approved emails and she gets the mail in the next round. Same thing if she writes a mail to an unapproved address, she gets a popup saying it’s not approved and gives her the choice of asking for permission. If she does, I get the mail and again get the chance to approve it or decline.

I can also set how many hours per day she can use the machine. (Buggy, see update below.) One setting for weekdays, another for weekends, and there is even a setting for excluded hours, for instance after bedtime.

Hours per day setting

Lastly, a very complete logging. Every app she used, number of times and for how long, plus date and time. Every website visited, including the URL parameters, so I can, from my own machine, see exactly *which* videos she watched on YouTube, for instance.

Log of URLs

Not everything is as totally controlled, for instance Skype is an allow/decline kind of thing. I can’t lock down who she adds, but I sure can check visually every now and then. If I’d excluded Skype and only allowed iChat, I would have had total control, but I can’t expect everyone to go get a Mac. Not just quite yet, anyway.

There’s just one thing missing and that is that I don’t get copies of all her email. But OTOH, that would be too intrusive, I think, especially since she can’t receive or send to anyone I haven’t already approved.

I’m amazed at how simple and well done this is in Leopard. Not really a surprise, but still. To me it’s worth giving her the iMac, just for this one thing.

Update on Feb 13, 2008: the time limits on use seem to only work intermittently and are thus unreliable. Depending on use they may not kick in. To my daughter’s delight, she seems able to keep watching YouTube videos forever, regardless of settings.

Mac XP: totally cool

I finally got the Mac Pro fully installed. Using the migration utility on a Time Machine backup screwed up the mailboxes in Mail somehow, so I had to hook up the old iMac as a firewire target and copy it all over again. Then it worked.

Some apps needed re-licensing, especially Adobe is a mother when it comes to this. You need to de-activate the old installation before you can activate on a new machine. Max two activated machines per serial number. I can’t imagine what you would have to go through if the old machine isn’t bootable so you can de-activate from it. Moral of the story: don’t ever reformat the old machine before you’ve excercised all your apps thoroughly.

Some other apps needed a new license entry, but none of the others needed deactivating on the original machine. Zinio reader was its usual obnoxious self, needing all kinds of manual removals of files in library and plists and stuff, plus reinstallation. This is about the fifth or tenth time I’m doing this with Zinio, and they really ought to get their act together.

Parallels needed a little fiddling to get the “shared networking” working. The trick is to go into system preferences, networking, and the system automatically wants to activate en3, which is what is missing if you migrate Parallels from another machine. Problem solved.

As a first test, I started up three instances of Windows, and had one of them compile a hefty DLL using Borland C++. At the same time, the other instances remained totally responsive, while OSX itself didn’t slow down in the least. Everything remained snappy as can be. Took a shot of the screen estate with the CPU graphs (8 of them stacked…) on top of the left Windows XP. You can admire the screenshot here. The current version of Parallels isn’t able to use or emulate more than one CPU per instance, but I think they will provide for that later. Let’s hope.

Mac XP: rest of stuff arrives

Just now, the rest of the parts arrived, that is four 500 Gb drives and 4 x 2 Gb RAM, all from OWC. I was a bit impatient and forgot to snap pictures of it before putting it into the box. But you all know how hard drives and RAM looks, right? Except this particular RAM has a lot of black fins on it (see earlier pictures).

Anyway, I took out the 2 Gb RAM that was in the machine and replaced it with 4 modules of 2 Gb each. I’ve read somewhere that memory access is the fastest on these machines if you have 4 modules. Maybe there is a 64 bit wide bus for each module and the computer has a 256 bit wide memory bus? Actually, I don’t know, but it didn’t seem worth slowing down the memory bus for just a meagre extra 2 Gb, so out they went. My next Mac Pro (yeah, right) can then get these for a 4 Gb start in life.

I also took out the single 320 Gb drive that was in the machine when it arrived from Apple. Put in 4 x 500 Gb Seagates instead. All the drives had a jumper in place limiting them to a transfer speed of 1.5 Gb/sec. I seem to remember the Mac Pro can handle 3 Gb/sec so I took them out. Fiddly in the extreme, had to use a knife to get them out.

Started up the machine, but discovered I couldn’t open the optical drives with the classic Mac keyboard, no reaction as I hit the eject button. (I’d taken the new keyboard and used it for the iMac.) Had to switch to the new keyboard and then I could open the drive to get the install DVD in there.

When installing, you have to select “Disk utility” first and format the drives. Time to figure out how to divvy up these drives. My first inclination is to use two of the drives like “regular drives” and save the last two drives for Time Machine. Yes, I do run regular backups through Retrospect to an external NAS, but still, Time Machine is really nice and useful. TM on the iMac sucks too much computing and disk power, however, so I’m hoping this isn’t the case on the Pro. Especially if it can use two internal 3 Gb/s drives, it ought not to get in my way, even though I’m running several Windows instances under Parallels (every change in those gigantic virtual machine files trigger a backup of the whole file, maybe 5-10 Gb at a time. Heavy. I’m not too sure this is a good idea.)

Right now, the Pro is migrating from my iMac using the Time Machine backup on an external 500 Gb Lacie Porsche. It still has another hour to run. Interesting factoid: the little Lacie Porsche makes more noise than the entire Mac Pro under it. Fascinating.