I’ve been a big fan of VMWare Fusion on my macs since I converted my home computers to mostly mac a few years ago. I love the product because it allows me to manage everything on 1 piece of hardware… well, almost everything. The one exception has been debugging in Visual Studio. I simply cannot step (f11) through code in my virtual environment and that’s pretty much become a deal breaker.
That said, I’ve overlooked it because I really love the single hardware approach. So, what got me thinking this approach might not be the best idea any longer? A few weeks ago, I like millions of others upgraded to Mountain Lion for $29. My upgrade ended up costing me an additional $49. Why? Because after upgrading to Lion, VMWare Fusion was unusable and had to be upgraded to version 4.1.3 for $49.99.
I’ve got a mac book air and an iMac so I’m looking at $100 every time an upgrade comes and, another is now available. It doesn’t make sense any longer, I’ve spent quite a bit on Fusion thus far to keep up to date but I’m likely at the end of the road unless the pricing comes down significantly.
Over the next 3 years I’d likely pay $300 minimum to keep 2 macs up to date with fusion releases; assuming 1 release per year, not to mention any apple OS updates that force me to upgrade. It’s a safe bet to assume that will occur at least once in the next 3 years, so perhaps $400 to remain up to date.
I actually purchased a new hp probook for $479. It runs Visual Studio 2012, SQL Server Standard Edition 2012, IIS and Windows 7 just fine on an i3 core… just what I need. In the end, the notebook is the better deal for me. It will obviously last 3 years and the pricing is nearly equivalent in that I’m comparing a new piece of hardware with an OS to a single piece of software over 3 years. Too, I get my full debugging features that I really need to have.
Sorry Fusion, I’m migrating back to a windows standalone machine.