My Macbook Air (2013) games better than my old Dell Studio 1556
published: (updated: )
by Harshvardhan J. Pandit
blog gadgets
I own a 2013 Macbook Air, which for its category is an ultraportable laptop - with great compromises in graphics. It has an onboard chipset, no separate graphics card, no separate graphics memory. Just an Intel HD 5000 chip on the motherboard that can handle some processing. Or at least that's what the impression always had been. My other, older laptop is a Dell Studio 1556, which I bought in 2010, which makes it 6 years old at this point. It has a dedicated graphics card - Ati Mobility Radeon 4570 that has 512MB dedicated graphics memory along with a GPU speed of 680MHz against Intel's 200Mhz. The other week, when I was in Cork, I had my Mac with me (it's always with me), and I wanted to play Borderlands 2 (which I play every monday), which is installed on the Dell laptop. Since I only had the mac, I figured I might as well see if I can play it, or whether it even was playable.
So I installed it via steam, and it took a while (>2 hours), during which I watched Kill Bill vol.1 on TV, so I won't complain about that. Once installed, it crashed quite a lot at the main screen, though a restart fixed that (for some reason, Macs are needing restarts too much, too closer to Windows). So here it was, playing smooth and crisp, on a tiny, thin, low-power laptop, much much better than the Dell. To say I was surprised would be an understatement.
I immediately whipped up my browser (no IE, huh!) and decided to do this proper. So I got some graphics benchmarks for the Mac and the Dell (here and here) and got down to see why the Mac could be better at graphics than the Dell. Sure, when I had bought the Dell, the graphics card had been a mobile one (which means it was low power and wasn't all that great), but since the Mac and Dell are both laptops, I decide to ignore this point. I was always under the impression that a dedicated graphics card could outperform any embedded chipset. Which was the reason for me gaming all these years (ever since 2014) on the Dell instead of the Mac. That, and because of my policy that work laptops should not contain games. But since I'll be buying a new Mac soon enough, and the 'work' I'm doing currently involves pretty much writing and reading, I think I'll manage for a while.
So the reasons that the Macbook Air can do better graphics were simple - better technology. Advancements in processor manufacturing meant that things got pretty low power (compared to 2009), which allowed cramming more stuff on to the chipset than before. The Intel graphics chipset is based on Haswell, which was the best architecture (for consumers anyways) in 2013. The ATi Mobility chipset, meanwhile, contains a traditional GPU architecture shrunk and trimmed down to cut power. Although the ATi card contains dedicated memory, the shared memory and faster cpu access to Intel graphics means faster graphics data transfers. Plus, the memory bus is twice that of ATi. Consider memory bus as road lanes, if there are more lanes - there is less traffic and more transportation.
For someone who is pretty interested and in the know about chipsets and processor technology, this just went to show what bias and misinformation can do. I was always under the assumption that dedicated chipsets were the only way to have decent graphics performances on a laptop. This sudden change in perspective also made me question what other biases I might be carrying with me. In some things, perspectives are never questioned because no one ever checks them. I could have simply been happy that some game worked on the Mac, but then this (agonizing) habit of questioning and intrigue always leads to changes in thinking - either by learning something new, or breaking old shackles down.
What was your last assumption that simply made you stop and wonder how it had stuck in your head for so long?