In January 2006, Apple took an important step toward success in the business world — it began to transition the Mac onto Intel processors. In so doing, the company paved the way for Macs to natively run Windows and Windows applications. Initially this capability came via Apple’s dual-boot system, called Boot Camp, followed by Parallels Desktop, software that ran Windows in a virtual environment. Either way, it eliminated an obstacle — the need to run applications not built for the Mac’s operating system — that had kept the Mac out of most workplaces.
Alongside support for other business standards in terms of networking and communication tools, this opened the door for the eventual acceptance of Apple in the enterprise. Granted, the iPhone’s later success helped, but the ability to run Windows was the golden ticket for the Mac in the workplace.
A decade and a half later, Apple upended that strategy by moving to its own ARM-based chips. Boot Camp isn’t available in Apple silicon-based Macs, but the ability to run Windows on the Mac still exists, thanks to virtualization. Microsoft even recommends Parallels as an official solution, alongside its own Cloud PC technology, for organizations that still need to run the Windows OS or Windows apps on a Mac. The question, however, is this: do Macs in business even need this ability any longer?
That question may sound surprising or even shocking — typing it even felt a bit heretical — but in a “mobile first, cloud first” world (to borrow Microsoft’s onetime tagline), one in which businesses and IT departments are trying to adapt to post-Covid realities and where most IT budgets are being stretched, it’s a question that should be asked.
The answer, for most, is no. For a great many companies, we’re in a world where Windows is optional — and sometimes the other options are better.
How’d we get here?
In looking at this question, it’s important to have a sense of perspective. We’re talking about a 15-year gap between Windows being the center of the business universe and Windows being just another option. A lot happened during those years, and getting to this point involved a number of critical processes.
For the most part, it was a different Apple product that changed the rules. A year after Apple moved the Mac to Intel, it unveiled the iPhone, and a year after that, it unveiled the App Store. Two years later, while most of the world was wondering what impact the iPad was going to have, Apple debuted its MDM platform. The introduction of Apple MDM was more significant than most of the IT world recognized at the time, and we’ll get that in a bit. Let’s look at what happened in the non-technical aspects of the workplace after 2007 first.
The iPhone really got its legs in 2008 with things like 3G, multiple carriers around the world, and the App Store, and it was truly revolutionary in the work world. This was the first time that employees across virtually every field had the ability to pick a piece of technology to use at work. For almost any task, there’s an app for that. Can’t use the corporate network? Use your mobile carrier. Need to transfer work documents from the PC on your desk to the phone in your pocket? Use a cloud provider or good old-fashioned email.
The iPhone, with Android on its heels, forced its way into the workplace whether IT departments wanted to support it or not. It was the catalyst for that “mobile first” world and the consumerization of IT — or, as we call it today, digital transformation.
While the iPhone was shifting the world in one way, cloud computing was shifting it in another. The advent and broad adoption of SaaS applications and the “as a service” mindset was breaking down the Windows desktop and application world, and both tech giants and new players were capitalizing on that change. Google, Dropbox, Slack, and even Microsoft itself showed us that business computing could be incredibly flexible, and the vast majority of it could be done in a browser.
A browser might not always be ideal, but it replaced Windows as the line in the sand for what was absolutely required to conduct business.
Alongside the browser was the App Store, which could deliver much better interfaces than a browser to those cloud services. The rise of browser-based computing, the mobile revolution, and the fact that macOS and iOS are two sides of the Apple coin all contributed to a perfect storm for developers and users to build a new business ecosystem.
iPhone manageability opens the door
Then there was Apple MDM. Introduced in 2010, the iOS device management framework gave IT a way to manage iPhones (and eventually other Apple devices) in the workplace. When we talk about MDM, we’re usually talking about mobile devices (MDM does stand for mobile device management), but nowadays the same MDM protocols that manage iPhones and iPads also manage Macs (and Apple TVs).
MDM wasn’t the first time that Apple decided to play alongside PCs in IT’s technical sandbox — the company had offered support for Windows file and network sharing, Active Directory authentication, and Exchange years before it transitioned to Intel processors. But MDM was a key factor in Apple as a whole gaining traction in business.
With smartphones taking the world by storm, it was inevitable that people would bring in their own devices. Apple had seen that and gotten out in front with a framework to secure, manage, and support those devices. With the same framework covering both macOS and iOS, the company managed to slip Mac support and acceptance in alongside the iPhone.
As a side note, Apple also (indirectly) ushered in enterprise support for Android devices and even Chromebooks. By providing its MDM framework for third-party vendors to use in their own mobile management services, the company opened the door for unified endpoint management (UEM) platforms that can typically manage devices running Windows, macOS, Android, iOS, iPadOS, ChromeOS, Linux, and more.
Employee choice, Covid, and maintenance costs bolster Mac acceptance
All of this change led companies to support the Mac and helped launch the employee choice movement, in which workers and managers can select the computing device that they feel the most comfortable and productive using. At first the companies joining this trend were outliers, but as large corporations like IBM began instituting employee choice, the idea began to gain real traction, become a selling point for potential new hires. Today, employee choice programs are mainstream and expected. Not offering computing options has become a hindrance to companies seeking talent.
Covid and the shift to remote work have also had an impact, both by encouraging workers to use whatever they needed to get the job done during extended lockdown periods and by empowering them to choose how they wanted to work — home office, preferred apps, personal smartphone, and Mac or PC.
Then there’s the issue of cost. Mac advocates have long made the case that Macs save money in the long run despite their higher up-front cost. Data from IBM and from Forrester demonstrates and quantifies the fact that Macs do save money — sometimes significant amounts of money — in support costs.
Is Windows necessary? Not so much
All of this bolsters the case for having Macs in business, but that case doesn’t need to be argued much today — the verdict has been in for while. Does that really mean businesses don’t need Windows?
Painting with a broad brush, no. There are always going to be exceptions — and sometimes they’re are big ones — but the vast majority of business tasks no longer require any particular platform. Any computing device on the market will do the trick. Whether it’s Office documents, collaboration apps, virtual meetings, or other business software, each vendor has not just one solution but several.
Microsoft’s drive into services and fluid computing models, where information spans apps, locations, formats, and devices, points to the fact that even the company that makes Windows knows it’s far from the only game in town, that the paradigm has shifted and isn’t changing back. To maintain relevance — to say nothing about maintaining dominance — Microsoft has play on every front. And what’s true for Microsoft is true for all software vendors.
Nor is Microsoft even the default choice for business software and collaboration today. For every key business technology it offers, there are competing options that work just as well or better. For Microsoft 365, there’s Google Workspace; for Teams, there’s Slack and Zoom; for Active Directory, there’s Okta; for Intune, take your pick of UEM vendors. Everyone on the field knows that one of the most important rules of the game is to support business users on any device they choose.
This isn’t to say that there is no need for Windows or that there will never be a situation where a Mac user will need to run a Windows app. Many organizations still have ongoing digital transformation programs, and not all legacy apps (particularly in-house customer apps), workflows, or business practices have migrated away from Windows. But if they haven’t already, then they likely will in the future. Even the most archaic apps need to be able to run on recent PCs, and there’s always a tipping point where starting from scratch using modern tools begins to outweigh patching and supporting older technologies.
So back to the question: do Macs need to be able to run Windows in order to be true business machines? With few caveats, the answer is no — and as for those outlying instances where the answer is still yes, we’ll have to check back in a year or two or five to see if the need is still there.
Copyright © 2023 IDG Communications, Inc.
This story originally appeared on Computerworld