It started with a realization: My entire digital life lay in the hands of Apple. My phone. My computer. But also, the really important things: my credit cards; my photos; my passwords; my journal; my email.
For a long time, I didn't see this as a problem. In fact, it was a benefit. I liked having everything centralized and interlinked. I liked (loved) the ecosystem. The UX. The design. It was, and still is, the best in the industry, at least in my view.
But when you put all your eggs in one basket, you're really trusting that the basket is going to hold. At some point in the last few years, I started doubting the integrity of mine. Apple started making decisions that gave me pause. Some of their product releases felt less polished than before, they seemed to be optimizing more and more for shareholder-first services revenue, and there were concerning reports of users being inexplicably locked from their accounts. The prospect of being kicked out of my centralized digital life made me very uneasy.
I had dug myself into an Apple grave, as I took to calling it. I was totally at the mercy of whatever they decided to do with my hardware, my software, my data—everything. And, of course, they also loved to control what I could and could not load onto my devices. So, I started to think about ways to rebuild my digital life within a safer, more independent structure. What followed was a nearly year-long journey that would ultimately reconfigure my preconceived notions about computing and personal agency—though maybe not in the way that you think.
My first longings beyond Apple's walled garden were for an Android phone. But not just any Android phone: I wanted the modular, repairable one. After spending so long with iPhones—until recently, among the least repairable handsets on the market—I was ready for something I could fix myself. Something that guaranteed an extremely long software lifespan. A Fairphone—European-bred and focused on repairability and ethical manufacturing—fit the bill. It was the foremost example of a philosophical 180 from the Apple ethos, and I wanted to dive in headfirst.
The first few days with my first-ever non-iOS smartphone were exhilarating. I could pop off the back! The battery was right there! I would never again have to pay through the nose for a repair; I could just get the parts cheaply and install them myself. This would be the last phone I would ever have to buy.
The software was equally inspiring. I could torrent things on my phone! It had USB-C and video out, so I could hook it up to a TV and watch the things I had torrented! I could download unauthorized apps, run exclusively open source software, and totally renovate the home screen into a text-first minimalist launching pad. It was really cool and really freeing.
But that exhilaration didn't last long. I quickly began to realize that the whole experience was disjointed. Unpolished. Rough around the edges. Of course, I knew going into this that Android simply couldn't touch iOS for UX—but I was really surprised at how things actually felt after switching.
The camera became my first major pain point. It was awful. Really, truly awful. The tone and color balance were always wrong. The sharpening was strange. And, I am not exaggerating one iota, you simply could not take a photo of anything in motion, because it would come out blurry. Every. Time. The app itself was janky and jittery and bugged out constantly.
Theoretically, the beauty of Android is that you can install another camera app. Which I tried. In fact, I tried over a dozen different apps before I realized: actually, Android (or at least the Fairphone) doesn't offer the right hardware APIs for third-party apps to actually use the camera to its fullest extent. So, if you zoomed into any of the images taken by those other, arguably usable, camera apps, you would see that pixel binning didn't work properly and that everything was a smudgy mess.
The browser situation proved equally frustrating, since I didn't want to use Chrome. Why, then, did I choose Android? Because I knew that I could swap Chrome out for any other browser I preferred. But, as I was learning, this freedom came with an (admittedly, industry-wide) asterisk: first-party apps are always better. They integrate seamlessly with the hardware and other apps, and they're more polished than anything you'll find on the app store. Yes, I could (and did) switch to Firefox, but Firefox isn't a first-party app. It doesn't perform like one. Its UX conventions clash with the rest of the OS. Its visuals don't match. Nothing about it feels native. (Here's where Android fans start steaming. I'm missing the point of modular systems!) Sure, Firefox works, but why can't I just have Chrome with an ad blocker?
While these might seem like niche, nitpicky problems, they're the exact sort of things that matter to me. And what happens is that all these little rough edges and weird, "No, that actually doesn't work," moments start to erode your trust in the foundation of the platform. Even when things mostly function, they don't function in a way that feels reliable. Android gave me more control, sure, but at what cost? It often felt as if I were forced to take control, because nothing behaved sanely unless I invested the time to swap out each piece of the greater system. When a system's inherent flaws start to compound, both in hardware and software, you find yourself constantly second-guessing whether it's going to perform when it matters.
This gets to the heart of what Android fails to prioritize: the polished, last-mile UX that makes a system feel coherent and trustworthy. Android may be open, but it is not artful. (Which reminds me of a particular bugbear: there’s this permanent Google search bar that you can't remove from the default home screen. It’s just there. Forever. You can’t delete it. The best you can do is use a third-party launcher that, of course, is not optimized in the way that the first-party one is. It feels unpleasant and very at odds with the platform's promise of customizability'.)
I decided that Android was a strong illustration of the tension between positive liberty (the freedom to do what you want) and negative liberty (freedom from interference). In trying to provide the former, Android often fails to deliver the latter.
I also realized something else: in my haste to run away from the grave I had dug myself over in the Apple garden, I had run into the arms of Google—another tech giant with its own complicated relationship to user privacy. I had changed out one megacorp for another. Was that really helping things? Not only were hardware problems and the core functionality of third-party apps now something I had to actively think about for the first time in years, I was dealing with this inside an ecosystem that fundamentally respected me less than the one I had just left.
And it was this realization that prompted me to, strangely, double down in another way: if mobile independence wasn't the answer, perhaps desktop computing held more promise. I was going to try Linux.
Open source. Democratic. Free from Google, Apple, Microsoft—the whole gang. My own computer doing only the things I wanted it to do. This was surely the antidote. And in a lot of ways, it was.
I started off, as all novices do, with Ubuntu. You see, Linux isn't just one OS—it's the name for a collection of many. Many, many. Ubuntu is the flavor of Linux most often associated with "it just works" simplicity. So when I nuked my desktop and fired up the USB installer, I was quite relieved when I didn't have to use a single terminal command to get the thing onto my hard drive. First Linux myth: busted.
System installed, I was immediately taken by the UI. It was clean, it was straightforward, it was minimal. It was very obviously not invested in getting me to click on ads and trials and first-party subscription services. But perhaps I liked it mostly, embarrassingly enough, because it was Mac-like. Foreshadowing, perhaps. But at the time, it was cool to have a powerful desktop run something other than Windows.
I was also seeing things that made me not only happy with my choice, but more optimistic and reinvigorated with computing as a discipline. This, for me, was the open source software and the development work that went into it. Wherever I looked on my Linux setup, I was seeing fingerprints of people, not corporations, undergirding the entire system. It felt like a return to something more human.
You see, corporate OSes are analogous to products of mass production. Mass production makes us feel alienated because it strips us of any tangible connection with the people on the other side of the assembly line. Before the Industrial Revolution, objects were wonkier, lines weren't as straight, and patina was a virtue. You could look at any chair or pane of glass you owned and see something human, some fingerprint of a creator, in it. Linux as a system has these fingerprints all over. You don't have to dig to see the scrutable choices made by individual contributors who wanted to make the system better for everyone. And that's touching.
You also don't have notifications constantly bleeping, or programs included that are just portals for data collection, or ads in your taskbar. You don't feel tracked or commodified. You just have a human system, built on a human level, to be used by humans. This is the best thing about Linux and open source: it's a beautiful paradigm that highlights the power of collective action. I felt inspired to contribute both time and money to projects that I considered to be doing good work. Significantly more than if it were just another paid app.
But the twin problems of boutique, by-hand creation have always been robustness and consistency. For all the ills of mass production, an assembly line is tuned for optimal performance. Ubuntu and the likes are not. You might consider that, unlike physical goods, software can be infinitely replicated without loss of quality. This should theoretically give open-source software an advantage in iteration and consistency, letting people build on ”good” until they reach ”perfect”—yet the coordination costs of distributed development often result in exactly the kind of inconsistency and roughness that came to characterize my time with the system. So, as with Android, the jagged edges eventually did me in.
Take, for instance, my Wi-Fi card. Even after I got it working, it would occasionally drop connections without warning. Then there was the issue of window flickering—a minor annoyance at first, but one that became increasingly grating over time. Managing packages and dependencies became a chore, especially when different software required conflicting versions of the same libraries. My Nvidia GPU didn't play well with anything, and drivers were shaky at best. Codecs for media playback were hit or miss, and just forget really any modern video feature, like super resolution or frame generation.
The more time I spent tweaking these issues, the more I fell into a sunk cost fallacy. Each hour spent troubleshooting made me more convinced that this was normal, sustainable behavior. But it wasn't. I often found myself happily surprised when something functioned the first time, without any tweaking. This is not a healthy expectation for your computer. It should fulfill your needs, and in the best case, anticipate them, without you having to second-guess or correct the execution. Linux was all second-guessing, all the time. My conclusion was that Linux was a hobby, not a workstation.
This brings me back to a fundamental question: Is it better to have a functional-at-best experience that you own or an exceptional one that you don’t? Openness and closure are means, not ends, and shouldn't be treated as moral imperatives. The debate over closed versus open systems often gets lost in what I would consider a weird teleos, a system-architectural millenarianism, preaching that one of the two systems will eventually emerge victorious due to the inherent contradictions, if not wrongness, of the other—while ignoring the everyday user experience right in front of us. And in the end, you can't run a computer on beautiful dreams alone.
Over the course of my year of alternate computing, I became convinced that UX and design are, in fact, what I value most in computing. I value it higher than performance. Higher than independence. Higher, even, than operational security.
So, I'm mildly ashamed to admit, I eventually switched out my Android (by then a Sony Xperia, also a lemon) for another iPhone and my Linux distro (by then Pop!_OS, less of a lemon than Fedora or openSUSE but roughly on par with Ubuntu) for Windows on my desktop.
And, I'm more ashamed to admit, it felt great. It felt right. I no longer had to second-guess my system or spend any time at all troubleshooting or interpreting opaque UX. I knew that everything would just work.
But my time with these other systems helped me see that we are living through a great consolidation. The dream of the early internet was decentralization—anyone could run their own server, build their own tools, control their own digital destiny. But the reality of modern computing is consolidation around a few major platforms. The stack has become so deep—from silicon to machine learning models—that only the largest companies can maintain competitive platforms end-to-end.
The question isn't whether to participate in this consolidation, but which consolidator to choose. That realization alone makes me want to run screaming back to the platforms I've just given up on. Would a Google Pixel have offered the polish I craved while maintaining some independence? Would learning to code have given me the tools to truly master Linux? These questions still nag at me. Even now, I get the itch to try everything again.
But, at the end of the day, I want my computers to work. Freedom and control are enticing, but they exact their own toll. Outside of the Apple ecosystem, hardware problems still exist and sometimes they're quite pronounced. First-party apps will always have a UX advantage, so why settle for third-party? Simplicity and intuitive function are perhaps the core virtues of a computer. It's better to have the right option pre-selected than be given the choice of a lot of bad ones.
So here I am, back in my Apple grave. Maybe one day I'll try to dig myself out again, but for now, I'm content with a system that, really and truly, just works.
Published December 4, 2024