The Interface Trap: When Technology Stops Being a Tool
I turned on my TV the other night and waited. Not for the show. For the TV itself. Eight seconds of boot animation, then a home screen I didn't ask for, then an app grid, then Netflix, then a profile selector, then a content menu. By the time I was watching something, I'd made six decisions and navigated three layers of interface just to do what a power button used to handle on its own.
It wasn't always like this. I remember turning a TV off on ESPN and turning it back on to ESPN. The device remembered. It had one job and it did it without ceremony. Now the TV doesn't remember anything, because the TV isn't really a TV anymore. It's an operating system that happens to display video. And the operating system has its own priorities: promoted content tiles, ad placements, app partnerships, engagement funnels. My intent to watch ESPN is somewhere on that list, but it's not at the top.
This pattern is everywhere, and once you start noticing it, you can't stop.
Screens on Top of Screens
My home alarm panel used to be a keypad. Physical buttons, instant feedback, muscle memory. You punched in four digits and the system armed. Eyes closed, half asleep, didn't matter. It worked the same way every time.
The new one is a touchscreen. It has weather. It has a photo slideshow. It has schedules and zones and submenus. It lags when you tap it. The core function (arm the alarm) now competes for screen space with a weather widget I never asked for and a firmware update notification that won't go away.
This is not an upgrade. In control theory terms, you've increased system complexity while reducing feedback fidelity. The old keypad gave you tactile confirmation on every press. The touchscreen gives you a half-second delay and a hope that your input registered.
Cars have gone the same direction. Climate control used to be a knob. You turned it, felt the click, and the temperature changed. Eyes stayed on the road. Worked with gloves. Worked offline. Worked every single time.
Now it's a tap on a touchscreen, then a submenu, then a slider. There's lag. You glance away from the road to find the right icon. If the software crashes (and it does), you lose climate control entirely. We've moved a safety-critical function into a latency-prone abstraction layer and called it modern.
The "Smart" Appliance Problem
My washing machine needs an app. Let that sit for a second.
A washing machine needs three things: power, water, and a spin cycle. It has needed these three things for decades. It doesn't need WiFi onboarding, firmware updates, account creation, or a companion app that sends push notifications when the cycle ends. I'm thirty feet away. I can hear it.
But the app exists because hardware margins are thin. The machine itself is a commodity. What isn't a commodity is the data it collects, the ecosystem it locks you into, and the recurring software relationship it creates between you and the manufacturer. Your dishwasher has a cloud strategy now. Not because you need it, but because their business model does.
This is financial engineering disguised as user innovation. The incentive structure favors complexity, so complexity is what you get.
The Voice Interface Lie
I own Apple HomePods. I genuinely wonder if Apple engineers use them. They don't respond half the time. They mishear simple commands. They lag. They lose connection to each other. They forget what room they're in. A device marketed as frictionless control is, in practice, an exercise in repeating yourself louder and slower until you give up and use your phone.
The deeper issue isn't that voice assistants are buggy (though they are). It's that voice interfaces are fundamentally probabilistic. A physical light switch is 100% accurate. You flip it, the light changes state. Every time. A voice command travels through a microphone, gets processed by natural language models, makes a network round trip, interprets your intent with some confidence score, and then maybe does what you asked. We replaced deterministic control with inference and called it progress.
That's not a technical complaint. It's a philosophical shift: from agency to interpretation. The device is no longer executing your command. It's guessing what you meant.
Why This Keeps Happening
Three forces drive the pattern, and none of them are about making your life easier.
The first is software scale bias. Engineers are trained to build for extensibility. Every problem looks like it deserves a platform, an API, a settings menu. Simplicity gets engineered out because it doesn't demonstrate technical ambition. A knob doesn't get anyone promoted.
The second is the feature arms race. More features means easier marketing. "Smart" is a selling point on a box, even if the smartness makes the product worse at its core job. Nobody puts "does one thing reliably" on a product page, even though that's what most people actually want.
The third is screen-native design culture. A generation of designers grew up building apps. When they design physical products, they default to what they know: touchscreens, menus, interfaces. The assumption is that every interaction deserves a UI. It doesn't. Some interactions deserve a button.
The result is predictable. Tools become platforms. Platforms become ecosystems. Ecosystems become fragile, interdependent, and slow.
The Cognitive Tax You're Already Paying
Every "smart" device in your house adds cognitive overhead you didn't sign up for. State ambiguity (is the system armed or not?). Update anxiety (will this firmware break something?). Connectivity dependency (does the thermostat work if the WiFi goes down?). Menu navigation for tasks that used to require zero thought.
Multiply that across your car, your alarm, your lights, your speakers, your washer, your dishwasher, your thermostat, your TV. You are now managing a distributed system inside your own home, except without any of the observability tools or control primitives that actual distributed systems engineers rely on.
You've accidentally become the SRE of your own kitchen, and nobody asked if you wanted the job.
What Good Technology Actually Looks Like
There's a reason mechanical watches still exist. Film cameras still exist. Cast iron still exists. These aren't nostalgia objects. They're durable because they embody a design principle that modern consumer tech has abandoned: the best technology disappears into physical affordance.
Good design reduces decision trees. It minimizes abstraction layers. It favors mechanical feedback over digital signaling. It keeps critical controls local and deterministic. It does one thing, and it does it without asking you to navigate a menu first.
The critique here isn't that technology is bad. I build software for a living. The critique is that progress which increases friction is not progress. Software without reliability is regression. And screens, increasingly, are the new bureaucracy: layers of interface between you and the thing you're actually trying to do.
The question worth asking is simple. When did we start accepting that turning on a TV should require six decisions? When did we agree that a washing machine needs a login? When did we stop expecting our tools to just work?
And more importantly: when did we stop noticing that the complexity was never for us?