site banner

Small-Scale Question Sunday for June 23, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

4
Jump in the discussion.

No email address required.

Does anyone have an explanation for why Microsoft products in particular feel so bad?

I've been using a Macbook Pro since 2015 as my personal computer. It does everything I want with nearly no issues. The phrase "it just works" is a perfect encapsulation of my feelings. (This isn't just Apple fanboy-ism speaking; everything else I own like TV, phone, watch, etc. is non-Apple).

I've recently had to use a Windows computer for work purposes and it has been a god awful experience. It frequently won't boot up from sleep, it often overheats while I charge it overnight, sometimes it won't connect to Wi-Fi and the Wi-Fi settings button is completely missing/inaccessible, getting it to change the scroll direction for the mouse wheel takes 5 more steps than it should, etc. I've also had to use Microsoft Teams instead of Zoom and Microsoft Outlook instead of Gmail, both of which have been massive downgrades along every dimension (lag, dropped calls for Teams; awful UI, lag, and terrible search/organizational functions in Outlook).

Imagine that an insane genius who hates end users, who constantly schemes up ways to make things worse, whose design ethos is 'whatever our customers will hate most and is maximally inconvienent' and you wouldn't be too far off as to why.

Some of the problems are downstream of hardware vendors. Boot from sleep and charge problems are well-documented for how hard-to-reproduce they get, but also probably specific to firmware sleep mode support on various hardware being dumb as a sack of hammers. While it'd be nice for Microsoft to enforce various OEM requirements more stringently (or even to take a harsh hand in enforcing BIOS behaviors other than the stupid TPM clusterfuck), I think at least part of this trades off against the increasing variety of capabilities and options that motherboard manufacturers have been able to give out, and while some of that latitude has given us Asus, I'd still take that over Apple charging an arm, leg, and a first-born child for a soldered-on NVME module.

((And knowing MS, we'd probably lose Framework and keep Asus as the new standard.))

But there is a general trend toward worse user interface, in places that are software-specific. In addition to the problems you've already mentioned, Windows Start Search uses the web before local files or even applications, settings menus repeatedly disappear or get moved for little cause, and a variety of first-party apps range from bad to hilariously error-prone.

The standard argument is that Microsoft is no longer selling products, but advertisements, and these decisions either reflect those pressures (adding more interstitial buttons to charge a setting provides more marketable data, somehow), or are downstream of the old pressures no longer existing (Microsoft no longer cares if you prefer an old UI, cause they're not selling to you).

These are probably part of it, but the same issues have shown up in other companies as well, including many who either don't have the same pressures or who have always been in the ad business. I think the structures of managing programmers itself has gone tango uniform, especially in FAANG environments, such that building a slick new view has become far more important to careers than any underlying functionality, either intentionally or as a side effect of increased review of code that could invoke regression testing. There's some fair arguments in favor of this behavior over the old one to some degree -- the Windows Format Disk dialog, rather famously, was the result of putting code over UI, with corresponding Fun edge cases -- but the pendulum has definitely swung too far one direction.

Is it a Dell by any chance? My work uses them and everything about that company from its hardware to its support is almost aggressively mediocre.

I too use a Mac but I have an older higher end Windows box for gaming which works mostly smoothly and without issues. It's more hit and miss with Windows since OS and hardware are not the same company, Windows is trying to make a universal OS that works on a wide range of hardware and this has special challenges. Hence why I prefer the Macbook Air experience for everyday (non-gaming) computing.

I don't know. I wanted to stay with Windows, but I wanted a premium feeling laptop. I use it all the time, I wanted it made out of metal. To get aluminum construction, the MacBook air was much cheaper. So that's what I've used for the last five years.

I have never really used a mac, but the reason it and my OS of choice (Linux) is better for me than Microsoft is probably value-alignment.

If you go and buy a chocolate bar, you can expect it to taste good. This is because the company that makes the chocolate is interested in making money, and must satisfy the preferences of their customers for them to keep buying it. Thus the design of the chocolate is two steps removed from being aligned with you: to the degree that your tastes align with the tastes of the average customer, and the degree that the amount the average customer spends aligns with the taste of the chocolate. These steps are likely pretty small – your tastes probably don't differ massively from the general public, and buying candy is a pretty straightforward transaction with lots of competition.

For software (and especially massive ones like operating systems), the situation is entirely different. The consumers are diverse, with some being interested in video games, some in office work, some in servers, and so on. You are probably not very close to the center here; Microsoft makes most of its money from hosting and cloud services, not OS sales. The second step is really bad as well; operating system sales suffer from practices like vendor lock-in, OEM-preinstallations, sunk costs, piracy, and painful friction from switching systems. This results in a system that doesn't do what you want, because "what you want" is pretty different from the incentives that Microsoft has.

Apple makes most of its money from selling actual products where the quality is strongly connected to the amount they sell. Moreover, they are targeted at consumers and professionals like you!

Linux is even more extreme. The modal user is a meganerd like myself, and the second step doesn't even exist – a whole lot of the development is done by people solving their own problems, which is by definition perfectly aligned with their needs. This results in a ecosystem that can compete in quality for their users even if there is significantly less resources involved than the other two.

(You can incidentally do the same analysis for any other domain. If your tastes don't strongly align with the general public and the general public buys whatever the company puts out, then it's a pretty safe bet that the product will be terrible.)

The answer to your question is that Microsoft only controls one component of the entire user experience, the OS. Most of what you describe, and most of what people complain about, is the result of hardware manufacturers and third party software contributing to the experience.

Honestly, it sounds more like they gave you a crappy computer more than any problem with Microsoft. I haven't used Teams so I don't know if the problems you describe are endemic to the software or downstream of the issues you seem to be having with your computer, but I'd say that in general it isn't fair to judge a company based on one product that isn't anywhere near a flagship. I agree with you on Outlook but I've used both it and Gmail and both have their problems depending on what you're trying to do. Microsoft Office in general is miles ahead of what little competition there is, and I say this as someone who has tried to implement non-Microsoft solutions for business purposes and been forced to use Google products by employers. I worked for a company that wanted us to use G-suite and the whole thing was garbage to the point that it was easier to just do the work in Office and convert it rather than use G-Suite directly. It all depends on what you're comparing it to.

takes 5 more steps than it should

Lol, if only it was that good for everything.

Something I ran into literally right now: I wanted to make letter-sized (8.5x11") slides in powerpoint. The process is:

  1. Change system of measurement to inches (optional):
    • Gain inspiration and mysterious clues from their help page.
    • Quit powerpoint
    • navigate through Control Panel -> Time and Language -> Date, Time and Regional Formatting -> Additional date, time & regional settings -> Region/Change Date, time, or number formats -> Additional Settings -> Measurement System = U. S.
  2. Change slide size
    • View -> Slide Master -> Slide Size -> Custom Slide Size

I have no idea how anyone is supposed to know that the ruler in Powerpoint (but not Word!) is set within the language settings in Windows. Or that you should start at "view" to get formatting options. It feels like they went from "If you don't know something, you shouldn't be afraid to ask for help" to "If our users don't know something, they should be required to ask for our help".

It feels like they went from "If you don't know something, you shouldn't be afraid to ask for help" to "If our users don't know something, they should be required to ask for our help".

It makes more sense when you realized that "desktop computing and office suites", as a general paradigm, peaked around the year 2000. UI/UX research was still going strong at large companies like Microsoft and IBM, and paying dividends in terms of customer satisfaction (because they still had to compete- if MS shit out complete garbage Lotus 1-2-3 was still a viable alternative).

But the problem is that after that, we never re-wrote anything, and just started stacking complexity upon complexity upon complexity such that nobody actually remembers what the starting point was in the first place. This is the reason why all new frontiers in computing have been "just stack another VM on it", because you're starting from a clean enough state that you can actually make something useful for once.

This is part of the reason people use Macs, because the UX paradigm hasn't changed since 1984. Microsoft has had 3 in that same time- one for the original Windows, one for '95, and one for 8 (tablets)- which is why, except for Windows NT (which is why the best versions of Windows ever released were NT 4 and NT 5/5.1 (usually known as Windows 2000/XP)), everything MS does feels like (and for the most part, is) a regression.

A lot of this is due to massive barriers to entry (this software is actually quite complex, and while some standards are open like MS' OOXML, there are a bunch of quirks to/bugs in MS' renderer that make copying it word-for-word impossible and are, naturally, undisclosed), the lack of any major innovation (AI isn't it, by the way) in the way the software should work because the ways in which we interact with computers has not meaningfully advanced (pixel densities and other interactions in VR aren't there yet though the Vision Pro is a good beta-test for it, AI is not meaningfully going to change this aside from fuzz testing, which companies aren't even going to use anyway because there's no reason for them to compete on UX, and we don't have any neural linking yet), and network effects for "you have to know this software to get a job" mean that it's not possible to meaningfully compete on UX, so any company that is in the position of doing so is not going to spend any money on it themselves, which makes the problem worse.

So all that remains are non-profit organizations trying to drive desktop development forwards in the shadow of the academics that created the computers in the first place, but they've been screwing around bikeshedding a replacement shell for 10+ years now (and... to be fair, they kind of do need to do this, as X Windows is technologically cool but also very crusty and not particularly secure) so I don't think they're going to amount to much.

I think that your experience is not representative.

I've recently had to use a Windows computer for work purposes and it has been a god awful experience.

I can't say that Windows is the best software in existence, but I haven't had any major problems with it.

I've also had to use Microsoft Teams instead of Zoom and Microsoft Outlook instead of Gmail, both of which have been massive downgrades along every dimension

I haven't had any major problems with these programs, either. (It would be nice if Outlook had Gmail-style labels rather than just folders. But in my experience such functionality is necessary only very rarely in the workplace. And, when it is necessary, Outlook does allow you to copy-and-paste an email and put the copy into a different folder as a workaround.)

Isn't that what categories are for?

Categories can't be nested.