VoxelVexillologist
Multidimensional Radical Centrist
No bio...
User ID: 64
I don't know the financial details at play, but money is often very fungible. It could easily be "letting all the tuition funds pay for humanities" or something like that. Although I've also heard universities complain informally about earmarked gifts: rich benefactors want new named buildings, not expensive repairs to existing ones.
Without sufficient ICs a university could easily decide that STEM research is too expensive, while cash cow race-to-the-bottom courses in Sociology or Psychology become more attractive. There are fierce internal battles between factions within every university, and if STEM research becomes a big money loser, its advocates will lose more battles.
When I was in college, several of the STEM professors brought up how many adjunct sociology professors their research grants were funding from the half-ish of the money that was "indirect". It was quite a few, because relatively little of tuition ends up paying for professors. I'm not saying this hypothetical couldn't happen, but that the reverse of it (STEM research is a cash cow for high-ranked multidisciplinary universities) has been true before.
Imagine if you handed someone from 100 years ago a smartphone or modern networking technology. Even after explaining how it worked, it would take them some time to figure out what to do with it.
I came of age right as the Internet was taking off. But I've started watching classic movies and TV and I think the "information at my fingertips" effect is something that has happened so gradually I don't think we really appreciate it's impact fully, even pre-LLM. One recent TV episode from the '90s had one character tell another to travel to the state capital and find and photocopy dead-tree legal references, which was expected to take a day. My world today is radically different in a number of ways:
-
State laws are pretty easily accessible via the internet. I'm not sure how the minutia of laws were well-known back then. Are our laws themselves different (or enforced differently) because the lay public can be expected to review, say, health code requirements for a restaurant?
-
Computerized text is much more readily searchable. If I have a very specific question, I can find key words with ctrl-f rather than depending on a precompiled index. The amount of information I need to keep in my brain is no longer things like exact quotes, just enough to find the important bits back quickly. The computer already put a bunch of white-collar workers out of jobs, just gradually: nobody needs an army of accountants with calculators to crunch quarterly reports. Or humans employed to manually compute solutions to math problems.
-
The Internet is now readily accessible on-the-go. Pre-iPhone (or maybe Blackberry), Internet resources required (I remember this) finding a computer to access. So the Internet couldn't easily settle arguments in real conversation. The vibe is different, and at least in my circles, it seems like the expectation of precision in claims is much higher. IRL political arguments didn't go straight to citing specific claims quite the same way.
I sometimes feel overwhelmed trying to grasp the scope of the changes even within my own lifetime, and I find myself wondering things like what my grandfather did day-to-day as an engineer. These days it's mostly a desk job for me, but I don't even know what I'd be expected to do if you took away my computer: it'd be such a different world.
The "community organizer" calls their local representative who calls a friendly reporter.
I have started almost completely writing off news stories about filed lawsuits for this reason. The bar to file is so low that except in situationally interesting cases (IMO: raises novel legal questions), I just can't be bothered with "patient says doctor didn't tell them about side effects" or "plane crash survivor sues airline". It genuinely seems to me like many of these articles are really (goodeffective) lawyers exercising networks with journalists to put public pressure on their counterparties to settle, or, less often, to raise the prominence of their clients. It'd be one thing if the articles had substantive analysis and outside facts, but they typically lazily repeat one party's claims from the filing.
I'm curious if you have any thoughts on what Amazon would have based their design on, if not Android? Yes much of it's generally-accepted functionality isn't open source -- I've seen Google claim this makes updates, including security updates, easier without relying on OEMs, which sadly makes sense, but also helps their moat.
Even if you chose something else, I doubt "write an OS from scratch" was in the cards, and I assume you'd end up with Linux or BSD as the base, with a very slight chance of some commercial embedded platform.
I'm not sure I agree with "proprietary wins" everywhere. It certainly wins in the short term in lots of markets. Anything sold directly to lay users is often dominated by proprietary offerings precisely because of those paychecks: games, "apps", and such. But I can think of a number of markets where "loosely organized volunteers" have mostly-gradually won because the cost of copying software is functionally zero.
Four decades ago you were buying, for any computer you bought, the operating system (even if bundled), a BASIC interpreter, maybe a compiler, and maybe even briefly a TCP/IP stack and a web browser. These days, the open source model has wholesale swallowed some of these markets. We're down to two modern web rendering engine cores: Firefox and Chrome, everything else nontrivial is bolted onto those engines. There are, last I counted, four commonly used compilers for software: msvc, icc, clang, gcc, of which the last two are probably the vast majority of the market and open source. Most devices that aren't Windows PCs (shrinking market) or Apple are running on a Linux kernel, and those that aren't are probably some BSD (or purely embedded). I can't imagine paying to use a programming language these days and I'm pretty sure Matlab and such are losing ground to Python. There also isn't a shortage of academics working with non-proprietary tools and publishing cutting edge, if not generally user friendly, stuff.
IMO the lesson I take from this is that the non-proprietary model can win in the short term, and stay relevant for a long time, but it at least seems to me that even entrenched, expensive professional tools are, slowly, losing ground to free (as in beer, which is coincidentally often as in freedom) alternatives on a more generational time scale: Matlab to Python, with KiCad and Blender as examples of tools I expect to (mostly?) displace commercial alternatives in the next couple decades. As software expectations get more complex, the make-buy calculation changes when "buy" includes leveraging existing non-proprietary offerings. I don't know if I'd completely stan RMS here too, since there are commercial source-available packages (Unreal Engine, for example) that somewhat have a foot in both camps.
What old alliance are you thinking of?
Having re-watched the Lord of the Rings movies fairly recently, I can't help but feel like all of the comments about "the Old Alliances" there felt very relevant to today's geopolitics situation, even though it was published after WWII. Some of the significant political moments felt very relatable: political leaders either feeble in their old age and fed questionable information by disloyal advisors, or stewards uninterested in the worldly success of their constituents. There is a real sense that the alliances have frayed and that, should the beacons be lit over the reluctance of one kingdom, forces of good will choose to fall divided rather than answer and stand together against the forces of Sauron. And this all takes place in the backdrop of the Elves, one of the members of the alliance choose to withdraw completely from the surrounding world and board ships to somewhere. I'm not sure whether I'd map Europe to Gondor or the Elves in this situation: both feel fairly pertinent at different times.
Obviously it's not a perfect allegory, nor do I think it was meant to be: Monarchy isn't really that popular of an idea these days, wingnut "God-Emperor" memes aside, and right-wing leadership feels far short of Theoden or Aragorn. And I'm not sure how the populace would feel more allegorically about how orcs, goblins, and such are rigidly type-cast as followers of evil: to those wanting to type-cast immigration as "the forces of darkness" here, Europe and America are hardly homogenous kingdoms, and never really have been either.
It's ironically the Russians with the Su-57 that have fielded the first advanced post-Cold War aircraft, although I will also give the Super Hornet at least partial credit
Is there a reason you're discounting the F-35 here? Even going by the start of its development cycle (1995) it's clearly "post-Cold War" and there are far more of them in service (for a decade now) than Su-57s.
Look I fully agree with you that the current setup for the EU is a stupid mess that only hamstrings itself, but you decided to join this stupid mess by signing stupid mess treaties and devolving your powers to stupid mess institutions.
Americans have been making similar comments about Federalism on this side of the pond for probably two centuries now. While some of those criticisms ring true, I think it'd be wrong to dismiss the American Experiment as having failed on that account. Americans are still having those very same arguments over our "stupid mess institutions" even now.
I'm not convinced that the idea of the EU is what's failing in practice. The most obvious difference I can point to is American chutzpah, which somehow seems more important than even the intra-EU language barriers.
Europe has picked a side in the American culture war, and it is the far left side.
While I think your statement here has a strong ring of truth to it with respect to Kulturkampf and the dispositions of the cultural elites, I generally find claims that "center in Europe is far-left in America" to be true only for a very limited definition of the political spectrum. "Center in Europe" includes certain elements that I'd wager the average Republican considers far-right: Literal hereditary monarchs (many such cases, some established within living memory)! Official state churches (many such cases)! States collecting taxes on behalf of churches! Blasphemy laws!
Sure, "center in Europe" also looks a lot more friendly to carbon taxes than even the DNC, and endorses a shorter workweek, more worker protections (although German unions look very different in ways I find interesting from their American counterparts), firearms restrictions (although those aren't uniform across the EU, they're generally stricter than the US -- although Sweden has problems with hand grenades that seem unbelievable as an American!), and so forth. I don't think the statement is completely wrong, just oversold.
Look, we all know what he means by 'universal human rights', and editors in sociology journals know what he means, and reviewers know what he means
I feel like this is getting at why the political divide has become the way it is: a generation ago I suspect even lay members of the public would understand and (broadly, if not uniformly) agree on "universal human rights", for Americans probably citing either the Constitution or Declaration of Independence. Today, the ivory tower definition has moved on at least a bit, and while the academics probably agree with each other, the lay public has started noticing when The Powers That Be have tweaked the definitions to mismatch the populace and they don't feel like they've been consulted or heard on how it impacts them through issues like refugee status or (youth) gender medicine.
If you wanted to stir the pot, you could ask those folks how they feel about the demolition of the Anti-Fascist Protection Rampart (Antifaschistischer Schutzwall).
The terminology was selected because at least some of the dudes in question say "no", and asking about "MSM" is (supposedly) easier than trying to convince the men in question that they are gay. And that's okay.
it is a mid sized country like Holland and stopped getting involved in far away places.
The Netherlands has claims on its own small overseas territories in the Caribbean.
I've long wondered whether antihistamines would be counterproductive because when you're actually sick the inflammatory responses aren't false positives and are presumably useful traits. I have seen some suggestions that fever reducers for minor fevers may be counterproductive, I think. But it's a bit out of my area of expertise to actually find literature for antihistamines.
Almost every time I've seen government make a promise like that, the "end of 2025" gets pushed out 3 months, then to September for the federal Fiscal Year, then delayed indefinitely. The Sequester is maybe the only time I've actually seen something like that go into effect. Not to say it couldn't be done, but I think it'd be much less likely to go into effect that way -- independent of my feelings about whether or not it's a wise choice to do so.
All the blacksmithing demos I've ever seen have some serious ritual about how you are not supposed to hit the hammer on the anvil. Only with the workpiece between.
On one hand, I think the Holocaust does read differently if the exact same victims died of plausibly-deniable famine: it's seemingly unique on the basis of the deliberate industrial murder, even though accounts generally count malnutrition and exposure deaths. On the other, this just incentivises malicious incompetence going forward, and doesn't necessarily reduce actual body counts. I'm not sure exactly where I'd put it, but IMO there is a line past which incompetence should be assumed to be malice when it comes to mass murder.
Is it really that far from precedent, though? Clinton's ATF and FBI burned a bunch of women and children to death in Waco. Obama ordered drone strikes on American citizens abroad. Cheney actually shot a man while in office. Two of those successfully ran for re-election afterwards.
They’ve never been to a ghetto at all with or without police, they don’t know anything about people who live there.
Maybe at the federal level, but some of the most experienced at the local level are certainly civil servants. The police are mostly responding to calls in the bad parts of town, as are the paramedics and fire fighters. Even the health inspectors are boots-on-the-ground visiting all the establishments in the city on a regular basis. Much more so than your corporate desk jockeys or even service workers. Maybe your plumbers and electricians make it out to those areas too, though.
I've heard New Yorkers blame this on, at least partially, on the lack of green-field. Boring tunnels for the subway or really anything has to worry about hitting undocumented, unmapped utility lines that have been in-use for a century.
But also on unions (risk of a subway strike prevents real automation improvements for efficiency and safety), and your standard bureaucratic bloat. Political pressure could at least fix this side of things, and maybe even "whoops we hit a gas line and will have to shut down heating for a ten block area for two weeks".
Is there any formal requirement that "asylum" implies anything other than safe(r) living arrangements? I'm not aware of any treaty or international law requirements that would require free travel permissions or right-to-work within the granting nation. As far as I can tell, dumping refugees into camps of some sort (hopefully hospitable ones) is pretty common in other parts of the world.
"I'm being persecuted by my government" can be fixed with, in theory, three hots (meals) and a cot in the Nevada desert. Presumably the current strategy of work permits, free travel rights, and housing assistance was at some point deemed easier, cheaper, or nicer, and that's why we do it. But I don't see why we're bound to it beyond the usual process for changing actions of Congress or the Executive.
On the other hand, I know people who arrived in the US as refugees as young kids (from Iran in the late 70s and the Balkans in the 90s, for example) and have gone on to do great things for the country. I'm not opposed to the program on principle even if I question it's lack of guard rails as of late.
There has been a bit of political pressure against rigid "expiration" dates on groceries because of concern that manufacturers are overly conservative on the dates leading to excess waste. On one hand, there is some wiggle room on many foods that become less desirable but still edible (brown spots on my bananas! Stale bread!) and manufacturers are assumed to gain from incentivising consumer waste (buying new loaves of bread to replace the stale one). Using "best by" (I often also see "sell by") is seen to better express best freshness without implying unusability.
I'm sure it's even more complicated for things that need refrigeration. I've had milk in the fridge curdle before the date on the package, but sometimes be fine a week after. Ultimately "is it still good" at the consumer level is best answered with a Mark I Human Nose (and eyes, and taste), although the date makes sense if you're managing sealed packages in bulk in the supply chain.
If you're referring to a cable modem (possibly integrated into a single device with your router), the protocol to complain about is called DOCSIS. I'm not an expert on that one, but searching suggests connection times on the order of minutes regardless of connection speed.
- Prev
- Next
This is, I think, the answer I was looking for. Ctrl-F doesn't find everything (I've had to search non-indexed dead-tree books before), but it's a huge force multiplier.
More options
Context Copy link