site banner

Culture War Roundup for the week of November 11, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

I think immortality is achievable and worth working on, but Kurzweil seems to have some unrealistic beliefs about our current progress. In one of his books, he made the tenuous argument that technology is always accelerating just because technology is used to develop technology, so as technology improves technological progress accelerates. That's the entire basis for his extrapolating all kinds of progress curves out in ways that aren't supported by anything else (other than a few empirical examples like Moore's law, which has slowed down and physically must end soon). He seems to be forcing his beliefs about the current progress on longevity onto these progress curves, despite evidence that the rate of progress is less than it is.

What he is not doing is hoping for a discrete jump in the rate of technological progress due to an artificiaal intelligence break through. He is saying that when we reach longevity escape velocity, it will be because the trend that we are currently seeing will have continued to that point. I think an honest assessment would say that the current trend is not good and something needs to change for us to reach our goal.

Moore's law has slowed down but his point is about a general trend of compute becoming cheaper per dollar, not a specific trend of transistors miniaturizing. He traces out a similar pattern of acceleration back to the dawn of life on Earth, long epochs of tiny creatures, followed by larger and more complex life. And then bang! It's the Anthropocene, goodbye to all the land mammal biomass that isn't us or ours. Even before transistors there was acceleration in compute capacity through electromechanical computing. Presumably acceleration will continue into photonics or some other method, perhaps with a delay period or sudden acceleration. You could argue that it's still accelerating, if you include the software and architectural improvements in the newest GPUs their effective compute/$ for AI tasks is rising faster than before.

I think his AI predictions turned out quite well - his original prediction was AGI by 2029 which looks conservative, if anything. Many today give a date of 2027, assuming all goes according to schedule. Singularity by 2045 is even more conservative. He was saying this back in the early 2000s, so clearly his reading of trend-lines has some merit to it.

His health practices however will probably not stand the test of time.

a general trend of compute becoming cheaper per dollar

That stopped happening once Intel stopped being competitive. Compute now costs the same amount per dollar (unless you're Apple and are just buying TSMC the machines); that's why new AMD CPUs are twice the price of the old ones despite not being twice as fast and nVidia's products in particular have the same or worse price/performance ratios than they did 5 years ago.

This seems trivially not true.
https://ourworldindata.org/grapher/gpu-price-performance

I do agree that no amount of better hardware performance can overcome poorly-written software.

If Nvidia's products have the same or worse price/performance ratio as 5 years ago then why are they the biggest company in the world today and a minnow five years ago? Shouldn't it be the other way around?

For some tasks, there's no difference. My favourite game Civ IV can run on 20-year old hardware. It runs a little faster on a modern CPU but that's about it.

The 4090 is not really for gaming, it's for mucking around with advanced image-generation, AI and training consumer-level LORAs. For some things the 4090 really is the cheapest way to run it, there is 0 performance per $ for anything below 24 GB of VRAM. Just like how there is 0 performance per $ on the Geforce 3 for most tasks. It doesn't even run modern OSes, you'd be better off with whatever comes with your CPU.

NVIDIA boasts 25x energy efficiency gains over the last generation for its flagship AI processors. OK, that's advertising - round that down to 10x or even 5x... That's still a huge improvement.

https://www.nvidia.com/en-au/data-center/gb200-nvl72/?ncid=no-ncid

Off topic but whats the best intro to Civ4 for someone who can't even parse the map? Aside from playing the game solo obviously that's the best.

Sulla's tutorial got me started with Civ IV; I recommend it in the strongest possible terms.

There are beginners playthroughs on youtube: https://youtube.com/watch?v=CgBnpbaQFo4 or https://youtube.com/watch?v=_f-pwq6cKwk?list=PLs3acGYgI1-vw-A3LHOb_BDQxKNtv1tze

There's a text guide here (this would be the best IMO for getting started, in terms of efficient reading): https://forums.civfanatics.com/threads/beginner-help-the-basics.648469/

There's a slightly more advanced tactic/strategy guide here: https://forums.civfanatics.com/threads/sisiutils-strategy-guide-for-beginners.165632/

The game manual is here: https://forums.civfanatics.com/resources/civ-4-manual.12753/

The map is pretty straightforward. It's all about getting three resources - food, commerce and production. There's a little button you can press on to show the per-tile yields, another one that highlights special resources.

You get the most value in making cities near food resources so they can quickly grow and get pops working other tiles: hills, mineral resources and forests for production or luxury resources/coast/rivers for commerce. Commerce is wealth, culture, espionage and most of all research, you control where exactly it goes with sliders.

Shouldn't it be the other way around?

This isn't saying absolute performance hasn't increased: it has.

What I'm saying is that a 4090 performs twice as well as a 3090, but at roughly twice the price. That's "same price/performance ratio", it's just that the right tail on the graph grows.

https://www.pcworld.com/article/1364477/nvidia-rtx-4090-vs-nvidia-rtx-3090.html

The GeForce RTX 4090’s $1,599 MSRP is significantly less than the $1,999 whopper of a price that the RTX 3090 Ti launched with. It’s also $100 more than the original RTX 3090’s debut $1499 price. Good news, however – the RTX 3090 Ti has dropped to a much lower $1099 for the Founders Edition, and sometimes can be found for less. The 3090 can often be found for under $900, and even closer to $700 if you’re OK with a used graphics card.

The 3090 price fell precisely because of the 4090, due to market forces. Today, the 4090 seems to be back up to around 2000 USD due to the AI boom and sanctions/sanctionsbusting. But anyone would rather have a 4090 for $2000 than a 3090 TI for $2000. In theory, you could get a 4090 for $1600 compared to a 3090 TI for $2000, which is a very good deal. Progress continues.

When the 5000 series emerges, the 4090 will fall to the $1000-1500 range too.

Secular falls in GPU prices (and heightened price/performance) are being suppressed by high demand but they're still observable.