This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Sure but the inputs are growing rapidly. There's still plenty of space at the bottom, the fundamental limits for computing are very generous. All our chips are still basically 2D!
Maybe our current machines can only produce a few nice-to-haves like this. But the next generation will produce more and better. Parameters get cheaper as transistors get smaller, as architecture gets better and algorithms improve. The amount of money we put in continually grows. And then our training methods improve as well. We're already starting to reap interest on the 'architecture improvement' front. Compound interest starts really slow but it gets powerful very quickly.
The human brain shows you can do a hell of a lot with 20 watts, at 20 hertz, on a shoestring materials budget, fitting the whole thing through a woman's hips! We have every element on the periodic table, endless lasers, acids and refinement techniques, we have gigawatts and gigahertz, thousands of cubic meters to spend. Our methods are incredibly primitive compared to what's already proven possible, there's so much low-hanging fruit we're yet to find.
The question is not whether current technology will help you make better technology, or whether AGI is theoretically possible. The question is how quickly change happens, and to what extent advances make future advances faster: You have better tools but the problem has also become harder. So far, it seems to me like the latter effect is winning out. GPT 4 can write (allegedly) working code, use documentation, bug fix, etc. But is it good enough to make writing GPT 5 substantially easier or faster than making GPT 4 was?
Well I doubt 'Open'AI would tell us, they like keeping things secret nowadays. Nevertheless, existing demonstrated capabilities seem to be accelerating progress. I'm not a subject matter technical expert but it seems this is happening: https://www.hpcwire.com/2022/04/18/nvidia-rd-chief-on-how-ai-is-improving-chip-design/
I can't judge how significant this is because I'm not an expert. But my intuition is that compound interest balloons outwards and there's plenty of physics/computing space for it to balloon outwards into. This is a fundamentally new kind of compound interest that is different to whatever input scaling we were already doing to keep up with Moore's law. In addition to increasing the amount of wealth and human intellect going in quantitatively, we get some qualitatively superior (albeit specialized) inhuman intellect too.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link