Yes indeed, that todo link should've been replaced with a link to a transcript of Emad's recent Interview.
I failed to find the transcript in my browser history, so I've relinked the video in its place.
Seeing as no one else has discussed this, I'll try to give a brief overview of the Drama that has taken place in the Stable Diffusion community over the last few days.
On 7th Oct, most of the source code, private models, and proprietary technology of NovelAI is leaked on /sdg/.
NovelAI's anime art generation stack was (is) significantly better than the open source tech available, so tooling is quickly made available to work with the leaked models. More specifically, the developer for the most popular offline Stable Diffusion tool, AUTOMATIC1111, immediately implements features to work with NovelAI's "Hypernetworks".
Within a day, AUTOMATIC1111 is accused of code theft and banned from the Stable Diffusion community, a decision explicitly backed by Emad. This causes a lot of Drama to happen,
-
Because the stable-diffusion-webui project is extremely popular; no other open source tool comes close in functionality,
-
Because it's ambiguous whether any code was truly stolen or not,
-
Because NovelAI was discovered to have illegally copied open source code in their leaked repositories, an error that was admittedly quickly reverted by their devs,
-
Because of the optics of the situation -- Stability AI backing a closed-source company over a popular open source figure?
The drama expands further when links to stable-diffusion-webui are scrubbed from /r/stablediffusion, causing the former moderators of that subreddit to reveal that the moderation team of the subreddit experienced a quiet takeover by representatives of Stability AI. It is additionally claimed that a similar process occured for the SD Discord server.
- To their credit, Stability AI has relinquished control of the subreddit to the original mod team.
And as an extra bonus, the coomers using SD have gone into high alert after Emad remarked in an interview that the release of SD V1.5 would be delayed to "handle edge cases" & (de)bias the model against "topless women".
Insofar as I have a take on all of this, it's going to be some blend of "holy shit Emad please stop doing bad PR" and the Seven Zillion Witches problem. I find it horrific that the mutually agreeable agenda of "let's keep AI open and available" is doing its best to self-destruct via the usual processes of internet catfights and the amplification of minor differences within what ought to be a united ingroup.
If you're of what might be referred to as the "pro-HBD" persuasion around here, how would the world look different if there were not meaningful cognitive/behavioral differences between ethnic groups?
We would not exist. God would be real. Cartesianism would be accurate. At the most basic level, it is extremely difficult to put "HBD is fake" and "evolution and scientific materialism is real" into the same boxes of reality.
GDB is
-
not easy to learn
-
even less easy to learn if you are a part of the modern GUI/webapp/the-fuck-is-a-shell generation (so, the problem statement at hand)
-
doesn't even scale to larger projects, so you can hardly say you'll use it in a real job
Compare it with, let's say, the chrome debug console. Or the vscode debugger for python. They're far more intuitive than x/10g
info all-regs
, b 0x1234
, ni
×100, etc.
Hi, I just want to leave a stub response: you seem right and I failed to type a recent response after reading 2 days ago.
Roughly speaking, I see your point and agree that it's possible we're just climbing a step further up on an infinite ladder of "things to do with computers".
But I disagree that it's the most likely outcome, because:
-
I think the continued expansion of the domain space for individual programmers can be partially attributed to Moore's Law. More Is Different; a JavaScript equivalent could've easily been developed in the 80s but simply wasn't because there wasn't enough computational slack at the time for a sandboxed garbage collected asyncronous scripting language to run complex enterprise graphical applications. Without the regular growth in computational power, I expect innovations to slow.
-
Cognitive limits. Say a full stack developer gets to finish their work in 10% of the time. Okay, now what? Are they going to spin up a completely different project? Make a fuzzer, a GAN, an SAT solver, all for fun? The future ability of AI tools to spin up entire codebases on demand does not help in the human learning process of figuring out what actually needs to be done. And if someone makes a language model to fix that problem, then domain knowledge becomes irrelevant and everyone (and thus no one) becomes a programmer.
-
I think, regardless of AI, that the industry is oversaturated and due for mass layoffs. There are currently weak trends pointing in this direction, but I wouldn't blame anyone for continuing to bet on its growth.
Is there something misleading with the way I phrased my comment? I don't understand why multiple people have succeeded in reading "programmers will be completely replaced by AI" into my words.
And this isn't a nitpicking thing. It is an extremely important distinction; I see this in the same way as the Pareto Principle. The AI labs are going to quickly churn out models good enough to cover 95% of the work the average software engineer does, and the programming community will reach a depressive state where everyone's viciously competing for that last 5% until true AGI arrives.
Your first paragraph misses how hard it is for human programmers to achieve those things, if it is even possible under current circumstances (find me a program that can acquire farmland & construct robots for it & harvest everything & prepare meals from raw materials). Even hiring an army of programmers (AI or no) would not satisfy the preconditions necessary for getting your own food supply, namely having an actual physical presence. You need to step beyond distributed human-level abilities into superhuman AI turf for that to happen.
If it does then it will be smart enough to self-modify,
This does not work out the way you think it will. A p99-human tier parallelised unaligned coding AI will be able to do the work of any programmer, will be able to take down most online infrastructure by merit of security expertise, but won't be sufficient for a Skynet Uprising, because that AI still needs to solve for the "getting out of the digital box and building a robot army" part.
If the programming AI was a generalised intelligence, then of course we'd be all fucked immediately. But that's not how this works. What we have are massive language models that are pretty good at tackling any kind of request that involves text generation. Solve for forgetfulness in transformer models and you'll only need one dude to maintain that full stack app instead of 50.
Why are the majority programmers so enthusiastic about machines that can code but not artists?
Because they aren't. They're collectively deluding themselves into believing in the «soul» and that programming will never be automated by AI. Just like certain artists are.
I am a programmer. OpenAI scares me. I'm putting every effort I've got into the Grind, because I think the industry's due for a phenomenal crash that'll leave the majority in the dumps. You are free to disagree.
NO WAY, you were real?
Start a substack. Please. Perfection is the enemy of good, and you are really good.
Misogynist (in the feminist sense) would be more accurate. There is zero mention of anything related to getting laid.
Sizzle50's various posts on BLM were really great, but I think everyone here has discussed that to death.
Instead, I'll link SayingAndUnsaying's longpost on Hawaiian Racial Dynamics, which will be new & novel for a lot more readers.
https://astralcodexten.substack.com/p/open-thread-243
Certified hoax. Quokkas shocked.
That's actually really cool, wow.
Leave the rest of the internet at the door.
Or could you at least have something more substantial to talk about than, "redditors upvote dumb shit, news at 11"?
Premise #2: Within the unit of people we care about, we care about everyone equally.
I think this premise specifically is inherently anti-utilitarian. How can you assign the same utility to each individual when there's so much variance? When the actions and roles and beliefs and experiences of two people can differ so greatly?
The comments on the youtube video seem to suggest that this song is actually serious and not a parody,
What comments are you reading? I saw mostly references to freedom truckers, lockdown protestors, and "G-d". Looks right-wing to me.
The main concern here is that we're headed for a future where all media and all human interaction is generated by AI simulations, which would be a hellish dystopia. We don't want things to just feel good - we want to know that there's another conscious entity on the other end of the line.
I can see this as a Future Problem, but right now the "conscious entity on the other end" are simply prompt writers. There is a sense of community to be gained from indulging and working on AI generation together. I think it is misleading to apply the bugman/we-will-be-in-pod argument to text-to-image tools, because new means of human interaction are forming as a result of it.
Also, some of us just hate the majority of conscious entities and are happier with what simulations we can get. This obviously doesn't apply to you or Vaush, but I wonder what brings you both to so viciously condemn the estranged, the alienated, the anti-social.
I've yet to see anyone blow up their life with (legal) porn,
What kind of observation would qualify, to you? Does blowing your life savings on OnlyFans count? Missing a national exam because you fell asleep after a jack at dawn? Ending up arrested for molestation because you thought it'd be as easy?
I don't even care about any of these, because they're edge cases. But when people are willing to condemn all kinds of behaviours except masturbation, I just don't get it.
Your post made sense to me, but I think that's a result of me agreeing with 90% of it. It might help if you broke up your stream of consciousness into proper paragraphs and subpoints.
Why do you care about what/who's fault it is? You have goals -- accomplish them or don't.
I don't see this as superstitious/magical. You are basically pressing the "purge all thoughts" button by spamming your brain with a single repeated concept.
(your first two links are the same)
Okaay I have no idea what's going on with the comment box. The link I have in there right now when I click the edit button is:
but it's getting rendered as
More options
Context Copy link