site banner

Friday Fun Thread for June 14, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

3
Jump in the discussion.

No email address required.

I remember there being stuff like SETI@home and Folding@home that outsourced massive computation tasks to volunteers. Are there similar projects that promise to train LLMs without corporate supervision?

I recall this kind of distributed training was considered by Stable Diffusion hobbyists after its public release in 2022, but it was deemed impractical due to each piece of training modifying the entire model at once, so doing it piecemeal wasn't an option. I wonder if LLMs suffer from similar issues, since the underlying architecture is similar.

I wonder if merging sibling models is something that is feasible, if not for the current generation, then for some future one.

I wonder if that's been squeezed out by crypto mining. Or the perception that anyone who wants your compute is going to be crypto mining. Or just by the proliferation of GPUs.

If I wasn't such a brainlet, I'd be looking into the possibility of using AI training as proof-of-work for crypto, like last year. But both of these subjects are way above my IQ level.