Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
Should I wait for a Nvidia 5090 or buy (and hide!) a 4090 now before a GPU-eating AGI is summoned, or more likely, NGO/associated state actors/alignment (((researchers))) make purchasing impossible? Currently on a 1060 6GB.
Buy a 3080. If you don't have enough money, save; anything slower is an objectively terrible deal. The 4070 will likely be a 3080 equivalent (just like the 3070 was a 2080Ti equivalent), but it's also going to cost as much as the used 3080s currently do anyway so it's up to you if you want to wait.
The problem with the 3070 is that it's 80% of the price for 60% of the performance; this is also true for the 3060 as compared to the 3070 and the 3050 as compared to the 3060. (This is the root of why tech bloggers complain about every new GPU launch, because it's not the "3080, but for 400USD" that everyone actually wants but can't buy.)
ML workloads function just fine on it; sure, the 4090 is twice as fast, but going from 1 iteration a second to 10 iterations a second is a much larger leap than going from 10 to 20 and it still can't run Facebook's LLM anyway so there's no real loss there.
I can't guess if nVidia has another doubling of performance in the pipeline for the 5000-series, but if they do it's probably going to cost roughly 1.5x what the 4000 series did because if they don't they'll likely end up cannibalizing their Tesla sales. Either way, it's probably not going to completely tank the resale value of that 3080, which will probably be a perfectly serviceable card far into the next console refresh cycle (which is the other half of the reason for why they're so expensive- the 8800GTX of 2006 was twice as expensive as the 7 series, but also had twice the performance and wouldn't be meaningfully topped for a very long time and as such was absolutely worth the initial asking price).
Also OP can consider AMD gpus if he wont use cuda. Depending on his market, he might be able to get a 6xxx card for cheaper than nvda counterpart, especially used.
More options
Context Copy link
More options
Context Copy link
Okay, we don't ban Joo-posting per se (as a browse through recent threads will clearly show), but using the triple parentheses like that to wave vaguely at (((Jews))) when making some sort of partisan culture war point is a very clear example of Not Speaking Plainly.
If you want to say you suspect Jews of nefariousness, you actually have to spell out that it's Jews, not weasel around with ((())), and then you have to back that up just like any other boo outgroup assertion.
More options
Context Copy link
What are you going to do with it?
If the answer is play games, get a 3070 or something similar. That'll bring you up to modern standards just fine.
If it's for running your own instance of GPT-whatever, I guess there's a case for waiting, but I doubt it'd make a big difference.
If you're doing this as some sort of weird hedge against a Basilisk, uh, I don't think there's much point. Either it doesn't happen, and you've wasted hundreds of dollars on a chunk of math rocks, or it does, and you will have bigger problems.
While you're at it, speak plainly instead of waving in the general direction of spooky groups you don't like.
More options
Context Copy link
These AGIs ruining the world in the next 5 years posts are so tired.
Anyways, flagship Nvidia GPUs tend to have 40-50% more compute than their previous generation counterparts. Realistically if you are not going to be doing some mondo gaming, video editing or Deep Learning training/inference, you don't even need a 4090.
In simple words, don't buy a 1500 USD GPU exclusively for AGI worries (buy it for any other reason but that!). And NVIDIA actually did you a solid by clearly demarcating their consumer line of GPUs from the server line quite distinctly (they don't even have video output), if in the near future institutions do limit GPU ownership (which is stupid for many reasons), the consumer line might not really get caught up given the US historically banning the sale of A100s and H100s to China. Your GTX GPUs are probably safe. And Nvidia is not stupid, if such regulations are on passing they can just have a few server GPUs in their lineup and some new "gaming" GPU's that just happen to have a lot of tensor cores, I don't think the bureaucracy would be able to keep up with that.
More options
Context Copy link
More options
Context Copy link