Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
Ten years ago a brand-new processor would have been the Haswell- or Broadwell-era, and while you could get machines that could hold 32GB RAM, the H81 chipset only supported up to 16GB, going to 32GB would not have been standard, and it'd probably cost you upwards of 250 USD in RAM alone.
But more centrally, VSCode's linter and intellisense implementation is perfectly fine for mid-sized projects without a boatload of dependencies in certain languages. Get outside of those bounds, and its RAM usage can skyrocket. Python tends to get it hard (as does Java, tbf) because of popular libraries with massive and somewhat circular dependency graphs, but I've seen large C++ projects go absolutely tango uniform, with upwards of 10GB.
Yes, it is usually an extension problem, but given that you'll end up needing to install a few extensions for almost every language you work with just to get them compiling (nevermind debugging!), and that it's often even Microsoft-provided extensions (both vscode-cpptools and vscode-python have bitten me, personally) , that doesn't actually help a lot. Yes, you can solve it by finding the extension and disabling it, and sometimes there's even alternative extensions for the same task that do work.
The normal case isn't much worse, and sometimes is better, than alternatives like IntelliJ/PyCharm. But the worst cases are atrocious, and they're not just things hitting some rando on a github issue with some weird outlier use case.
My PC built in 2016 with skylake (2015) had 64GB ram. My assembled in 2010 had 32. And with developer salaries being what it is - it was always affordable even in Eastern Europe.
32GB was possible on Sandy Bridge processors (technically 2011), but mid-range Westmere and Nehalim processors only supported 16GB(ish) for most of the consumer market, and even the high-end Bloomfield capped at 24GB. I'm not saying you didn't do it -- I've got a couple Xeon systems from that era floating around that could have -- but it was absolutely not a standard use case.
A more normal midrange system would be closer to 4GB, with 8GB as the splurge. You'd probably end up spending over 400 USD in RAM alone, plus needing to spec up your motherboard to support it (thanks, Intel for the fucky memory controller decision).
So it probably was sandy bridge. It wasn't xeon with certainty. Too many years. I remember having core 2 duo 2006 or 7 with 8GB, I remember that the PC I built in 2016 had 64 (which I still hasn't changed, the performance growth in everything but the GPUs have been pathetic), and I remember that it replaced a PC with 32 - so it probably was early 2011. Also possible I build one in 2010 with 16 and then one in 2012 with 32.
Anyway RAM was peanuts compared to the payroll for developers so it didn't make any sense to not pump their workstations.
More options
Context Copy link
More options
Context Copy link
Wth, my 2018 pc only had 16 until I recently upgraded to 32. I think you were in the top fraction of a percent of users.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link