birb_cromble
No bio...
User ID: 3236
I don't know if I'd call them good, per se, but two horror movies that I've recently seen that very earnestly tried to do something new were "Glorious" and "Anything for Jackson".
In the former, a man gets trapped in a highw rest stop with an Eldritch monstrosity, and if he doesn't satisfy the monster's physical needs through a glory hole, the world will end. The monster is voiced by JK Simmons, who is having the time of his life.
In the latter, a pair of grieving grandparents draw on the power of Satan to perform a "reverse exorcism" on a pregnant woman to try and bring back their dead grandson.
I have permanent damage to one knee, one shoulder, one hip, and my neck from past injuries. I also have some pretty wicked arthritis to top it off. While I can drag a deer carcass multiple miles through the snow, it hurts more now than it did when I was in my 20s.
Are there any turkey hunters here? I'm getting old and beat up enough that deer hunting is losing its shine, but I really like having a freezer full of game.
How is turkey hunting? It always feels like it's a sea of trigger happy nut jobs out in the woods, but I've never actually done it.
His biggest concern is that he has another son that he had much later in life. He's afraid he won't see him graduate from high school.
Interesting. How complex are these contracts that they need that many lawyers to handle them?
If they currently have 30 lawyers working for them and 3,000 total employees, that's one lawyer for every 100 employees. That's, to put it mildly, and insane ratio.
I think a lot of this comes down to the fact that nobody really has no idea where risk is going to be priced in to these business models.
All the AI companies are trying to push it onto the end user to the maximum extent possible - they'd like to keep humans around to function as accountability sinks and not much else. That works great if you accept that the "agent" isn't intelligent and has no agency.
The thing is, if you're claiming that your models are so self-aware that they deserve their own retirement plan, sooner or later somebody's going to believe it and claim that either the AI or the corporation has some form of liability for it. That's some incredibly novel legal ground, and I wouldn't doubt that a fairly large number of those lawyers are wargaming defenses to that right now.
According to Microsoft and OpenAI, AGI happens when OpenAI earns $100 billion in revenue.
can anyone explain NFTs to me in a way that makes sense?
The best use case I've seen is for transferable, non-physical assets that provide value in the real world.
A concrete example would be a lifetime ticket to a band's shows.
Son of a bitch. I hadn't even considered that.
The entire Oracle trajectory makes sense to me. Oracle was very profitable for a long time because they were a legal extortion company that grudgingly shipped a database. Once MS SQL Server and Postgresql got good enough to compete, and once new companies wised up enough to avoid Oracle in the first place, the writing was on the wall. If they didn't diversify they'd die. This mapped to the increasingly wild hail Mary throws they've been making. Each attempt that failed to materialize hyper growth made the next attempt even more important. After their cloud offering ended up a distant fourth place and nfts didn't pan out, what was left? AI, obviously.
Completely ignoring the potential of the technology, Oracle really has no place in the market. They don't have a foundation model: they aren't even trying to make one. At best, they seem to be aiming for some kind of position as a utility company for GPGPU compute. Less charitably, they seem to be selling their credibility by laundering dodgy debt through their corporate credit rating.
I'd assumed that everyone involved in this was as greasy as Ellison, and much like most of the tech industry, I have learned not to make the mistake of anthropomorphizing Larry Ellison.
Doing some more reading, it looks like Anthropic employs a disproportionate number of rationalists and effective altruists. Even if you think those two philosophies have some good parts, they definitely have some peculiar failure modes, and some are worse than others. This is the same philosophy that Anthropic's chief philosopher holds.
I'm not a fan of this at all. I grew up around true believers. I even held the snakes. They're scarier than a con man, because at least the con man has predictable goals.
In short, thank you for bringing this to my attention, and fuck you for putting this evil in my head.
https://finance.yahoo.com/news/margins-foot-traffic-down-target-184105183.html
My understanding is that they expanded too much, too fast, and now they have a lot of stores that aren't producing.
Is Target a good comparison, given that they're bleeding foot traffic like a gut-shot deer?
I'm a lot more optimistic about Anthropic's business plan than OpenAI's. They recognized that getting their hooks into enterprise users from the get go is a better strategy than having hundreds of millions of users who don't pay.
Do the tech companies even really have that high of a P/E ratio anymore? Microsoft's doesn't really stand out. Amazon's seems a little high, but not absurdly so.
Are referring to OpenAI's plans for massively increased expenditures in the future?
I am.
shipping
I wonder what that's going to look like in a week. It seems like something should happen.
I've heard good things about it, but the synopses I've seen feel like the target market is atheists who like dungeons and dragons a lot. I only really meet one of those criteria, so I'm a little wary. Is that too uncharitable of a description?
Software giant Oracle corporation is laying off thousands of workers and killing their Texas data center plans, per Reuters and Bloomberg. It appears that their capital expenditures have gotten ahead of their ability to pay for them and now they face the regrettable need to say it out loud shortly before markets close on a Friday afternoon.
In December, the company said it expects capital expenditures for fiscal 2026 to be $15 billion higher than the $35 billion figure the company estimated during its first-quarter earnings call.
The layoffs will impact divisions across Oracle and may be implemented as soon as this month, the Bloomberg report said, citing people familiar with the matter. Some cuts will be aimed at job categories that the company expects will shrink due to AI.
This may be indirectly tied to the Iran conflict as Mid East sovereign wealth funds have begun pulling back from investment.
I'm interested to see the fallout of this one. My understanding is that the Ellison clan is fairly tight with the Trump admin.
Beyond that, I have concerns that this may be the match that lit the fuse on AI spending. I have spent the last six months trying to figure out why these valuations made any sense whatsoever. The expense profile of companies like Anthropic and OpenAI looked a lot more like Caterpillar to me than Salesforce. When it came to Oracle, I couldn't make sense of it at all.
In terms of explanations, I only had three explanations I had were that I was:
- Missing critical information
- Retarded
- Right
I still don't know which one it is.
Some of you here are clearly smarter and more educated than me. What do you think I'm missing here? My gut prediction is that this spirals into an even bigger flight from capital in the next six months, which causes holy hell on the retail market because the average investor is more leveraged now than they have been at any point in my lifetime. I'm also assuming it'll kill quite a lot of "LLM Wrapper" companies, like the one run by fear porn expert Matt Shumer.
I assume Google will be OK.
Beyond that, I don't have any idea.
Any predictions?
What are the actual consequences for the world?
Fish stocks collapse even faster than the already are
But Children of Time is one of my favorite modern SF books, so we're gonna fight.
Have you noticed that Tchaikovsky has kind of collapsed into telling a single story lately? I finished Shroud recently, and it seems like he's using the AND THEN SUDDENLY THERE WAS AN EMERGENT COLLECTIVE CONSCIOUSNESS trope more frequently.
I'll never turn down an excuse to recommend Roadside Picnic.
Also American. The people I know who meet that kind of millionaire next door criteria (eg: frugal, older) tend to not get into the eight figure range. It could be a regional thing: I'm in a pretty LCoL area. The people who I'm describing all ended up here because of the nearby college.
Maybe he was full of shit on that one, then. I'm not a billionaire, so it's hard to speak from experience.
I think he meant it more in the way that you can't swing for the fences on day one. You have to aim for millions before you get to billions.
In my own life experience, I think I have a lot more distaste for the "mere millionaires" than the billionaires. Maybe it's because I've met a bunch of people in the 10 - 50 million dollar net worth range and only one billionaire.
Nonetheless, every single person I know who is worth more than $10 million reached that position through inheritance or marriage.
They don't work, and seem to flutter through life thinking they'll become an Expert at Something, but usually get bored before they actually get anywhere. The most dedicated of them might get another degree, but it seems to stop there.
Beyond that, they're insulated from consequences in a way that warps their minds. Most of them think they're quite brilliant investors, for example. Unlike me, they're able to afford big, risky bets that I can't take unless I want to stake my entire future on it. They'll then crow about their enormous unrealized gains on their portfolio, failing to note that every single investment they've made other than Nvidia is down 10 - 50%. They could have literally made more money investing in a basic-bitch whole market fund. It doesn't matter though, because they've staked enough of it as collateral for tax-free loans that they're set for life.
They frequently don't even realize what is realistic or not. I was talking about burnout once with one of them, and he gave me his sure-fire stress solution: just take a year off and go to Europe. I didn't have the heart to tell him that even if I could afford to take a whole year off work and start the job hunt when I got back, I still didn't have a family chalet that I could use rent free during my visit.
The one billionaire I have met, however, was sharp as hell. He clearly knew both his industry and finance inside and out, and simultaneously had the self-awareness to realize that he was not normal. He knew he had a lot of help getting to the point where his wealth could really snowball, and he recognized that luck was involved. Someone asked if he had advice on becoming a billionaire. The answer was long, but one of the most important things he said was that you can't do it without taking risks. He also made it clear that you aren't going to even be able to take those risks unless you're a multi millionaire first. It was a very different conversation compared to people who fell into money.
- Prev
- Next

My favorite thing about Dredd is that Stallone refused to wear the helmet. Urban refused to take it off.
More options
Context Copy link