They occasionally do this, but psychiatrists don't like it because they see it as schools pawning off their disciplinary problems on doctors rather than solving them themselves. I can't find it, but the local news did a story a few years back after one of the school shootings about how some local districts had adopted zero tolerance policies and were sending any kids who exhibited violent tendencies to Western Psych at the drop of a hat. The doctors they interviewed basically said that the ED is there for people who have acute mental health crises and not kids who got into fights. So what was happening was the kids were waiting for hours at the bottom of the triage list and when the doctor concluded they didn't meet the criteria for admission they were sent home. But the school got to say they referred him to psych immediately and didn't take any chances.
The upshot of what the one doctor was saying was that long-term behavior problems are the kind of thing that needs to be dealt with over the course of months or even years, and that psychiatric hospitals aren't equipped for that. He said that if the schools were concerned they needed to hire their own mental health staff that could work with students and parents to resolve the problems. I can tell you right now that this isn't going to happen because the incentives are aligned against it. If a school hires its own counselors and starts its own program for troubled youth then it's going to cost a lot of money and if one of those kids ends up doing something terrible the program is going to be put under a microscope and probably won't come out looking good. If they say "we sent him to Western Psych after we saw the red flags" then their insurance will pay for it and Western Psych can explain to the media why the treatment didn't work.
Realistically, though, the doctors were right: Not all problems are mental health problems. If a guy keeps getting into fistfights at bars that don't cause any serious injury we don't send him to the nuthouse. It's a criminal matter. And realistically we don't even do that much in a situation like that; while misdemeanor battery has around a five year max in most jurisdictions, first offense you can likely plead down to disorderly conduct. After that you'll get a combination of fines, probation, and suspended jail sentences until you either get into a fight while on probation or the judge looks at the rap sheet and simply loses patience. The most you might get in the way of treatment is court-ordered anger management classes (I know three people who have completed these and they all say it works). I've never heard of anyone going to Western over a barfight unless there are obvious extenuating circumstances.
Helen Reddy had 19 chart hits in the 1970s that spent a cumulative 245 weeks on the Hot 100. Barry Manilow had 15 hits that spent 233 weeks on the charts. Pink Floyd had 2 chart hits in the 70s. As far as total weeks are concerned, that's not in my source since they're not in the top 100 of 70s artists, so I consulted AI and got contradictory results. DeepSeek confidently tells me that:
Pink Floyd had several songs chart on the Billboard Hot 100 during the 1970s, but their cumulative weeks on the chart were relatively modest compared to some other artists of the era. Here are the songs that charted and their respective weeks on the Hot 100:
- "Money" (1973) - 13 weeks
- "Us and Them" (1973) - 1 week
- "Time" (1973) - 1 week
- "Have a Cigar" (1975) - 4 weeks
- "Another Brick in the Wall (Part II)" (1979) - 23 weeks (though this song peaked in early 1980, it spent most of its chart run in 1979)
Adding these up:
- "Money": 13 weeks
- "Us and Them": 1 week
- "Time": 1 week
- "Have a Cigar": 4 weeks
- "Another Brick in the Wall (Part II)": 23 weeks
Total cumulative weeks on the Billboard Hot 100 in the 1970s: 42 weeks.
Note: The exact number may vary slightly depending on the specific chart data source, but this is a close approximation.
ChatGPT was more circumspect:
During the 1970s, Pink Floyd had limited presence on the Billboard Hot 100 singles chart. Their song "Money," released in 1973 from the album The Dark Side of the Moon, reached number 13 on the Billboard Hot 100. However, specific data regarding the total number of weeks "Money" remained on the Hot 100 during the 1970s is not readily available. It's worth noting that "Another Brick in the Wall, Part II" became a number-one hit, but it was released in late 1979 and achieved its peak chart position in 1980. Therefore, its chart duration primarily falls outside the 1970s. Overall, Pink Floyd's presence on the Billboard Hot 100 during the 1970s was minimal, with "Money" being their primary entry.
Gemini's response was similar to that of ChatGTP, except with the added caveat that "To get the precise number, you would need to consult detailed Billboard chart archives".
To be clear, DeepSeek was clearly the worst of these in that it confidently gave an incorrect answer. The other two gave technically correct responses that nonetheless don't qualify as answers. Pink Floyd only had one chart hit in the 1970s, "Money" which spent 15 weeks on the chart between 5/19/1973 and 7/28/1973, peaking at No. 13. While "Another Brick in the Wall was released in 1979, it wasn't released until November 30, and it did not debut on the charts until January 19, 1980. This was not difficult information for me to find on my own, considering that Billboard publishes it on their own website. "Not readily available" my ass. The Gemini response pisses me off more, though, because Google has detailed Billboard chart archives in the magazines scans that are available on Google Books for all to see. Apparently Gemini's training data doesn't even include their own archives. For the record, "Us and Them" was released as a single in 1974 but didn't chart, "Time" was the b-side of "Us and Them", and "Have a Cigar" was released as a single but also did not chart.
Anyway, getting back to my original point, while Pink Floyd sold a lot of albums, their music just isn't the kind of immediately arresting, memorable thing that @coffee_enjoyer is describing. They didn't get played on the kind of AM Top 40 stations that most college undergraduates were listening to. (Yes, Pink Floyd was, and to a large degree still is, popular among college students, and "College Rock" has largely become a synonym for the kind of independent music that gets played on college radio. But this is the minority. Most college kids listen to Top 40 or other contemporary radio and aren't particularly tuned into progressive music.) Their current iconic status is based on people who bought albums they spent 40 minutes listening to, not catchy radio hits. They aren't particularly memorable, their music is just intriguing enough that it demands multiple listens.
Yes, but to my knowledge, John Paul Jones never wrestled, so that didn't seem relevant. The move from music to wrestling isn't unprecedented either, with Sid Vicious taking up the sport after getting kicked out of the Sex Pistols. As an aside when Sandra Bullock was married to Jesse James I didn't know who that was and just assumed she was married to Road Dogg Jesse James, who had a second act himself as tight end for the Steelers. Currently experiencing a second act is Cam Ward, who decided to try his hand at football with the Miami Hurricanes after a long career as goaltender for the Carolina Hurricanes.
By that logic we should value Barry Manilow and Helen Reddy over Pink Floyd, and Neil Diamond over the Rolling Stones.
What's even more amazing is that he went back to England and ended his career as a rock musician, playing bass and occasional keyboard for Led Zeppelin. This wasn't entirely without precedent, however, as Booker T. Washington had recently had his own career revival as a soul musician, having a series of hits with the Stax record label with backing band the MGs. Remarkable men both.
I don't think this would have the results you're looking for. Most people are more likely to remember pithy doggerel than the works of say, Alfred Lord Tennyson. The highest-value poems would invariably be dirty limericks.
Oh boy. As you may know, I'm an attorney, and before I proceed I want to give some general disclaimers. First, I'm not your attorney and none of what I'm about to say should be taken as legal advice. I don't know your exact situation or even the state in which you reside, so I'm not in a position to give specific advice. As for my qualifications, I had my own practice between November 2019 and May 2023 and I did estate planning and administration work, but not exclusively. This work was in Pennsylvania, which does not have ToD deeds. For a decade, includign when I had my own practice, I did oil and gas title work. While this may not seem like it has much to do with estate planning, a large part of it was dealing with the consequences of poor estate planning and figure out how to clean everything up. I did this in Pennsylvania, Ohio, and West Virginia. OH and WV have ToD deeds, but I didn't see them much, for reasons I will make clear below. I currently do litigation in PA and WV primarily and still do estate work very occasionally, but it's more of a side gig where someone will ask me about a will and I'll do it through my firm or a coworker's friend will ask them and I'll get it because I'm one of the three people here who have done that kind of work. I don't mess around with anything that involves the Federal Estate Tax or the word "irrevocable", but I've been to plenty of seminars involving this kind of stuff so I have a decent working knowledge. Basically, I know enough about it to know that it's a liability minefield I don't want to get involved in.
With that out of the way, I'd generally recommend against ToD designations for real estate. Certain charlatans like Suze Orman try to convince everybody that probate is the worst thing in the world and is to be avoided at all costs, but that's not necessarily true. One of the most common questions I was asked when I did estate planning is a variation of the following question I actually got a call about a couple months ago: A woman's husband recently died. She had three children, including a 36-year-old unmarried son who was living with her. She already had a will that left the house to the son, but asked me about possibly conveying the house to the son. Her intention was to avoid the 4.5% inheritance tax.
My answer was an immediate and unequivocal "no". Her plan would only work if her son stayed in the house and continued to live there indefinitely after her death, and he continued to be a responsible single guy in good health with no financial difficulties. The most obvious issue, though, was the tax issue. If the son continues to live in the house after she dies then it's not a problem. But the woman is only in her 60s; she could easily live another 20 or even 30 years. Suppose the son buys his own house in the meantime. When his mother dies, he now owns a house that he doesn't live in. If he sells it, he takes a short-term capital gain that is taxed as regular income since he isn't entitled to a homestead exemption. To make matters worse, the capital gain is on the entire sale price, since he got the house for free. On the other hand, if he inherits the house in that situation and wants to sell it, he can take advantage of the step-up in basis and only pay capital gains tax on the difference between the sale price and the market value at the time of death, which is likely to be zero. He'd still have to pay inheritance tax, but this is only 4.5% as opposed to the 20%+ he'd be paying in capital gains tax.
Beyond tax considerations, though, this woman would risk dealing with what I call the five Ds:
-
Death: If her son dies before she does, the beneficiaries of his estate will become the owners of the property. The woman probably assumes that she'll outlive her son and even if she doesn't, he'd leave his estate either to her or another family member, but he could leave it to anyone. It could end up in the hands of a charity or an ex-girlfriend who is disinclined to let this woman keep living there for free.
-
Divorce: If her son were to get married, the house could be an asset subject to distribution in any subsequent divorce proceeding.
-
Disability: If her son becomes disabled, owning a significant asset will affect his eligibility for SSDI, Medicaid, and various other benefits.
-
Debt: If the son were to file for bankruptcy, the house would be an asset subject to distribution to creditors. Chapter 13 bankruptcy allows debtors to protect equity in a home by entering into a payment plan instead of liquidating the estate. The catch is that the payment plan has to raise at least as much money as the creditors would get in a Chapter 7. This realistically isn't an issue, since most people filing for bankruptcy don't own their houses free and clear; they've already mortgaged them to the hilt. If the house isn't complete shit it probably forces him into a 100% plan, which could be unfeasible depending on the amount of debt. Worst case scenario he's forced to sell the house to cover the debt. On the other hand, if he doesn't own the house it's likely a no-asset Chapter 7 or a straightforward Chapter 13.
-
Dumb: People do dumb things all the time. He could mortgage the house to buy a boat and leave her vulnerable if he can't make the payments. He could neglect to pay property taxes. He could try to save a little money by not insuring the property. He could decide to rent out an extra bedroom to a hobo. Those examples are downright idiotic, but even well-intentioned gestures can fit this category. Say he wants to do some kitchen renovations. His mother thinks he's just paying for them, but in reality he took out a home equity loan. Six months later he loses his job and can't make the payments. Now she's looking at foreclosure as the result of actions she had no control over.
ToD deeds were created as an attempt to mitigate the effects of the five Ds. By creating a revocable future interest in the property instead of an irrevocable present interest, the beneficiary can't really do anything to affect the property while the grantor is still living. Sounds good, but this creates its own problems; by taking assets out of probate, any issues must be dealt with outside of the probate process. Probate isn't a boogeyman. It's a process specifically put in place to deal with these kinds of issues. Wills allow you the flexibility to provide precise instructions regarding your intentions, and allow you to appoint an executor to ensure that these instructions are carried out. Probate courts provide a forum to resolve any issues that arise. Outside of probate court and it's centralized process; you're out of luck. Just a few issues I can think of the top of my head, using the above case as an example:
-
Instead of conveying the house outright, the woman executes a ToD deed naming her son as the beneficiary. Several years later, the son becomes disabled and cannot work, and relies on government benefits. The mother then dies. The son now has an asset that cuts off his eligibility. Hod the house been transferred by will, she could have created a provision that created a testementary trust in the event that any named beneficiary were receiving benefits at the time of her death, and the trustee would have been able to ensure that the house would remain property of the trust for the son's benefit and that he could continue living there and receiving benefits.
-
The woman executes a ToD deed conveying the house to her son and two other children in equal proportion upon her death, at which time the house is worth $300,000. Two of the children want to sell the house and get their $100,000 share. But the son, who is still living there, doesn't want to sell, and correctly claims that as part owner he has the right to the premises. He further refuses to buy out his sisters' interests. If the sisters want anything out of the deal, they'll have to file a partition action, which will cost 5 figures and could take years to resolve. They're also unlikely to get their full shares, since the son will be able to claim any mortgage payments, taxes, repairs, insurance, or any other allowable expense he made towards the house over the course of his time living there. The house will be sold at auction, invariable resulting in a lower sale price than could be had if it were properly marketed. A will could expressly include buyout provisions (I usually included these if a child was living in the family home), expressly direct the executor to sell (though he could sell to the son), or give any number of other guidelines. Even in the absence of these, this is a dispute the probate court would be able to resolve before title ever transfers. It could get complicated, but nowhere near as complicated as a partition.
-
The son gets married and has a child. The woman executes a ToD deed naming the son as beneficiary and the child as contingent benificiary. The son predeceases the woman. The woman then dies while the child is still a minor. The mother is still alive. The mother now has to petition a court to establish a legal guardian for the child's estate, so that the real property can be managed for the child's benefit until she is of legal majority. This is a complicated and expensive procedure. If the guardian wishes to sell the house to use the money for the child's ongoing support, they need to get a court order. If they sell the house and get the cash, they're required to invest the money and only spend the interest; if they need to dip into the principal, they need a court order. They need to file an annual accounting with the court. It's a complicated process. On the other hand, and will would contain automatic trust provisions for the event that a minor had to inherit a major asset. The trustee could be named in advance, and the trust set up shortly after death without court involvement. The trustee doesn't need court approval to do anything, and the accounting requirements are much looser.
-
The woman executes a ToD deed with her three children as beneficiaries in equal proportion. The house is the only item of value in the estate. Shortly after the woman's death, the children sell the house to a third party. They do not consult an attorney because they believe that since they aren't opening an estate and there's only one asset they don't need to. A year later, a man claiming to be a creditor of the woman calls the son, asking about the money he is owed. After the son tells him that his mother passed and no estate was opened, the man discovers the ToD deed and subsequent sale to the third party. He then sues all three of the woman's children for their pro-rata share of the debt. If the woman had a will, or died intestate, the estate would have been advertised and the creditor would have had a chance to make a claim. The executor could have settled the matter out of the proceeds of the sale before the money was distributed.
These are just a few things I can think of off the top of my head. The point is, DIY estate planning is a bad idea. I talked to a lot of people, smart people, who thought they were doing something really smart by avoiding paying a lawyer to have a proper estate plan done. These people usually ended up doing things that would cost their estates significantly more than the most expensive estate planning lawyer in the area would charge. A couple thousand bucks may sound like a lot, but you have no idea how easy it is to spend that much when an estate goes haywire. People who tell horror stories about probate are usually referring to instances where something got fucked up and the matter was held up or needed to be litigated. These are unfortunate circumstances, but in no case was there some easy self-help fix that could have avoided the situation. Please, consult with an attorney as soon as you can.
You knew, but did you care? I think that's what ultimately did the platform in. Seeing pictures of people you actually knew was one thing. Seeing the wedding pictures of someone you hadn't talked to in a decade was just crap that cluttered up your feed. There was a certain novelty to it for a while, but as soon as people realized that that was as far as the relationship was ever going to go (or, more ominously, that they had no interest of pursuing the relationship any further), the novelty wore off and people stopped caring. I'm not going to lose any sleep over the fact that I no longer know what some guy I was sort of friends with in college is up to now.
I think Facebook simply died a natural death that can't really be attributed to anything Zuckerberg did or didn't do. I don't think it's a coincidence that the demise of Facebook roughly corresponds to the rise of Reddit as a mass-market phenomenon. Though the platforms seem very different, they essentially serve the same purpose — a time suck for bored people. People who used to spend their free time scrolling Facebook now spend it scrolling Reddit, and Reddit offers more in the way of content than Facebook ever could. Message boards have existed since the dawn of the internet, but they were mostly specialized. Now, everyone has a whole universe of them in one convenient place, and the more popular subs like AmItheAsshole aren't the kind of thing that can exist as a stand-alone site.
That and there was just a general weariness about some of the shit that went on there. I'm not talking about politics, except in the sense that everyone had a friend that posted about nothing but politics and you didn't give a shit about their opinions regardless of whether you agreed with them or not. And then there were the people who posted nothing but memes. And the people who posted nothing but pictures of their kids. This was all relatively benign, though. The worst was the people who overshared personal information, or hinted at personal problems without giving details, all of it for the express purpose of generating lazy sympathy. The politics was often the most interesting thing about it, because at least it gave you the chance to engage in a way similar to how you would in person. But even in person, the guy who always has to bring up politics is annoying.
So the normal discussions that you would have with these friends were few and far between. Then sponsored content began taking up a greater and greater percentage of your news feed (I don't look at my account often anymore, but when I do I'm lucky to get one or two posts from friends, even if profile checks show a significant number of them still posting regularly; it's really something to behold). So people lose interest and go to places that aren't as irritating. Also, like Reddit, they changed to a "modern" interface that does the site no favors. The best thing they could do is go back to the 2010 UI. But it won't happen.
Even if HR does initial screens, they aren't throwing the resumes of qualified applicants in the circular file just because they're (probably) white. Most of it is throwing out the massive volume of garbage applications from people who have no hope of getting the job in any universe. Usually they don't even do a great job at this, especially if this work is outsourced to a recruiting company. My brother had a manager who was completely incompetent but only ended up getting fired after it was discovered that he was sharing personal information of female employees with people who didn't need to know about it. A friend of ours (who used to work with my brother) works for a company that was looking to hire a manager and the hiring team was complaining that all their staffing company was doing was sending them this loser's application over and over again.
Fire them for what? Affirmative Action doesn't give private companies license to ignore Title VII. Any "affirmative Action Hires" you can find are likely going to be marginal cases where the resume was similar to a qualified non-minority candidate. The upshot is that any AA on the part of your HR department isn't going to be consequential to the point where it's worth laying off the majority of your HR department so you can pay them unemployment on top of the increased rates you're going to be paying an outside contractor to do the work. Not to mention the fact that this outside contractor isn't going to be as familiar with your company and it's policies as your existing staff. My firm outsources its billing to a third party firm and my boss has hour-long weekly Zoom meetings with them just to make sure they're doing what we need them to do. And this is a relatively small firm. In any event, let's not pretend you're going to give some company in India major say in hiring decisions.
Jimmy Carter's death. The Flag Code calls for the flag to be lowered to half staff for 30 days following the death of a former president.
This is a total myth that was fabricated by the Right to excuse the absolutely inexcusable behavior of Trump supporters. If you spent the summer of 2020 watching Fox News point to a few high-profile incidences of police cowardice or listening to NPR's defund the police nonsense then it's understandable how you would get that impression. But if you watched local news or actually paid attention to what was happening you'd have seen that there was no shortage of people who were arrested and charged. Hell, here in Pittsburgh there were news reports on an almost weekly basis that consisted of a grainy photograph of people the police were looking for in connection to spray painting buildings, or throwing rocks at police, or some other minor crime that wouldn't even merit a mention in the newspaper let alone a media-assisted manhunt. I can't speak to this happening in every city, but I know the same was true for Los Angeles and Atlanta, and the Feds were looking for a ton of people as well, which is interesting considering that they only had jurisdiction over a small percentage of the total rioters.
The reason you didn't see many high-profile convictions is because the BLM protestors were at least smart enough to commit their crimes at night and make some attempt at concealing their identities. For all the effort police put into tracking these people down, if there's no evidence there's no evidence. To the contrary, the Capitol rioters decided to commit their crimes during the day, in one large group, in an area surrounded by video cameras. Then they posed for pictures and videos and posted it on social media. Were these people trying to get caught? Which brings me to the dismissals. Yes, a lot of the George Floyd riot cases were dismissed, and conservatives like to point to this as evidence of them being treated with kid gloves. But the prosecutors often had no choice. The tactics of the Pittsburgh Police (under the administration of Bill Peduto, no one's idea of a conservative) were to simply arrest everyone in the immediate vicinity the moment a demonstration started to get out of hand. Never mind that they didn't have any evidence that most of these people committed a crime. If a crowd throws water bottles at the police and they arrest everyone they can get their hands on, good like proving that a particular person threw something. Unless you have video or a cop who is able to testify, you're entirely out of luck. So they'd arrest a bunch of people and ten the DA "(Steven Zappala, no one's idea of a progressive) would drop the charges against the 90% against which they had no evidence. In any event, I didn't hear about Biden or any liberal governor offering to pardon any of these people.
Seriously. The Capitol rioters were morons operating under the assumption that their sugar daddy Trump would bail them out because he agreed with their politics. If he wanted to give clemency to people who got swept up in the crowd and trespasses where they shouldn't have, I could understand that. But by pardoning people who assaulted police officers, broke windows, and the like, he shows a complete disrespect for law enforcement and the rule of law. And all of it coming from a guy who is supposedly about law and order. It's absolutely disgraceful.
This is a typical suburban street in the Pittsburgh area. Every house has an attached garage (technically an integral garage, but whatever). The garages on the downhill side of the street aren't visible, and the garages on the uphill side are below the grade of the main level. This is more a consequence of topography than anything else, but the whole problem is solved in flat areas by designing neighborhoods with alleys in the rear where you can have a garage and a place to put out the garbage and not have to worry about aesthetics.
And no, cars don't have to be stored outside, but it's still better to garage them. It was 3° this morning but my car was 55° when I left for work. It's also snowed about every other day since Christmas and I don't have to spent 10 minutes clearing my car before work, when I'm least in the mood to do so. I don't roast in the summer. I don't have to worry about punk kids trying door handles. When a storm rolls in and branches are blowing around I don't have to worry about them hitting my car. And blocking in a garage isn't the answer. Aesthetically speaking, you're probably not going to be able to match the existing exterior, so you either find something "close enough" or use something totally different; either way, it screams "this used to be a garage". And since most fire codes prohibit running ductwork in garages you have to retrofit it for HVAC. It's also probably not insulated so now you have that to deal with.
There being other ways to utilize the interior space is only a benefit if you're actually going to use it. I already have a basement I don't really use for anything; I'm not parking my car outside so I have more wasted interior space.
Justice for what, though? If what he's technically being accused of would have been attributed to some faceless administrator whose name didn't come up until the middle of the investigations, few people would care about whether this person technically lied about funding an organization that may or may not have been funding gain of function research into coronaviruses. No, the ire directed at Fauci is almost entirely due to the recommendations he made during the pandemic and the people who didn't like them. These people had no love for Fauci before he was dragged in front of the committee and were looking for an excuse to nail his ass to the wall.
So what's the problem? Where is the historical inaccuracy? Yes, it's a work of fiction, but works of fiction are often based on real historical facts. The producers probably included it because it elucidates their point better than some dry as dust historical tract about how raw materials from The Congo were often used to produce military equipment. They didn't alert you that it was a work of fiction, but is this really necessary? If a documentary about WWI were done in the same style but quoted "For Whom the Bell Tolls" instead, would you insist that they flash "Work of Fiction" in yellow Impact font on the screen just to remove any ambiguity? And who are they supposed to be propagandizing, anyway? You can't stream it without paying extra, unless you have Kanopy, which most people technically have access to for free but don't know about and probably wouldn't be interested in. I'd be more concerned about historical movies that clean up the plot for narrative convenience and leave the viewer with an incorrect impression. These aren't even trying to pretend to be documentaries, but the fictionalized movie version ends up being cultural canon.
You just had to be there. There's nothing bad about sax solos per se, but by 1992 they had gone out of style and were a reminder of the 80s, which wasn't held in high esteem. Sax solos in pop records were ersatz soul. In R&B records they had worn out their welcome. But mostly it was just an ick factor involving anything associated with the 80s. The article isn't meant to be taken too seriously.
Ah, you're referring to the Bad Saxophone Solo (BSS). I stayed up much later than I should trying to find this incredibly on-point article posted to a now defunct file sharing site back in 2002. I'm posting it here not only for the enjoyment of everyone on the site, but so I can find it without searching the depths of the Internet Archive and its dead links. It goes to show how much hip musical tastes have changed in the past 25 years, and is a bit of a time capsule (it would be unthinkable now for a serious critic or music fan to shit on Hall & Oates, but back then they were punching bags). Enjoy:
Amongst the many horrible things to emerge from the cultural swamp of the 1980s (Reaganomics, crack, leg-warmers, the Coreys Haim and Feldman, Winger), there is nothing in the world of Rock music worse than the Bad Saxophone Solo. Unremittingly phony and invariably devoid of any shred of real emotion or creative expression, this sonic assault on all that is worthwhile is more destructive and more widespread than one could imagine in their most horrific nightmare.
Perhaps the most mysterious aspect of the Bad Saxophone Solo (BSS) is its origins. By all accounts, no matter when it was first laid to wax, all BSS seem directly evolved from Kenny G.'s 1986 smooth-jazz hit "Songbird." So awful that it seems to exist outside of time, this incomprehensible morass of suck is ground zero for all Bad Saxophone Solos ever. Spreading the BSS from Smooth Jazz throughout the world of popular music, "Songbird"'s evil is so pervasive that not even the collective din of Charlie Parker, John Coltrane, Roland Kirk, Eric Dolphy, Joe Henderson, and Lester Young all simultaneously spinning in their graves non-stop since its inception can drown out its malignant influence.
The BSS' powers are truly formidable: after but a few seconds of its aural assault, a cheesy-but-catchy pretentious prog-pop tune like Supertramp's "The Logical Song" is rendered so muzacky and faux-funky as to make the theme from Night Court seem like a vintage George Clinton production. The Bad Saxophone Solo has even been known to crop up within the confines of otherwise decent songs. In the middle of a relatively quality tune like the Pogues' "Summer in Siam," the schlocky BSS is like a fire hydrant at a dog-show - a piss-soaked novelty distracting all attention away from the true talent and refinement therein.
But, let's back up. What exactly is the BSS? The Bad Saxophone solo is an insidious but elusive blight. On the wings of some Joe Cool sunglasses-wearing, bandana-ed jackass's overly emotive stage gesticulations it alternately glides or skronks and wails it's way into your brain. Before long you're staring vacuously into space, tuning out not just it but the entire world around you, because the truly Bad Saxophone Solo is literally mind-numbing. Which song contained its gut-wrenching sound? And how exactly did its pseudo-bluesy/soulful melodic interpolation go? You don't know, because, like elevator music (even the worst of which is a preferable alternative), a Bad Saxophone Solo convinces the brain on an essentially primal level that sensory stimulus is a bad thing. In order to avoid the BSS (along with some of its multi-media counterparts like the Bad Hotel Painting and the Local Car Dealership Commercial) the brain attempts to ignore it and in the process closes itself off to the world around it.
Alas, the world, and unfortunately the BSS, is still there, and upon recovery blame must be placed in order for any true healing to begin. Some culprits are obvious. The music of Glenn Frey is a good place to start. Often, as one begins to surface out of the depths of a Bad Saxophone Solo-induced stupor, vague memories of the drab tones of this former Eagle's laughably idiotic music will linger. Was it "The Heat Is On" that so dulled your senses, or could it have been the pummel-your-forehead-repeatedly-against-a-spackled-concrete-wall tones of that soft-rock atrocity "You Belong To the City?" You don't know, and that's the point. Like an aural lobotomy, the very nature of the Bad Saxophone Solo prevents its victim from remembering its exact source. What's more, prior knowledge of the stopped-up commode that is Frey's musical canon may not be enough to help the victim sort out what just happened. Even an experienced BSS victim is subject to the confusion and chaos that follows a severe attack, often mistaking the music of Frey for other sources (such as serial-BSS conveyors like Huey Lewis and the News or Hall and Oates).
The experience can be excruciating. A typical Bad Saxophone Solo experience finds the victim awakening - like a sorority-girl the morning after a Rohypnol-enhanced date-rape - groggy and disoriented but acutely aware that they've been fucked and that it was a far from pleasurable experience. Drooling uncontrollably and just steps away from catatonia, the unlucky listener will, for example, catch the last few endlessly repeated chords of the George Thorogood blues-rock abortion that is "Bad to the Bone." Many victims are unable to believe that this song could actually get any worse, but indeed, its atrociously soulless and completely forgettable Bad Saxophone Solo makes it so.
The question remains: why would the Bad Saxophone Solo do this? What is it goal? The answer may be revealed deep within the lyrics of one particularly saccharine and nauseous BSS carrier. To the casual observer, Wham!'s "Careless Whisper" might be dismissed as the lonely musings of two men, one struggling with the desire to thwart anonymity and the other struggling to stop getting caught having anonymous homosexual sex in public bathrooms. But "Careless Whisper" is so much more than that. It is actually both a purveyor of the BSS and an unintentional post-modern treatise on the plight of the Bad Saxophone Solo victim. The "whisper" at issue here is not just, as would at first seem the case, the hushed words of a gossiping lover. The "whisper" is in fact the bleating, faux-soothing tones of a particularly bland Bad Saxophone Solo. "No, I'm never gonna dance again…" reveals "Careless Whisper"'s narrator, unveiling the ultimate harrowing result of the BSS. The BSS to prevent (often with great success) its victim from any further enjoyment of music. Ever. Especially music that contains saxophones. The ugly truth is that, for the BSS victim, "guilty feet have got no rhythm."
There is no known cure for the Bad Saxophone Solo, and no band or musical style is safe from its cancerous grasp. Great bands like Pink Floyd, David Bowie, and, yes, even the Rolling Stones have bowed to its hokey will. It is inescapable. Even if one were to explicitly avoid elevators, dentist chairs and movie soundtracks, the BSS would still creep up unannounced on Classic Rock radio - perhaps even in an overblown "life on the road" Bob Seger ballad. Worse yet, though the frequency of the BSS has diminished since the onset of the 90s (3d Wave Ska Revival non-withstanding), it is quickly being replaced by an even more deadly variant: the dreaded Rock and Roll Scratch-DJ Turntable Solo (RRSDJTS). For the love of God, please, beware.
-Robert Whiteman
As a straight investment, maybe, but who is buying this stuff as a straight investment? Collectibles have the advantage of having intrinsic utility that these meme coins don't. In the 1980s my mother was gifted a series of limited edition Norman Rockwell commemorative plates that would supposedly increase exponentially in value over the years. They didn't, but they were displayed in my parents' dining room for at least 30 years. I don't think Trump Coin or whatever has that kind of value.
Elon doesn't know what he's talking about. I used to work as an adjudicator for the PA Disability Determination Bureau, and the investigation is so thorough that getting disability through fraudulent means is effectively impossible. The evaluation is mostly based on the claimant's actual medical records and, if those are insufficient, the bureau will schedule an examination. Some information comes from the claimant themself, but most of this is clarification about medical treatment or conditions which are noted on the medical records but that they aren't claiming disability for. The ydo fill out an ADL form, but this is only really taken into consideration in the event that the claim is borderline; in that case it might tip the balance toward an approval but only if the condition is significantly limiting their ADLs in a way that one would expect the condition to limit them. In any event, most ADLs show some limitations but not nearly the kind that would be sufficient to tip the balance in that circumstance. For example, if a guy is claiming disability for a back problem there's probably going to be something about how they don't move around very well and can't lift heavy objects. They probably aren't going to claim that the pain is so bad that they can't get out of bed and have to have someone else do housework for them.
Realistically, the only way you're getting disability on the initial application is if you're over the age of 50, have a job that involves physical labor, and haven't done any other kind of work in the past 20 years. If you're under 50 it's assumed you can adjust to other work, so if you're capable of doing a sedentary job that doesn't require any special qualifications you're denied. If you're over 50 and you have an office job you're denied. If you're over 50 and you have a job that involves physical labor but there's a similar job that uses the same skills but doesn't involve physical labor you're denied. If you're over 50 and you generally work a job that involves physical labor but you you worked the register at your brother's convenience store a decade ago when he was just starting out and needed extra help, you're denied.
Some of the cases I can remember: One approval of the more typical kind involved a 55-year-old black guy who worked as a welder his entire career and had back problems. Claims of back problems are common, but this guy had serious problems documented on x-ray and had undergone at least one surgery. He tried to go back to work after the surgery but had to stop. Another case involved a guy in his 30s with brain cancer who was in such bad shape I couldn't talk to him directly and had to get the information from his sister (cases like this are flagged upon intake so they can be approved quickly). One case involved a 16-year-old girl who had severe psychological and emotional problems to the point that her mother couldn't take care of her and she was put into a group home, but her behavior was so bad that she kept getting kicked out of them. She had been admitted to Western Psych repeatedly over the past few years. I only spoke to her briefly; most of my communication was with her mother, who spent most of the conversation on the brink of tears as she talked about how she didn't know what she was going to do about her daughter and how scared she was about the future. I honestly don't know if this is an approval because I left before the case was resolved. I was pushing hard to get it out the door because it was clear to me that this girl would never be capable of working but my supervisor was skeptical because, if memory serves, while she had frequent hospitalizations it had been close to a year since the last one so maybe things were improving.
Now, I saw plenty of bullshit as well, but it was obvious bullshit that resulted in a denial. The modal case for this was some kid in his early 20s who never worked for any length of time and never had any education beyond high school who was trying to claim disability for psych problems despite having never seen a psychiatrist. He might be taking some kind of antidepressant but it was always prescribed by a PCP and didn't follow any kind of psych workup. That presents a complication, since we can't deny the claim without any psychiatric evidence, so we'd have him evaluated by a psychiatrist who would invariably conclude that the kid had garden-variety anxiety and depression but nothing that would prevent him from working. Psych claims usually require a longitudinal history of progressively worsening problems, or else some kind of huge psychic break that's unavoidable. But most of the cases are people who obviously have problems, just not of sufficient severity to render them disabled. The determination office is basically a denial machine, and most of the claims that are approved at the initial stage are ones so obvious that no one could possibly claim they were fraudulent. There are also a small number of people who have already retired and later have a health problem and figure they'll file just to see if they qualify.
Now, once you get beyond the initial determination stage and into appeals, the success rate is much higher. However, if you're appealing then you have an attorney and the case is heard by an administrative judge who issues an opinion. I think that the reason for this is that few of the bullshit claims get appealed, so the cases the judge sees are of overall higher quality that what are seen at the initial stage, especially since most of the severe cases are sent to a different department for fast-tracking. An adjudicator who spends all day dealing with marginal claims basically turns into a denial machine. I saw a statistic that claims 38% of initial applications that reached the adjudication stage (i.e., not denied for technical reasons, which half of all initial claims are) were approved. This seems way too high. Granted, the numbers are from ten years after I stopped working there and I can't speak to how they do things in other states, but in my office it was like 20%, 25% tops. And that includes expedited cases and cases adjudicated by people who have been there for 20 years and think they can tell an approval from a denial based on gut feeling.
I wasn't there long, but in my time there I evaluated hundreds of claims, and I never once saw anything I thought looked fraudulent. As I said, there were bullshit claims, but these were obviously bullshit, and in any event the claimants weren't lying about anything. It's one of those things that just isn't worth it. The average SSDI benefit is $1,200/month, and the average SSI benefit is $800/month. And if you make more than something like $1,200/month from a regular job your benefits get cut off. So the reward one gets from perpetrating a fraud on the system is a life of bare subsistence living. One thing I can't speak to is fraud at the technical level, for example, people hiding income or assets so they qualify for SSI. Given the high rate of technical rejections, it's clear both that SSA is doing thorough investigations and that people aren't even trying to hide much. I'm not saying that fraud doesn't exist, but the guardrails in place for preventing it are so high and the incentive for committing it are so low that I doubt there's much savings to be had here.
pinging @jeroboam, since I didn't see his comment until after posting this
It was a bad war, but what other choice did Israel have? Not retaliating or trying to get the hostages would have been politically untenable. Immediately entering negotiations for the return of the hostages would have had the effect of legitimizing hostage taking as a means of diplomacy. the only real option was to invade and hope that Hamas kept the hostages alive for leverage and wait until Hamas was sufficiently weakened to be in a position to make a deal. And that's just a deal, not necessarily a good deal.
I think you're missing my point here. You can talk about the language used in the document, but I conceded in my initial post that people being forced to sit through bullshit training conducted by charlatans was one of the consequences of wokeness. What is missing is evidence that this nonsense results in any tangible differences to a significant number of ordinary people. Correct me if I'm wrong, but I don't get the impression that you came across this publication because your 7th grader brought home math homework that you found highly suspect and were directed to the PDF by school administration. Which is why I brought up the fact that no one I know IRL who is complaining about this can produce any worksheets, or textbooks, or anything like that that would convince a reasonable person that this is a widespread phenomenon. Instead all I see are media reports, or rumors, or material discovered online by people who were actively looking for it.
I believe if you look with regards to education you'll find a number of objectionable curriculum and policy changes in major school districts.
Well, that's my point. If the change is as big as you suggest, I shouldn't have to look for it. It should be obvious. I know a lot of parents and quite a few teachers, but I've yet to hear any of them talk about any specific instruction in their schools. It's always happening somewhere else. I don't doubt that some teachers in some places are teaching woke material, but if this were widespread I should be able to throw a dart at the map and find plenty of examples locally. But it's always someplace else.
I'm a few years older than you, but beyond the internet, I think the problem started with cell phones. First, because they enable much easier communication, and second, because they became status symbols. While ease of communication seems like a good thing, it has the unfortunate side effect of making it easier to flake. If I call you tonight and we make plans to do something right after work tomorrow, unless you change your mind within the next few hours, you're pretty much stuck. Obviously, if there's some kind of emergency you could call me at work or at the place where we're supposed to meet, but that's intrusive and inconvenient (especially if you have to find the phone number of a business without the convenience of the internet), reserved for situations where you truly can't make it. These days, if it's getting late in the day tomorrow and you feel too tired to do anything, you can always just send me a text cancelling. I'm always available, and you don't even have to talk to me directly.
I'm as guilty of this as anyone, but it also makes it much easier to be late for things. If I have the kind of appointment like a job interview or court appearance where it's imperative that I be on time, I'm almost never late unless I make a fundamental miscalculation or there are unforseen circumstances. But if the engagement is merely social or recreational, I'm horrible at it, not because of unforseen circumstances, but because of inertia. After all, if I say I'm going to meet friends to ski at 9 am and I'm running a half hour late, I'll just text them to start without me and I'll call to see where they're at when I'm ready. In the old days, they'd have to wait around for me in the parking lot, not knowing where I was, and they couldn't go on ahead of me because I'd have no way of finding them once I got there. Being late meant either getting them pissed off waiting or running the risk of being ditched for the day.
Whether or not this is a net negative is hard to say. People flaking is annoying but it's nowhere near as bad as people having medical emergencies and no way to call an ambulance. Hell, it's probably better than the old days when people would have to cancel for legitimate reasons but had no way of contacting you and just stood you up. It's better than being stuck at home waiting for a call, or needing to get in touch with someone who isn't home at the time. As much as people complain about people being slaves to their phones now, it was worse in the old days. If you were at home and your phone rang, you basically had to answer it. Sure, you could screen calls through your answering machine, but this was inconvenient, and the idea of doing this for every call, all the time, was absurd. So you basically had to answer the phone, and the person on the other end could be anybody, wanting to talk about anything.
To get back to the social aspect, say I'm having people over this Friday night and I'm calling friends to invite them. These days you'd send a text. The recipients can see the group text, check their schedules, and respond at their convenience. If they don't really want to commit but want to keep it as a contingency, they can wait a few days to see if anything better comes up before responding. In the old days, you'd call your friends, and they'd have to give an answer immediately. "I don't know" was an acceptable response, but one only given in the event that there was some legitimate contingency involved that prevented you from committing in the here and now but wasn't certain enough to entirely preclude your attendance. And giving such a response required you to take the additional step of calling the host back at a later date to give a firm answer.
Which brings me to my second point, about phones being status symbols. This, admittedly, isn't that much of a problem, but it ties into everything else. Cell phones were always status symbols, but originally they were status symbols of a different type. Owning a cell phone before about 1995 meant that you had a very important job where people always needed to be able to reach you and it was worth paying ridiculously high fees for this capability. Then the cost of the phones and the basic subscription came down enough that normal people could afford to have them, but the per-minute charges were expensive enough that most of these were only used for emergencies or other situations where they were the only option. Landlines still ruled the roost for everyday conversations.
Then, in the early 2000s, changes were made to the business model that made teenagers actually want to own them as opposed to having them so they could call their parents for a ride. First, plans became available that came with a certain number of minutes that could be used during the day, and unlimited minutes on nights and weekends. Eventually, unlimited talk became the standard. Now, they could be used for casual conversation without your parents getting a huge bill. Second, texting became available, quickly gaining market share for low-priority communications that weren't worth interrupting somebody over. If I called for the specific point of telling you that the Penguins' goaltending looked especially shitty tonight (and not as an entree to a longer conversation), you'd be annoyed. If I texted the same you wouldn't care. The ability to have short, inane conversations (in an era with a telephonic keypad) didn't appeal much to adults, but kids loved it.
And with more kids having cell phones, marketers realized there was room for improvement of the phones themselves. Progress in cell phone design was initially centered around making them more compact. Now it was about making them more stylish. This is where Apple really knocked it out of the park. The Blackberry had existed for years, and provided much of the same functionality as smartphones would. But they were only appealing to people who actually needed the functionality. Nobody bought a Blackberry as a status symbol, and people who needed them for work didn't seem to like using them (one friend of mine who bought one for work purposes was thrilled when her job started paying for a work phone because she could now carry a normal phone for personal use). The iPhone had improved functionality, for sure, but it was a status symbol more than anything.
This only gets truer as time goes on. The first iPhone was a huge leap forward, but subsequent iterations have been less revolutionary than the improvements to the flip phones before them. Every couple years we'd at least get a new, useful feature, like a camera, or a full keyboard. Smartphone improvements are basically limited to incremental improvements of technology that already existed in flip phones or the first generation of smartphones. Faster processor, better camera, waterproof, etc. But new iPhones don't really do anything that the originals didn't, and that statement is even less true when comparing the current generation to the previous. (The biggest selling point of the iPhone 16 is that it has native AI capability, which sounds good until you consider that any phone with access to the internet has AI capability, just not on the phone itself. I don't know who this is supposed to appeal to.)
Nonetheless, there are people who want this thing. Every time a new iPhone comes out, there's a line out the door at the Apple Store of people who can't even wait a couple of weeks. Contrast this to the 90s. Phones were appliances. My parents had the same wall phone hanging in the kitchen throughout my entire childhood and most of my adulthood. You only got a new phone if the old one broke or, rarely, if there was some game-changing feature like a cordless handset or touch-tone dialing that you wanted. The idea of getting a new phone every two years was like the idea of getting a new dryer every two years.
The importance of this to the current phenomenon relates back to the first reason. Even though phones seem more central to our lives now, they are actually less central than they were 30 years ago. Like I said, if the phone rang, chances are you answered it, even though you probably had no idea who it was or what the call was about. If you made plans over the phone, you were stuck with them, unless you went to great lengths. If an important call came and you weren't home, you were out of luck. Our entire lives revolved around telephones and having access to telephones, but nobody really noticed or cared. They were as exciting as vacuum cleaners. Now they're as sexy as ever even though the core functionality hasn't improved since the introduction of texting. Once the average person was liberated from the noose of the telephone, that should have been the end of it, and progress should have stopped. The smartphone's integration of communication equipment with a portable, but unimprovably limited, personal computer, should have been the last improvement anybody cared about. But here we are, 15 years later, and people are even more concerned now than they were then.
- Prev
- Next
Each of those actions probably took about five minutes, maybe a few hours tops. And they could have been having a slow night. When you put someone under surveillance you don't get to watch them at your convenience.
More options
Context Copy link