This thread is for anyone working on personal projects to share their progress, and hold themselves somewhat accountable to a group of peers.
Post your project, your progress from last week, and what you hope to accomplish this week.
If you want to be pinged with a reminder asking about your project, let me know, and I'll harass you each week until you cancel the service
Jump in the discussion.
No email address required.
Notes -
I'm automating Hinge. Android emulator, pyautogui, PIL, GPT-4o. It's almost too easy.
The flow is:
Costs me about $0.04 to reject, $0.10 to message. I think I can get that down some. I only ran it for one batch, and it got a match faster than I normally do. Small sample size, but I am optimistic.
As to why #3 is so simple - I initially had a hand written weighted average of all the things, but looking at the actual behavior realized that:
Favorite kerfuffle: it messaged a woman, shown in a photo by a giant 10ft novelty planted pot: "is that enormous, or are you tiny"? It...was certainly not the latter.
This raises some questions for me:
What's your goal here? Sex or love?
Side note: I fucking love that you did this for many reasons. Congrats!
\3. Interesting, I think saving it for the first date might be a good call. I think it's a good story at any time, but probably more useful there.
Goal is love for sure. If I just wanted to get laid, the algorithm would not need the "is very smart" filter, and even the "not very fat" filter could be relaxed...
Thanks, I've wanted to do it for a while. I really thought it would be harder. I'm curious to try about o1-preview as well, although it's 3-4x the cost. GPT-4o-mini was not adequate for sure.
Curious to see how Claude sonnet 3.5 (new) performs. Feels like it has more emotional intelligence than ChatGPT. You might have invented a new AI benchmark.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
How have the results been so far though? do you ever feel that llms are too boring so kinda hard to keep a girl interested in texting you? regardless cool project.
More options
Context Copy link
How does it do for opening messages? I remember when I was dating and my lack of creativity really held me back there.
It's plenty good. The prompt basically just says not to be too enthusiastic, and to keep it short. I've been writing down my manual messages to prompt it with as examples, but am not even using that yet.
E.g. the match was responding to a picture of a woman standing in front of a field of sunflowers, captioned "My happy place": "Sunflowers might just be the happiest flower there is. What's your favorite outdoor place to relax?" A little extra, but it shows "I" looked at the picture, makes it about her, is easy to answer/start a conversation with - not bad imo.
If messaging is your bottleneck, you can use basic chat interfaces for response ideas. I would not overestimate how much the specific message matters - I'm pretty sure it's more of a rule-out than a rule-in kind of thing. Just don't be a creep basically.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Just bottled this year's first batch of homemade sake. I got a bigger umeshu jar this year so I was able to make about 5L of the stuff. Left a good amount of the rice particulate in so it has that nigorizake taste, though much less sweet. We have a lot of sake lees from this batch, so I'm thinking of marinating some pork and probably some cucumbers and carrots. Any other ideas? (Paging @George_E_Hale -- but keep it on the down low!)
I've also got several batches of umeshu in the work, though they won't be done until next summer. But I do have one experimental batch that will be ready next month -- "Christmas umeshu" with a cinnamon, cloves, vanilla, and nutmeg. I expect it will be overpowering and terrible by itself, but maybe okay in a mug of hot water. Looking forward to cracking it open in a few weeks.
Any amateur brewers, distillers, or infusers on The Motte?
I've only messed around with micro batches of turbo cider. UHT apple juice, champagne yeast and a few slices of ginger root. I like it dry and strong so it's as simple as waiting til it finishes and then chilling it to clear. I should test if my yeast is still active, I've been thinking about making another bottle.
I made strawberry infused vodka once and that turned out better than I expected if a little overly sweet and jammy tasting. Would have been improved by cutting it back with plain vodka but I'd used all the vodka for making the infusion.
I did look into home stilling but I don't drink enough to make it worthwhile. Unlike @yofuckreddit the information I found made it look cheaper and easier than I expected. The method basically depends on judging when to cut the fractions (heads and tails from the hearts), and the more you spend on a still the more distinctly and efficiently they can be separated instead of having them bleed and smear from one into the other. There's some fairly straight forward chemistry and engineering underlying the process. As I say I lost interest as I don't drink that much and realised it would be quicker and easier to simply buy food grade ethanol to use for infusions (£20/L 95% according to the notes I took, roughly the same price as cheap vodka by alcohol volume). Still think it could be a fun project though. I saw some interesting videos* steeping different size oak pieces for making whisky that indicated a good product could be made quickly using pieces with the correct surface area, which makes sense as beneath the mystique it's basically a wood infusion. On the other hand this is coming from someone who rates UHT turbo cider as perfectly adequate, so mileage may vary. Now where's that yeast...
*Found it: Final Tasting - How does surface area affect the whiskey aging process?
More options
Context Copy link
Yes, although these days I am mostly infusing. I tried my hand at brewing with the beer kits and while the beer was decent, the process of doing it wasn't for me. Too much chemistry and precision. I tend to cook (and craft) like an engineer, so I want things that I can tweak on the fly with ratios I can eyeball.
When it comes to infusing though, I have quite a few recipes I've been perfecting. I used to do Umeshu, but my supplier stopped selling ume and I can't find any others for a reasonable price that will ship to the midwest. Here is what I've infused in order from best to worst: Umeshu Roasted pineapple-cinamon tequila Cranberry vodka Blueberry vodka Brown sugar oatmeal vodka Maple bacon bourbon Limoncello Rhubarb vodka Pompelmocello (limoncello but grapefruit) Roasted walnut/roasted pecan vodka/bourbon Jalapeno tequila: homegrown jalapenos were too spicy Peach vodka Pineapple vodka (if you know how much I dislike peaches, these being ranked lower says a lot) Granny smith vodka Roasted/unroasted murasaki imo (purple sweet potato) vodka/bourbon: absolutely vile, despite trying some different ratios
Got started late on the cranberry this year, so I'm currently waiting on a batch to see if 2 weeks is enough instead of my usual 4.
Thanks, pretty cool. How did you pull these off?
Also, I reformatted your list for easier reading:
Umeshu
Roasted pineapple-cinamon tequila
Cranberry vodka
Blueberry vodka
Brown sugar oatmeal vodka
Maple bacon bourbon
Limoncello
Rhubarb vodka
Pompelmocello (limoncello but grapefruit)
Roasted walnut/roasted pecan vodka/bourbon Jalapeno tequila: homegrown jalapenos were too spicy
Peach vodka
Pineapple vodka (if you know how much I dislike peaches, these being ranked lower says a lot)
Granny smith vodka
Roasted/unroasted murasaki imo (purple sweet potato) vodka/bourbon: absolutely vile, despite trying some different ratios
More options
Context Copy link
More options
Context Copy link
I’ve made mead before, and beer. It’s not a regular thing.
Is beer worth it? IIRC I tried some homebrew in college and it was very okay. It seems like you'd need to invest in some nice equipment and fancy ingredients to make something that would actually be better than even mediocre craft beers, and it also seems easy to screw up and make something undrinkable.
I've heard nothing but bad things about mead. What was your experience with it?
IME ‘ok’ is a good outcome from home brewing. Mead has tended to turn out better than beer, though.
More options
Context Copy link
More options
Context Copy link
My experience is that it's nearly impossible to make something worse than a mainstream (European) store-bought quality beer. Whether they measure up to a craft beer is a matter of taste, mine is arguably quite primitive, and I don't actually like most craft beers. Screwing up in my experience is far from easy, the only place where it can plausibly happen is when you deviate from a recipe, but if you just follow the instructions you'll be fine.
If you have a big pot at home (10+ litres, but the bigger the better) and just want to try out to see if this is for you, you won't need a lot of equipment - there are starter kits in the $50 range with all the ingredients and basic tools. If it turns out you enjoy it, you can start buying extra equipment as needed. For me the gear is less about the quality of the end product, and more about making the production easier - one downside of this hobby is that it's pretty time consuming, especially on the day of brewing.
More options
Context Copy link
Due to local taxes, the ingredients for 23 liter batch of beer and the equipment to brew it costs a little bit less than 23 liters of beer does. If you feel like really cheating, there are companies that will "help" you brew and bottle "your own" untaxed beer or wine for a small fee.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Anyone into knife sharpening? Especially with diy jigs/systems? Looking for something that can utilize an already sizable collection of full size whetstones, and not the tiny ones that Edge pro apex 4 uses.
More options
Context Copy link
How have you been doing @Southkraut?
Lots of time wasted on engine switching and reconsideration. Better focus more on disassociation from engine even at cost to performance. Said this before, but optimization always draws me towards tighter integration - need to not do that. A cleaner separation with minimal engine specificity (is that a word?). Don't let the siren song of premature optimization lure me from the good path.
Is there a way to do C++ without header files? I realize it's incredibly petty, but having two files per class just keeps turning me off of Unreal.
Other than that, I keep comparing engines.
Unity:
Unreal:
Godot:
Redot:
Stride:
No idea where I'm going with this. Right now I'm trying to fiddle some dev time into my new routines. Not yet sure how to. Maybe I'll just get used to the job and end up being less weary by the end of the workday so I can get something done in the evenings, or maybe I'll follow a friend's recommendation and just get up even earlier to do some coding in the mornings. Or maybe I'll find a way to take a few short breaks (legitimate breaks, not slacking off.) during work to...I don't know, dictate code to my phone?
I wouldn't recommend it as a permanent process, but when I'm sketching things out I often define functions within class definitions in headers. You can almost get to a point where you only have header files, but there are a few caveats (circular dependencies, static member initialization, inlined code size) that prevent me from liking it for bigger projects. But it can be helpful if you haven't finalized interfaces since there is only one place they are defined.
More options
Context Copy link
If you're creating an application or something like a dynamic library using someone else's API, technically you can put every single thing into one giant .C file with no header files and it'll work. If you're creating a library for others to use, either it needs to be "header-only" (i.e. it all gets recompiled by every user, which really only makes sense for pure-template code where the user does every instantiation) or it needs to have separation between declarations in headers (for the users' compiler) and definitions in source files (for the users' linker). Technically you can mark all your definitions as
inline
and put them in headers and just have one file per class even if you're not doing fancy template tricks, but people don't do this, just to avoid longer recompile times.It's common to have more than one class declaration per header file, but for big projects it's just easier to keep everything organized if you have one class per header.
It's not uncommon to do "unity builds" (no relation to the game engine), where a bunch of source files get batched together so that (because of include guards) the header files only have to get parsed once per batch rather than once per file. This is just to reduce compile times and allow for more Inter-Procedural Optimization in a single compiler pass, AFAIK.
Complaining about two files per class isn't petty so much as weird, though. Partly I say this because I love having two files per class (the declarations are "what does this do", which is easier to read when removed from "what messy tricks does this use to do it" definitions), but mostly I say this because there's so much more in C++ to complain about. I'd have imagined a C# fan would have been most annoyed by having to manually manage heap allocations, with no garbage collection.
I thought that was the general consensus. People also think it's the hardest engine to use (which is why I played with Godot with my kids, despite C++ experience myself), but AFAIK it's had the most features for forever and people manage to wring out the most performance from it too.
It is. I posted late at night and was tired and got dumb and messed up my Englisch to the point of stating the opposite of what I intended. Contend, contest, contradict...Nach Müd kommt Blöd.
More options
Context Copy link
Modern C++ is much less painful in this regard. The days of
new()
anddelete()
are long gone.find /usr/include/c++/14/ -type f | xargs grep '= new'
still shows me nearly a hundred uses in standard library headers, so "long gone" is a bit of an exaggeration. If libstdc++ authors are still at that point, imagine where your average coworkers are. ;-)I admit that
shared_ptr
was nice, and the development of "likeauto_ptr
, but you can put it in containers" was a godsend. Pointinggrep
at my own favorite project I still see 6 invocations ofnew
in non-deprecated code, versus about 800 invocations ofmake_unique
.But I'd still argue that even modern
unique_ptr
best-practices count as manual management and are thus more annoying than "the programming language will just figure it out". Even if you try to mimic garbage collection behavior withshared_ptr
you still have to worry about leaking unreachable cycles of pointers that a garbage collector would have been able to detect. This is all a useful sort of annoying if you write any sort of interactive code where big a garbage collection sweep might drop a frame or add input latency or whatever, and even garbage collected code can leak memory because "I forgot to remove a reference" isn't a drastically different bug from "I forgot to delete an allocation", but heap management is still the thing I notice the most complaints about when users of other languages first move to C++.More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Deer processing is going well- mild sausage is all stuffed and spicy sausage is just waiting on a new sausage stuffer(a minor hiccup was when I dropped the nozzle on my old one and had to order a new one). Chili grind is all made and the steaks are cut out and packaged.
I do this every year and it’s always an experience.
Did you just cut out the backstrap, or do the whole thing? What cuts went to the grinder?
Cutting out just backstrap is illegal in my state- I took backstraps, fore and hind quarters, the neck, and a few odds end ends. Basically everything except the backstrap went into the grind pile.
More options
Context Copy link
More options
Context Copy link
How much of an ordeal is this? Is the grinder a pain to clean? Is it easy to get the sausages to taste right? Is it only cost-effective if you are able to obtain a lot of good meat at a low price? Is it fun, or just messy/tedious? Is it easy to get the meat-to-spice ratio correct?
Grinders are disassemblable for cleaning, for obvious reasons. Cleanup is mostly a wipe-spray-dishwasher if applicable thing. That being said, some mixtures are much messier than others- the liver in boudin, and eventually the rice, take flossing over anything hard to get to.
The meat to spice ratio can be a few grams off without messing anything up; taste is pretty simple to get right as long as it’s mixed thoroughly and evenly, although offal as an ingredient can be difficult to manage. Consistency is much trickier because the meat to fat ratio is much more tedious and particular. Sausage is almost definitionally much fattier and saltier than most meat, and exactly how fatty or salty is a major driver of what you wind up with.
Fortunately I was using venison as the meat, from which all the fat is removed as part of processing, allowing me to straightforwardly weigh beef suet or pork fat cap in. In the past, when I have made boudin, or used commercial pork, the correct fat percentage has been much more difficult.
Cutting the fat out of the venison was tedious and messy and took more time than the rest of it combined. I find it a satisfying hobby, if definitely ‘work’.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Boid simulation in Redot
Googling around for an answer on whether it's possible to set primitive-type uniforms with Redot's shader API (which is how the C++ project was handling it's simulation), I ended up running into this repository. On one hand, that's a bit of a shame, not much more for me to do - on the other hand, with my relative lack of experience in Go/Re-dot, I'd be wasting a lot of time figuring out minutia, so this saved me a lot pain, and I still managed to learn a bunch of things:
Redot has "particle shaders" which grant you full control over individual particles in a system. It's what I was going to use originally, but when I was experimenting with them I noticed that there's no way to have an infinite lifetime on particles, and that the lifetime is inseparable from the emission rate, so setting a long one would just give me a slow drip of new "boids" showing up. I was quite prepared to say screw it, and just draw my own particles on a texture, with a compute shader, but I noticed that this project is actually using the particle system + shader combo. To my surprise, the particle lifetime was set to 1 second... and then it hit me - as long as the simulation is done in a separate compute shader, and the particle one is just their for setting up the position and orientation, it doesn't matter that a particle "dies" and another is created, the user won't even see that it happened.
It turns out that primitve-type uniforms - the original issue that caused me to find the project - aren't available in Redot, but you can have named properties in a buffer (like so), and that's a pretty elegant solution, imo.
Originally, starting the simulation up was a disappointment. The Readme, as well as the accompanying videos talk about simulating 100K boids. My GPU could handle 32K at 60+FPS, or maybe 50K at 30FPS. By the last week's goal of "go big, or go home", where "big" was supposed to mean millions, it felt like it was time to pack it up, and never speak of the idea again. Kind of devastating, because the big idea that made me revisit that idea was to optimize the simulation by sorting the particles / boids into a grid. This is already done, both the C++ project from last week, or this Godot version that I found go into detail about it. The shader code has a flag for turning off the spatial sorting, and indeed without it the frame rate drops by half. A part of me thought "fair enough", My hardware is pretty old, and those are roughly the numbers I ran into the first time I took a stab at it, but I felt like the sorting should be giving a much bigger boost.
So then I start playing around with the parameters. Since the boids are sorted on a grid, they only have to look up other boids from neighboring grid cells. Smaller grid, even less look-ups. That did speed things up quite a bit, and I could simulate 131K at 20FPS. Somehow this gave me a strong feeling that I'm not at the hardware limit, and there must be some issue in the code.
It's been a while since I took that GPU programming course, but roughly from what I remember: At some point it got a lot easier to squeeze in more cores into hardware than to squeeze more speed out of a core, and with GPU's in particular it turns out you can squeeze a lot more processing units in, if they are all performing the same instruction at the same time. If you want to run a linear algorithm on a GPU, it will blaze through millions of data points like it was nothing. Even when you go quadratic, things still run pretty smooth, this is why it wasn't necessarily surprising that the spatial sorting increased performance, but not by that much. If you can eliminate data points from the list of what needs to be processed that is in theory a plus, but the bane of all GPUs is thread divergence. If one group of cores goes one way of a particular if-block and another goes via the else, what happens in practice is that you have to run the same code twice. First you run all the threads where the condition is met, and then all the threads where it is not. The more flow-control you use, the more divergence, and the bigger hit on the performance.
Still, that should not have happened here. The boids are sorted, so on each pass the threads should be processing boids that are next to each other, and looking up the same neighboring grid cells. There might be some divergence, but things should be mostly staying in-sync, which again should mean much higher performance. What could possibl... oh, that motherfucker - he's not actually using the sorted indices. When a thread selects a boid to process, it just goes through the unsorted buffer containing the boid data (position, velocity, etc). Since the boids can move around there's no guarantee that they're next to each other at any given time. Sure, the sorting helps, because when you're looking up the neighbors, you don't have to check on all of them for each boid, but like I was saying above I was getting divergence up the wazoo.
Well, one "
if (use_bins) my_index = bin_reindex.data[my_index];
" later that 131K simulation was running at 100FPS. I can go as high as 262K at 40FPS. By 524K, the shader program crashes, by the looks of the error my GPU doesn't have enough memory, though the message is not very clear.Not a lot of code written, but that was quite fun. Now I want to if I can reimplement my old simulation. "Follow player, collide with each-other" should be a bit easier on the GPU, since it necessarily requires that the boids are seperated, and spread equal-ish on the grid. I'd also like to do something about the spaghetti GD script. Let's see how it goes.
More options
Context Copy link
Day 19 of NaNoWriMo. I crossed the 30k mark last night. This has been a very bipolar experience, I find myself wildly vacillating between "this book fuckin slaps" and long dark nights of the soul. I swear I'm hitting every point on this graph several times a week. I'm on an upswing now: between the amount I wrote on the train this morning and the amount I wrote on my lunch break, I've completed nearly half of my daily quota today already. Can't wait to get home and finish today's quota.
Funnily enough, I'm actually feeling pretty good about the style of the book, on a sentence-to-sentence, paragraph-to-paragraph level. It's only the story, pacing etc. I'm unsure about. But I feel emotionally invested enough in my characters that I'm legitimately feeling guilty about the fate that's soon to befall one of them, which must be a promising sign. No turning back now, just a week and a half left. The only thing I'm disappointed about is that I was looking forward to having a first draft by November 30th, but realistically at 50k words I'll probably only have between 60-66% of a first draft. Planning to take December 1st off, then resume writing on the 2nd, maintaining the pace of 1,667 words/day until I have a complete draft. Hopefully that means I'll be done a week before Christmas.
Oh yeah writing is brutal for this. Just wait til you begin showing your work to a larger audience, the swings get even more violent.
More options
Context Copy link
More options
Context Copy link