The Wednesday Wellness threads are meant to encourage users to ask for and provide advice and motivation to improve their lives. It isn't intended as a 'containment thread' and any content which could go here could instead be posted in its own thread. You could post:
-
Requests for advice and / or encouragement. On basically any topic and for any scale of problem.
-
Updates to let us know how you are doing. This provides valuable feedback on past advice / encouragement and will hopefully make people feel a little more motivated to follow through. If you want to be reminded to post your update, see the post titled 'update reminders', below.
-
Advice. This can be in response to a request for advice or just something that you think could be generally useful for many people here.
-
Encouragement. Probably best directed at specific users, but if you feel like just encouraging people in general I don't think anyone is going to object. I don't think I really need to say this, but just to be clear; encouragement should have a generally positive tone and not shame people (if people feel that shame might be an effective tool for motivating people, please discuss this so we can form a group consensus on how to use it rather than just trying it).
Jump in the discussion.
No email address required.
Notes -
Hmm, basically all the libraries I listed except maybe for pytorch haven't changed all that much since 2021, gpt-4 should really still be very useful with all of them. What it will have trouble with is a library like "Transformer" by huggingface, which lets you automatically download and use pretrained deep learning models. But to even use a super-high-abstraction library like that one you still need a bunch of "glue skills" like knowing how to load a .png image from your computer into a format that the high-level functions can understand, and how to interpret and visualise the output of those high-level functions. GPT-4 would be amazing for all of that.
I think Transformers has been around. Langchain on the other hand...
That said, you can use search enabled GPT or bing for this stuff too.
But- I have noticed GPT will mess up when setting up ML models sometimes. Stuff like mixing up the ordering of dimensions. I wound up with a toy transformers model that was trying to learn a string reversal task using positional embeddings that were spread across batches instead of across tokens the other day. And on other tasks it has all sorts of other minor errors. The code usually compiles- but it doesn't always do what you thought you asked for.
It certainly gives you experience doing critical thinking with regards to code debugging though, while also doing a lot to help you learn new libraries. The loop goes from
Read Demos and example code and textbooks -> Write hopefully passable code -> Debug code
to
Have GPT4 write weird, buggy or technically correct but silly code -> Debug Code while asking GPT4/googling/inferring/testing what the functions do.
Which I much prefer.
More options
Context Copy link
More options
Context Copy link