site banner

Tinker Tuesday for April 1, 2025

This thread is for anyone working on personal projects to share their progress, and hold themselves somewhat accountable to a group of peers.

Post your project, your progress from last week, and what you hope to accomplish this week.

If you want to be pinged with a reminder asking about your project, let me know, and I'll harass you each week until you cancel the service

1
Jump in the discussion.

No email address required.

Gemini has an enormous context length. I think 2.0 Pro could accept 2 million tokens, and 2.5 Pro is a regression back to a million. That's still way ahead of the competition.

There's very little I personally need to do that requires CLs that long. I think the most I've ever used in a natural manner was when I was translating a novel and used up 250k tokens. I've used more for the hell of it, but never actually maxed it out before I got bored.

Models are also getting better at making good use of the additional context, but there's still performance degradation that's not captured by benchmarks like needle in a haystack tests.

Give Gemini a go, preferably through AI Studio. I think it'll do much better than Claude. I know some documentation in programming can be enormous, but probably not 400k words enormous? You could alternatively enable search if the docs are publicly available on the web.