

Doesn’t that conflict with their AGPL license?
Self-hoster/FOSS Pronouns: He/Him
Doesn’t that conflict with their AGPL license?
How dare you point out how bad things are. Why can’t you be more positive. Now everyone can have a nice new Tesla. It’ll bring jobs and green energy and reduce pollution. Why do you always have to be so negative all the time. People have dreams of driving a nice big car and sitting in traffic for 48hours a day. Why can’t you be more grateful?
/s
Thank you for all your insight!! This is really helpful
I need to mess with tabbyapi. Doesn’t help that there’s like 2 tabbys, one is tabbyapi and the other is tabbyml. I am guessing tool support is at its infancy stage.
Maybe… But no not in this scenario.
All I can say is, if all your friends jumped off a cliff will you jump off as well?
I mean of course the VCs will say that. They aren’t the ones working. ¯_(ツ)_/¯
Damn! Thank you so much. This is very helpful and a great starting point for me to mess about to make the most of my LLM setup. Appreciate it!!
Dude! That’s so dope. I would really like your insights in how you tuned MoE. That would be a game changer as you can swap out unnecessary layers from the GPU and still get the benefit of using a bigger model and stuff.
Yeah it’s a little hard to do inference with these limited VRAM situations and larger contexts. That’s a massive pain
You’re good. I’m trying to get larger context windows on my models so trying to figure that out and balance token throughput. I do appreciate your insights into the different use cases.
Have you tried larger 70b models? Or compared against larger MoE models?
Not to be that guy (he says as he becomes that guy) but the GPL is not a permissive license, BSD and MIT are. Tho imo GPL is the better and probably best license.
Also what models and use cases did you run it for? And what was your context window?
And now I’m a wizard!
Lolol yes, it’s a weird straight circle indeed.
However applying those fixes and then learning to fix it is a great way to learn how to troubleshoot and unb0rk your system.
Treat the response like you would of an LLM, it needs to make sense to you, you need to make sure they aren’t messing with you or have given you a fix that only works in their case. Usually the best fixes are the simple ones. And it seems like even with the longer ones you’re able to figure out your simple fixes which is awesome!
They can’t consume human food. So caffeinated blood. Source: What we do in the shadows
Tbh education was failing students long before AI. It started as soon as a degree requirement became the barrier of entry for a job. Companies don’t care about what you learnt, just you have a degree and you have a high GPA. Students are just playing the game appropriately. learning was never the objective of education after a point.
Lukes face is hilarious
I just map it to the abxy that steam suggests. Works great when I wanna keep using the trackpad or joystick at all times.
You can also host it yourself.
That means they are gonna rewrite everything? Damn. That’s heavy