Where is the link you mentioned?
Is there a LLM for just math or stats?
For local models such as Llama how can we the effectiveness as a whole for smaller vs larger models?
Mixtral is the best for local use on regular/high performance computer
on device and makes it free and is good enough for most tasks. stuff like gemini nano is even multi modal on some devices.
Great
In the description
🤨
@tbleeker7987