@tbleeker7987

I like that you addressed price for the first one.  This is definitely a consideration for me.

@rickyS-D76

Where is the link you mentioned?

@cylurian

Is there a LLM for just math or stats?

@Deltacasper

For local models such as Llama how can we the effectiveness as a whole for smaller vs larger models?

@Panthorus

Mixtral is the best for local use on regular/high performance computer

@idcrafter-cgi

on device and makes it free and is good enough for most tasks. 
stuff like gemini nano is even multi modal on some devices.

@ChristopherG-k4n

In the description

@paulojose7568

🤨