Local LLM is prefered, user do not upload their private Data to internet

The race for AI won’t be won by the one offering the cheapest models, but by the one who is the first to offer powerful, local modules that don’t require a constant internet connection. These modules, small enough to run on personal devices, would eliminate the need for remote sessions like the one we’re using with ChatGPT today.