Chat With RTX: a local chatbot powered by NVIDIA!


From NVIDIA, we learn that the chameleon brand is offering a chatbot that can be accessed entirely locally: Chat With RTX. The chatbot is fully customizable and runs on RTX 30 and RTX 40… But not all cards are compatible!

Chat With RTX: a “ChatGPT like” available locally!

Chat With RTXBroadly speaking, this software works with NVIDIA RTX 30 and RTX 40 graphics cards. However, a minimum of 8 GB VRAM is required, so the RTX 3050 6 GB is not compatible, which is a shame. You’ll also need to check the version of your graphics driver, as GeForce 535.11 or higher must be installed.

Next, it’s a chatbot that runs locally, so you won’t need an Internet connection to use it, unlike ChatGPT for example. Next, you’ll benefit from a personalized experience based on the datasets you use. Finally, you can use LLM Mistral or LIama 2 to generate responses. You can even use YouTube videos as resources.

Afterwards, NVIDIA intends to boost the performance of its tool, delivering faster, more accurate answers. This gain is expected with TensorRT-LLM v0.6.0.

To download, click on the link below:

This way to the NVIDIA chatbot!