Nvidia Corporation introduced its own analogue of the chatbot ChatGPT . Its distinguishing feature is that it is capable of running locally on a Windows computer. Internet access is not required.
The principle of operation is simple: a person asks, AI answers or creates some content upon request. In this case, in the program you can choose one of two large language models – Mistral or Llama 2. In addition, the Chat with RTX neural network is capable of processing videos from YouTube.
Journalists from The Verge have already tested the service and noted that Chat with RTX copes well with analysis tasks. This can primarily be useful for both journalists and anyone who works with large amounts of information. But for now this is more of an early demo version than a complete solution.
How to use
To run a chatbot, you need a GeForce RTX 30 or 40 series video card, at least 16 GB of RAM and an impressive amount of disk space. For example, the installation file alone weighs 35 GB, despite the fact that this is only a “technical demonstration”.
If all conditions are met, you can download the neural network for free on the company’s website.
Recall that in December 2023, Google introduced Gemini, its coolest artificial intelligence model, which, according to the company, “destroys” GPT-4. The chatbot will be gradually integrated into all of the company’s products: from Pixel smartphones to the search engine and Chrome browser.
And even earlier, Elon Musk introduced Grok – a conversational chatbot inside Twitter (X), which is described as “like ChatGPT, only better.” Unlike similar models, he enjoys sarcasm and was created to have a bit of humor in his responses.