Wednesday, June 19, 2024
windows news

NVIDIA Chat with RTX hands-on: This local AI chatbot already shows plenty of promise

NVIDIA does a lot of interesting things around AI, but its consumer-facing business is still predominantly focused on gaming. It’s now aiming to bring both categories together with the introduction of Chat with RTX, an AI chatbot that runs locally on your PC. The software leverages Tensor-RT cores built into NVIDIA’s gaming GPUs — you’ll need an RTX 30 or 40 card to use it — and uses large language models (LLM) to provide useful insights into your own data.

The key difference is that unlike ChatGPT and Copilot, Chat with RTX runs entirely on your PC, and it doesn’t send any data to a cloud server. You feed it the relevant dataset, and it offers answers based on the information contained within. Another cool feature is that you can share YouTube links, and Chat with RTX interprets the content in the video and answers questions — this is done by pulling from the data from the closed captions file.


This website uses cookies. By continuing to use this site, you accept our use of cookies.