It runs locally and can find information for you.
NVIDIA has released Chat with RTX, a new AI chatbot that runs locally on your PC and can provide information based on your files.
This is a demo app that "lets you personalize a GPT large language model (LLM) connected to your own content – docs, notes, videos, or other data." It makes use of retrieval-augmented generation (RAG), TensorRT-LLM, and RTX acceleration to quickly bring you answers to your questions, and the fact that it operates locally means your content is secure.
Image credit: NVIDIA
Aside from your documents, the AI can watch YouTube videos and retrieve context from there, which is pretty handy, especially if you need to watch a long presentation.
If you'd like to check it out, you'll need an NVIDIA GeForce RTX 30 or 40 Series GPU or NVIDIA RTX Ampere or Ada Generation GPU with at least 8GB of VRAM. I haven't had a chance to check the bot myself, but The Verge's Tom Warren notes that Chat with RTX can be handy even though it is "a little rough around the edges" at the moment.
"It wasn’t perfect for searching YouTube videos, though. I tried to search through the transcript of a Verge YouTube video, and Chat with RTX downloaded the transcript for a completely different video. ... When it worked properly I was able to find references in videos within seconds."
Image credit: NVIDIA
Overall, the AI summarizes well and can be used in a variety of cases. Just be warned that this is still a demo that has plenty of issues.
Check it out for yourself here and join our 80 Level Talent platform and our Telegram channel, follow us on Instagram, Twitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.
Keep reading
You may find these articles interesting