logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

NVIDIA to Present 18 Research Papers at SIGGRAPH 2023

Most of the research papers will be dedicated to generative AIs and neural graphics.

NVIDIA has announced its plans for SIGGRAPH 2023, an upcoming computer graphics conference set to take place on August 6-10, stating that they intend to reveal 18 research papers on generative AIs and neural graphics.

Prepared in collaboration with over a dozen universities in the US, Europe, and Israel, the papers are set to introduce novel generative AI models that turn text into personalized images, rendering tools can turn images into 3D models, AI-powered neural physics models capable of simulating complex 3D elements, and neural rendering models for generating real-time visual details.

According to NVIDIA, the research they will present during SIGGRAPH will "enable creators in art, architecture, graphic design, game development and film to more quickly produce high-quality visuals for storyboarding, previsualization, and even production" and "help developers and enterprises rapidly generate synthetic data to populate virtual worlds for robotics and autonomous vehicle training".

Furthermore, the studio has also shared a sneak peek at some of the research papers they will share, with the list including:

  • A method that can simulate tens of thousands of hairs in high resolution and in real time using neural physics.
  • An AI system that can learn a range of tennis skills from 2D video recordings of real tennis matches and apply this motion to 3D characters.
  • Tech that can generate and render a photorealistic 3D head-and-shoulders model based on a single 2D portrait.
  • A compact model that can take a handful of concept images to allow users to combine multiple personalized elements — such as a specific teddy bear and teapot — into a single AI-generated visual.
  • Neural texture compression that delivers up to 16x more texture detail without taking additional GPU memory.
  • And much more!

Click here to read NVIDIA's full announcement. Also, don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more