It has a trillion parameters.
At the ISC23 keynote, Intel announced Aurora genAI – a science-focused generative AI model with a trillion parameters, almost six times more than in the free and public versions of ChatGPT. It is going to be trained on general text, code, scientific texts, and structured scientific data from biology, chemistry, materials science, physics, medicine, and other sources.
The model, being developed in collaboration with Argonne National Laboratory and HPE, will be used in a variety of scientific applications, from the design of molecules and materials to the synthesis of knowledge across millions of sources to suggest new and interesting experiments in systems biology, polymer chemistry and energy materials, climate science, and cosmology. It will also be used to accelerate the identification of biological processes related to cancer and other diseases and suggest targets for drug design.
“The project aims to leverage the full potential of the Aurora supercomputer to produce a resource that can be used for downstream science at the Department of Energy labs and in collaboration with others,” said Rick Stevens, Argonne associate laboratory director.
Read more about it here and don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.