- BuzzRobot
- Posts
- Upcoming AMA with Thomas Scialom from Meta About Llama 3 and More
Upcoming AMA with Thomas Scialom from Meta About Llama 3 and More
Plus: Virtual Talk on Aligning AI with Pluralistic Human Values. Also, check out the recorded lecture on Databricks LLM.
Hello, humans! Sophia here, bringing you a lineup of interesting virtual discussions.
Table of Contents
May 23rd Virtual Talk: Aligning AI with pluralistic human values.
May 30th AMA Session with Thomas Scialom, Senior Staff Research Scientist at Meta AI, about Llama 3 and more.
Video Recording Available: DBRX – Insights into training the language model by Databricks.
How to Align AI with Pluralistic Human Values
Even though humans share more than 99% of their DNA, our diverse values shaped by family, culture, or personal experiences set us apart. Considering that AI is becoming more powerful, it's crucial to ensure these technologies mirror the diverse human values.
Taylor Sorensen from the University of Washington will share his research on this topic. He and his collaborators developed ValuePrism, a comprehensive dataset comprising 218k values tied to 31k human-crafted scenarios.
By leveraging the dataset, they created Kaleido, a versatile language model designed to generate, interpret, and evaluate human values in various contexts.
Interestingly, users have shown a preference for the value sets generated by Kaleido over those by the educational model GPT-4, emphasizing its accuracy and comprehensive scope.
AMA with Thomas Scialom from Meta AI
Join us on May 30th for an AMA (ask me anything) session with Thomas Scialom, Senior Staff Research Scientist at Meta AI. Thomas has played pivotal roles in leading the development of Llama 2 and in the pre-training stages of Llama 3. He has also contributed significantly to Code Llama and Galactica, Meta’s initiatives in programming and scientific LLMs.
I’ll also upload the recording to our YouTube channel.
Video Lecture on Databricks LLM, DBRX: Challenges and Design
We recently had Shashank Rajput, a research scientist at Databricks, present an insightful lecture to the BuzzRobot community about DBRX, a Databricks language model. He shared about the model's foundational data structure, its architecture, and the technical challenges faced during its development, including hardware failures. Interestingly, DBRX was trained using 3072 H100 GPUs, experiencing an average of three failures daily given the scale of the hardware used.
Reply