OpenAI's Safety Team Exodus: Ilya Departs, Leike Speaks Out, Altman Responds - Zvi Analyzes Fallout

"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis - Een podcast door Erik Torenberg, Nathan Labenz

Dive into the intricacies of AI ethics and safety concerns as we dissect the recent resignations of AI Safety Team from OpenAI. In this stimulating conversation, we unpack the challenges of aligning leadership vision with safety culture, explore legal implications of non-disparagement clauses, and discuss the future of AI alignment and superintelligence. Tune in for a thought-provoking analysis on the responsible path forward in AI development. SPONSORS: Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention "Turpentine" to skip the waitlist. Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/ CHAPTERS: (00:00:00) Introduction (00:04:46) Compute resources (00:08:15) The straw that broke the camels back (00:13:35) Sponsors: Oracle | Brave (00:15:42) Dwarkesh interview with John Schulman (00:19:14) What should we do? (00:22:47) Strengthening the bill (00:25:11) Non-Disparagement Clauses (00:30:48) Sponsors: Squad | Omneky (00:32:33) Safety measures (00:39:05) SOFONs (00:43:22) AI movie concept (00:47:24) Forking Paths (00:49:44) Simulation Hypothesis (00:53:56) Doomer

Visit the podcast's native language site