Learn languages naturally with fresh, real content!

tap to translate recording

Explore By Region

flag Zico Kolter, a Carnegie Mellon professor, leads OpenAI’s safety committee, which can halt AI releases to ensure safety during its transition to a for-profit entity.

flag Zico Kolter, a 42-year-old computer science professor at Carnegie Mellon University, leads OpenAI’s Safety and Security Committee, a four-member panel with the power to halt AI releases deemed unsafe. flag Appointed over a year ago, his role became a key requirement for OpenAI’s 2025 transition into a for-profit public benefit corporation, mandated by California and Delaware regulators. flag The agreements ensure safety decisions take precedence over financial goals, granting Kolter full observation rights to for-profit board meetings and access to safety data. flag The committee addresses risks including AI misuse in weapons development, cyberattacks, and mental health harm. flag While Kolter declined to confirm if the panel has ever blocked a release, he emphasized the growing threat landscape of advanced AI. flag His leadership reflects heightened scrutiny as OpenAI navigates its shift from nonprofit mission to commercial enterprise.

53 Articles