On Friday, a group of former OpenAI employees submitted a proposed amicus brief supporting Elon Musk in his lawsuit against OpenAI. The lawsuit opposes OpenAI’s intended shift from a non-profit to a for-profit corporation.
The brief, which includes the names of twelve former employees such as Steven Adler, Rosemary Campbell, and Neil Chowdhury, argues that if OpenAI’s nonprofit body relinquishes control over its business operations, it would significantly undermine its mission.
Several of these former staff members have previously criticized OpenAI’s practices. Gretchen Krueger has called for the company to enhance its accountability and transparency. Daniel Kokotajlo and William Saunders have expressed concerns over OpenAI’s aggressive pursuit of AI dominance. Carrol Wainwright has stated that the company should not be trusted to act appropriately in the future.
OpenAI, founded as a nonprofit in 2015, shifted to a “capped-profit” model in 2019. The organization is now attempting to restructure as a public benefit corporation. During its capped-profit conversion, OpenAI retained its nonprofit segment, which has a controlling stake in its corporate division.
Musk’s legal action against OpenAI alleges that the startup has diverted from its nonprofit mission intended to ensure that its AI research benefits humanity. Musk sought a preliminary injunction to halt OpenAI’s conversion, but this request was denied by a federal judge. The case is expected to proceed to a jury trial in spring 2026.
The brief from former OpenAI employees suggests that the organization’s current structure, with a nonprofit overseeing several subsidiaries, is essential to its strategy and mission. Any changes that dilute the nonprofit’s control would contradict OpenAI’s mission and violate trust with employees, donors, and other stakeholders who supported the organization based on its commitments.
The document emphasizes that OpenAI committed to several core principles in its charter, treating these commitments as binding internally. It argues that maintaining the nonprofit’s governance is crucial to ensuring that the benefits of artificial general intelligence (AGI) are prioritized for humanity, rather than for narrow financial interests.
According to the brief, OpenAI often used its unique structure as a means to attract talent, highlighting the nonprofit’s control as pivotal to fulfilling its mission. The brief mentions an all-hands meeting in late 2020 where OpenAI CEO Sam Altman purportedly underscored the significance of the nonprofit’s oversight in ensuring safety and broader societal benefits over immediate financial gains.
The former employees caution that if OpenAI transitions to a for-profit model, it could be motivated to compromise on safety and potentially develop powerful AI concentrated among its shareholders. Such a move would undermine the “merge and assist” clause in OpenAI’s charter, which commits the organization to collaborate with any similar project that achieves AGI first.
Beyond the former employees, several organizations, including nonprofits and labor groups like the California Teamsters, have urged California Attorney General Rob Bonta to prevent OpenAI’s transition, citing a failure to protect its charitable assets and mission. In December, the nonprofit Encode also expressed similar concerns in an amicus brief, highlighting risks to AI safety.
OpenAI has claimed that its conversion would preserve the nonprofit arm and provide it with resources for charitable initiatives in areas like healthcare and education. In exchange for its controlling stake, the nonprofit reportedly stands to gain billions.
In an online statement, OpenAI affirmed its intention to enhance the capabilities of its nonprofit sector and rejected the notion of moving away from it.
The situation is critical for OpenAI, as it must finalize its transition to a for-profit model soon or could face losing some of the capital it has recently acquired. Efforts to reach OpenAI for comment are ongoing, and this report will be updated upon receipt of their response.