We screened an AI safety documentary in Parliament
An important update for PauseCon Brussels, the departure of our Organising Director, and four new national chapters.
Date Change for PauseCon Brussels
We’re changing the dates for PauseCon Brussels to 21-23 February 2026.
This was a difficult decision, but will allow for a much improved event. Given that the previous dates were in the run-up to Christmas, several attendees and speakers were unable to attend. Our new dates in February have secured the attendance of AI expert Stuart Russell, who will come for a talk and a panel discussion with several European lawmakers.
We’ve already notified those of you who have signed up, and we apologise for any inconvenience this change may have caused. As a reminder - if you have already arranged transportation to Brussels for December and are unable to get a refund, PauseAI can reimburse you. Please email Ella if this is your case.
Whilst there’s a change of date, the focus remains the same. PauseCon Brussels will be a brilliant opportunity to meet people working towards a Pause, engage in workshops on policy, communications, and organising, and to take part in a large demonstration.
As previously mentioned, free accommodation is available for attendees on a first-come, first-served basis. Applications remain open here. (If you think you have a valuable session to add to the PauseCon agenda, also reach out to Ella.)
New National Chapters
The global impact of the race to build superintelligent AI requires global opposition. That’s why we’ve continued to grow the size and number of our national chapters since PauseAI’s formation in 2023.
Volunteers in Canada, Serbia, Romania, and India have recently formed their own groups, taking us to a total of 15 worldwide.
If you’re interested in joining or helping out with your national chapter, our website can point you in the right direction.
Volunteers are in the process of launching additional chapters in the Philippines and Nigeria. If you’re wondering why there’s not yet a chapter in your country, there’s a good chance you’re not the only one! Many national chapters have formed from discussions in our discord server.
FLI’s call for a ban on superintelligence reaches 100,000 signatures
Last month, the Future of Life Institute published their Statement on Superintelligence, initially signed by hundreds of AI experts, politicians, and public figures.
As you’ll recall, it calls for a prohibition on the development of superintelligence, at least until there is:
broad scientific consensus that it will be done safely and controllably, and
strong public buy-in.
The letter received sweeping media coverage and social media attention, and 108,738 people have now added their name. Given that its ask is closely aligned with PauseAI’s statement, we were delighted to see such strong public support, and many of us within PauseAI have signed.
Yoshua Bengio, Stephen Fry, Grimes, Prince Harry, Geoffrey Hinton, Steve Wozniak, Kate Bush, and… you? Sign the letter here.
PauseAI host SB 1047 documentary screening in Parliament
Last month, PauseAI volunteers and MPs attended a screening of Michaël Trazzi’s SB 1047 documentary in the Houses of Parliament.
With the UK AI Bill being delayed, and potentially not coming into force until at least 2027, it’s useful to look at the dynamics at play during the passage of Californian AI safety bill SB 1047, which was ultimately vetoed by Governor Gavin Newsom.
UK Director Joseph Miller spoke before the screening on the lessons we can learn from the battle that took place between AI lobbyists on one side, and the majority of the public and the Californian legislature on the other.
Organising Director Ella Hughes set to depart
After a year as PauseAI’s first full-time employee, we’re sad to announce that our Organising Director, Ella, is set to leave at the end of the year.
After having worked in the union space, Ella has brought in much-needed expertise, and has improved and professionalised many aspects of PauseAI.
She’ll remain involved as a volunteer, and we wish her the best in her future employment!
We are now hiring for someone to come in as Organising Director after Ella’s departure. Apply here.
Other news
King Charles personally handed Nvidia CEO Jensen Huang a letter on the dangers of AI surpassing human capability.
Satirical AI company Replacement.AI comes out and says what real AI companies can’t - “humans are no longer necessary. So we’re getting rid of them.”
Sam Altman was served a subpoena live on stage in San Francisco in relation to a trial of activist group StopAI.
Microsoft AI CEO Mustafa Suleyman has gone public with his concerns that many in AI want to build superintelligence to replace humans, but has unfortunately declared his intention to build “humanist superintelligence”, perhaps without understanding the difficulty of the problem.
AI Safety Camp 11 is open for applications.
Anthony Aguirre is running a creative contest in a search for media that summarises the ideas of his essay, Keep the Future Human. There’s over $100,000 in prize money available to the winners!
What we’ve been reading/watching
ControlAI’s Andrea Miotti wrote a brilliant piece in TIME on the need for a global movement against superintelligence.
YouTube veteran Hank Green does the double, opening up his huge audience to the dangers of the race to superintelligence.
The other If Anyone Builds It guy, Eliezer Yudkowsky, on Chris Williamson’s podcast.
PauseAI Comms Director Tom Bibby on Trish Wood’s podcast.
Siliconversations video on FLI’s superintelligence letter.
Thanks for reading PauseAI’s November newsletter. See you next month!



