top of page

The BigMother Manifesto

The Big Mother Manifesto

Testimonials

Professor Leslie Smith

(University of Stirling, Associate Editor of the Journal of Artificial General Intelligence)

“Much of what passes for AI is simply neural networks, basically the same ideas as in the 1980’s but upgraded with various bells and whistles like new activation functions, summarising levels, and feedback. Even the most sophisticated systems (like the GPT systems) lack novel ideas. Aaron’s work aims to push beyond this, to set an agenda for actual intelligent systems (so called artificial general intelligence, AGI) that considers more than pattern recognition and synthetic language construction. This is quite different from what is pursued by companies, and most computing departments. The work is important, and may underlie the next steps forward in artificial intelligence.”

References

Professor Leslie Smith
(University of Stirling, Associate Editor of the Journal of Artificial General Intelligence)

Professor Pei Wang
(Temple University, Chief Executive Editor of the Journal of Artificial General Intelligence)

Abstract

The way in which AI (Artificial Intelligence) --- and, in particular, superintelligent AGI (Artificial General Intelligence) --- develops over the remainder of this century will most likely determine the subsequent quality of life of all mankind for all eternity. Due to the unique nature of superintelligence, we have one --- and only one --- chance to get it right. Imminent existential risks notwithstanding, this makes getting superintelligent AGI right the most important problem currently facing mankind. Accordingly, the goal of the BigMother project is to influence the current AGI trajectory (and thus the AGI endgame, and thus the fate of all mankind for all eternity) in order to achieve an endgame that is (as close as possible to) maximally-beneficent (and minimally-maleficent) for all mankind.

Latest draft

The BigMother Manifesto: A Roadmap to Well-founded Maximally-Aligned Maximally-Superintelligent AGI (Part 1)
bottom of page