Alan Hajek on 2011

This AI Prediction was made by Alan Hajek in 2011.


Predicted time for AGI / HLMI / transformative AI:

(Hover for explanation)Types of advanced artificial intelligence: AGI (AI that can perform many tasks at a human-level), HLMI (more advanced AI that surpasses human intelligence in specific areas), and Transformative AI (AI that could significantly impact society and the world)




Opinion about the Intelligence Explosion from Alan Hajek:



Flycer’s explanation for better understanding:

Not provided



The future of humanity with AGI / HLMI / transformative AI:

[Transcription: Even if you shouldn’t expect the singularity, even if those arguments don’t go through, just small probability of the singularity requires serious attention]


Flycer’s Secondary Explanation:

The singularity is an event that may or may not occur, but even a small chance of it happening requires serious consideration. Arguments for and against the singularity exist, but regardless of their validity, the singularity should not be ignored. Even if the singularity does not occur, its potential requires attention.




Alan Hajek is a philosopher and logician who has made significant contributions to the fields of decision theory and artificial intelligence. He is currently a professor of philosophy at the Australian National University in Canberra, Australia.Hajek’s work on decision theory has focused on developing a framework for making rational decisions under uncertainty, which has important applications in fields such as economics and artificial intelligence. His work on AI has explored the ethical and societal implications of advanced AI systems, particularly with respect to issues of safety and control.Hajek has published numerous articles and books on decision theory, probability theory, and the philosophy of science. He has received several awards for his research, including the Humboldt Research Award and the Royal Society of New South Wales Medal.









Keywords: Singularity, Arguments, Attention