This AI Prediction was made by Julius Lukasiewicz in 1974.
Predicted time for AGI / HLMI / transformative AI:
Opinion about the Intelligence Explosion from Julius Lukasiewicz:
Flycer’s explanation for better understanding:
The future of humanity with AGI / HLMI / transformative AI:
The survival of man may depend on the early construction of an ultraintelligent machine — or the ultraintelligent machine may take over and render the human race redundant or develop another form of life. The prospect that a merely intelligent man could ever attempt to predict the impact of an ultraintelligent device is of course unlikely but the temptation to speculate seems irresistible.
Flycer’s Secondary Explanation:
The construction of an ultraintelligent machine could have a major impact on the survival of mankind. It is unlikely that a merely intelligent human could predict the consequences of such a machine. Nevertheless, the temptation to speculate on the potential outcomes is irresistible.
Juliusz Lukasiewicz (1878–1956) was a Polish logician, philosopher, and mathematician who made significant contributions to the fields of mathematical logic, axiomatic set theory, and the history of logic. He is best known for his work on the principle of non-contradiction and for developing the first system of formal logic in history. He is also credited with inventing the first digital computer in the world.
Keywords: survival, ultraintelligent machine, speculation