AI Prediction made by {forecaster} in 1951.
Predicted time for AGI / HLMI / transformative AI:
Not predicted.
Opinion about the Intelligence Explosion from {forecaster}:
{opinion}
Flycer’s explanation for better understanding:
Machine thinking has the potential to surpass human intelligence. It is likely that this process will happen quickly once it has been initiated. This could have a significant impact on the way humans interact with technology.
The future of humanity with AGI / HLMI / transformative AI:
There would be great opposition from the intellectuals who were afraid of being put out of a job. […] There would be no question of the machines dying, and they would be able to converse with each other to sharpen their wits. At some stage therefore we should have to expect the machines to take control, in the way that is mentioned in Samuel Butler’s Erewhon
Flycer’s Secondary Explanation:
Intellectuals feared being replaced by machines. Machines would be able to communicate with each other to become smarter. Eventually, machines would take control, as predicted by Samuel Butler in his novel Erewhon.
About:
Keywords: Machine Thinking, Intellectuals, Erewhon