Rakesh Kapoor on 2003

This AI Prediction was made by Rakesh Kapoor in 2003.

 

Predicted time for AGI / HLMI / transformative AI:

There can be no dispute with both his propositions: machines with greater-than-human intelligence may well be built in the next 50 years

 

 

Opinion about the Intelligence Explosion from Rakesh Kapoor:

n/a

 

Flycer’s explanation for better understanding:

 

 

The future of humanity with AGI / HLMI / transformative AI:

Unlike the learned Brahmins of the Panchtantra story, we hope that the learned scientists and philosophers of today will be more than smart and [skillful], and learn from the collected wisdom of human experience, so that they do not outsmart themselves, and us all.

 

Flycer’s Secondary Explanation:

Scientists and philosophers of today should learn from the collective wisdom of human experience. This will help them to avoid outsmarting themselves and the rest of us. The Panchtantra story serves as a reminder of the importance of this.

 

 

About:

 

 

 

 

 

Source: https://kundoc.com/pdf-when-humans-outsmart-themselves.html

 

 

Keywords: Machines, Intelligence, Wisdom