My own opinion based on what I do, see, read, etc.. is that we're so far away from real "learning" algorithms that make their own independent decisions ("consciousness") that it's highly unlikely it'll ever happen, much less happen in my lifetime. I'm 55.
Granted, I could be wrong. Been there, done that, will be wrong again.
As I said, I pretty much agree with everything you are saying. I’m just a tad less willing to shut the door completely to the possibilities... but basically I think you are right.
IMHO it will happen, and in the not to distant future. It is not a question of if, but when.
Narrow AI systems are a.l around us. We are just now starting to stitch them together to make truly robust apps.
Add Machine Learning to the mix on top of that and we have started down the path. Now with Deep Learning we no longer have to “teach” the computers or write extensive rule sets for them. With humans providing guidance to the machines they can develop expert capabilities in just about any domain rather quickly.
Given the economic and military drive for the holy trail of Super AI it will happen. Once it does, all bets are off. That SAI will be able to process information a million times faster than a human. In 1 week it will have completed 20000 years of human like processes. 1 week after inception we will not be able to calculate its relative IQ.
Given the it will probably be hosted in the cloud it will have access to as much much computing power as it wants, near infinite storage, and hopefully a good attitude.
For those that say it is only a machine... What are we?
If we define intelligence as the acquisition of knowledge and the ability to make decisions based on that knowledge or the awareness of the lack of specified knowledge then AI will exist.