I do think we will figure out a revolutionary new way of searching and indexing. I also think that it's silly to try to stop technological innovation. If nothing else, humans are amazing at learning, advancing, and figuring new shit out, and it's useless to try to stop that. Cloning may or may not be wrong, but even if it's illegal in the States some crazy Italian or Zimbabwean or Japanese is going to figure it out anyway, right?
I'm not so sure machines will pick up HAL-like traits of power-hungriness or greed or anything else. I think it's entirely possible that AI machines will win the hearts of the world by lacking those traits. They'll be perfect, nice, good, productive, smart citizens of the world, and exactly THAT is why they'll become popular, spread, and eventually take over. Their logic will be unassailable: "You humans cause pain to each other, kill, lie...we don't. Why shouldn't we be in charge?" The world will be a better place when humans are gone.
Or will it? I don't think AI beings will have the capacity to create art. "Who needs art when you have world peace and prosperity?" the machines will ask. But I think a world without art is a shitty world indeed, so ultimately I think the world will be a much worse place.
Although machines would have to consciously maintain the morality we programmed them with, because without human traits/creations such as conscience or religion or metaphysics, they're not going to have any reason to behave according to a certain code. Maybe they'll develop a utilitarian system of "whatever keeps society running smoothest is Right"?