To be a sentient being is to have the ability to feel and to perceive independently of something akin to a program that tells a machine what to do within the context of a specific set of conditions. Hard to say with certainty that an AI will never become sentient, separate from its code. Programming languages are becoming more capable, however, of "learning" from user input. I both develop and write about customizing application programming interfaces for a living, and recent advances are promising in terms of making a smarter AI, but there are still no signs of Hal that I can see. As for Alan Turing and his test, I think it was an important idea in his day but that day was fifty years ago now and I don't think the Turing test is in any way definitive in determining sentience. Technical advances of the past half century have eclipsed the Turing method. AI programs today are very capable of interacting with a human being, but this is still another kettle of fish from possessing sentience.