You can build a computer that responds to stimulus - we do it all the time - but that doesn't mean that it necessarily is conscious (that is, has subjective experience). If I build a computer that responds to stimulus, that doesn't necessarily mean (to paraphrase Thomas Nagel) that there is something that it is like to be that computer.
Well, sure, there is something it is like to be that computer, but it is only accessible to the computer, which does not pay a lot of attention to what it's like to be itself generally. When you ask me what it's like I can try to imagine becoming the computer, but in the process I would cease to exist, so for me there is only the objective computer seen from the outside.
The only reason "what it's like" seems to make sense with other humans is because we evolved this nifty faculty of empathy. But if I examine more closely what it's like to be you, I also find that at the point I'm you, and thus have subjective access to your consciousness, I'm not me anymore.
So, it seems that when looking more closely exactly the same problem occurs when asking what it's like to be a computer, or a human, therefore the "what it's like" question is not a valid argument to ascribe consciousness to one but not the other.
It is possible the computer pays as much attention to its own subjective experience as you or I but we have not given it the tools necessary to express itself.