I think that Apple is developing and adding to their own Siri database if I'm not mistaken, but it does depend on other searches and portals--google, wolfram alpha, Yelp, et al depending on how you ask the question, or the type of question that it addresses.
As far as I know, Apple will continue to increase the Siri database as well as improve on response and interpretation--I seem to remember reading that the intent is to develop it to the point where it does not depend on external internet-based searches, as it does now.
Comparing it to what Vlingo or other Android voice services do is a bit silly. Though, it can certainly be argued that the canned voice response from Siri is simply "ear candy" on top of similar functionality for, I don't know, 60% of its activity, Siri does provide a more increased level of usability than what Google currently offers.
Simply asking it about your schedule or somewhat vague questions like "what is it like outside?" and having it respond directly, with good detail, is quite impressive.
It seems to me that they are aiming for the Watson concept from IBM--advanced query interpretation with uncanny problem-solving capabilities. Granted, the device itself could never have that kind of necessary hardware (not in this decade, anyway), but perhaps Apple could utilize some serious cloud processing to ramp up Siri's functionality?
It's speculation from my part based on nothing, and perhaps a profound misunderstanding of what it can do--but the potential seems quite profound.