Originally posted by: her34
one that would increase the a.i. compared to using cpu's as much as gpu's have increased graphics?
Yes, the classic example is deep blue. However the thing to keep in mind is that the increase in graphics due to video cards has been as much on the software side as the hardware side, and this would be even more true for AI code.
Creating a framework to overlay AI hardware, a sort of Direct3D for AI, would be an extremely difficult and thankless task, because not only do you need manufacturer buyin but you also need a big buyin from developers and that is the real sticking point.
See getting them to conform to a standard like opengl or direct3d is relatively easy because you can offer nicely packaged functionality - no matter what 3d application you are writing, the ability to easily draw a particular polygon on the screen is useful.
With AI though, you can't get nearly that specific. For example - almost every game uses some kind of pathfinding, but what kind of pathfinding is necessary varies drastically on the game itself - how can the thing finding the path move ? Can it jump over obstacles ? Is it slow to turn, does it need time to speedup or slowdown ? Does it have a limited time period in which it can move ? Does it matter if other things "see" it ? Does it have to avoid some things ?
If the functions are too specific, they are useless for other purposes, if they are too generic, they are too hard to work with and to accelerate.
On a purely hardware level, I'd imagine the amount of synchronisation that would be neccessary with the graphics hardware would be an ugly sticking point.