![]() |
a.i processor
Don't know if this is new news, or old news, but some company looks to be trying to create an A.I dedicated expansion card - for those elements which are common to A.I tasks anyway. Interesting idea, but not sure if it will really work in the long run. Link to an arstechnica post:
http://arstechnica.com/news.ars/post/20060905-7665.html I personally think it's a company trying to jump onto the ageia bandwagon. Time will tell I guess. |
Re: a.i processor
alot of new stuff is coming.. that one was new to me.. but what do we got
Physics card.. which ATI is about to do a siimiler version on their graphics card. Network card that someone should be able to lesser the traffic and for games make it possible to have lower ping. secret. and now the AI card. well im all in for new tech.. but some i dont think they will be that usefull.. the processor can handle it. |
Re: a.i processor
network card with lower pings is rather marketing slogan.
|
Re: a.i processor
aparently the net adapter uses a special protocol type thing to ensure the best routing for packets, and it prioritisies Game packets over general net traffic. I'll have a look, see if I can find the website again.
|
Re: a.i processor
AI coprocessor ? well, maybe another cpu :D ... most AI tasks you cannot just put into some simplified commands like the shaders on GPUs, AI coder are used to have the 'normal' possibilities ... and if such boards would be available they'd for sure need extra support from the programmer side, no way that's making the NPCs automatically smarter.
greetings from finland |
Re: a.i processor
Judging by some of their comments in their FAQ and stuff they don't even understand how A* works, so I wouldn't put much faith in this thing. An AI coprocessor could be very beneficial in theory. Branch heavy graph searching algorithms could benefit greatly from specialized hardware. In Game Programming Gems(6 i think), there is an implementation of A* on the GPU which turned out significantly faster. A more generalized AI processor would be pretty cool, but I think it's going to be much harder to get market with one. Unlike Physx, which can happily do eye candy only so lesser machines can still experience the main parts of the game, I figure you wouldn't be able to rely on this AI thing and still give those without it the full experience. The industry is needing more talented AI programmers with more time to do AI than they do an AI processor, especially with duel and soon quad core at our disposal. If I had to choose I'd probably prefer a physics accelerator that let me do shitloads of hardware accelerated collision testing, which is one of the main expenses of much AI work. Traceline, TraceHull, etc. I think PhysX screwed themselves with their rediculous $300 price tag. I could see paying up to $100 at most. Likewise, if an AI chip hits market, it will probably be overpriced as well, ensuring a crappy market penetration, which in turn means less likely for developers to bother, which then even further gives gamers less incentive to get them.
p.s. The network card you speak of is this joke http://www.killernic.com/KillerNic/ |
Re: a.i processor
A dual core / multi CPU system is better then this, and your not restricted by their SDK. I am suspicious of a lot of their comments, and their "whitepaper" was just a rehash of their web page, with no technical specs or figures.
Sounds like a scheme to generate some money.... |
Re: a.i processor
I think the duo-core story is a bit hard to realize. it would mean the game engine should know what pieces of code to execute where. Since atm one app is ran on one cpu (for games that is). Your other processor is still doing nothing but running background/windows tasks.
If the OS cannot cope with slicing processor power, it will be hard for a game engine to do so. I do know there are developments, but for now its just a nice wett dream ;-) |
Re: a.i processor
Quote:
|
All times are GMT +2. The time now is 16:02. |
Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.