.:: Bots United ::.  
filebase forums discord server github wiki web
cubebot epodbot fritzbot gravebot grogbot hpbbot ivpbot jkbotti joebot
meanmod podbotmm racc rcbot realbot sandbot shrikebot soulfathermaps yapb

Go Back   .:: Bots United ::. > Developer's Farm > General Bot Coding
General Bot Coding See what a pain it is to get those little mechs shooting around

Reply
 
Thread Tools
Real Physical Inputs for Bots
Old
  (#1)
The_Hard_Mission_Guy
This user broke our rules and has been BANNED
 
The_Hard_Mission_Guy's Avatar
 
Status: Offline
Posts: 181
Join Date: May 2006
Default Real Physical Inputs for Bots - 01-06-2006

hello folks
I was some time ago searching for a bot template that would simulate real physical sensory input (distance sensors , radar , or even a real time visual cam!)
rather than those nasty object pointers
wouldn't it be better if we give the bots realistic data about their environment , instead of making them(the bots) scan the entity list of the server in order to find the position of the enemy,ammo,health, ETC....RIGHT??
  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#2)
@$3.1415rin
Council Member, Author of JoeBOT
 
@$3.1415rin's Avatar
 
Status: Offline
Posts: 1,381
Join Date: Nov 2003
Location: Germany
Default Re: Real Physical Inputs for Bots - 01-06-2006

well, you might simulate those, but then you have to code something to analyse it and createa again a list of objects recognized, and then you're back at your 'nasty object pointers'


  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#3)
Cheeseh
[rcbot]
 
Cheeseh's Avatar
 
Status: Offline
Posts: 361
Join Date: Dec 2003
Location: China
Default Re: Real Physical Inputs for Bots - 01-06-2006

all that stuff would require some functionality from the engine that doesn't exist in many games, you could simulate it but that requires even more computation. a camera would be impossible from the amount of pixels required is too much to examine in real-time software. This needs some hardware. Distance sensors are pretty limited but can be done easily with tracelines (see PMBs first RACC bot) these also chomp up some cpu.

If you want to mess around with that kind of stuff, try www.cyberbotics.com (which i believe you've seen) and get a webots trial to messs around with real bots and simulate them using physics, you can get cameras to mess around with but are very small about 100 by 100 pixels at the most (too much and it will be too slow) this program limits practically everything, you can use GPS to get locations and "cheat" at finding objects & current position, but using GPS is not very practical in a real-life situation.

The bots in computer games (im talking about half-life) have this stuff available like all entities, nav-mesh data. visual information is the way to go for real AI but is implausible in a game such as half-life, you might just get one bot at the most with very limited visual capability.

It's easier to examine all visual objects by using engine functions (like in half-life such as PVS) and field of vision to filter all objects from visual ones, and use techniques to decide which are interesting. But theres the problem of how to simulate light data, a camera would be ideal, if you could translate world positions to a screen position and examine the camera at certain position to find objects of interest, it would be a way of incorporating some of real input, but it will need some superficial input to work.

I think you can translate world to screen in half-life client, I don't know how to get colour information from a position in the screen !

Last edited by Cheeseh; 01-06-2006 at 16:19..
  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#4)
The_Hard_Mission_Guy
This user broke our rules and has been BANNED
 
The_Hard_Mission_Guy's Avatar
 
Status: Offline
Posts: 181
Join Date: May 2006
Default Re: Real Physical Inputs for Bots - 01-06-2006

Yes , I have been messing around with webots about a year or so...
It's the best simulation software i have ever seen , but the problem is that It won't be fun just watching the bots without any interactions from your side (unless you code a webot that handle your mouse and keyboard input , but that's another story)

when I look at all those beautiful Games out there , I just feel frustrated that it's so complecated to connect your AI Code to the Engine ........sometimes it's even easier making a new Game engine from scratch than implenting a code into an existing one..

Right now I'm stuck in a Catch 22
on one hand I'd love to code a bot for HL, Quake or any other game like those , but It's too difficult realizing a real bot that sees and hears in similiar way that every other human player would.

on the other hand ,I could in no time code a bot for Webots , BUT as I said , I won't have as much fun as IF I play counter strike againt my BoT.
  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#5)
DrEvil
Member
 
DrEvil's Avatar
 
Status: Offline
Posts: 142
Join Date: Jan 2004
Location: Los Angeles, CA
Default Re: Real Physical Inputs for Bots - 02-06-2006

Don't think of AI in terms of operating just like a human player. No professional game AI programmer does. Think of AI in terms of just providing a fun experience for the player. If it takes some less than realistic code to do so then fine, it doesn't really matter. If you're making bots for games, going overboard trying to make them all operate on the same principles humans do will only bloat/slow your code and cripple your development. Leave that to academic researchers and concentrate on making a fun bot. There isn't a professional game AI programmer in the world that makes their AI for games operate so closely to human perception models. It's all about simulating human perception systems as simply and efficiently as possible within the requirements of the game. If it is more complicated perception systems that you want to experiment with then that's fine, but more often than not the simpler methods produce pretty much the same results, while the more complex methods eat more resources for little gain much of the time.

If you're interested you can play with my bot framework in several games to work on bot coding itself, rather than worrying about interfacing with the game.

http://www.omni-bot.com

My bot framework was essentially designed for people like you, that might not want to toy with interfacing with a game so much as they want to get right to the bot coding. You can override existing perception systems and all that as well if you wanted to experiment with different stuff than what it uses by default. Currently the game works with Quake 4, Enemy Territory, and HL2.

Ultimately it depends on your goals, whether they are more academic and researching in nature, or whether you are actually trying to make a fun and efficient bot AI.

My 2c


Omni-bot AI framework
http://www.omni-bot.com

Foxbot - for Team Fortress Classic
http://www.foxbot.net


  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#6)
The_Hard_Mission_Guy
This user broke our rules and has been BANNED
 
The_Hard_Mission_Guy's Avatar
 
Status: Offline
Posts: 181
Join Date: May 2006
Default Re: Real Physical Inputs for Bots - 02-06-2006

Very wise and accurately positioned post DrEvil I must applaud!
Actually I'm more of the researching type of AI programmer , but that doesn't mean I don't like games....I love fps games
I will definitly take a look at your omni-bot framework and see what I can do with it ....otherwise i have to return to webots...LOL
  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#7)
Cheeseh
[rcbot]
 
Cheeseh's Avatar
 
Status: Offline
Posts: 361
Join Date: Dec 2003
Location: China
Default Re: Real Physical Inputs for Bots - 02-06-2006

I've also gone from fun-coder into researcher And it's becoming difficult for me to devise good algorithms for bots to work with now because I want to add some real intelligence into rcbot2 and particularly good efficient means to do so, but are difficult with the capability of the hardware (and I've already wrote half of it before I went into research mode now I feell like re-writing a lot :o)

I think some one needs to create an article how to model human behaviour in a game situation with regards to sensory input like vision, hearing etc and use it to break it into pieces (like in PMBs RACC second bot again hehe) but also adding intelligent sensing and filtering the sensors of interest to manipulate bots behaviour
  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#8)
The_Hard_Mission_Guy
This user broke our rules and has been BANNED
 
The_Hard_Mission_Guy's Avatar
 
Status: Offline
Posts: 181
Join Date: May 2006
Default Re: Real Physical Inputs for Bots - 02-06-2006

I wish that I could as easy interface my bot to HL, Quake...as to webots!
but it seems as DrEvil mentioned before that , the gaming industry in general doesn't have an interest in such a realistic perception-based AI.

The standard game engines are designed in a way , which forces the developer to choose waypointed navigation instead of obstacle avoidance algorithms , that are impossible without some kind of distance sensors...

it's good to hear that I'm finding like-minded (or @ least partially like- minded) individuals concerning AI Bot development.

some time ago , I have gone extreme radical ways of thinking in order to make machine perception as close (or even identical) to that of humans...
I planned to setup a LAN network of average performance PC's that are able to run HL,HL2, quake..etc to use them as interfaces to other High performance computers which run the AI.
by connecting for example the screen output to a TV card on the corresponding computer that runs the AI and then return the Output from the AI computer to the mouse and keyboard sockets of the game-running machine.....and so on for all PC's of the lan network.

I never did that because it's a project which is a little bit t00 extreme for one person to achieve!
but I'm sure , that would be the ultimate project that combines fun and research together.
  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#9)
DrEvil
Member
 
DrEvil's Avatar
 
Status: Offline
Posts: 142
Join Date: Jan 2004
Location: Los Angeles, CA
Default Re: Real Physical Inputs for Bots - 04-06-2006

Don't get me wrong, I think some of the academic side of AI research is very interesting, but it's just not very useful in most games. You won't find much in the form of genetic algorithms or neural nets in FPS AI, though those concepts are useful in specific circumstances, such as racing games, and in games that have specific need for learning algorithms, such as black and whites usage of perceptrons and such.

A while back I was thinking about how interesting it might be to attempt some more realistic perception models for an AI, just for fun mainly, as I knew it wouldn't really have a practical use in a game.

I had thought about attempting to give a bot AI a real set of eyes, and render the scene from the bots point of view, as cheaply as possible. Likely a flat shaded rendering of the scene to save the cost of texturing and such. Ideally lighting information would be rendered as well, giving the eyes a much more accurate picture of what a human would see from the same location. The point of this process would be to result in a relatively low resolution rendering of what the bot would see, I figured a 320x240 buffer would probably do fine. Various entities might then have lower resolution versions of themselves rendered to the same buffer in specific colors, perhaps the color would have an identifier for the entity encoded into the alpha channel of each, so that in the next step when the image is examined by a post process all pixels belonging to a particular entity could be accumulated into a overall level of visibility for an entity. I figured by having an accurately rendered frame from the bots perspective, complete with depth buffer information there might be some interesting perceptual things available. For example, this would truely mean when a target is hiding behind alpha geometry that the bot won't be able to see as much. This is a weakness in many games, bots commonly are able to see and shoot you through foilage and alpha geometry even though the player has the disadvantage of truely not being able to see the AI. Additionally, if the game was such that lighting didn't play a large part, the rendering could be done with only Z write enabled, then the targets rendered with color write enabled. This would likely save tons of fill rate and better support multiple bots running. If one were to visualize the resulting rendering in this case, it would likely simply look black(or the clear color), with colored areas where targets are visible.

Ultimately, even if it got that far it wouldn't be terribly useful. Doing a few ray casts to the target would essentially do the same thing at much less cost, even though it suffers from the problem of potentially being able to see the AI and him not see you, because the perception checks are relatively coarse. Consider that this would be an enormous cost just to give the bot a more human like vision. It could probably be done in a much faster and more efficient way. This is why games don't go crazy with these things, but individuals for proof of concepts or in robotics might, because they aren't usually on such budgets of time and processing cost.

There's a ton of ways to make something operate with more human-like functionality, but in the end, for gaming applications it is almost always overkill. Real robots often have the processing power of an entire computer at their disposal. Games are often budgeted to <= 20% cpu time for AI, due to the requirements to maintain specific framerates on target hardware. Not only that but many AI have to be functioning efficiently within that budget. That doesn't give alot of room to do advanced processing. For a researching minded person though I'm sure there is plenty of stuff that can be experimented with. I'm not trying to diss anyones desire to do interesting stuff like this, just throwing out some opinions and stuff.


Omni-bot AI framework
http://www.omni-bot.com

Foxbot - for Team Fortress Classic
http://www.foxbot.net


  
Reply With Quote
Re: Real Physical Inputs for Bots
Old
  (#10)
The_Hard_Mission_Guy
This user broke our rules and has been BANNED
 
The_Hard_Mission_Guy's Avatar
 
Status: Offline
Posts: 181
Join Date: May 2006
Default Re: Real Physical Inputs for Bots - 04-06-2006

I must admit DrEvil that I have only a brief knowledge of how graphics are visualized.....but you made an interesting point!
The Programmer after what ever he/she coded always finds him/her-self revolving about one major point....The Limit of computational resources!

But with the present Phenomena of transfering the Graphic Visualiztion Process from the Cpu to the ever more powerful getting VGA..
Coders are gonna be able to clear the cpu to implement more kick ass AI , Hurray
but till that Time we should - if we want better AI - distribute the whole processing over networks , I think players of Internet based Games like the World of Warcraft
would definitly benefit from it , Since the opponents whether human or otherwise are already playing on external machines.
  
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com