fread and large amounts of data
I'm writing a class to make BSP reading easier ( for me anyways ).
The problem is that on certain maps reading the texinfo lump will cause some serious bugs. What's wierd is that it crashes when I try to free memory that the BSP class used ( overloaded delete operator ). If I comment out loading texinfo it works fine. I know it's a problem with large data amounts because of the following scenario... 1. Tried to load ns_mineshaft - Crash 2. Tried to load ns_siege007 - Worked Alot of the numbers produced by my class so-far show siege to be quite small as compared to the other maps ( when they loaded ). Any suggestions? I already tried using "open" instead of using "fopen" to see if that would help but with no luck. I also do not want to resort to using the windows file I/O functions. Any help is appreciated. [ Added ] Some test data, results are at the bottom of each section. =========== ns_origin =========== BSP Version: 30 Models: 202 Vertexes: 27503 Planes: 11196 Leafs: 6303 Nodes: 11418 TexInfo: 14150 ClipNodes: 30181 Result: Crashed ============= ns_siege007 ============= BSP Version: 30 Models: 104 Vertexes: 5211 Planes: 1140 Leafs: 1065 Nodes: 1728 TexInfo: 337 ClipNodes: 4825 Result: Worked ========= ns_lost ========= BSP Version: 30 Models: 116 Vertexes: 17227 Planes: 14198 Leafs: 3488 Nodes: 6490 TexInfo: 3538 ClipNodes: 19090 Result: Worked |
Re: fread and large amounts of data
It may sound stupid, but are you sure you are allocating enough memory ?
Else try using malloc(), I never had problems with that facility - meaning, the bugs were always on my side. |
Re: fread and large amounts of data
Im almost positive...
Code:
I have no clue why it crashes during the memory freeing though, I know fread fails due to debugging. |
Re: fread and large amounts of data
You do know that most of the MAX_MAP_STUFF #defines are arbitrary constants, right ? Zoner used them in his compile tools but they do not reflect the maximal capacity a map can be. You may want to increase them if you intend to read really large maps, especially some with lots of entities.
Here's the ones I use for my BSP reader. I don't read the entdata lump but I'm pretty certain that the MAX_ for the entdata lump is arbitrary as well. Code:
// BSP map file constants |
Re: fread and large amounts of data
[ Edit ]
Sorry, I skipped over the arbitrary parts ( bah - its hard to read so early in the morning ). Changing MAX_MAP_MIPTEX to 32767 solved the crashes, thank you for the help. |
Re: fread and large amounts of data
I ran into this problem also and had to do a custom build of ripent to fix it.
I was writing this up when I saw you guys just posted the soution. It is as you say. It is caused by a too small value of DEFAULT_MAX_MAP_MIPTEX in bspfile.h Here is my post on VERC in the ZHLT section Oct 10, 2003. "ripent.exe has been crashing on a number of maps when I try to export the entities. Here is the problem: LoadBSPImage() This line: g_texdatasize = CopyLump(LUMP_TEXTURES, g_dtexdata, 1, header); Crashes in CopyLump() because the map I am loading has a size of 4,363,020 that is being copied into memory that has been allocated with the following size. bspfile.h #define DEFAULT_MAX_MAP_MIPTEX 0x400000 // 4Mb of textures is enough especially considering the number of people playing the game // still with voodoo1 and 2 class cards with limited local memory. bspfile.cpp int g_max_map_miptex = DEFAULT_MAX_MAP_MIPTEX; Why isn't this memory allocated by what is needed in the bsp file instead of some hardcoded size? I bumped DEFAULT_MAX_MAP_MIPTEX up another meg and it has fixed the crash on all maps that were crashing on me. Feedback anyone?" I never got a response. Actually I had to double this value to be able to load all maps. #define DEFAULT_MAX_MAP_MIPTEX 0x800000 I good test is cs_alpin. The text area is hugh on this map for some reason. While you are at it, why don't you change over to dynamic allocation for this memory? That would be the correct way to go. PM, good "educated" guess! |
Re: fread and large amounts of data
1 Attachment(s)
By dynamically allocating memory you mean by using the lump size instead of the constant value right?
Anyways, PM's post helped out quite a bit and the BSP reading class is now finished. It took me about 10 minutes to drop it into the client dll to do a wireframe sort of effect using triapi. Thanks again everyone, I uploaded a pic of it in action for no good reason. |
Re: fread and large amounts of data
Quote:
sizes are used instead, which is bad. Code:
g_dtexdata = (byte*)AllocBlock(g_max_map_miptex); do it like this: g_dtexdata = (byte*)AllocBlock(header->lumps[LUMP_TEXTURES].filelen); It is the only sure way to know you can read any BSP without crashing. But it may not be worth the trouble depending on what you are trying to do. bspFile.cpp needs to be rewritten IMHO. Cool hack. Thanks for the pic. Hope this helps... |
Re: fread and large amounts of data
I'll try that since it will save work down the line. Basically, my class loads BSPs like this...
Code:
Actually, the user must specify ( with a bitmask ) what parts of the BSP to load. That should take place after Open returns true. |
Re: fread and large amounts of data
Quote:
Check it out. http://austinbots.com/mapping/BSPEdit.jpg You can DL it here: http://austinbots.com/mapping/BSPEdit.zip Some of the source may be interesting to you. Get the ful souce and relese build here: http://austinbots.com/mapping/BSPEdit+Source!.zip Oh Yea! |
All times are GMT +2. The time now is 13:58. |
Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.