.:: Bots United ::.

.:: Bots United ::. (http://forums.bots-united.com/index.php)
-   General Programming (http://forums.bots-united.com/forumdisplay.php?f=25)
-   -   fread and large amounts of data (http://forums.bots-united.com/showthread.php?t=365)

Lazy 15-01-2004 08:09

fread and large amounts of data
 
I'm writing a class to make BSP reading easier ( for me anyways ).
The problem is that on certain maps reading the texinfo lump will cause some serious bugs. What's wierd is that it crashes when I try to free memory that the BSP class used ( overloaded delete operator ). If I comment out loading texinfo it works fine.

I know it's a problem with large data amounts because of the following scenario...

1. Tried to load ns_mineshaft - Crash
2. Tried to load ns_siege007 - Worked

Alot of the numbers produced by my class so-far show siege to be quite small as compared to the other maps ( when they loaded ).

Any suggestions?
I already tried using "open" instead of using "fopen" to see if that would help but with no luck. I also do not want to resort to using the windows file I/O functions.

Any help is appreciated.

[ Added ]

Some test data, results are at the bottom of each section.

===========
ns_origin
===========
BSP Version: 30
Models: 202
Vertexes: 27503
Planes: 11196
Leafs: 6303
Nodes: 11418
TexInfo: 14150
ClipNodes: 30181
Result: Crashed
=============
ns_siege007
=============
BSP Version: 30
Models: 104
Vertexes: 5211
Planes: 1140
Leafs: 1065
Nodes: 1728
TexInfo: 337
ClipNodes: 4825
Result: Worked
=========
ns_lost
=========
BSP Version: 30
Models: 116
Vertexes: 17227
Planes: 14198
Leafs: 3488
Nodes: 6490
TexInfo: 3538
ClipNodes: 19090
Result: Worked

Pierre-Marie Baty 15-01-2004 11:49

Re: fread and large amounts of data
 
It may sound stupid, but are you sure you are allocating enough memory ?

Else try using malloc(), I never had problems with that facility - meaning, the bugs were always on my side.

Lazy 15-01-2004 11:53

Re: fread and large amounts of data
 
Im almost positive...


Code:

 
bool CBSPReader::Allocate( void )
{
  m_pHeader = new dheader_t( );
  m_pModels = new dmodel_t[ MAX_MAP_MODELS ];
  m_pVertexes = new dvertex_t[ MAX_MAP_VERTS ];
  m_pPlanes = new dplane_t[ MAX_MAP_PLANES ];
  m_pLeafs = new dleaf_t[ MAX_MAP_LEAFS ];
  m_pNodes = new dnode_t[ MAX_MAP_NODES ];
  m_pTexInfo = new texinfo_t[ MAX_MAP_TEXINFO ];
  m_pClipNodes = new dclipnode_t[ MAX_MAP_CLIPNODES ];
  if ( m_pHeader && m_pModels && m_pVertexes && m_pPlanes && m_pLeafs && m_pNodes
                && m_pTexInfo && m_pClipNodes )
  {
          return true;
  }
  return false;
}
bool CBSPReader::ReadLump( int iLump, void* ptr )
{
  if ( ptr != NULL )
  {
          fseek( m_pFile, m_pHeader->lumps[ iLump ].fileofs, SEEK_SET );
          fread( ptr, m_pHeader->lumps[ iLump ].filelen, 1, m_pFile );
          return true;
  }
  return false;
}

Those are two parts of the class, for now, everything is allocated upon load of a bsp and freed on a call to delete. I tried using malloc aswell but it still came up with the same problem area of fread.

I have no clue why it crashes during the memory freeing though, I know fread fails due to debugging.

Pierre-Marie Baty 15-01-2004 12:27

Re: fread and large amounts of data
 
You do know that most of the MAX_MAP_STUFF #defines are arbitrary constants, right ? Zoner used them in his compile tools but they do not reflect the maximal capacity a map can be. You may want to increase them if you intend to read really large maps, especially some with lots of entities.

Here's the ones I use for my BSP reader. I don't read the entdata lump but I'm pretty certain that the MAX_ for the entdata lump is arbitrary as well.
Code:

// BSP map file constants
#define MAX_MAP_HULLS 4 // hard limit
#define MAX_MAP_MODELS 400 // variable, but more would stress out the engine and network code
#define MAX_MAP_PLANES 32767 // more than this in a map and the engine will drop faces
#define MAX_MAP_VERTS 65535 // hard limit (data structures store them as unsigned shorts)
#define MAX_MAP_FACES 65535 // hard limit (data structures store them as unsigned shorts)
#define MAX_MAP_EDGES 256000 // arbitrary
#define MAX_MAP_SURFEDGES 512000 // arbitrary

I can almost bet your problem comes from here.

Lazy 15-01-2004 12:30

Re: fread and large amounts of data
 
[ Edit ]

Sorry, I skipped over the arbitrary parts ( bah - its hard to read so early in the morning ).
Changing MAX_MAP_MIPTEX to 32767 solved the crashes, thank you for the help.

Austin 15-01-2004 23:40

Re: fread and large amounts of data
 
I ran into this problem also and had to do a custom build of ripent to fix it.
I was writing this up when I saw you guys just posted the soution.
It is as you say. It is caused by a too small value of DEFAULT_MAX_MAP_MIPTEX in bspfile.h

Here is my post on VERC in the ZHLT section Oct 10, 2003.

"ripent.exe has been crashing on a number of maps when I try to export the entities.

Here is the problem:
LoadBSPImage()

This line:
g_texdatasize = CopyLump(LUMP_TEXTURES, g_dtexdata, 1, header);

Crashes in CopyLump()
because the map I am loading has a size of 4,363,020
that is being copied into memory that has been allocated with the following size.

bspfile.h
#define DEFAULT_MAX_MAP_MIPTEX 0x400000
// 4Mb of textures is enough especially considering the number of people playing the game
// still with voodoo1 and 2 class cards with limited local memory.

bspfile.cpp
int g_max_map_miptex = DEFAULT_MAX_MAP_MIPTEX;

Why isn't this memory allocated by what is needed in the bsp file instead of some hardcoded size?
I bumped DEFAULT_MAX_MAP_MIPTEX up another meg and it has fixed the crash on all maps that were crashing on me.

Feedback anyone?"

I never got a response.


Actually I had to double this value to be able to load all maps.
#define DEFAULT_MAX_MAP_MIPTEX 0x800000


I good test is cs_alpin. The text area is hugh on this map for some reason.

While you are at it, why don't you change over to dynamic allocation for this memory?
That would be the correct way to go.

PM, good "educated" guess!

Lazy 16-01-2004 01:28

Re: fread and large amounts of data
 
1 Attachment(s)
By dynamically allocating memory you mean by using the lump size instead of the constant value right?

Anyways, PM's post helped out quite a bit and the BSP reading class is now finished. It took me about 10 minutes to drop it into the client dll to do a wireframe sort of effect using triapi.

Thanks again everyone, I uploaded a pic of it in action for no good reason.

Austin 16-01-2004 02:37

Re: fread and large amounts of data
 
Quote:

Originally Posted by Lazy
By dynamically allocating memory you mean by using the lump size instead of the constant value right?

Once you load the file header you have all the sizes, but unfortunately they are ignored and the hardcoded
sizes are used instead, which is bad.

Code:


void LoadBSPFile(const char* const filename)
{
  dheader_t* header;
  LoadFile(filename, (char**)&header);
  LoadBSPImage(header);
}
 
After the LoadFile(filename, (char**)&header);
Your header will have all of the ACTUAL sizes from the BSP file.
These sizes should be used instead of the hard coded ones.
 
void LoadBSPFile(const char* const filename)
{
  dheader_t* header;
  LoadFile(filename, (char**)&header);
  // alloc memory HERE based on the sizes from the header instead of hard coding them.
  // NOW we can safely call LoadBSPImage()
  LoadBSPImage(header);
}

So for example, instead of allocating like this:
g_dtexdata = (byte*)AllocBlock(g_max_map_miptex);

do it like this:
g_dtexdata = (byte*)AllocBlock(header->lumps[LUMP_TEXTURES].filelen);

It is the only sure way to know you can read any BSP without crashing.
But it may not be worth the trouble depending on what you are trying to do.
bspFile.cpp needs to be rewritten IMHO.

Cool hack. Thanks for the pic.
Hope this helps...

Lazy 16-01-2004 02:47

Re: fread and large amounts of data
 
I'll try that since it will save work down the line. Basically, my class loads BSPs like this...

Code:

 
bool CBSPReader::Open( const char* pszPath )
{
  m_pFile = fopen( pszPath, "rb" );
  if ( m_pFile != NULL )
  {
          if ( ReadHeader( ) == true )
          {
                return true;
          }
  }
  return false;
}
bool CBSPReader::ReadHeader( void )
{
  m_pHeader = new dheader_t( );
  if ( m_pHeader != NULL )
  {
          fread( m_pHeader, sizeof( dheader_t ), 1, m_pFile );
          return true;
  }
  return false;
}
bool CBSPReader::LoadData( unsigned int uiBits )
{
  if ( uiBits & LOAD_MODELS )
  {
          m_pModels = new dmodel_t[ MAX_MAP_MODELS ];
          if ( m_pModels == NULL )
                return false;
          ReadLump( LUMP_MODELS, m_pModels );
  }
...
...
ect...

Shouldn't be that hard to change since the header is loaded and verified before any other loading takes place.

Actually, the user must specify ( with a bitmask ) what parts of the BSP to load. That should take place after Open returns true.

Austin 07-06-2004 19:16

Re: fread and large amounts of data
 
Quote:

Originally Posted by Lazy
I'm writing a class to make BSP reading easier ( for me anyways ).
The problem is that on certain maps reading the texinfo lump will cause some serious bugs.

Well... its been a while... But finally I got a week off and had time to do some things. High on the list of priorities was to write a slick program to edit entities.
Check it out.



http://austinbots.com/mapping/BSPEdit.jpg


You can DL it here:
http://austinbots.com/mapping/BSPEdit.zip

Some of the source may be interesting to you.
Get the ful souce and relese build here:
http://austinbots.com/mapping/BSPEdit+Source!.zip

Oh Yea!


All times are GMT +2. The time now is 13:58.

Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.