Hi Toby, Try cfitsio v1r3060p1, which has a patch to fix a known problem related to the large file handling code. That is probably why you see different behavior from the RM. However, even with this change, the unit test still fails on Windows. Unfortunately, I think the problem here is a fundamental one in cfitsio that is hard to fix, namely that the long long handling relies on some functions like fseeko and ftello. These functions apparently have equivalents on Windows like _fseeki64 and _ftelli64, but I tried them and they did not fix the problem that the RM sees: fseek simply fails and it does appear to be truncating the 64 bit integer. The unit test is one thing, but you may see some improvement if you use the patch version (v1r3060p1) in production code. One thing to watch out for with the unit test is that it may not be an infinite loop. The test code creates an empty extension, then calls the cfitsio function that adds rows. Unfortunately, this process is extremely slow in cfitsio, but I wanted the unit test to do a meaningful test of the long file handling. This just takes a while to run, and the RM may be more patient than you are ;) James Toby Burnett wrote: > An update: a bit of test code shows that long long types are handled properly when sent to functions. I suspect that there is a compiler error involving the template function. Since it is an old compiler, I guess I better update, but this might mean that the current win32 builds are compromised? > > In fact the test_tip program gets into an apparent infinite loop after printing this line: > > Expected behavior: Opened large_file.fits, to test adding a large (>32 > bit) number of rows > > Looking at the RM, the test program does fail at the point, but there is more output which I don't seem to get. > > https://www.slac.stanford.edu/www-glast-dev/cgi/viewLog?sessionId=c49e > 356c2474ab5e6bc4cb64c1d75692&cpId=19636&package=tip&file=test_tip.exe. > txt > > The previous version did not fail. > > --Toby > > -----Original Message----- > From: James Peachey [mailto:James.Peachey@nasa.gov] > Sent: Sunday, February 01, 2009 6:06 PM > To: Toby Burnett > Subject: Re: stuck with tip problem > > What version of tip are you using? An what version of cfitsio? > > Toby Burnett wrote: >> Hi James, >> >> >> >> I'm adding phi dependence to our exposure calculation, and am stuck >> with strange behavior of tip. Specifically, in the healpix package >> routine reproduced here: >> >> >> >> HealpixArrayIO::write(const HealpixArray & ha, >> >> const std::string & >> outputFile, >> >> const std::string & >> tablename, bool clobber) >> >> { >> >> if (clobber) >> >> { >> >> int rc = std::remove(outputFile.c_str()); >> >> if( rc == -1 && errno == EACCES ) >> >> throw std::runtime_error(std::string(" Cannot remove file " >> + outputFile)); >> >> } >> >> >> >> // now add a table to the file >> >> tip::IFileSvc::instance().appendTable(outputFile, tablename); >> >> tip::Table & table = *tip::IFileSvc::instance().editTable( >> outputFile, tablename); >> >> >> >> // this is a work-around for a bug in tip v2r1p1 >> >> >> >> std::stringstream ss; >> >> size_t size = ha[0].size(); // get individual size from first one >> >> ss << size << "E"; >> >> std::string nbrbins = ss.str(); >> >> table.appendField("COSBINS", ss.str()); >> >> table.setNumRecords(ha.size()); >> >> >> >> // get iterators for the Table and the HealpixArray >> >> tip::Table::Iterator itor = table.begin(); >> >> HealpixArray::const_iterator haitor = >> ha.begin(); >> >> >> >> // now just copy >> >> for( ; haitor != ha.end(); ++haitor, ++itor) >> >> { >> >> size_t n= (*haitor).size(); // check individual size? >> >> (*itor)["COSBINS"].set(*haitor); >> >> } >> >> >> >> // set the headers (TODO: do the comments, too) >> >> tip::Header& hdr = table.getHeader(); >> >> setHealpixHeaderFields(ha, ha[0].nbins(), hdr); >> >> >> >> hdr["THETABIN"].set(CosineBinner::thetaBinning()); >> >> hdr["NBRBINS"].set(CosineBinner::nbins()); >> >> hdr["COSMIN"].set(CosineBinner::cosmin()); >> >> hdr["PHIBINS"].set(CosineBinner::nphibins()); >> >> >> >> // need to do this to ensure file is closed when pointer goes out >> of scope >> >> return std::auto_ptr(&table); >> >> } >> >> >> >> I'm now creating a table with many thousands of rows, each with a >> single vector field. It has worked fine for years with a 40 bin >> array, that is, a "40E" field. >> >> >> >> But when I increase that 40, at least to 80 (I want 640), the >> resulting file seems to have the correct size, but fv does not see >> that HDU. When I try to examine it with pyfits, I get this message after a long time: >> >> >> >> In [2]: q = pyfits.open('test_livetimecube1.fits') >> >> Warning: Required keywords missing when trying to read HDU #3. >> >> There may be extra bytes after the last HDU or the file is corrupted. >> >> >> >> There are no errors during the writing, and I'm not able to trace the >> code to a cfitsio call. >> >> >> >> This is exercised by the skymaps test program, by changing the "0" in >> >> healpix::CosineBinner::setPhiBins(0); // non-zero to >> exercise phi bins >> >> >> >> to "1", say. >> >> >> >> I've updated, but not tagged the packages healpix and skymaps, and >> would appreciate any insight. >> >> >> >> --Toby >> >> >>