What is this error in PyTables?

Wapiti

I am using pytables via pandas in python. I am trying to load a file using pandas.read_hdf() but I am getting this nasty error. I hope I have not lost my 1.1 gigs of irreplaceable data. I have not seen any errors during the saving process. Everything seems to be working fine.

Can someone explain what this error is saying?

Also, is there any way to recover?

HDF5ExtError: HDF5 error back trace

  File "H5Dio.c", line 174, in H5Dread
    can't read data
  File "H5Dio.c", line 449, in H5D_read
    can't read data
  File "H5Dchunk.c", line 1729, in H5D_chunk_read
    unable to read raw data chunk
  File "H5Dchunk.c", line 2755, in H5D_chunk_lock
    unable to read raw data chunk
  File "H5Fio.c", line 113, in H5F_block_read
    read through metadata accumulator failed
  File "H5Faccum.c", line 254, in H5F_accum_read
    driver read request failed
  File "H5FDint.c", line 142, in H5FD_read
    driver read request failed
  File "H5FDsec2.c", line 720, in H5FD_sec2_read
    addr overflow, addr = 1108161578, size=7512, eoa=1108155712
Jeff

A similar question is here

bottom line. your file is borked. no way to recover from this. this is specifically warned against (using multiple threads/processes as writers). see docs here.

HDF5 is NOT threadsafe/process safe for writers.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related