Thank you, Kanade.nyes. the data file is 23 GB. (.dat file)nyes. I can read smaller files; since I run LES, I save the file automatically. Fluent could read 5 gb .dat.gz file. but if .dat.gz file greater than 6gb, then I will receive this error.nI post my HPC staff's reply here: nLei, I have been unable to successfully read the 20+GB .dat files. I made a very simple journal file that would read the .cas file and then one fo the .dat files. No matter how I ran it, single-threaded, multi-threaded, etc. It would fail with something along the lines of: Error: Invalid section id: �9u000bError Object: #f gzip: stdout: Broken pipe When monitoring the jobs, I would see them use as much as 100GB of RAM before exiting, but they were on a node where I had requested 512gb of RAM, so memory pressure was not an issue. I think at this point you may need to consult with your colleagues or with ANSYS to determine why these files cannot be read.n----------------------------------nThe system info isnnnI also submit a request in customer port: 11067416291. pls help me on this issue. it has been bugging us for more than a year. my colleague's post is attached.nhttps://forum.ansys.com/discussion/15214/error-invalid-section-id-error-object/p1n