Page 1 of 2

Maximum file's size

Posted: Wed Mar 22, 2017 9:45 am
by skarab
Hello guys,

I've some files that exceded 8go and I can't open it. Is there a limitation of the size for .bin files ?

Thanks

Re: Maximum file's size

Posted: Wed Mar 22, 2017 10:25 am
by daniel
There's no limit on the filesize (at least for what I'm aware of), but there's one on the maximum number of points (I think it's 2 billion points per cloud, but you'll generally hit the machine memory limit before that).

You can increase the virtual memory limit but in this case you'll get horrible performances (which should already be poor anyway with so many points). At least it will let you apply some decimation in command line mode for instance...

Re: Maximum file's size

Posted: Wed Mar 29, 2017 5:03 pm
by dope
Is the limit (2Billion) still valid for 2.9beta? Is there a way to overcome this? Tried to decimate the cloud using command line, but the program still crashes. :/

Re: Maximum file's size

Posted: Wed Mar 29, 2017 5:22 pm
by daniel
Yep, no change on this side.

If the file is a LAS file, then you can split it before loading it with the LAS 'split' tab in the dedicated loading dialog.
If it's ASCII, you can also try to split the file in multiple clouds with the ASCII/text loading dialog (at the bottom). But in this case you'll need a lot of memory as CC will still try to load all the points in memory (but in several clouds)

Re: Maximum file's size

Posted: Thu Mar 30, 2017 3:44 pm
by skarab
Thks for the answer. Finaly, it was just my files that was corrupted.

But my answer up a new problems : my files are unusally fat. For 11,000 points I have .bin of 1.8go. If i convert it in .e57 and resave it in .bin the file is arround few mo.

Do you have an idea ?

Sorry for my bad english, i'm french...

Re: Maximum file's size

Posted: Thu Mar 30, 2017 4:42 pm
by daniel
Where do these files come from? (which version of CC?)

Re: Maximum file's size

Posted: Fri Mar 31, 2017 6:50 am
by skarab
daniel wrote:Where do these files come from? (which version of CC?)
This files are created with .fls and saved with CC 2.8.1. The original files with no modification and around 6 millions points is around 1.7go. After few operation (section, cut, clean...) and of course, after deleting all the temp files, my final files is bigger than the source...

I found after few manipulation that an resample reduce the file size but it delete the SF. So it's not a good option for me.

Re: Maximum file's size

Posted: Fri Mar 31, 2017 3:26 pm
by daniel
The original FLS file is 1.7gb with 6 M. points only? Or is it the first BIN file?

For BIN files at least, the problem is that the scan grid (the gridded structure) is kept in memory with the point cloud. And each time you split the cloud, the scan grid is duplicated. If this grid is huge then the memory might explode quickly.

I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).

And as far as I know the resampling should not remove the scalar fields (maybe it's only deactivated?).

Re: Maximum file's size

Posted: Mon Apr 03, 2017 9:48 am
by skarab
daniel wrote:The original FLS file is 1.7gb with 6 M. points only? Or is it the first BIN file?
My first .bin with mutliples .fls.
daniel wrote:I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).
How can delete it ?

Re: Maximum file's size

Posted: Mon Apr 03, 2017 10:54 am
by daniel
Something like 'Edit > Scan grids > Delete'?