Huge file, need to reduce
Huge file, need to reduce
(OP)
I have been tasked with modeling a porous surface for one of our components. The surface has to be fully modeled in order to generate an STL file for making the part as a laser sintered component. Attached is a part file(NX5)of the basic porous surface module. In order to create the full surface, I need to array the base module in a 125 x 50 array The attached jpg shows a 25 x 10 array. Creating this smaller array, which is 1/25 of the required size, results in a 32 megabyte file.
Attempting to create the full size array results in an "Out of memory" error.
So I am open to all suggestions. Anything from "You're doing it wrong." to recommendations for environment settings and hardware upgrades.
Attempting to create the full size array results in an "Out of memory" error.
So I am open to all suggestions. Anything from "You're doing it wrong." to recommendations for environment settings and hardware upgrades.





RE: Huge file, need to reduce
Also turn off the silhouette lines ... darn it, I cannot find where that setting went!!!
RE: Huge file, need to reduce
Visualization preferences > visual > edge display settings
then uncheck silhouettes, and smooth edges. Your model may look goofy but it will be OK.
For some reason I cannot uncheck those two settings because they are greyed out ... I will eventually figure out how to un-grey them.
RE: Huge file, need to reduce
John R. Baker, P.E.
Product 'Evangelist'
NX Design
Siemens PLM Software Inc.
Cypress, CA
http://www.siemens.com/plm
http://www.plmworld.org/museum/
To an Engineer, the glass is twice as big as it needs to be.
RE: Huge file, need to reduce
John R. Baker, P.E.
Product 'Evangelist'
NX Design
Siemens PLM Software Inc.
Cypress, CA
http://www.siemens.com/plm
http://www.plmworld.org/museum/
To an Engineer, the glass is twice as big as it needs to be.
RE: Huge file, need to reduce
In your Jpeg you actually have 11 x 26 instances so that leads me to think that you are in fact already using the instance geometry function. You need to subtract 1 from the number of duplicates to get the correct total. Other array commands do work differently.
With you part setting the shading facets to coarse will lighten the load as the part increases in size.
I did manage to up the ante somewhat by getting to 125 x 25 before I too ran out of grunt. The method used was to create and instance array of 5 (total) and then unite the bodies, then I added 4 copies of that body in the opposite direction and united to create a single body 5x5. I repeated the process once more in either direction so that I had 25x25 of the original. The file was by now over 93Mb, but removing parameters will pare than down to just on 60Mb. I went 5 more and managed to create the instance array before it fell over performing the unites.
The strategy I used improves your chances by reducing the required number of booleans but sooner or later there are limits to most things and in this case you'll need a machine with a lot of memory.
Cheers
Hudson
RE: Huge file, need to reduce
Hudson, good catch on the number of instances in the jpeg. After a decade or so of using "instance feature" where it asks for the total number of instances, I am continually getting burned by "instance geometry" where it asks for the number of copies. Something about old dogs and new tricks.
As for the base module, it uses spheres and tubes to create the geometry. Would a model using polyhedrons create less memory overhead?
RE: Huge file, need to reduce
It may do very slightly but I suspect that the memory is used split in two or more ways to contain the model definition and the display definition in large part. Now when you turn the facet display down to coarse you will reduce the number of facets so that the spherical portions are already effectively tetrahedronal. Trade that off against the model definition containing a larger number of faces and edges and you may balance out the benefits.
If you were to try then if the base model is smaller then that would be sufficient to indicate that the array should also benefit. In this case if I were you I'd probably create you polyhedron by some quick and nasty method as a dumb solid then scale it for the different sizes and generate to dumb copies of the base model to help you compare. I say so on the basis that we had such a file size reduction when you removed parameters that I think the dumb solid is the more practical indicator. Certainly as a smart solid there may be more features to the design and they might tend to contribute to file size and become misleading.
Cheers
Hudson
RE: Huge file, need to reduce
If you are using a windows xp 32bit machine, you might want to edit your boot.ini and add the /3gb option.
Here is the boot.ini for my computer
[boot loader]
timeout=10
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional(3GB)" /noexecute=optin /3GB /userva=2800 /fastdetect
In my setup, the original boot.ini was
[boot loader]
timeout=10
default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /noexecute=optin /fastdetect
and I added the bottom portion as an option when booting up.
If something is wrong you may not be able to boot up your machine.
Roark
Productive Design Services
www.productivedesign.com
RE: Huge file, need to reduce
Cheers
Hudson
RE: Huge file, need to reduce
Ultimately, someone will have to develop an algorithm to produce the porosity in the SLA machine, not from model geometry. Not me, of course, I just draw things.
Attached is a jpeg of the final current structure. Kind of like a bunch of tepee poles.