Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

FEA hardware and software for very large models 2

Status
Not open for further replies.

chris9

Automotive
Feb 18, 2004
142
Hello,

I'm currently running into memory issues with my present hardware and FEA software which is IDEAS. I am only allowed 2GB of memory because it runs on a 32 bit operating system (Windows XP). What I need is about 8 GB of RAM so that I can build very large models.

Most of the geometry (cast iron housings) is tet meshed because of its complexity. I carry out linear, contact, material non-linear, forced frequency response, thermal etc and I would also like some fatigue tools.

Can you suggest a software bundle and suitable hardware that would meet my needs and what would be the approximate cost?
 
Replies continue below

Recommended for you

COSMOS has a good tet mesher and calculation scheme. Someone else will have to price it for you, but it does all of the analysis types for which you are looking. I'm not sure about fatigue, but there is a third party fatigue calculator at I think, that does a really nice job. NENastran will also do it, but I'm not sure if it utilizes the full 64 bits of XP.

As for hardware, it's cheap, although for 8GB of RAM, you're looking for a pretty powerful machine. My guess, and it is truly a guess, about $3,000 US.

Garland E. Borowski, PE
 
I'll restrict my first set of comments to the hardware side of the question only. The cheapest 64-bit hardware available is a 64-bit AMD Opteron system. With at least 8-GB of RAM, you're probably looking at $8k-$10k US. The RAM alone will cost you upwards of $4k.

Of course, then the next question is operating system. MIcrosoft still has not released its XP-64 for these systems. So, if you need something right away, you're probably looking at a linux-based solution. That, however, is probably going to limit the available FEA software, as the FEA program has to be compiled in the native 64-bit environment.

Good luck - I'm going through the same situation right now, so I know what you are going through.

For a rough idea on cost, I'd try They've got a configuration wizard where you can set up any PC.
 
It is possible to run Nastran on a cluster of Linux boxes. Sadly I do not know if it is (a) easy or (b) cost effective.

Cheers

Greg Locock
 
Well, I sure blew it on the hardware cost...sorry about that!

Garland E. Borowski, PE
 
Interesting one. How big are the models and how do you know you need 8Mb RAM if you dont have the system? Is the problem running time or getting the solver to run?
Chris Booth
 
Chris,

My model was 150,000 tet elements and it wouldn't solve with 2GB of RAM. I reduced its size to 20,000 tets and it worked fine but I had to heavily defeature the geometry. I know my stress results will be dodgy with such a coarse mesh. If I had more RAM (i.e more than 2GB) I would have more accurate results without the hassle of defeaturing.

I can live with long solver running times because I'll just leave it going overnight but I can't afford to waste time building efficient models with bricks and shells.
 
Chris9

I have similar problems, which involve very complex geometry and loading, tet meshing is the only practical solution. I am not allowed to defeature the geometry.

The "solution" that I employ is as follows:-

1). Build a total model with a "coarse" mesh, (with the term coarse loosely applied as it is all relative), or more accurately as good a model that will still successfully run within the restrictions of the software/hardware resources.

2). Run sub-models in the regions of greatest interest/concern, using the displacements from 1). to drive the boundaries of the sub-model. Sub-model regions can be judiuosly chosen to enable very fine meshes to be constructed.

The method is flawless and is very powerful, since a relatively coarse mesh will still provide a very accurate overall stiffness and hence accurate displacements for use in the sub-models.
 
I have recently run a model with 110k tet elements and 220k nodes on Adina with 1.25GB ram...it ran overnight. It had 10 time steps, nonlinear materials and large dispacement analysis, so it should be possible to do your model with much less than 8Gb.

There are memory allocation issues on 32bit Windows systems for FE programs that I dont pretend to fully understand. Its not just the amount of memory you have but the ability of Windows to allocate the blocks of memory that the FE program requests.

On my software it helps a lot to split the model into a number of element groups (I use 12 by default on larger models), that seems to get round some of the memory problems. Dont know if that sort of thing helps on other programs.

There is also an option for a 3Gb memory switch on boot up with XP. I think you have to put a command into one of the boot up files.

Chris
 
Johnhors,

I've tried sub-modelling and it works. I still wish I had more RAM though. My model with 150,000 elements was partially defeatured already. I had to experiment with selective defeaturing to achieve meshing. Then when the model didn't solve I had to defeature some more.

I'm thinking of switching to NEiNastran with a linux operating system and 8GB of RAM. The time I waste costs more than it does to buy new hardware.
 
Johnhors,

What software are you using?
 
Chris9

I have mainly used Lusas in the past, I am now using Abaqus as well. I use Cadfix purely as a meshing tool for Catia created models. I use my own software to set up all the loading and supports (including applying the enforced displacements in sub-models). Working on PC's with 2GB of ram (and up to 6GB of page file), I can quite easily solve models of up to 450000 nodes using either Lusas or Abaqus.
 
A few notes regarding memory management:
1) If you are using Windows XP, you can add "/3GB" to the system startup. To do this, go to the Control Panel - Performance and Maintenance - System - Advanced - Startup and Recovery Settings - and manually edit the system startup to ADD (this is VERY IMPORTANT) a line that appends the "/3GB" to the end of the atartup line. Then, before you apply it, check to see if you are using XP SP2. If you are, then go ahead. If you are still on SP1, then you need to contact Microsoft (use their 1-800 number, and don't worry, you won't have to pay for this hotfix) and ask for the hotfix related to Knowledgebase article 328269.

2) johnhors, I had a bit of a chuckle when I saw that you have a 6GB page file. All 32-bit systems (XP included) can only address 4GB of RAM - both physical and virtual. BTW - the page file is virtual RAM. Therefore, there are 4GB of your page file that are sitting useless on your hard-drive. I summary, the maximum amount of physical RAM + page file (virtual RAM) is 4GB. However, even when you implement the /3GB setting that I noted above, Windows will only be able to dedicate a maxmimum of 3GB to any particular process - namely your FE program.

3) There are indeed contiguous memory issues in Windows. Using ANSYS, I can only utilize a maximum of 1800MB for my total workspace, while reserving 786MB for the database (these terms will mean nothing to those of you not using ANSYS). Suffice it to say that Windows was NEVER designed for larger memory applications such as FEA.

BTW, I am in the process of procuring a 64-bit machine with 16GB of RAM. This whole using a paging file REALLY slows things down, and I agree with chris9 - the time I waste waiting is well worth $15k for a new machine.
 
I've a sun blade 2500. currently running models in excess of 1.6m elements and around 2.0m degrees of freedom, writing 11Gig results files. Only cost around $7k
 
Chris9,
Have you tried a software that uses P-elements. I am not suggesting changing software in anyway but I am wondering how effective would a P-element software be in solving the models that you are dealing with. I have been using Pro/Mechanica and some of my big models consist of 16,000 elements and I can go upto a polynomial level of 6. It would be interesting to see how many elements Pro/Mechanica would create for your models and to what polynmial order it would take it to.

ahad
 
ahad,

To fit 16,000 elements into my geometry I would need to defeature too much. I would lose all the fillets etc. I don't think h-elements would give accurate results with heavily defeatured models.
 
Chris9,
Don't be surprised at the low number of elements because these are P-elements. A model of 16,000 P-elements is not the same as 16,000 H-elememts since the accuracy of the result is determined by their polynomial order. Without looking at your model it is impossible to say how many elements Pro/Mechanica would create but from my experience it would not hurt giving it a try if you have access to Pro/Mechanica.

ahad
 
Pro Mechanica doesn't do non-linear analysis.

corus
 
Chris9,
After so many responses it kinda slipped out of my mind that your requirements include non-linear analysis. Thanks corus. Yes, Pro Mechanica doesn't do non-linear analysis especially material non-linearity.

ahad
 
The /3GB switch works well with NEiNastran. It has a special tet10 solver mode allowing it to handle models over 7 million DOF on a pentium with XP and the /3GB option. I have personally been able to run a model that was 6.6 million DOF. The PCGLSS solver is the same one used in COSMOS and ANSYS and I-Deas Model Solution.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor