Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Approach for designing unbraced (steel) frames. 12

Status
Not open for further replies.

3doorsdwn

Structural
May 9, 2007
162
I have a frame where the method of lateral force resistance consists of (entirely) moment connections. Rather than go through and figure all the K-values for the members, I thought that I might just set all the K-values equal to 1 and do a analysis & design considering the p-delta affects. Do you consider this feasible? After all, isn’t that what the K-values attempt to compensate for (i.e. increased moments from p-delta)?

(As you’ve probably already guessed: the program I am using for analysis can do a p-delta analysis very easily, but the k-values have to be figured manually.)


 
Replies continue below

Recommended for you

I would highly recommend to everyone involved in this discussion to read the following Q&A. It verifies what StructuralEIT has been saying all along:

Question
04/10/2003

Is there any documentation regarding the assumption an effective length factor of K = 1.0 if a P-delta analysis is performed?

Question sent to
Answer(s)

Your question is one with which engineers regularly struggle and I hope I can provide a useful answer. The first, and perhaps most important, issue has to do with looking at and combining provisions from two different specifications. The AISC, CISC and other steel design specifications will all accommodate stability issues differently and it is imperative that the provisions within a single specification are used consistently. One approach taken in the AISC Specification is to use effective length for column axial capacity and a second-order moment magnification for checking bending capacity. Alternatively, the engineer could choose to do a second-order analysis but it must be done correctly and that is usually the place where misunderstandings occur. The Canadian code uses a different approach to stability and second-order effects (the notional load approach). Thus, their statement that the effective length factor may be taken as 1.0 when a second-order analysis is used is inappropriate for use with the AISC Specification at this time. When carrying out a second-order analysis one must be sure to address both system buckling and moment amplification. The second-order analysis procedures in a typical computer program will determine the magnitude of the moments in the displaced equilibrium configuration corresponding to the applied loads. They will not determine the buckling capacity of the frame. However, frame buckling capacity can be determined if the second-order analysis is taken incrementally to the limit. This requires more than just checking the box in the software program for a second-order analysis. It should also be noted that a second-order incremental approach can have some potential problems. If the frame to be analyzed has no lateral load and no sway under gravity load, a second-order analysis will yield the same results as a first-order analysis, thus saying nothing about column capacity or any second-order moments. So, using a typical second-order analysis at ultimate load will permit the elimination of the code-specified moment amplification factor from the interaction equation but will not permit the use of K = 1.0. A second-order analysis under ultimate load taken to the limit will determine the buckling capacity of the system and that will negate the need to use a K-factor to determine that capacity. A paper I published in the Proceedings of the 2000 North American Steel Construction Conference, “A Practical Look at Frame Analysis, Stability and Leaning Columns” might be of help to you. A paper of the same title that should be published soon in the AISC Engineering Journal will provide some additional discussion and should also be of assistance.

Louis F. Geschwindner, P.E., Ph.D
American Institute of Steel Construction
Chicago, IL


 
Well yes you have to follow the entire procedure when using Direct Analysis (you have to do more than just a P-Delta analysis to be able to use k=1.0).
 
I have spent quite a bit of time reading Appendix 7 (Direct Design Method) over the last few days. I have to say it seems VERY complicated. So much so that I don't even believe you can use this in any coherent fashion using any software currently available (including RAM).
Also, to do it by hand seems very cumbersome (but less so than using software).
 
StrlEIT, I've heard comments like that a lot recently, but I really don't understand them.

What is confusing about the DAM? Account for initial imperfections using notional loads and reduced member stiffnesses. Use a rigorous second order analysis to capture in-plane instabilities (hence Kx=1.0). Design members for applied forces.

The ELM IS what's complicated. It's easy if done wrong, like many folks do, simply using G and the alignment charts, ignoring overall story effects. Try figuring out the correct Kx using all that crazy stuff from Pages 16.1-244 through 16.1-248. That discussion on K is bigger than the entire Appendix 7!! For that matter, Chapter C dwarfs Appendix 7.

Appendix 7 can be reworded as "do it right using this bulldozer method" LOL. It's many times more intuitive than the ELM.
 
From Clansman: "I would highly recommend to everyone involved in this discussion to read the following Q&A. It verifies what StructuralEIT has been saying all along:

Question
04/10/2003

Is there any documentation regarding the assumption an effective length factor of K = 1.0 if a P-delta analysis is performed?

Question sent to
Answer(s)

Your question is one with which engineers... "

That's old, issued prior to the 2005 Spec. which defined the Direct Analysis Method. Like haynewp typed, the DAM is more than just a second order analysis. Kx can be taken as unity when using the DAM.
 
271828-
This is purely my opinion, which may be misguided, so please correct me if I am going astray here.
Part of the lengthiness of the Direct Analysis resides in the fact that you need to apply notional loads in both directions and account for the worst effects. This essentially means that you need to run the approximately 180 load combinations that RAM spits out with notional loads in X+ and Y+, then run those 180 load combinations with the notional loads in the opposite directions all while using a reduced stiffness for the strength.
You will then have to remove the notional loads and use the nominal stiffness to check for serviceability.
Meanwhile, all of this needs to be done using a RIGOROUS second order analysis. I will have to check out RAM, does anyone know if RAM's second order analysis qualifies as RIGOROUS?
It just seems like there are a whole lot of analysis and load combination checks to run using a very high powered program such as RAM. If you don't have something like RAM, you are really left out in the cold. Not even just from this Direct Analysis method, but even just for the Load Combinations. There are just too many to do by hand.
 
StrlEIT, I don't think you're astray. I thought you were typing more about the method, not how to implement it. Implementation such as you describe would be difficult.

I think it might be possible to make it easier. Define load cases called something like NotionalXDL, NotionalXLL, NotionalYDL, etc. and then combine using load combinations. I've never used RAM (always used RISA, ETABS, and SAP), so don't know how to accomplish this using that system.

In the future, I think the DAM will be much easier to implement automatically than the ELM. Here's a great example. I was involved with developing an analysis/design program last year that did the DAM. It was not difficult. We actually had the program automatically move the nodes H/500 rather than applying notional loads. RAM, CSi, etc. will surely build this, or some other automatic DAM feature into their programs. We found this approach easier than figuring out the notional loads for arbitrary frame geometry.

The ELM is an unbelievable pain to implement correctly, like I typed before. Try making a program select the right Kx for every member of an arbitrary frame! That's possible, but really tough to do.

As for RAM's second-order analysis being rigorous, you just need to make sure that it correctly captures the P-Delta (almost certainly) and P-delta (maybe, maybe not) effects. This test is not as onerous as it sounds. App. 7.3(1) even allows the B1, B2 approach as part of the DAM, assuming the reduced stiffnesses are used. When the new Stability DG comes out, there will be lots of examples and benchmark problems in there that you can use to verify your program's analysis procedures.
 
Many of the AISC speakers recently have implied that the Direct Method is, or will be, the easiest and most preferred method.

 
I don't think RAM System is currently setup to handle DM. I don't have the program in front of me now but I do not recall there being a way to segment columns along their height to capture the local p-delta. And, I know RAM Advanse has not incorporated AISC 2005 because I just asked them, so I would guess that RAM System hasn't either but I am not sure. I use RAM System all the time and I hope they will implement DM ability into the program.

I have used Direct Analysis in 2D using MASTAN and it is was a very easy model to build. It is definitely a software based method, and I do not think it would be hard to do in 3d using a program like Advanse or RISA. But I don't think anybody is going to use the method until there are some examples of it out there to benchmark the software from. Depending on the program, you may have to split the members into a larger number of pieces in order for it to estimate the local p-delta correctly.
 
haynewp,

Some time ago I went to the AISC example Pdelta verification problems/solutions found in the back of the specifications (in the appendix or commentary - can't remember which) and built those problems in RISA, using a somewhat standard "column" length in the range of 10 feet to 12 feet.

I did both 1 ft. segments and 2 ft segments and compared the results to the AISC given solutions. Both 1 ft and 2 ft segments appeared to be rather close to the given solution.

I didn't have time then to expand to 3 ft, 4 ft, etc. but in most cases, for ease of use, I use 1 ft segments anyway.

This, at least, gave me some assurance that RISA correctly handled the second order effects, both [Δ] and [δ].

 
JAE, I see them, they are in the commentary. Those are the same examples as the ones I have from another source.

I also remembered that for Direct Analysis, you don't want the program doing the other standard design checks using the reduced E you have input for the analysis. How were you able to implement that using RISA?
 
I'll have to go back and see what exactly I did but I'm pretty sure I was just verifying that RISA was a program that "properly" did Pdelta (both [Δ] and [δ]) and I wasn't trying to implement the Direct Method.

I just found that RISA does provide the proper analysis results indicated for the AISC example problems.

I have yet to get deep enough into the Direct Method to see how it works out in practice using RISA.

Perhaps we have to analyze with the program and manually design...I don't know. It may be altering the E value but I'd have to see if that would mess up the design routines within RISA.

 
I think you'd probably want to leave E alone and use either a property modifier on A and Ix or perhaps bump up the loads instead.

This stuff isn't going to be easy with the current programs.
 
Scratch that last post of mine, guys. That wouldn't help for any program I know of. I was thinking that might work in SAP or ETABS, but it wouldn't.

The last sentence is ok, though!
 
I agree with 271828 - Direct Analysis is a straightforward brute force concept that takes a lot of the guesswork out of complicated stability problems where the equivalent length method would be a shot in the dark at best. The equivalent length method is generally done incorrectly as he pointed out.

Regarding the actual implementation of the notional loads within the load combinations, in the large majority of situations the notional load will only be applied with the gravity-only load combinations, thefore limiting its implementation to only a handful of load combinations. This is because the use of a notional load in conjunction with the lateral load cases is only required when the ratio of 2nd to 1st order drift exceeds 1.5. Generally this ratio is kept below 1.1 per the seismic provisions of ASCE 7/IBC 2006, and therefore the notional loads never come into play with the lateral loads.

Regarding current software implementation. RAM is working on implementing the direct analysis method but I remain slightly dubious of the results. The current version does automatically create notional loads and combine them with combinations per choices made by the user. It does not yet, however, actually reduce the stiffnesses and run an analysis (so it is an incomplete implementation right now). RAM in its current form does not perform a rigirous p-delta either. Its current p-delta implementation is based on a 1-shot geometric stiffness implementation. This type of p-delta is fairly accurate for regular building structures with all columns tied to a rigid diaphragm (which is RAMs bread and butter and therefore generally OK), but it falls apart for short stories, multiple columns outside of the diaphgram, etc. Also RAM only performs the p-delta considering one "P". The only way to TRULY capture P-Delta effects IMHO is to actually run each one of the load COMBINATIONS through an iterative 2nd order analysis.

SAP/ETABS with their most recent versions both do implement the Direct Analysis process fairly well. They also (suprise) have good documentation on it. These programs automatically reduce the stiffness of the members etc. for the analsysis but not for the design portion. Both also work for the test problems in the AISC provided the members are split up sufficiently to capture slenderness effects. This is generally not as many pieces as you might think - 2 pieces will actually work within 3% for all but the highest slenderness ratios, while 3-4 will be adequate in 99.9% of all cases. This process is easier done in SAP where you can tell the program to mesh the columns internally during analysis - with ETABS you must manually split them prior to analysis.

I am not too familiar with RISA's implementation.

I think that before too long, as we work out the bugs, the direct analysis method will be the de-facto standard.

 
Excellent WillisV. Thank you for the review of the software, especially CSi's. I don't have the latest version of SAP and it's good to know that they're including the DAM now.

I've had a soft spot for CSi for years because, compared to RAM, their products seem like the technologically superior, but poorer marketed of the two. Mac vs PC, HP vs TI, etc. It would be cool for them if this finally pays off and gives them an edge over RAM.

CSi's documentaiton and quirks will never stop being irritating, though! I doubt I'll ever like how their units operate.
 
The reduced EI and EA are only for strength checks, correct? You will still use the nominal EA and EI for serviceability checks, correct?
 
271828 - I agree that the CSI products are technically superior. However, I think that RAM products are infinitely more user friendly due to their excellent load application/book keeping interface as well as load combination generators. In order to keep that friendly a user interface, RAM has to sacrifice some of the technical aspects - conversely CSI products are in some cases horribly user unfriendly as they are trying to be everything to everyone. Each has its place.

StructuralEIT - of course. That is explicitly stated in the commentary to Ap. 7.
 
Sorry to throw a monkey wrench into this discussion, but..
Does anyone have any idea how the Direct Analysis Method will impact FMC frames? As the leeward connections are considered pinned under lateral loads, the ration of second order drift to first order drift will be higher than determined in a program. This can impact the requirement for notional loads.
Also, can anyone tell me how you determine the second order drift and the first order drift for teh ratio without using the B1 and B2 factors from AISC (I am asking the question more of how you determine this from a program). If doing by hand, I can check B1 and B2, but it seems counter-productive to do this as a very unrefined analysis when using such a powerful program. My fear is that the two may not be completely compatible. Maybe my crude B2/B1 check results in a ratio of 1.4, butt he refined program prooduces 1.55. By hand I don't need the notional loads, but using the refinement of the program, I do.
 
Hat tip to WillisV for his comment on user documentation. After reading his post, I asked my boss to upgrade us a couple of weeks ago. If you don't have SAP V11.0.7 or later, the user documentation for 2005 AISC was not included >:-<. But now that they've finally come out with documentation on 2005 AISC, the implementation is more thorough than I had expected. SAP appears to tackle all the major DA requirements discussed here: automatic reduction factors for section EI and EA per code to account for residual stresses, notional loads, and P-Delta including ? effects. Because of the requirements of P-delta in both DA and equivalent length methods, the 2005 code seems be pushing certain specific analysis methods. This is a bit of an oversimplification, but with the DA method, you basically have a tradeoff, you get to use K=1 instead of working with alignment charts and the adjustments to them, but you have to take a (typically) 20% reduction to section EI in exchange.

Notional loads (for out-of-plumbness effects) have to be defined for each gravity case, a minor pain, but then SAP automatically generates the lateral loads based on user specified percentage of each gravity load, typically .002 or .003, and it will automatically create load combos using these notional loads. If you don't have SAP, you'd calculate each gravity load at each level, not counting loads from above, then generate your notional lateral loads in each direction from there.

SAP automates one other important function that I think other methods may struggle with: the combination of factored loads using nonlinear P-delta analysis combos which are automatically used for design. Since P-Delta is a nonlinear analysis, you cannot run individual nonlinear load cases and then later combine them with other loads using linear sum for design. In earlier versions of SAP, you could have used 'Analysis cases' to define these nonlinear factored P-Delta combos, but now, all you do is either define linear factored load combos yourself or let the program create the combos automatically, then press a button to convert those factored load combos into nonlinear P-Delta used in analysis and design. That is a really nice new feature.

WillisV made a statement upthread about the need to add up to 4 intermediate joints along the columns to properly handle local P-delta effects. Had I not spoken to a CSI support engineer about this very topic, my instinct would have been to agree with him. However, according to this CSI engineer, and he was adamant about this, you would typically not need to add any additional intermediate joints on the columns at all to properly account for local P-?, but in some cases you might need to add 1 additional joint through the frame auto-mesh so as not to alter unbraced lengths. He insisted that due to some uniqueness in SAP's frame element formulation, you would NEVER need to add more than 1 intermediate joint to properly account for ?. I've been doing a little experimentation on this, and the support engineer so far seems to have a point.. at least in the examples that I've run through. In SAP, to add the intermediate joint you would select all and then Assign a frame automesh and divide the selected element into 2 segments internally. That way, the frames are divided internally, but reported and designed using their true lengths.

CSI has made some good strides over the past few years in my opinion regarding user friendliness. The office I work in now has SAP and Risa, but I worked with RAM a few years back and my opinion is that RAM was not that much more user friendly. They had good modeling tools and and load distribution options, but they had problems too.. for example, you used to have to redraw members each time in RAM in order to move them because RAM didn't have a command to let you move them... and I recall that sloped beams didn't work at that time. Back then RAM was faster off the mark in adding the ASCE wind loads and incorporating them in the load combos, a very important advantage at the time, but CSI seems to have largely caught up with them in the auto lateral load area, and trumped them with auto-wind loads on open frame structures and nice implementation of IBC 2006 quake. And I haven't seen any program that comes anywhere close to what SAP now offers for automating the 2005 AISC, with the auto-reduction factors, notional loads, etc., and with the specific code requirement regarding P-? which most other programs don't address along with the reduction factors to EA and EI for the DA method, it seems that the other programs have a long road ahead of them to catch up with what SAP offers now. I just don't see that wide of a gap between the programs on user friendliness as WillisV suggested, but your mileage may vary.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor