Thanks for the help, but I'm still not really sure how to continue. I do have an approximate input torque though I know there's some shock loading on the system. Our halfshafts are designed for 450Nm nominal (fatigue) and 800Nm shock under clutch drops/mis-shifts etc. I'm using this torque divided by two for the torque on each side gear then divided by the number of spyder gears for the torque at each gear mesh. That seems to be the proper method for differentials.
I have performed the calculations several times to ensure that I haven't made any errors. I have also tried using a calculator that I downloaded called MITCalc. The data published by the AGMA does not seem to provide a great deal of information about small gears - the size factor is not available for bevel gears of 10 teeth mating with 16 teeth. Consequently, I'm not fully sure my calculations are accurate, but it seems hard to believe that they're out by such a large margin.
My reference design is capable of withstanding 10x the torque in a racing application, yet my calculations say that it's only marginally handling my loads. In fact, they are gears from a late 80s Ford Escort re-worked into an LSD, which still should be putting near 5x the torque into them. This simply cannot be right.
diamondjim, are you suggesting that I essentially redesign my gears to have the same factor of safety (~0.4) that the reference parts have? It's hard to me to believe this is a good method, but it's the only thing I can think to do. Are there straight bending strength calculations I can do that do not account for fatigue?