hhtguy
Electrical
- Jul 27, 2010
- 3
Currently, I have a project that researchs methods to save fuel, mainly the practicality of using existing cars on the road to help with logistics in a region. So assuming 40000 lbs of cargo need to be shipped from point A to B (100miles). One can either use a single truck to ship this via one trip, or alternatively, one can divide up the load and ship them via cars that are planned to go from A to B (like a carpool). Since the cars are already going from A to B in the first place, the fuel required is merely the additional fuel the car will use in carrying the cargo.
I know that a 10% increase in rolling friction (dependent on weight)results in roughly a 2% increase in fuel consumption for a typical car. Assuming the car gets 27 mpg and weighs 3200lb (typical), carrying 320lbs (10% increase in weight, so roughly 10% increase in rolling friction) will result in around 2% increase in fuel burn. At 27mpg, 100 miles will burn 3.7 gallons, so 2% of that is 0.074 gallon.
0.074 gallon is the additional fuel the car used in carrying 320lb for 100 miles.
40000/320 = 125, so we multiply 0.074 by 125 = 9.25 gallons, which is the fuel that 125 cars will use in carrying the total 40k lbs.
On the other hand, a semi will get loaded to the max with 40000lb and gets around 6 mpg. 100 miles will use up 16.7 gallons of fuel.
So the first method is significantly more efficient. If the cargo has less density, the difference will be more prevalent. I merely plugged in common values to arrive at my conclusion. This is assuming the car do not have to deviate from its course to reach the drop off point.
I would like to know if there are any good literatures and sources on the situation I've just talked about because I'd like to get some good sources to back my results up. Thanks
I know that a 10% increase in rolling friction (dependent on weight)results in roughly a 2% increase in fuel consumption for a typical car. Assuming the car gets 27 mpg and weighs 3200lb (typical), carrying 320lbs (10% increase in weight, so roughly 10% increase in rolling friction) will result in around 2% increase in fuel burn. At 27mpg, 100 miles will burn 3.7 gallons, so 2% of that is 0.074 gallon.
0.074 gallon is the additional fuel the car used in carrying 320lb for 100 miles.
40000/320 = 125, so we multiply 0.074 by 125 = 9.25 gallons, which is the fuel that 125 cars will use in carrying the total 40k lbs.
On the other hand, a semi will get loaded to the max with 40000lb and gets around 6 mpg. 100 miles will use up 16.7 gallons of fuel.
So the first method is significantly more efficient. If the cargo has less density, the difference will be more prevalent. I merely plugged in common values to arrive at my conclusion. This is assuming the car do not have to deviate from its course to reach the drop off point.
I would like to know if there are any good literatures and sources on the situation I've just talked about because I'd like to get some good sources to back my results up. Thanks