There are several ways of making light dimmers now, but essentially they all work by reducing the voltage that gets to the light bulb. Think of an incandescent light bulb as a pure resistor. With lower voltage, a resistive load will use a reduced wattage (W = V^2/R A in a resistive load), and hence less energy. Of course, you will also get a reduced lumen output as well, but assuming you can live with less light most of the time, and only turn it up to full brightness when you absolutely need it, you will save energy.
How much? Well, how much is much? If you have a 100W bulb, and you turn the dimmer down so that it is only putting out the equivalent of a 40W light bulb, you are theoretically saving 60% of the maximum energy. If however, you can't read what you need to because it is too dim, you have actually wasted 40W of energy for no good reason! But let's say you are watching TV and just want some low ambient light, then dimming that 100W bulb makes perfect sense. Having the dimmer allows you to turn it up to full only when you want to read.
All in all however, dimmers do save energy if they are used. That 100W light bulb burning 4 hours/day, 360 days/year is 100 x 4 x 360, or 144kWH / year, which at $.05/kWH is about $7.20/year. If you dimmed it to a 40W equivalent for 80% of the time (3.2hrs/day), it would be (40 x 3.2 x 360) + (100 x .8 x 360) or about 75kWH which is $3.75/year.
So if your dimmer cost you $15.00, your payback would be 4 years, fairly reasonable, but not if you keep it turned up to full brightness all the time.
"Our virtues and our failings are inseparable, like force and matter. When they separate, man is no more."
Nikola Tesla