The article on the "metre" in wikipedia has an interesting discussion of the history of standards, and the increased precision of the standard as time went on, see the table under the "Timeline" heading:
1795 - defined as one ten millionth of distance from pole to equator - relative uncertainty ~ 10-4
1799 - platinum bar - rel. uncertainty ~ 10-5
1889 - Pt/Ir bar - rel. uncertainty ~ 10-7
1960 - 1650763.73 wavelengths of light from a specified transition in krypton-86 - rel. uncertainty ~ 4x10-9
1983 - Length of the path travelled by light in a vacuum in 1/299,792.458 second - rel. uncertainty ~ 10-10
One could argue that past 1799 most of us would probably not care too much, unless dealing with fairly high precision machinery or electronic devices. Then again, this computer I'm typing on, and my phone, and other electronics, would probably not be possible without the dimensional precision implied by at least the 1960 standard. What the list really says is that the increasing ability to measure time to very high precision/repeatability means that you can improve other measurements by tying them to a time-based standard, so why not?