The charge and discharge rates of Lithium-Ion batteries are measured using C-rates. The most common practice is the rate the battery capacity at 1C which means that a battery that is rated 1Ah will offer 1A of power for an hour. Do note that it does not mean you will get 1 hour of usage per A, it only determines the amount of power driven per hour.
The amount of energy that a battery can hold is measured using battery analyzers. These test equipment analyzers are fully capable of calibrating batteries and measuring the exact amount of time it takes for a battery to reach its end-of-discharge voltage. The battery drain varies depending on battery technology. You will notice a significant difference between Lithium and Nickel based batteries.
Lead-acid batteries usually discharge rate is typically close to 1.75V per cell every hour. For Nickel Cadmium it is close to 1.0 V/cell and Lithium-Ion batteries see a drain of 3.0V/cell every hour. The analyzers that are used to calculate the C rate figure out the nominal rating at which the batteries are drained as well.
Considering the fact that we have devices like variable levels of usage, the drain values should be taken with a grain of salt. While any device that does not see a spike or dip in battery consumption can be accurately measured, the exact C-rate of several electronics cannot be accurately measured. This is why many manufacturers give estimates based on testing results to let users and potential customers know how many hours of usage they can expect out of smartphones, tablets and similar products. The C-Rate also varies over time as batteries start to lose efficiency over time, causing a dip in the amount of time a battery is able to keep an attached device up and running.