This post is the first in our series on estimating human errors in crude quality and measurement. It can be easy for an oil company to invest too little in reducing human errors not because they are not important, but because they are hard to measure. The examples in the series will outline rough calculations measurement supervisors can use to decide whether a specific source of human error is worth further investigation.
Topics: Oil Testing
The short answer is: a lot. This explains why oil companies everywhere in the supply chain spend so much on quality and measurement, from sophisticated on-line analyzers and automatic samplers at most custody transfer points, to large annual third-party-lab budgets.
The centrifuge test to determine the sediment and water (S&W) content in crude oil is one of the most frequently run tests in the oil and gas industry. S&W measurements are done every time a shipment of oil changes hands so that the buyer only pays for the oil volume. Even small errors can add up to significant amounts of money so accuracy is important. For example, a 0.1% measurement error on S&W can cause a producer (or receiver) to lose about $10,000 per year for a single well1. That is nearly $2,000,000 per year for a larger 100,000 bpd midstream operator!