1. n. [Formation Evaluation]
A technique for measuring the pore volume of a core sample by observing the change in pressure of helium introduced into the pore space. The rock sample is held in a core holder whose internal walls are elastomers, so that the only void space is the internal pore volume. With a suitable holder, the sample can be held under a confining stress. Helium is held in a reference cell at known volume and pressure, typically 100 to 200 psi [689 to 1,379 kPa]. The helium is introduced to the core sample, dropping in pressure as it fills the connected pore space. The effective pore volume is obtained from Boyle's Law using the pressure before and after introduction of helium, and the reference volume.
See related terms: Boyle's Law Double Cell, buoyancy, core plug, liquid saturation method, mercury displacement method, porosimeter, routine core analysis, summation of fluids method