1. n. [Formation Evaluation]
A technique for measuring the fluid saturations in a core sample by heating the sample and measuring the volumes of water and oil driven off. The sample is crushed and weighed before being placed in the retort. It is then heated in stages or directly to 650oC [1200oF] during which the fluids are vaporized, collected, condensed and separated. Plateaus in the rise of the cumulative water volume with temperature are sometimes analyzed to indicate when free water, surface clay-bound water and interlayer clay-bound water have been driven off. The volumes of water and oil are measured directly, but corrections are needed to account for alterations in the oil. The volume of gas also is needed for accurate results. This is measured on a separate, adjacent sample by injecting mercury under pressure and measuring the volume absorbed. Before injection, the sample is weighed and its bulk volume determined by mercury displacement. The total pore volume is then the sum of the volumes of gas, oil and water. The saturation of each component is the ratio of its volume to the total pore volume.
See related terms: core plug, Dean-Stark extraction, routine core analysis