Drainage flow of a viscous compressible gas from a semi-sealed narrow conduit is a pore-scale model for studying the fundamental flow physics of fluid recovery from a porous reservoir without using fluid injection. Thermal effect has been routinely neglected for these flows in the traditional petroleum engineering literature. Since the motion is entirely driven by volumetric expansion, temperature change always accompanies the density change. This thesis examines such thermal effects on the drainage flow.
Thermal drainage flow is first studied by simultaneously solving the linearized continuity, momentum and energy equations for adiabatic walls. It is shown that even in the absence of an imposed temperature drop, gas expansion induces a transient temperature decrease inside the channel, which slows down the drainage process compared to the isothermal model and Lighthill’s model. For a given density drop, gas drains out faster as the initial-to-final temperature ratio increases; and the transient density can undershoot the final equilibrium value. A parametric study is then carried out to explore the influence of various thermal boundary conditions on drainage flow. It is found that as the wall transitions from adiabatic to isothermal condition, the excess density changes from a plane wave solution to a non-plane wave solution and the drainage rate increases. It is shown that when the exit is also cooled and the wall is non-adiabatic, the total recovered fluid mass exceeds the amount based on the isothermal theory which is determined by the initial and final density difference alone. Finally, a full numerical simulation is conducted to mimic the channel-reservoir system using the finite volume method. The Ghost-Cell Navier-Stokes Characteristic Boundary Condition technique is applied at the far end of the truncated reservoir, which is an open boundary. The results confirm the conclusions of the linear theory.