Reynolds stress

(redirected from Reynolds stresses)
Also found in: Wikipedia.

Reynolds stress

[′ren·əlz ‚stres]
(fluid mechanics)
The net transfer of momentum across a surface in a turbulent fluid because of fluctuations in fluid velocity. Also known as eddy stress.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.
References in periodicals archive ?
* Reynolds stresses are either all modeled and solved using the six-equation Reynolds stress models, or an analogy using the Boussinesq hypothesis is made to relate the RANS unknowns to a new term called turbulent viscosity.
As the additional information to the first-order velocity magnitude, the time and space averaged Reynolds stresses are presented in Figure 11.
Broadly speaking, two basic approaches can be used to model the Reynolds stresses in terms of mean flow quantities and to provide closure of the governing equations: (a) eddy viscosity models, and (b) the Reynolds stress transport models.
Hence, a comprehensive set of high quality data, such as velocities, turbulence intensities, Reynolds stresses, and boundary shear stresses of the 3D turbulence structure, were also obtained.
The exact transport equation for the Reynolds stresses, [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], maybe written as follows:
where [bar.[u'.sub.i]-[u'.sub.j]] are the second moment of statistical correlations called Reynolds stresses and should be modeled in order to close the system of equations.
The Reynolds stresses [bar.[rho]u"u"], [bar.[rho]v"v"],[bar.[rho]pw"w"] and [bar.[rho]u"v"] may be written as follows
Terms in the (1)-type equations [bar.[rho][u'.sub.i][u'.sub.j]] are called turbulent or Reynolds stresses that require additional differential equations.
These Reynolds stresses, [bar.-[rho][u.sub.i][u.sub.j]], must be modelled in order to close Equation 7.
The eddy viscosity hypothesis assumes that Reynolds stresses can be related to mean velocity gradients and eddy (turbulent) viscosity by the gradient diffusion hypothesis in a manner analogous to the relationship between stress and strain tensors in laminar Newtonian flow:
The new set of scalings were found to achieve a good collapse of the Reynolds stresses. Aubertine and Eaton (2006) found that velocity scale of the Elsberry et al (2000) worked better than classical flat plate scalings but failed to collapse the inner peak of the streamwise Reynolds stress for their non-equilibrium adverse pressure gradient turbulent boundary layer.
The RANS simulation solves the time-averaged N-S equation and models the additional Reynolds stresses. This modeling approach can significantly reduce the grid resolution requirement, and can be performed as steady-state.