A system model for assessing scalar dissipation measurement accuracy in turbulent flows
Measurement Science and Technology
In this paper, a system model is developed to investigate independent and coupled effects of resolution, noise and data processing algorithms on the accuracy of the scalar gradient and dissipation measurements in turbulent flows. Finite resolution effects are simulated by spectral filtering, noise is modelled as an additive source in the model spectrum and differencing stencils are analysed as digital filters. In the current study, the effective resolution is proposed to be a proper criterion for quantifying the resolution requirement for scalar gradient and dissipation measurement. Both effective resolution and noise-induced apparent dissipation are mainly determined by the system transfer function. The finite resolution results, based upon a model scalar energy spectrum, are shown to agree with non-reacting experimental data. The coupled resolution-noise results show three regions in the mean scalar dissipation rate measurement: noise-dominated region, noise-resolution correlated region and resolution-dominated region. Different noise levels lead to different resolution error curves for the measured mean scalar dissipation rate. Experimental procedures and guidelines to improve the scalar gradient and dissipation experiments are proposed, based on these model study results. Finally, the proposed system approach can also be applied to other derived quantities involving complex transfer functions.