Wall shear problem
I am having trouble solving for wall shear stress within my simulation, with my results consistently off by a factor of 1.5 compared to the theoretical calculations. Originally, I thought this issue had to do with a coarse mesh, however, even after plotting 1.6 million elements, the results are still the same. Below is a detailed setup of my experiment.
The study is a 10 cm in length by 10 cm wide by 2 cm high rectangular volume.
I used various mesh densities ranging from 1e-3 to 5e-4 m. I also used a bias mesh on the height (shown in the picture below) in other runs.
I defined the bottom as a no slip condition and the walls having a specified shear of zero.
I set the velocity inlet at a consistent 0.04 m/s and an outlet measuring total pressure. The working fluid is water with the boundary made from aluminum. I ran residuals from 1e-3 to 1e-5 (results were also attempted in double precision with residuals at 1e-11) with a negligible change to the results.
The wall shear was measured using a line at the bottom of the plate, running the full distance from the intel to outlet, centered between the two walls (see picture below). I exported a graph with wall shear on the y-axis and the distance along the Z axis plotted on the x-axis and compared it to the theoretical within excel.
If you have any ideas why wall shear would be off by a factor of 1.5 in a very basic simulation, please let me know what I can do to my simulation to make it more accurate. I also posted an image of the formula for calculating the theoretical data. The excel shows data from fluent results in blue (1000 data points) and the simulated values in orange with wall shear on the y-axis and distance from the inlet plotted on the x-axis.