Hello Sinan,

Generally the heat transfer coefficient becomes negative due to surface temperature being lesser than the fluid temperature. In convection the newtons law of cooling is followed so if you have negative temperature difference it might lead to a negative heat transfer coefficient . So I recommend you to check on the temperature settings of the surface and fluid .Heat transfer coefficient will be calculated by using heat flux divided by the temperature diff which is equal to (Twall - Treference). if ur Treference is greater than Twall u get a negetive h. check for the reference values and see if ur plate is getting heated up or is it getting cooled.

Under convective heat transfer in a fluid with varying properties and in boiling, heat transfer coefficient may substantially depend on and ΔT. In these cases an increase of heat flux may give rise to hazardous phenomena such as burnout (transition heat flux) and deterioration of turbulent heat transfer in tubes.

I have attached the video link for the best practices simulation case for heat transfer in tubes and shell tube heat exchanger so that you can familiarise with the boundary condition.

(10) Forced Convective Heat Transfer in a 2D Staggered Tube Bundle — Simulation Example - YouTube

(10) Heat Transfer in a Shell and Tube Heat Exchanger — Simulation Example - YouTube

I hope this solves your problem.

Thanks,

Chaitanya Natraj