Ansys Products

Ansys Products

Fluent does not start up on Rocky 8.6: Failed to connect to…

    • steven.vandenbrande

      I am trying to run Fluent 19.1.0 (without gui or graphics) on our new HPC cluster which runs Rocky Linux 8.6. Fluent however fails to spawn processes; this already happens when running on a single node with a single process:



      @k28i14:vof_wice$ fluent 3ddp -ptrace -t1 -g -cnf=k28i14 -i simulation.jou
      /ANSYS/19.1/ansys_inc/v191/fluent/fluent19.1.0/bin/fluent -r19.1.0 3ddp -ptrace -t1 -g -cnf=k28i14 -i simulation.jou
      /ANSYS/19.1/ansys_inc/v191/fluent/fluent19.1.0/cortex/lnamd64/cortex.19.1.0 -f fluent -g -i simulation.jou (fluent "3ddp -pmpi-auto-selected  -host -r19.1.0 -t1 -mpi=ibmmpi -cnf=k28i14 -path/ANSYS/19.1/ansys_inc/v191/fluent -ptrace -ssh")
      /ANSYS/19.1/ansys_inc/v191/fluent/fluent19.1.0/bin/fluent -r19.1.0 3ddp -pmpi-auto-selected -host -t1 -mpi=ibmmpi -cnf=k28i14 -path/ANSYS/19.1/ansys_inc/v191/fluent -ptrace -ssh -cx k28i14.:34149:46055
      Starting /ANSYS/19.1/ansys_inc/v191/fluent/fluent19.1.0/lnamd64/3ddp_host/fluent.19.1.0 host -cx k28i14.:34149:46055 "(list (rpsetvar (QUOTE parallel/function) "fluent 3ddp -flux -node -r19.1.0 -t1 -pmpi-auto-selected -mpi=ibmmpi -cnf=k28i14 -ssh") (rpsetvar (QUOTE parallel/rhost) "") (rpsetvar (QUOTE parallel/ruser) "") (rpsetvar (QUOTE parallel/nprocs_string) "1") (rpsetvar (QUOTE parallel/auto-spawn?) #t) (rpsetvar (QUOTE parallel/trace-level) 1) (rpsetvar (QUOTE parallel/remote-shell) 1) (rpsetvar (QUOTE parallel/path) "/ANSYS/19.1/ansys_inc/v191/fluent") (rpsetvar (QUOTE parallel/hostsfile) "k28i14") )"

                    Welcome to ANSYS Fluent Release 19.1

                    Copyright 1987-2018 ANSYS, Inc. All Rights Reserved.
                    Unauthorized use, distribution or duplication is prohibited.
                    This product is subject to U.S. laws governing export and re-export.
                    For full Legal Notice, see documentation.

      Build Time: Apr 09 2018 13:46:05 EDT  Build Id: 10123

      Info: Your license enables 4-way parallel execution.
      For faster simulations, please start the application with the appropriate parallel options.

           This is an academic version of ANSYS FLUENT. Usage of this product
           license is limited to the terms and conditions specified in your ANSYS
           license form, additional terms section.
      Host spawning Node 0 on machine "k28i14" (unix).

      Process is being spawned with command:
              /ANSYS/19.1/ansys_inc/v191/fluent/bin/fluent 3ddp   -flux -node -r19.1.0 -t1 -pmpi-auto-selected -mpi=ibmmpi -cnf=k28i14  -ssh -mportv ::45139:0

      /ANSYS/19.1/ansys_inc/v191/fluent/fluent19.1.0/bin/fluent -r19.1.0 3ddp -flux -node -t1 -pmpi-auto-selected -mpi=ibmmpi -cnf=k28i14 -ssh -mportv ::45139:0
      Failed to connect to k28i14



      I have tried different values (default, intel, openmpi) for the -mpi option, but this all gives the same result. Leaving out the -cnf option, fluent does start. But unless I am mistaking, it will not be possible to run on multiple nodes.

      I expect people will comment on this thread that there is something wrong with the ssh setup, but I can definitely tell that a simple ssh connection works without any password (it could still be there is a problem with launching a remote command over ssh, but I can't figure out which command fluent executes over ssh). As a further confirmation of this, with the newer Fluent version 2021R2, everything is working fine (i.e. I can do multi-node Fluent runs with that version).

      On a side note, version 19.1.0 is working fine on another cluster that is running CentOS 7.

      Do you have advice on what I could further try to get this working?

    • Nikhil Narale
      Ansys Employee

      Hello Steven, 

      Looks like you are trying to run Fluent 19.1 on the unsupported version of Linux. Kindly have a look at this document which lists supported platforms for Release 19.2: ansys-191-platform-support-by-application.pdf



Viewing 1 reply thread
  • You must be logged in to reply to this topic.