LS Dyna

LS Dyna

How to point LS Dyna license in High Performance Computing (HPC) cluster

    • mmia1
      Subscriber

      Hi,

      I am a graduate student of Louisiana State University, Baton Rouge. I have LS Dyna network license installed in my office desktop. Since I have to run thousands of simulations for my PhD project, I am planning to use Louisiana State University (LSU) High Performance Computing (HPC). In LSU HPC, we have LSDyna software installed already. However, the license, I have, has been installed in my work desktop in my office. Could anyone please guide me how I can point the license of my office desktop in HPC so that I can run simulations in HPC nodes? Thanks again.

    • Reno Genest
      Ansys Employee

      Hello,

      On all the HPC nodes, set the following environment variables:

      LSTC_LICENSE=network

      LSTC_LICENSE_SERVER=hostname_or_ip_of_desktop(license server)

       

      You will find the complete LSTC license installation guide here:

      https://ftp.lstc.com/user/download-instructions/License-Manager/LSTC_LicenseManager-InstallationGuide.pdf

      username: user

      password: computer

       

      Reno.

       

      • mmia1
        Subscriber

         

        Hi
         
        I was able to configure the license and now the cluster nodes can access my LSDyna network license installed in my office desktop. However, could you please suggest me which executable I should use for running LSDyna jobs. I have 1000 jobs to run in cluster. I am currently using ls-dyna_smp_d_r1000_x64_redhat59_ifort160 executable but when I select two nodes and 20 processors per node still it took one hour to finish one job only. Should I use ls-dyna_mpp_d_R101_winx64_ifort131_impi.exe version which is MPP? Could you please guide me what to do or what modifications needed in my batch script.
         
        Thanks again.

         

    • Reno Genest
      Ansys Employee

      Hello,

      SMP is good up to about 8 cores; beyond that there is almost no speedup. You need MPP for 20 cores across 2 nodes.

      I recommend using Intel MPI (impi). You will have to install the Intel MPI on your Windows cluster (if it is not already done). Intel MPI comes with the Ansys installation. If you don't have Intel MPI, you can use MSMPI (Microsoft MPI) which comes already installed on Windows.

      You can download the latest solver (R13) here:

      https://lstc.com/downloader/page.html

      username: user
      password: computer

      To run MPP, you have to point to the MPI first (mpiexec.exe file). Here is the LS-DYNA command I use to run Intel MPI (the location path of your mpiexec will probably be in a different folder; please change below command accordingly):

      "C:\Program Files (x86)\IntelSWTools\mpi\2018.3.210\intel64\bin\mpiexec.exe" -np 20 "C:\LS-DYNA\Dev_MPP_Intel-MPI_Double\ls-dyna_mpp_d_Dev_89768-gb61c685104_winx64_ifort190_impi.exe" i=C:\LS-DYNA_Runs\test\Test\2Cubes_R12.k memory=20m 

      You can modify your PBS script accordingly.

      Let me know how it goes.

      Reno.

    • Reno Genest
      Ansys Employee

      Hello,

      Also, maybe the size  of your  model is too small to use  20 cores. We usually try to have at  least 10-20k elements per core. So, if your model has 100 000 elements, the speedup should increase up to about 10 cores. Beyond that, each cores will have fewer than 10  000 elements and not enough work to do. In this case, the communication between the cores becomes the bottleneck and increasing the number of cores further can degrade the performance.

      Have you tried running on the cluster with the same number of  cores as on your desktop? Does the performance compare? For example, with 5 cores, you could run 4  jobs at the same time on the cluster. This may speedup running all your jobs. If the cluster  gives poorer performance with the same solver and the same number of cores with SMP, than maybe the CPUs on the cluster are not as performant as the ones on your desktop.

      Reno.

Viewing 3 reply threads
  • You must be logged in to reply to this topic.