Error while compiling SALOME 9.14.0-native on Ubuntu 24.04

Basically, this command is problematic:

In your case, you need to check why this command is failing:

/usr/bin/xmlpatterns /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/CMake/paraview_servermanager_convert_xml.xsl home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/Plugins/AcceleratedAlgorithms/AcceleratedAlgorithms.xml

you can run it as command line to check the output message if any.

You can also instrument the code by adding messages like in file: SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/CMake/ParaViewClient.cmake (Line 585):


  foreach (_paraview_gpd_xml IN LISTS xmls)
    message(STATUS "CHECK: ==================")
    message(STATUS "CHECK: executing command: ${xmlpatterns} ${_paraview_gpd_to_xml} ${_paraview_gpd_xml}")
    execute_process(
      COMMAND "${xmlpatterns}"
              "${_paraview_gpd_to_xml}"
              "${_paraview_gpd_xml}"
      OUTPUT_VARIABLE _paraview_gpd_output
      ERROR_VARIABLE  _paraview_gpd_error
      RESULT_VARIABLE _paraview_gpd_result)
    message(STATUS "CHECK: xmlpatterns          = ${xmlpatterns}")
    message(STATUS "CHECK: _paraview_gpd_to_xml = ${_paraview_gpd_to_xml}")
    message(STATUS "CHECK: _paraview_gpd_xml    = ${_paraview_gpd_xml}")
    message(STATUS "CHECK: _paraview_gpd_output = ${_paraview_gpd_output}")
    message(STATUS "CHECK: _paraview_gpd_error   = ${_paraview_gpd_error}")
    message(STATUS "CHECK: _paraview_gpd_result  = ${_paraview_gpd_result}")

and share the output.
You can still disable this plugin, but this is not a suitable solution.

The result :
cd /home/serge/salome/SALOME-9.14.0-native-UB24.04/BUILD/ParaView/Plugins/AcceleratedAlgorithms/paraview_help && /usr/bin/cmake -Dxmlpatterns=/usr/bin/xmlpatterns -Doutput_dir=/home/serge/salome/SALOME-9.14.0-native-UB24.04/BUILD/ParaView/Plugins/AcceleratedAlgorithms/paraview_help -Doutput_file=/home/serge/salome/SALOME-9.14.0-native-UB24.04/BUILD/ParaView/Plugins/AcceleratedAlgorithms/paraview_help/AcceleratedAlgorithms_doc.xslt -Dxmls_file=/home/serge/salome/SALOME-9.14.0-native-UB24.04/BUILD/ParaView/Plugins/AcceleratedAlgorithms/CMakeFiles/AcceleratedAlgorithms_doc-xmls.txt -D_paraview_generate_proxy_documentation_run=ON -P /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/CMake/ParaViewClient.cmake
– CHECK: ==================
– CHECK: executing command: /usr/bin/xmlpatterns /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/CMake/paraview_servermanager_convert_xml.xsl /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/Plugins/AcceleratedAlgorithms/AcceleratedAlgorithms.xml
– CHECK: xmlpatterns = /usr/bin/xmlpatterns
– CHECK: _paraview_gpd_to_xml = /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/CMake/paraview_servermanager_convert_xml.xsl
– CHECK: _paraview_gpd_xml = /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/Plugins/AcceleratedAlgorithms/AcceleratedAlgorithms.xml
– CHECK: _paraview_gpd_output =
– CHECK: _paraview_gpd_error = xmlpatterns: could not find a Qt installation of ‘’

– CHECK: _paraview_gpd_result = 1
CMake Error at /home/serge/salome/SALOME-9.14.0-native-UB24.04/SOURCES/ParaView/CMake/ParaViewClient.cmake:600 (message):
Failed to convert servermanager XML: xmlpatterns: could not find a Qt
installation of ‘’

script.log (1.5 MB)

xmlpatterns don’t work :
serge@xps:~/salome$ xmlpatterns
xmlpatterns: could not find a Qt installation of ‘’

Hard de to guess. There is some inconsistency in your installation.
/usr/bin/xmlpatterns is nothing more than a link to /usr/bin/qtchooser
and you can “force” qtchooser to target qt5, by setting the environment variable QT_SELECT
You can try the following to check if it helps:

# following variable tells /usr/bin/qtchooser which version of qt to run
export QT_SELECT=qt5
xmlpatterns --help
rm -rf SALOME-9.14.0-native-UB24.04/BUILD/ParaView
SAT/sat -o "APPLICATION.properties.git_server='github'" compile SALOME-9.14.0-native -p ParaView --clean_all

if it fails, maybe someone else has some hints.

Below:
available Qt versions supported by qtchooser can be listed with
qtchooser -list-versions

In my case, I have:

/home/salome > qtchooser -list-versions
4
5
default
qt4-x86_64-linux-gnu
qt4
qt5-x86_64-linux-gnu
qt5
/home/salome > export QT_SELECT=qt5
/home/salome > xmlpatterns -version
xmlpatterns version 0.1 using Qt 5.15.13

Strange that I end up with two xmlpatterns. I will fix this problem later. With the export of QT_SELECT the ParaView compilation is valid.

Now I have problems with the compilation of cgal :
script.log (5.6 KB)
CMakeCache.txt (21.1 KB)

1 Like

Here you need to install: libmpfr-dev libgmp-dev zlib1g-dev zlib1g libboost-serialization-dev libboost-iostreams-dev libeigen3-dev libmetis-dev.
And of course, run:

SAT/sat -o "APPLICATION.properties.git_server='github'" compile SALOME-9.14.0-native -p cgal --clean_all

If OK, then resume compilation with:

SAT/sat -o "APPLICATION.properties.git_server='github'" compile SALOME-9.14.0-native

Now it’s c++ itself that crashes when compiling libigl;
CMakeCache.txt (66.2 KB)
script.log (118.8 KB)

quite strange here, given that i do have the same CmakeCache.txt contents. Here you can:

  • try again since libigl compilation is quite time & resources consuming indeed and a glitch may have occured). Run:

    
      SAT/sat  -o "APPLICATION.properties.git_server='github'" compile SALOME-9.14.0-native -p libigl --clean_all
    
  • if it still fails, edit file SAT_SALOME/products/libigl.pyconf and add just after;

       patches: ['libigl-v2.5.0-p01_609_Boolean.patch', 'libigl-v2.5.0-p02_FindCGAL.patch', 'libigl-v2.5.0-p03-FindBLAS.patch', 'libigl-v2.5.0-p04-FindEmbree.patch']
    

    the number of parallel make commands:

        nb_proc : 1
    

    Then rerun compilation for that product. If it resolves the issue, feel free to open a pull request in SAT_SALOME.

  • remove from SAT_SALOME/applications/SALOME-9.14.0-native.pyconf, libigl and mcut (which depends upon libigl). Libigl is used as one of the possible algorithms for boolean operations on meshes.
    Then resume compilation:

        SAT/sat  -o "APPLICATION.properties.git_server='github'" compile SALOME-9.14.0-native 
    
  • another solution would be to use an already compiled version that we can as a last solution implement.

Personally, I would rerun and then if it still fails reduce the number of processors.

I have just successfully built SALOME 9.14.0 :+1:
Thanks for your help !

1 Like

Hey,

I encountered similar issues with embree during its compilation (it’s the first thing failing during the overall compilation). I try to build it individually running:

./sat/sat -t -o “APPLICATION.properties.git_server=‘github’” compile SALOME-9.14.0-native -p embree

I think it may be related to my architecture arm64 because it cannot work with ispc.

Do you have any clue ? I joined the output of the embree compilation.

embree.txt (3,0 Ko)

SAT simply extracts an ISPC binary that was compiled on a different architecture, so you’ll need to compile ISPC yourself or use an ISPC binary which corresponds to the ARM architecture, if available. You might encounter other issues. If you find a working solution, feel free to fork SAT_SALOME and open a pull request.
See here:

Ok thank you, it’s what I did, I found a ispc tar for Linux running on ARM architecture. However, Embree is full of Intel builds (ISA, SSE2, AVX) that have no ARM matches.
I disabled them, but it seems that the kernels in Embree strongly rely on these and have been built around SSE2 .

It may be a really long work.

Newer releases of EMBREE may support ARM, but this needs verification. You could try a newer version to see if it works.
Alternatively, consider removing EMBREE and OSPRay to see if you can proceed. Be aware that dropping OSPRay will disable ray tracing support in ParaView.

Therefore, your options are to either try newer EMBREE versions or remove EMBREE and OSPRay.

To remove a set of products, consult the “rm_products” section in the application’s Python configuration file for implementation details:

The following code snippet shows you how OSPRay dependency is managed during ParaView’s construction:

Thank you Nabil. I proceeded through the recompilation and ARM update for ispc, embree, openVKL, rkCommon, and opsray. I’ll let you know if I end up with something compiling.

1 Like

Hey Nabil, iwas able to finish the SALOME compilaiton , I still have some issues, especially with the plugin: BLSURFPLUGIN and HYBRIDPLUGIN. This create issues for PARAVIS and SOLVERLAB. All the other modules compiled properly.

Let me know your opinion about the fiability of the tool in this configuration.

I’m able to launch paraview tho. Do you think it could be an interesting add to your repository ? If it’s the case, I’ll be glad to work on it and help you.

Thank you for your help.

Please share any errors you’re encountering. BLSURFPLUGIN and HYBRIDPLUGIN are primarily interfaces for the 3DS MeshGems mesher, which anyhow requires a license. I’m not sure why they would be affecting PARAVIS or SOLVERLAB."

Note that the MeshGems binaries are anyhow for x86-64 architecture and I doubt they run on your arm64 architecture, thus you can forget about these plugins to which you can add HexoticPLUGIN (one 3D mesher which belongs to this suite)

MeshGems/bin/Linux_64> file mg-hybrid.exe
mg-hybrid.exe: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, for GNU/Linux 2.6.32, BuildID[sha1]=b30e6a876836117f952a88443d4b6d660a5a0317, stripped

In addition, note that once you compile SALOME, you can check at this link, how to validate your construction (step 9 onwards):

Yes exactly I saw that MeshGems was only x86 arch. It compiles but won’t be usable so far, the reason why the pluggin are failing. Hexotic plugin actually got installed. When I launch the test in the python command I got 1 error:

======================================================================
ERROR: test_fields (fields_test.TestFields.test_fields)
Quick test for Fields module

Traceback (most recent call last):
File “/home/ludovic_jantzen/Documents/SALOME-9.14.0-native-UB24.04-SRC/INSTALL/FIELDS/bin/salome/fields_test.py”, line 45, in test_fields
import medcalc
File “/home/ludovic_jantzen/Documents/SALOME-9.14.0-native-UB24.04-SRC/INSTALL/FIELDS/lib/python3.12/site-packages/salome/medcalc/init.py”, line 54, in
from .medpresentation import MakeMeshView
File “/home/ludovic_jantzen/Documents/SALOME-9.14.0-native-UB24.04-SRC/INSTALL/FIELDS/lib/python3.12/site-packages/salome/medcalc/medpresentation.py”, line 24, in
import pvsimple as pvs
ModuleNotFoundError: No module named ‘pvsimple’


Ran 6 tests in 1.553s

FAILED (errors=1)

I presume that if you click on FIELDS, you get a crash?
Here, you need to check whether ParaView, GUI and PARAVIS installation are OK.
For ParaView, you need to check that ParaView/lib exists and is not empty

In addition, you need to check that this file: PARAVIS/lib/python3.12/site-packages/salome/pvsimple.py exists.

As a consistency check, compare the installation directory structure with the one of an x86-64 Ubuntu 24 archive using a tool like meld.

By following these steps, you can effectively check and ensure the integrity of your installation.

I have a problem with compiling MEDCOUPLING on Ubuntu 24.04. I’ve followed instructions from SAT wiki, installing SALOME-master-native (it is SALOME 9.14.0 version because of tag=V9_14_0-9-g426a3d4 in local settings of SAT, as I understand it).

In SALOME-master-native-UB24.04/LOGS/MEDCOUPLING/make:
/usr/lib/x86_64-linux-gnu/openmpi/include/mpi.h:404:47: error: cast from ‘void*’ is not allowed
404 | #define OMPI_PREDEFINED_GLOBAL(type, global) (static_cast (static_cast<void *> (&(global))))

I have OpenMPI 4.1.6, gcc/g++ 13.3.0

I had installed all prerequisites and additional utils from ub24-system.txt
make.txt (130.8 KB)
ub24-system.txt