Hi everyone,
I am currently using the shhh.flows command on numerous dataset. To save time, and because my computer is not that powerfull, I use a HPC tu run the program. The jobs run fine when the file.trim.flom is not to big. But when it is larger (about a million Ko), the jobs stops because of what it seems to be a segmentation error:
segfault at 2aae4bc8b000 ip 0000000000da02fb sp 00007fffffffbf70 error 6 in mothur[400000+f0b000]
TERM environment variable not set.
/cm/local/apps/torque/var/spool/mom_priv/jobs/774025.SC: line 18: 36315 Segmentation fault mothur $HOME/batchfilebai.txt
I have talked with our HPC support, they don’t see any problem with my jobs, the problem seems to come from mothur.
(I’m using the latest official binary (v1.37.2) version)
Any idea?
Thank you!