Mothur stops during shhh.flows

Hi everyone,

I am currently using the shhh.flows command on numerous dataset. To save time, and because my computer is not that powerfull, I use a HPC tu run the program. The jobs run fine when the file.trim.flom is not to big. But when it is larger (about a million Ko), the jobs stops because of what it seems to be a segmentation error:
segfault at 2aae4bc8b000 ip 0000000000da02fb sp 00007fffffffbf70 error 6 in mothur[400000+f0b000]
TERM environment variable not set.
/cm/local/apps/torque/var/spool/mom_priv/jobs/774025.SC: line 18: 36315 Segmentation fault mothur $HOME/batchfilebai.txt

I have talked with our HPC support, they don’t see any problem with my jobs, the problem seems to come from mothur.
(I’m using the latest official binary (v1.37.2) version)

Any idea?
Thank you!

Shhh.flows is very memory intensive and the error you are getting indicates a memory issue. If you are unable to process your data using the shhh.flows command due to memory limitations you can use the trim.seqs command with quality screening. Pat has an example of this in the 454_SOP analysis,

Thank you for your answer, I’ll use this alternative then!

Before giving up on trim.flows, you should make sure that your minflows and maxflows values are the same. When people select different values here they tend to run into the type of problems you are describing here.