What is the maximum file size for the make.shared command? I used hcluster for a large file of sequences (9,700 KB), and now I need to group them and make shared trees. Every time I type the command make.shared mothur crashes. How can I get around this? I cannot delete redundant sequences because the point is to compare/contrast diversity across my samples.
I’m not following… Can you post the exact commands you are entering in order? To run make.shared you need a list and group file. Also, you can use mothur to build trees via clearcut using the unique sequences and then add the redundant sequences back in.
I am also having this issue. I am running the Windows 64-bit version. When I enter make.shared command everything seems ok for awhile, and then mothur crashes. The first time I tried to run it (using label=0.03), windows told me I was running out of memory and mothur flashed up an error message and then closed; the second time I tried it (using label=0.05) it just shut down by itself with no message from mothur or from windows. Is it possible that I don’t have enough memory on my machine? I am using 6GB RAM on my laptop, and have not had any issues until now.
Could you send your list and group file to firstname.lastname@example.org?
I’m pretty sure they are too big to email…the list file is ~5.9 GB and the group file is ~2.9 GB… the data set is huge ( a full lane of illumina sequences), which is why I thought it may just be too much for my machine to handle. I am running the make.shared command with the same files on an iMac with 8GB RAM and it seems to run ok, even though it has been computing for about 27 hours so far…