# negative F value in 2-way permanova

I am performing a permanova using an OTU table and I want to examine 2 factors if they affect the composition of my OTUs
each of the factors is affecting significantly the bacterial OTU composition and I get for each one of them positive F values (as someone would expect)
However, the 2-way interaction between the factors is giving me a negative F-value!
I understand that in PERMANOVA it is a pseudo-F but still… what does a negative F value mean?

here is the output:
Two-way PERMANOVA
Permutation N: 9999
Source Sum of sqrs df Mean square F p
Taxonomy 5.4349E08 2 2.7175E08 9.6671 0.0001
Actino 1.4429E08 1 1.4429E08 5.1329 0.0001
Interaction -1.3142E08 2 -6.5711E07 -2.3376 0.0004
Residual 2.8391E09 101 2.811E07
Total 3.3955E09 106

any ideas?

PS: wasnt sure where to post this question, so I apologize in advance if I posted it in the wrong forum... :(

I googled “negative f permanova” your duplicate question on 2 other boards were first, this answer was third.

well, I have been obviously looking for other answers than the one in that forum and by the way I am not using R I am using PAST to run the permanova

as for my duplicated/triplicated question, is there a rule of not posting the same question in a different forum?

it’s generally not a great idea to spam a bunch of fora. The link I provided is applicable to permanova regardless what software you ran it in

most likely if your data is strongly non-Euclidean

You likely have an underlying gradient that’s relationship to the communities is a step function rather than a gradual change. This is common in communities which makes permanova inappropriate for evaluating that relationship.

in which case (if PERMANOVA is non-applicable) what would you use instead?

have you plotted your communities in an NMS? this will tell you if your permanova issues are due to outliers.

How to assess communities that are responding to non-linear gradients is a huge topic in macroecology, dive into that lit. I’ve used regression trees, machine learning, linear modeling, etc.