
We investigate the geometric limits of the Linear Propagation Assumption (LPA), the premise that local updates coherently propagate to logical consequences. For negation and converse, we prove that guaranteeing direction-agnostic first-order propagation necessitates a tensor factorisation separating entity-pair context from relation content. For composition, we identify a fundamental obstruction: composition reduces to conjunction, and any conjunction well-defined on linear features must be bilinear. Since bilinearity is incompatible with negation, this forces the feature map to collapse. These results suggest that failures in knowledge editing, the reversal curse, and multi-hop reasoning may stem from common structural limitations inherent to the LPA.