Long-distance dependencies are notoriously difficult to analyze in a formally explicit way because they involve constituents that seem to have been extracted from their canonical position in an utterance. The most widespread solution is to identify a gap at an extraction site and to communicate information about that gap to its filler, as in What_FILLERdid you see_GAP? This paper rejects the filler−gap solution and proposes a cognitive-functional alternative in which long-distance dependencies spontaneously emerge as a side effect of how grammatical constructions interact with each other for expressing different conceptualizations. The proposal is supported by a computational implementation in Fluid Construction Grammar that works for both parsing and production.