I'd say the very idea of using a corpus instead of picking examples out of thin air and pronounce them sound or unsound has affected all parts of linguistics.
If we are speaking really generally, Chomsky spoke at my school years ago and I thought he had an interesting insight into CL. He said few understand it for its potential to empirically and computationally verify different language models paper and pencil linguists, like himself, were creating. So UG for example, if it was a valid way to describe syntax, should be able to used in a computer application to emulate human processing of language, and should be good enough to show that model does work. A lot of the community, in my view when I studied CL in a school where paper and pencil ruled the land, saw it as way out there, unto itself. He believed really solid language models had to be re-constructed and tested, and that CL was where rubber hit the road, not the place where people were using heuristics to cheat at language cognition, as a lot of it has become in terms of economic viability.
An important — if often overlooked/ignored — contribution of CL/NLP to theoretical linguistics is implemented demonstrations of insufficiency. More specifically, implementing "pencil and paper" theoretical models derived from a small set of examples, and testing them on suitably large quantities of sentences can (and has!) reveal failures.
A particular example — not widely quoted in the theoretical community — is a paper available on the Rutgers Optimality Archive by Lauri Karttunen, entitled "The Insufficiency of Paper-and-Pencil Linguistics: the Case of Finnish Prosody", which demonstrates via computational implementation that published Optimality Theoretic analyses of the Finnish prosodic system fail unexpectedly on a certain class of inputs.
It might be argued that this isn't a contribution of CL/NLP per se, but the implementation of OT is only rendered possible via heavy use of Finite-State Machines, a core tool of modern CL/NLP.
I'll mention just one of the aspects I have seen myself.
When I spoke to Prof. Pollard, who, I suppose, can be considered to be a paper-and-pencil linguist, I believe he said that the reason he did HPSG, and not the more logic-driven approach he is pursuing now (CVG) was simply that he did not know logic at the time. The slides for his course on such a formalism are at http://www.coli.uni-saarland.de/courses/logical-grammar/page.php?id=materials. In any case, doing derivations for sentences in such a grammar is exactly like doing derivations in mathematical logic.
Coming to my point now, from what I've seen, which isn't much, it's not NLP/CL that's the cutting edge of linguistics, it's mathematics. But this does address the aspects you raised about computability and formal specification.
A note about Pollard's logical grammar: it addresses 3 aspects of sentences: phonology, syntax and semantics, all with exactly the same, logic-based methods.
Most theoretical linguists understand the value of the computer in modelling language as a means of verification. However, most computational linguists are focused on statistical models, which are productive for real-world application, but run counter to the field of theoretical linguistics. Statistics simply plays no role in human language.
The mismatch in approaches results in both fields largely ignoring one another.
Be that as it may, many theoretical models, such as LFG and HPSG, incorporate discoveries and approaches from computational linguistics. The fields are not entirely mutually exclusive, just for the most part.