The worst notational abuse
I have lately been reading Programming Distributed Computing Systems: A Foundational Approach, by Carlos A. Varela, and so far been fairly impressed. The presentation has been clear, and the topics (the \(\pi\) calculus, actors, the join calculus, and mobile ambients) are things that I probably should already know more about than I do. But it has reminded me of the worst abuse of notation I am aware of, in theoretical computer science.
In the \(\pi\) calculus, computation occurs by processes exchanging messages across channels. Creating a channel (roughly the equivalent of introducing a new variable in a \(\lambda\) expression in that calculus) is done with the syntax:
$$ (\nu c)P $$where \(c\) is the new channels's name and \(P\) is the remainder of the process, where \(c\) is bound.
What is the Greek letter \(\nu\)? Nu. New \(c\), geddit?
Sigh.