Church, who gets dual credit for the theorem, showed the result in the context of lambda calculus. His calculus, as a model for logic, is broken: it proves a contradiction. The actual proof involves the Y-combinator, a lambda term that gives you arbitrary recursion and so infinite looping, and which can be used to create a term that switches back and forth between true and false.
The lambda calculus, as a logic, gets fixed by the introduction of types. Church's types give the simply typed lambda calculus, which forbids terms like the Y-combinator. It's possible, as first shown by Hindley and Milner, to write an algorithm that takes arbitrary untyped lambda terms and, if they can be simply typed, to assign them a type. This algorithm is present in a number of typed functional programming languages, including Standard ML, Haskell, Ocaml and F#. Simply typed terms are always strongly normalising: they never go into infinite loops. As Milner put it "typed terms cannot go wrong."
The Hindley-Milner type inference algorithm is therefore possibly the most famous algorithm for showing that a class of programs halt, but as lpetrich hints, there are infinitely many.
I need to get round to writing a blog post about these connections between logic and computation, but I'm still trying to figure out how to pitch it.