The algorithm as it is defined is sound and complete in the usual depth-first, backtrack search sense. Clearly the parser may enter an infinite loop (in case non branching rules are defined that may feed themselves or in case a grammar makes a heavy use of empty categories). However, in case the parser does terminate one can be sure that it has found all solutions. The parser always terminates, in case the grammar solely consists of non-chain-rules which consumes some input, and chain-rules which either branch or consume input. An example of grammars that adhere to this condition, are the semi-lexicalized Tree Adjoining Grammars, discussed above. Another example of such a grammar, would be a constraint-based grammar, in which each non-chain-rule is a lexical entry, and each chain-rule branches.
A parser is called minimal iff it returns one solution for each possible derivation. As it stands the parser is not minimal in this sense. In fact, if the same word occurs more than once in the input string then we may find a solution several times. The problem comes about because a list is not an appropriate data structure for bags. For example, removing an element from a bag should not be non deterministic, but it is if we use lists. It is straightforward to encode bags with a more appropriate data structure such as the one found in some Prolog libraries, which we will adopt, with slight modifications, in the next subsection. This subsection can be skipped by readers, who indeed believe that such an encoding is possible.
The other subsections discuss ways in which to improve upon the efficiency of the head corner parser. These sections are somewhat technical, and may also be skipped, if the reader is willing to accept that several modifications are possible that improve upon the efficiency of the parser, defined so far.