I wonder if such a system can be used for "programming by example". Ie., generate by hand a bunch of example behaviors and then the system learns the program that can do that.
I was pondering that same thought here. Modeling user's intent is one of the more difficult things in end-user development - it would be great to have a more-or-less standard way to infer what abstraction the user had in mind when introducing an example that ought to be generalized into a program.
Yesterday's Program Synthesis Demo [1] with MS TouchDevelop shows that the approach is viable. What I find lacking in such systems is a way to correct the inferred program in those inevitable cases when the learning system guesses wrongly; having a language to model the possible explanations for the example would allow the user to tweak the program or give hints on how to correct it.