**Vortragende(r): Barbara Hammer (Bielefeld University)**

In a nutshell, intelligent tutoring systems (ITS) provide automated, personalised feedback to learners when performing some learning task such as learning how to program. While ITSs have made great strides in well-defined domains such as mathematics, it remains a challenge how to design individualized feedback in open ended domains where there is no single best solution to a task, and how to avoid time-consuming explicit expert generation of such feedback.

Prototype-based machine learning technology offers promising ways to automate this process, since these methods can structure a solution space based on given data only, and they allow to highlight or contrast exemplaric prototype solutions given a learner solution. This strategy relies on the core property of such models that they represent data in terms of typical representatives. Within the talk, we will mainly focus on modern variants of so-called learning vector quantization (LVQ) due to their strong learning theoretical background and exact mathematical derivative from explicit cost functions, and which has resulted in first promising attempts for feedback generalization in ITSs.

The use of LVQ in ITSs faces two challenges:

1) Data are typically non-vectorial, e.g. structured data such as sequences or tree structures are present; since classical LVQ models have been designed for euclidean vectors only, the question is how to extend LVQ technology towards non-vectorial data. We will present relational extensions of LVQ technology which enable its use for proximity data as provided by structure metrics such as alignment in a very generic way.

2) Structure metrics crucially depend on model parameters such as the scoring function, and their optimum choice is not clear. Still, the accuracy of such models crucially depends on a correct choice of these metric parameters. We will present recent results which allow to adjust structure metric parameters autonomously based on the given data and learning task only.