LLM Teaching-Assistant Prototype Interface
I threw together a quick prototype to demonstrate how large multimodal AI models might be used to supplement human TAs in delivering educational courses.
The idea driving this demonstration was that a relatively simple interface could be used to allow easy questions to be directed to an AI assistant with minimal friction. In this respect, you could think of the AI model as providing “level 1 tech support”, so the TAs only need to step in for more difficult or nuanced questions. This could help the same number of TAs more effectively handle a larger number of students.
Another point to highlight is that, with the right interface, the learning experience can easily come to resemble having a TA looking right over your shoulder every step of the way. It is not a stretch to imagine that people learn more quickly when they have access to 1-on-1 attention. An AI assistant may not be as effective as a human TA, but most people cannot afford to have a human TA actually looking over their shoulder throughout the entirety of the learning process. Persistent, full-time AI accompaniment, however, is quite feasible.
Proof of Concept
In the video below, you can see a browser window running the prototype interface alongside an instance of Eclipse IDE. The Eclipse window has been shared into the browser (using the screen-sharing API). In this way, the AI model is given access to “see” what the user is working on. Simultaneously, there is a persistent voice connection between the user and the AI model.
This allows me to talk about what I’m seeing on my screen, and get contextually-aware assistance from the AI. In the video, I play the first-year computer-science student and we see that the AI is capable of helping me out.