--- title: "Per-Course and Per-Student Mentors on mentorAI" slug: "per-course-and-per-student-mentors-on-mentorai" author: "Jeremy Weaver" date: "2025-09-04 15:53:52.084225" category: "Premium" topics: "Per-course AI mentors Per-student AI assistants Course-scoped AI chatbot Personalized AI tutoring (higher ed) AI that cites course materials Document retrieval with citations (education) MentorAI per-course configuration Faculty prompt and pedagogy controls Domain-scoped AI safety Source Panel ranked citations Drag-and-drop course corpus (AI) Visibility toggle for sources (AI) Retrieval-augmented generation for courses University AI assistants (granular) Model choice per assistant (OpenAI, Gemini, Anthropic) Student history insights (optional) Academic integrity with AI citations Transparent AI answers for students Instructor-controlled AI behavior Higher-ed AI personalization" summary: "How mentorAI enables per-course and per-student assistants that answer with cited sources, follow instructor-defined pedagogy, and respect domain-specific safety—so campuses get precision, transparency, and control without the complexity." banner: "" thumbnail: "" --- At ibl.ai we’ve learned a simple truth from working with universities: precision beats “one big chatbot.” Course contexts differ, student needs diverge, and faculty pedagogy is not one-size-fits-all. That’s why mentorAI lets you scope assistants **per course**—and, when useful, **per student within a course**—so the AI behaves and answers with the right granularity. Faculty don’t want a Pre-Calc student getting Calc III answers, and they want control over how the assistant explains concepts. In mentorAI, that’s the default posture: **one mentor per course**, with the option to **instantiate per-student mentors** when you want personalized behavior. --- # Why Granularity Matters - **Rigor without spillover**. Each mentor can be trained only on that course’s materials, so answers don’t “leak” across levels or departments. - **Faculty voice preserved**. Instructors set the mentor’s **prompt and pedagogy**—tone, level, examples—so explanations match how they teach. - **Transparent answers**. With **Document Retrieval**, replies cite the exact lecture/slide/page and display a ranked **Source Panel**; one click opens the original file for verification and deeper study. - **Safety on top of safety**. A **custom moderation layer** lets you scope what’s “in bounds” for each mentor (e.g., refuse questions outside a course domain), layered over the base model’s alignment. # How Per-Course Mentors Work - **Create the mentor**. Give it a name and description; pick a language model (OpenAI, Gemini, Anthropic, etc.—your choice per mentor). - **Add the corpus**. Instructors **drag-and-drop** approved files (slides, PDFs, readings) into the mentor’s dataset. Retrieval is then limited to these sources. - **Set visibility**. For each file, use the **Visible toggle** to decide whether it appears in the Source Panel; hidden files can still inform answers without being shown, and you can flip visibility instantly—no retraining. - **Teach the teacher**. Adjust the **prompt/pedagogy** settings to guide explanations, examples, or steps appropriate for your learners. **Result**: When a student asks a question, the answer is grounded in that course’s materials and **cites** them, with sources ranked for inspection. # When To Go Per-Student (Within a Course) Sometimes you want finer control—an assistant that adapts to an individual’s progress or needs. In those cases, you can **spin up a mentor per student per course** (when you have the right context and approvals). - **Personalized scaffolding**. Calibrate the prompt to a learner’s background or goals while keeping answers sourced to the same course files. - **Optional insight for instructors**. If enabled, you can **track learner history** to spot common stumbling blocks and refine materials. Privacy and governance remain institution-controlled; mentors use only the data and scope you approve. # Faculty Experience: Simple First, Control When Needed Faculty have told us they want **“factory defaults” that work out of the box**—and settings they can adjust when they have time. mentorAI starts simple (create → add files → go), but lets instructors dial in: - **Prompt & pedagogy controls** (explainers, steps, tone). - **Model choice per mentor** (use the LLM that fits cost/performance needs at any time). - **Source visibility** (show or hide specific files without re-indexing). - **Domain-specific safety** (constrain answers to the course topic). # What Students See - **Clear, concise answers** grounded in the course’s own materials. - **Inline citations** and a **Source Panel** ranked by relevance; one click opens the original document to read more. - **Consistent explanations** that match how their instructor teaches (because the mentor’s behavior is set by the course team). # Where This Pays Off - **Academic integrity**. Students can verify every claim against the course’s own readings. - **Pedagogical alignment**. The assistant sounds like your course—not a generic chatbot. - **Operational agility**. Faculty can hide/reveal sources on demand and adjust prompts without re-training. - **Right-sized personalization**. Use per-student mentors where they add value; otherwise keep it simple with the per-course default. --- # In Conclusion If you want AI that mirrors your syllabus, cites your slides, and adapts at the **course** and **student** level—without sacrificing safety or faculty control—let’s talk. Visit **ibl.ai/contact** to see per-course and per-student mentors in action.