How mentorAI Integrates with Open edX
mentorAI installs in Open edX as an LTI 1.3 Advantage tool, so a single OIDC‑signed launch JWT logs users straight into the AI mentor with their exact course and role while Deep Linking, Names & Roles, and Assignments & Grades services handle roster sync and real‑time score return to the Open edX gradebook. Instructors just drop an LTI component (XBlock) in Studio, choose mentorAI’s launch URLs, and the platform auto‑embeds AI activities as native units—all secured by the Sumac‑release LTI 1.3 implementation.
mentorAI (an AI-driven tutor/mentor platform) connects to Open edX using the LTI 1.3 Advantage standard. Open edX (Sumac release and later) is fully LTI Advantage certified, meaning it supports the core LTI 1.3 launch flow and the three Advantage services (Deep Linking, Assignments/Grades, Names & Roles). In practice, an instructor adds a LTI component (an XBlock) in Studio and configures it with mentorAI’s tool endpoints. Students then click the mentorAI link to launch the tool with single-sign-on, and data flows (user identity, context, scores, roster, etc.) are exchanged securely. The result is a seamless experience: mentorAI content appears as part of the Open edX course, students never have to log in again, and instructor-configured content and scores are synced automatically.
- Configuration: In Studio’s Course Outline, the instructor adds an *LTI Component* (XBlock) and sets “LTI Version = 1.3”. They enter mentorAI’s Tool Launch URL (redirect URL) and OIDC Login URL (login endpoint), as provided by mentorAI. They also paste mentorAI’s public key into the settings; this lets Open edX verify the signature on incoming LTI messages. (Studio then displays generated values – Client ID, Deployment ID, Keyset URL, etc. – which are copied into mentorAI’s tool configuration.)
- OIDC Launch Flow: When a user clicks the mentorAI link, Open edX initiates the LTI 1.3 launch using an OpenID Connect handshake. The platform (LMS) sends an authentication request to mentorAI’s OIDC login URL. mentorAI validates the request and responds with a signed ID Token (JWT) containing the launch parameters. This JWT includes claims for the user’s identity, role (Instructor or Learner), course context, resource link ID, and any custom parameters. Open edX verifies the JWT signature with the public key, establishes the user’s identity, and opens the mentorAI interface.
- Security and Data: By using OIDC and signed JWTs, the launch is secure. The JWT payload transports user and course information without extra login prompts. Because mentorAI trusts the JWT signature, it knows exactly which student or instructor has launched it, and under which course and unit. (Open edX’s LTI component explicitly calls out that the Tool Public Key is required to “check if the messages and launch requests received have the signature from the tool”.)
Deep Linking (Content Selection)
- Enable Deep Linking: In the LTI component settings, the instructor toggles Deep Linking to True and enters mentorAI’s *Deep Link Launch URL* (often the same as the normal launch URL).
- Configure in Studio: The LTI block in Studio now shows a “Configure tool link” button. Clicking this sends the instructor to mentorAI’s deep-link interface.
- Select Content: Within mentorAI, the instructor picks or creates the desired content (for example, an AI-generated quiz, study guide, or interactive tutorial). Once they finish, mentorAI returns a configured resource link to Open edX.
- Result in Course: Studio inserts the selected mentorAI content as an LTI resource in the course. Students will see this new activity in the unit. When launched, it goes directly into that specific mentorAI content (instead of a blank tool), thanks to the deep-link configuration.
Names & Roles Provisioning (Roster Sync)
- Enable NRPS: In the same LTI component settings, enable LTI Names and Roles Provisioning (NRPS).
- Roster Retrieval: With NRPS on, mentorAI can call the LTI *Names and Role Provisioning* service endpoint. This returns the course roster of enrolled users. For each user, mentorAI gets limited profile details (full name, email, username) and their role and enrollment status.
- Use Cases: Having the roster allows mentorAI to personalize interactions (e.g. addressing students by name) and to manage permissions (e.g. know which learners should have access to the course’s mentorAI). It also lets mentorAI verify which students are active in the course. (*Note: By default Open edX only returns NRPS data for courses with up to 1000 users for performance; an administrator can raise this limit if needed.*)
Assignment and Grade Services (Score Return)
- Enable AGS: In the LTI component, set the *LTI Assignment and Grades Service* mode (e.g. Programmatic for multiple grades).
- Passing Grades: When students complete tasks or quizzes in mentorAI, the tool can use the LTI Assignments & Grades (AGS) API to post scores back to Open edX. For example, mentorAI could submit a numerical score or percentage via LTI’s grade return calls.
- Gradebook Integration: Returned scores automatically populate the Open edX gradebook under the LTI component’s entry. Instructors see the mentorAI results alongside other assignments.
- Standards-Compliant: Because Open edX is LTI-Advantage certified, this grade passback works reliably. (Alternatively, a deep custom integration could use Open edX’s own grading APIs, but LTI AGS is the standard method.)
Embedding mentorAI in Courses
- LTI Consumer XBlock: The integration uses Open edX’s built-in LTI Consumer XBlock. This XBlock “implements the consumer side of the LTI specification enabling integration of third-party LTI tools”.
- Course Authoring: Instructors simply add the LTI XBlock to a unit (just like adding a problem or video). Once configured, it appears in the course outline. Students click it to launch mentorAI.
- Studio View: Instructors see the LTI block’s settings and can edit or reconfigure it. Deep-linked content appears as links (with the title from mentorAI). The Studio interface lets them manage all mentorAI LTI blocks without coding.
- No Extra Plugins Needed: No special client-side app is required; the LTI XBlock handles everything. (If desired, an advanced team could build a custom mentorAI XBlock, but it is not needed since the standard LTI workflow suffices.)
Streamlined User Experience
- Single Sign-On: Students and instructors experience one-click access. Clicking a mentorAI activity logs them in automatically via the LTI handshake (no separate username/password).
- Role-Specific UI: mentorAI knows the user’s role from the launch data. Instructors get access to course management and content-creation features, while students see the personalized mentor/tutor interface. The tool can even tailor questions or hints based on the user’s identity and course progress.
- Instructor Workflow: Instructors set up mentorAI content through Studio (deep linking or parameters), then mentorAI appears as part of the normal course flow. There is no need for manual roster exports or score entry, as those sync automatically.
- Scores and Feedback: Any grades or feedback from mentorAI flow back to the LMS gradebook in real time, so instructors can monitor student performance in one place. Students see their mentorAI quiz scores right in Open edX as well, completing the feedback loop.
Related Articles
How mentorAI Integrates with OpenAI: A Guide to Model Options and Deployment Flexibility
MentorAI’s guide walks campuses through plugging any GPT model—using a self-managed key or private Azure cluster—while keeping data FERPA-safe. Its middleware routes prompts, logs and meters token spend, and unlocks embeddings, Whisper, and DALL·E upgrades without changing course code.
Students as Agent Builders: How Role-Based Access (RBAC) Makes It Possible
How ibl.ai’s role-based access control (RBAC) enables students to safely design and build real AI agents—mirroring industry-grade systems—while institutions retain full governance, security, and faculty oversight.
AI Equity as Infrastructure: Why Equitable Access to Institutional AI Must Be Treated as a Campus Utility — Not a Privilege
Why AI must be treated as shared campus infrastructure—closing the equity gap between students who can afford premium tools and those who can’t, and showing how ibl.ai enables affordable, governed AI access for all.
Pilot Fatigue and the Cost of Hesitation: Why Campuses Are Stuck in Endless Proof-of-Concept Cycles
Why higher education’s cautious pilot culture has become a roadblock to innovation—and how usage-based, scalable AI frameworks like ibl.ai’s help institutions escape “demo purgatory” and move confidently to production.