# ibl.ai Documentation # Generated: 2026-03-25 # Total pages: 115 # URL: https://ibl.ai/docs --- ## [ibl.ai](https://ibl.ai) URL: https://ibl.ai/docs/about # [ibl.ai](https://ibl.ai) ## Who we are We're all-in-one web and mobile AI tools for faculty support and student success: tutoring, content creation and automation. Full transparency, help-desk and training. Proven by the best universities and companies. Based in New York, NY and family-owned. ## Get started with mentorAI Get started with [mentorAI](https://mentorai.iblai.app) by chatting with an extensive catalog of mentors or create your own. You'll also be able to create your mentors and display them on external sites like [Syracuse University](https://its.syr.edu/) and [Fordham University](https://fordham.edu/undergraduate-admission/apply/what-were-looking-for/). Explore SaaS or enterprise partnerships at [ibl.ai/pricing](https://ibl.ai/pricing). ## Partnerships [Google ISV Partner](https://cloud.google.com/find-a-partner/partner/ibl-education) [Microsoft ISV Partner](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/iblai.mentorai?tab=Overview) [AWS ISV Partner](https://aws.amazon.com/marketplace/seller-profile?id=824b8600-b332-417d-b259-550630b627d8) ## Contact our team LinkedIn: [linkedin.com/company/iblai](https://www.linkedin.com/company/iblai) Chief Technology Officer: [Miguel Amigot](https://www.linkedin.com/in/miguelamigot/) Technical Support: [support@iblai.zendesk.com](mailto:support@iblai.zendesk.com) --- ## Credential Creation URL: https://ibl.ai/docs/configuration/badging/credentialcreation Video: https://www.youtube.com/watch?v=qP270PvUlsk # Credential Creation ## Purpose Configure a credential (certificate) for a course so learners automatically earn it upon completion or passing. --- ## Find the Course 1. Navigate to the **Discover** tab or your enrolled courses. 2. Select the course you want to attach a credential to. 3. Click on **Configuration** from the course overview page. --- ## Create a Credential 1. In the credential list, click **Add Credential**. 2. Fill in the fields: - **Name**: e.g., "Course Name Certificate" - **Description**: explain what the credential is for (e.g., "For passing the course") - **Issuer**: your platform/tenant name - **Credential type**: choose from: - Micro Credential - Certificate - Program Certificate - Pathway - Course Certificate - **Issuing signal**: when the credential is triggered: - **Course Completed** — issued at 100% completion - **Course Pass** — issued when the learner meets or exceeds the grading threshold - **Icon image** (optional): upload a custom icon 3. Click **Create**. --- ## Manage Credentials - Created credentials appear in the credential list on the Configuration page - You can **edit** or **delete** credentials as needed - Each course can have multiple credentials with different trigger conditions --- ## Key Takeaways - **Course Completed** triggers at 100% completion regardless of grade - **Course Pass** triggers based on the course's grading policy threshold - The **issuer** is automatically set to your platform/tenant - Credentials are issued to learners automatically when the trigger condition is met --- ## External Credentials URL: https://ibl.ai/docs/configuration/badging/externalcredentials Video: https://www.youtube.com/watch?v=ongBQqb_KIM # External Credentials ## Purpose Configure and manage external credential providers (e.g., badging vendors) and map your platform-issued credentials to those providers so credentials are issued externally while remaining managed within your platform. --- ## Where to Find It 1. Click your **tenant name**. 2. Go to **Advanced**. 3. Scroll to the bottom to find: - **Provider Configuration** - **Credential Mapping** --- ## Provider Configuration Use this to **add, edit, enable/disable, or delete** an external provider. ### Add or Edit a Provider 1. Open **Provider Configuration**. 2. Enter: - **Provider name** - **Configuration** (provider-specific settings) - **Enabled** (toggle on/off) 3. Click **Save**. ### Notes - Editing looks the same as adding, except fields are **pre-filled**. - You can **delete providers** you no longer need. - **Enabled providers** appear in the mapping step. --- ## Map Credentials to an External Provider Once a provider is configured, map platform credentials to the provider’s templates. ### Create the Mapping 1. Open **Credential Mapping**. 2. Select: - **Credential** - From the list created in your platform (typically from the course overview page) - **Provider** - **External Template ID** - The credential/template ID from the provider 3. *(Optional)* Add: - **Group ID** - **Additional metadata** 4. Click **Save**. ### What’s Editable - **Credential name** and **provider** are fixed after creation. - Optional fields (e.g., **group ID**, **metadata**, **external template ID**) can be added or updated. --- ## How Issuance Works - The credential is **created and managed** in your platform. - The mapping links it to the **external provider’s template**. - When issued, the credential is **issued by the external provider** using that mapping. --- ## Result You can self-manage **external credential providers** and seamlessly issue credentials through them—**without duplicating workflows or leaving the platform**. ### Transcript skillsAI | External Credentials https://www.youtube.com/watch?v=ongBQqb_KIM [1.4s] In order to self-manage credentials from [3.5s] external providers, we've added new [6.1s] settings to the advanced settings for [8.1s] the tenant. So, when you click on your [9.9s] tenant name, you'll navigate to this [11.9s] advanced tab. Now, at the very bottom, [15.1s] you'll see two new features. We have the [18.7s] provider configuration, which allows you [20.3s] to configure a specific provider um that [23.3s] you can connect with your credentials. [25.4s] Now, you can see that we've already got [26.6s] this set up for a credible, and I'll [28.3s] show you what the editing process and [30.3s] creation process looks like. Um, but I [32.7s] won't show you the credentials that are [34.0s] currently in place. You're also able to [36.3s] delete these as well. But the way this [39.0s] works when you're adding a provider [41.0s] configuration, it'll look the same when [42.4s] you're editing as well, just the only [44.1s] difference will be the adding is not [46.8s] filled out and editing will be filled [48.5s] out with what you've already added in. [50.8s] So you have to enter the provider name, [53.0s] the configuration [54.9s] um and decide whether or not this is [56.5s] enabled. Now once this is created, it'll [59.1s] show up on this list and we can actually [62.0s] map credentials now to the credentials [65.2s] created on our platform. So as you know [68.2s] you'll create a credential from the [69.4s] course overview pages. Um that is a [71.6s] credential that lives in our platform. [73.8s] It is stored with us but we want to [76.1s] connect it to an external provider. So [78.5s] the way you would do this is adding and [80.6s] you can see there's a whole list of [81.8s] credentials that have already been [83.0s] created and you can just select you know [84.6s] which credential you'd like. Select the [86.9s] provider and the external template ID [89.2s] that comes from the provider side. So [91.1s] this is a credential ID that says you [93.4s] know this credential belongs to this [95.2s] course. You also have the option of [97.8s] adding a group ID in and additional [100.4s] metadata. [102.0s] Now just to kind of give you an idea of [103.5s] what this looks like for a filled out [104.7s] credential. Um you can see we've given [106.6s] it the credential name in this case it [108.3s] matches exactly what the course name is [110.2s] and the provider. Um these two are not [112.9s] going to be editable um when you create [115.0s] these credential mappings [117.3s] but the other options are especially [119.3s] because they are optional fields. You [121.2s] can add additional metadata like I said [123.8s] um adding group IDs and external [125.8s] template ids. And what this does is [128.3s] essentially connects the template uh [130.5s] credential that you created in our [132.1s] platform on skills um via that course [135.8s] overview page [137.8s] um and connect it to the external [139.7s] mapping. So when the credential gets [141.1s] issued, it's actually coming from this [143.0s] external provider. And [146.2s] that's how easy it is. --- ## DM Creation URL: https://ibl.ai/docs/configuration/content/Courses/dmcreation Video: https://www.youtube.com/watch?v=rtGp3lyulFs # DM Creation ## Where to Start - Sign in as a **Super Admin**. - Open **Data Manager → Catalog App**. - You’ll import both **SCORM** and **video courses** from here. --- ## Prepare the CSV - Click **Import** to see a minimalist upload screen. - Download the **sample CSV** to review required fields: - `platform_key`, `course_name`, `section`, `subsection`, `unit`, `tags`, etc.—the same fields used in the single-course creation wizard. - Populate the CSV with one row per course you want to add. --- ## Import SCORM Courses - In **Catalog → Courses**, press **Import**. - Choose **File** and select your completed CSV. - Click **Submit**. - A **validation dialog** appears—review the parsed rows. - Click **Confirm Import** if everything looks correct. - If any row contains errors (e.g., duplicate course), the system flags it and cancels that row; fix and re-upload as needed. - A **success message** lists every course that was added. --- ## Import Video Courses - In **Catalog → Course Videos**, click **Import**. - Download the **sample CSV** to confirm field names (they’re shown on-screen). - Fill in the CSV, choose it, and run **Submit → Confirm Import** just like SCORM. - Success or error feedback appears immediately. --- ## Locate Imported Courses - After import, all new **courses and videos** appear in the **Catalog App**. - They behave exactly like any manually created course; no further setup is required. --- ## Learner View Check - Switch to the **Skills** front-end. - Open **Discover**. - Search for a course title from your CSV (e.g., SC B Upload). - The imported course is **visible and enrollable**, confirming a successful bulk upload. --- ## Error Handling Tips - Re-uploading an identical CSV row triggers a **“course already exists”** error—update the row or remove duplicates. - Always **validate before confirming**; the dialog catches missing or malformed fields. --- ## Key Takeaways - **Bulk import** is only for **SCORM** and **video course** types. - Use the **sample CSV** as your template and keep field names intact. - **Validation** ensures you don’t create duplicates or malformed entries. - Imported items instantly populate the **Catalog** and **Discover** views—no extra publishing step. With these steps you can mass-create courses in minutes, freeing super admins from repetitive single-course setup. --- ## Studio Creation URL: https://ibl.ai/docs/configuration/content/Courses/studiocreation Video: https://www.youtube.com/watch?v=kBY42Mkw5SY # Studio Creation ## Navigate to Your Course - Open **Studio** and head to **Courses**. - Select the course you want to edit (one already exists in this demo). --- ## Understand the edX Course Structure - **Section** - **Subsection** - **Unit** ← where all content lives (lectures, problems, videos, etc.) --- ## Add or Edit an Assessment in a Unit - Expand the course outline until you reach the target **Unit** (e.g., Module 3 → Assessment). - Inside the unit, click **Problems → Multiple Choice → Edit**. - Use the built-in template to create a question: - Delete **placeholder text**. - Paste your own **stem** and **answer choices**. - Click the **X** beside the correct answer. - Click **Save**. - The question renders instantly in the **unit preview**. --- ## Publish the Content - When the unit is complete, click **Publish**. - The material becomes live if the **course start date** is today or earlier. - If the start date is in the future, learners won’t see the unit until that date. --- ## Next Steps: Mapping Roles and Skills - After publishing, you can link this course (and its assessments) to **Roles and Skills** to map competencies—covered in the following step of the workflow. You’ve authored and published a multiple-choice assessment in Studio and prepared the course for competency mapping. --- ## ⁠Wizard Creation URL: https://ibl.ai/docs/configuration/content/Courses/wizardcreation Video: https://www.youtube.com/watch?v=WDeBBuDwqs8 # ⁠Wizard Creation ## Launch the Course-Creation Wizard - Log in as a **Tenant Admin**. - Go to **Authoring → Courses**. - Click **Add New Course**. - Instead of opening Studio directly, a **step-by-step wizard** appears. - You can still enter **Studio** later for advanced edits. --- ## Fill In Course Details (Page 1) - **Name** – the course title - **Description** – a brief summary (can match the name for testing) - **Duration** – estimated learning time - **Course Card Image** – upload an image learners will see in the Catalog - **Topics / Tags** – add keywords; press Enter after each one - Click **Next** when finished. - **Start dates** are auto-set behind the scenes, so you don’t have to manage them now. --- ## Build the Outline & Add Content (Page 2) - Define the initial structure: - **Section** (e.g., Section 1) - **Subsection** (e.g., 1.1) - **Unit** (e.g., 1.1.1) - Choose your **content type** for the unit: - **YouTube URL** – paste a link - **SCORM** – upload a SCORM package directly - Click **Submit**. - **Upload time varies**; SCORM files may take longer than a URL. --- ## Confirm Creation - After processing, the new course appears in **Analytics → Courses**. - From the **Analytics** listing you can: - Open **Studio** for deeper edits - Add **credentials or skill points** - **Enroll users** --- ## Verify Learner Access - Switch to the **Skills** front-end. - Open **Discover**. - Search for the course name. - The **course card** shows the image you uploaded; click **Access Course** to launch it. - If **SCORM** was included, it loads inside the **course player**. --- ## Roadmap Note - This wizard is the **first iteration**. - Future updates will add **more options** and simplify choices so authors can launch courses even faster. --- You’ve now created a complete course shell, uploaded initial content, and confirmed it’s visible to learners—all without leaving the new tenant-admin wizard. --- ## Studio Creation URL: https://ibl.ai/docs/configuration/content/Programs/studiocreation Video: https://www.youtube.com/watch?v=yzU2Tid16io # Studio Creation ## What a Program Is A **program** is a bundle of individual courses packaged under one umbrella title—ideal for grouping a sequence such as BIO 101, 102, 103, and 104 into a single “Biology” pathway. --- ## Launch the Program Builder - In **Studio**, open **Programs**. - Click **New Program** (green button at the top). --- ## Complete Basic Program Info - **Program Name** – overarching title (e.g., Biology) - **Organization** – the course-creator org (e.g., IBL) - **Program ID** – appears in the learner-facing URL (e.g., biology-ibl) - Click **Create** --- ## Add Courses to the Program - In a separate tab, open each course you want included. - Copy its **Course ID** from the URL (looks like `course-v1:ORG+CODE+RUN`) - Return to the program page and paste the **Course ID** into the field provided. - Repeat for every additional course. - Click **Save** to lock in the list. --- ## Enter Optional Metadata - **Start Date** – back-date to yesterday for immediate access, or schedule a future launch - **Description** – short blurb shown on the program’s about page - **Language** – primary language of instruction - **Level** – skill level (introductory, intermediate, advanced) - **Display Price** – visible cost if applicable - **Duration** – estimated total learning time --- ## Visual Settings - **Banner Image** – appears on the program about page - **Course Card Image** – thumbnail shown in the Catalog - **Credentials Toggle** – enable if completion should award a credential --- ## Publish & Verify - Click **Save** after each update - Programs become visible in the **Catalog** once the **start date** is reached and at least one course is live - Learners see the assembled bundle with your banner, description, and course list --- ## Key Takeaways - One **“New Program” wizard** bundles multiple existing courses - **Course IDs** are the link between standalone courses and the program - Optional metadata (dates, price, images) shapes the **learner-facing presentation** - Changes save instantly; you can edit the program at any time You’ve now created a fully packaged program that groups related courses into a cohesive learning pathway. --- ## Blackboard URL: https://ibl.ai/docs/configuration/datasources/blackboard Video: https://www.youtube.com/watch?v=k91vOq4CANg # Blackboard ## Purpose Connect a **Blackboard course** as a data source so a mentor can read course content and attached files, cite them, and answer questions from that material. --- ## Part 1 — Install the REST API Integration (Blackboard Admin) 1. In the **Blackboard Developer Panel**, create (or use) an application and copy its **Application ID** (this ID will be shared for installations). 2. In your Blackboard instance, open the **Admin Panel → search REST API Integrations**. 3. Click **Create Integration** and paste the **Application ID**. 4. Choose a user to link to the application (pick one who has access to the course). **Permissions noted in the demo:** - Authorized to act as user: **Not needed** - Any user access: **Yes** 5. Submit. The test application shows as **integrated**. --- ## Part 2 — Add the Blackboard Course as a Data Source (Mentor Platform) 1. In Blackboard, open the **target course** and copy the **course URL** from your browser. 2. In the mentor’s **Datasets**, choose **Blackboard** as the data source. 3. Paste the **course URL** and click **Submit**. 4. The document queues for **training**; once trained, it appears in the list. 5. *(Optional)* Mark it **Visible** so you can see cited content. --- ## How It Works in Chat - The mentor can answer questions from the **course’s text content** and **attached files** (e.g., PDFs). - Source snippets appear, showing exactly **where the answer was drawn from**. **Example (from the demo):** A question about **Las Casas’s writing** returns an answer about Spanish colonists’ treatment of Indigenous people, with snippets pointing to the relevant attached documents/sections. --- ## Auto-Retraining - By default, the Blackboard content is set to **auto-retrain every 7 days**. - You can reschedule this to any number of days. --- ## Result Your mentor now **ingests the specified Blackboard course and its attachments**, cites where answers come from, and stays **up to date via scheduled retraining**. ### Transcript ibl.ai | Blackboard | Course Integration https://www.youtube.com/watch?v=k91vOq4CANg [0.0s] Hello, this is Joseph from iBL.ai and I [3.1s] will be showing our integration to [6.0s] Blackboard. [7.8s] So um over here on the Blackboard [11.3s] developer panel just to showcase created [15.4s] just a sample application over here. Um [18.9s] of course in production this would be [20.6s] different but this is a test [22.8s] application. [24.4s] Now if we have an application like this, [26.4s] all we need here is this application ID [29.5s] which will be shared for installations [32.7s] and then the platform admin um would [36.7s] install this REST API extension into [41.2s] their Blackboard instance. So take for [43.8s] example I am here at Blackboard.ai. [48.2s] I need to go into the admin panel then [51.0s] search for REST API integrations. Let me [54.1s] zoom in. REST API integrations. Okay. [59.3s] Now that I am here, I will click on [61.8s] create an integration and then I would [64.8s] specify [66.3s] um the application ID. Okay. Now I need [71.0s] to specify a user for which I am going [74.3s] to um [77.5s] use. I think my user here will be [80.8s] administrator. Yep, I have my Blackboard [84.2s] administrator account here that I would [87.3s] link to this application. Of course, you [90.7s] could narrow down the permissions you [92.9s] want. In our case, all we need is the [95.4s] ability to read the content of the [97.5s] course and to download any files [100.0s] attached to it. So, best you choose a [102.8s] user who has access to that course. [106.6s] Okay. Um, so authorized to act as the [111.0s] user. We don't need that um any user [113.7s] access. Yes. And then we submit. And [117.0s] with that, we have successfully [119.4s] integrated this test test application. [122.8s] And that's it right here. IVL test. [125.3s] Good. Now, we want to be able to use [128.5s] this test course, a test course from um [134.4s] our system. So, what we would do, we [137.0s] would come on to our Blackboard [140.2s] um panel, search for the course we are [143.4s] trying to set up. Now, just copy the [147.3s] link to this course from the top of my [151.0s] browser here. [152.6s] come back and then navigate into my data [156.3s] set. I am going to load this um in as [162.5s] data source. For this I will choose um [166.1s] Blackboard as my [169.7s] data source here and I will paste the [173.0s] URL to this set course. Okay. Now let's [177.3s] submit this. And yes, the document is Q [179.9s] for training. Oh, I could submit that [182.5s] but no need. Let's um so let's let's go [188.1s] back and then I would um go back here. [192.7s] Oh, it's already trained. That was quite [196.2s] sharp. So you can see it over here and I [200.8s] am marking it as visible just so I can [203.0s] see the contents. Now I can go back. Um, [207.2s] not necessary, but just in case, let me [209.7s] refresh on my screen here. Now, you can [214.7s] check. Let's double check. This is the [217.4s] only content trained on this course. [219.8s] Okay. [221.4s] And [223.0s] we have this course here. It's just a [225.6s] dummy course. Okay. You just open it. [229.2s] There is um a dummy assessment. There is [234.5s] a PDF file attached. Okay. Our main [238.5s] focus will be on the attached files and [241.8s] then text content of the course. Okay. [245.3s] So, we are going to have our agent be [247.5s] able to answer questions based on this [250.4s] information. So let me ask it something [254.5s] um [256.2s] about uh [259.3s] um [261.8s] let me say [267.4s] grab some content here. Uh Lascasa's [270.2s] writing about Spanian's horrific [272.6s] treatment. Um so let me just ask [275.6s] something about this guys statement. [279.7s] Okay. So what did Lascas write [286.2s] about the uh Spaniards [290.2s] pronounced that wrong? [294.7s] And then we see um [299.2s] he wrote about the [301.8s] uh horrific treatment of the indigenous [304.3s] people by Spanish colonists over here. [308.6s] Okay. which if we go back to the horse [312.3s] that is exactly what we see you know [316.0s] about blackboard legend how it treated [319.8s] Indians. [324.0s] Now back here we can also see snippets [328.1s] from the documents attached where this [332.2s] um information were gotten from. Okay. [336.2s] So we can see about um [340.4s] uh the nations yearning to break free [343.2s] from the Spanish influence. We can see [346.6s] uh the snippets about black legends um [350.9s] and more. Okay, so we have all this [353.4s] information that the agent got fed in [356.6s] straight from the Blackboard course [358.4s] without us needing to manually re-upload [361.0s] them. Now this content, okay, can be [365.0s] configured and by default it is [366.9s] configured to auto retrain every 7 days. [370.9s] Okay, you could reshadow it to train in [374.7s] any number of days that you want. And [378.8s] well, I think that's one of the coolest [382.4s] things you could have [385.9s] for your course. Thank you very much. C. --- ## Dropbox URL: https://ibl.ai/docs/configuration/datasources/dropbox Video: https://www.youtube.com/watch?v=5yWOSRc4NiU # Dropbox ## Purpose Link **Dropbox files** to a mentor so its knowledge base can include those materials. --- ## Steps ### Open the Mentor’s Datasets - Click the **mentor’s name** - Select the **Datasets** tab ### Add a Resource from Dropbox - Click **Add Resource** - Pick **Dropbox**, then follow the **connect-and-select flow** ### Train the Mentor - After selecting the file, the mentor can be **trained** on that material --- ## Result The mentor now includes content from **Dropbox** in its knowledge base. ### Transcript mentorAI | Data Sources | Google Drive, Dropbox and Microsoft OneDrive https://www.youtube.com/watch?v=5yWOSRc4NiU [1.1s] When you're creating a mentor on mentor [3.0s] AI, you define the knowledge base that [5.5s] mentor is trained on. You do this by [7.4s] uploading documents, uh files, links, so [10.5s] on and so forth. But you can also [12.2s] actually use a couple of different uh [14.2s] file hosting options such as Google [15.9s] Drive, uh Microsoft, uh one drive, and [19.7s] Dropbox. Now, to do this, you'll have to [22.1s] connect your accounts. And to get that [24.3s] set up, all you have to do is click on [26.5s] the mentor name and then click the data [28.5s] sets tab. Once you're here, you'll be [31.3s] taken to a screen that looks like this. [32.9s] And you'll hit the add resource button. [35.0s] We'll start with Google Drive. [39.4s] And you can see here it'll pop up a [41.1s] little thing to connect my account. I'll [43.7s] select that. [48.6s] Now you can select any file from your [50.3s] Google Drive folder. It can even be a [52.4s] folder, uh, a specific document, so on [54.6s] and so forth. Now, to connect to one [59.7s] drive, you'll hit the same button, add [62.3s] resource, but instead of clicking Google [64.6s] Drive, you'll do one drive from [65.8s] Microsoft. [69.4s] It'll take you [72.1s] to your drive if you've already [74.4s] connected it. And if you haven't [76.2s] connected it, it'll take you through a [77.4s] series of prompts to log in and confirm [79.4s] you're allowing mentor AI to access your [81.8s] materials. You'll just select a file [83.8s] from here and then you can train your [86.5s] mentor on that file. And yeah, that's [89.0s] basically all you need to do to connect [90.6s] to your drives and upload materials from [93.1s] there. --- ## Google Drive URL: https://ibl.ai/docs/configuration/datasources/googledrive Video: https://www.youtube.com/watch?v=5yWOSRc4NiU # Google Drive ## Purpose Link **Google Drive files or folders** to a mentor so its knowledge base can include those materials. --- ## Steps ### Open the Mentor’s Datasets - Click the **mentor’s name** - Select the **Datasets** tab ### Add a Resource from Google Drive - Click **Add Resource** - Choose **Google Drive** - When prompted, **connect your Google account** and follow the steps - After connecting, pick any **file or folder** in your Drive ### Train the Mentor - After selecting the file or folder, the mentor can be **trained** on that material --- ## Result The mentor now includes content from **Google Drive** in its knowledge base. ### Transcript mentorAI | Data Sources | Google Drive, Dropbox and Microsoft OneDrive https://www.youtube.com/watch?v=5yWOSRc4NiU [1.1s] When you're creating a mentor on mentor [3.0s] AI, you define the knowledge base that [5.5s] mentor is trained on. You do this by [7.4s] uploading documents, uh files, links, so [10.5s] on and so forth. But you can also [12.2s] actually use a couple of different uh [14.2s] file hosting options such as Google [15.9s] Drive, uh Microsoft, uh one drive, and [19.7s] Dropbox. Now, to do this, you'll have to [22.1s] connect your accounts. And to get that [24.3s] set up, all you have to do is click on [26.5s] the mentor name and then click the data [28.5s] sets tab. Once you're here, you'll be [31.3s] taken to a screen that looks like this. [32.9s] And you'll hit the add resource button. [35.0s] We'll start with Google Drive. [39.4s] And you can see here it'll pop up a [41.1s] little thing to connect my account. I'll [43.7s] select that. [48.6s] Now you can select any file from your [50.3s] Google Drive folder. It can even be a [52.4s] folder, uh, a specific document, so on [54.6s] and so forth. Now, to connect to one [59.7s] drive, you'll hit the same button, add [62.3s] resource, but instead of clicking Google [64.6s] Drive, you'll do one drive from [65.8s] Microsoft. [69.4s] It'll take you [72.1s] to your drive if you've already [74.4s] connected it. And if you haven't [76.2s] connected it, it'll take you through a [77.4s] series of prompts to log in and confirm [79.4s] you're allowing mentor AI to access your [81.8s] materials. You'll just select a file [83.8s] from here and then you can train your [86.5s] mentor on that file. And yeah, that's [89.0s] basically all you need to do to connect [90.6s] to your drives and upload materials from [93.1s] there. --- ## Microsoft OneDrive URL: https://ibl.ai/docs/configuration/datasources/microsoftonedrive Video: https://www.youtube.com/watch?v=5yWOSRc4NiU # Microsoft OneDrive ## Purpose Link **Microsoft OneDrive files** to a mentor so its knowledge base can include those materials. --- ## Steps ### Open the Mentor’s Datasets - Click the **mentor’s name** - Select the **Datasets** tab ### Add a Resource from OneDrive - Click **Add Resource** - Select **OneDrive (Microsoft)** - If already connected, your drive opens; otherwise, **log in and approve access** - Choose the desired **file** ### Train the Mentor - After selecting the file, the mentor can be **trained** on that material --- ## Result The mentor now includes content from **OneDrive** in its knowledge base. ### Transcript mentorAI | Data Sources | Google Drive, Dropbox and Microsoft OneDrive https://www.youtube.com/watch?v=5yWOSRc4NiU [1.1s] When you're creating a mentor on mentor [3.0s] AI, you define the knowledge base that [5.5s] mentor is trained on. You do this by [7.4s] uploading documents, uh files, links, so [10.5s] on and so forth. But you can also [12.2s] actually use a couple of different uh [14.2s] file hosting options such as Google [15.9s] Drive, uh Microsoft, uh one drive, and [19.7s] Dropbox. Now, to do this, you'll have to [22.1s] connect your accounts. And to get that [24.3s] set up, all you have to do is click on [26.5s] the mentor name and then click the data [28.5s] sets tab. Once you're here, you'll be [31.3s] taken to a screen that looks like this. [32.9s] And you'll hit the add resource button. [35.0s] We'll start with Google Drive. [39.4s] And you can see here it'll pop up a [41.1s] little thing to connect my account. I'll [43.7s] select that. [48.6s] Now you can select any file from your [50.3s] Google Drive folder. It can even be a [52.4s] folder, uh, a specific document, so on [54.6s] and so forth. Now, to connect to one [59.7s] drive, you'll hit the same button, add [62.3s] resource, but instead of clicking Google [64.6s] Drive, you'll do one drive from [65.8s] Microsoft. [69.4s] It'll take you [72.1s] to your drive if you've already [74.4s] connected it. And if you haven't [76.2s] connected it, it'll take you through a [77.4s] series of prompts to log in and confirm [79.4s] you're allowing mentor AI to access your [81.8s] materials. You'll just select a file [83.8s] from here and then you can train your [86.5s] mentor on that file. And yeah, that's [89.0s] basically all you need to do to connect [90.6s] to your drives and upload materials from [93.1s] there. --- ## LLMs URL: https://ibl.ai/docs/configuration/llms Video: https://www.youtube.com/watch?v=6_s7E7oB6ds # LLMs ## Why mentorAI Is LLM-Agnostic ![](/images/llms.png) - Let each mentor use the language model best suited to its subject. - **Math mentor**: choose a model optimized for symbolic reasoning and step-by-step solutions. - **English composition mentor**: switch to a model fine-tuned on rhetoric or literary analysis. - Improves clarity, accuracy, and learning outcomes within each domain. - Future-proofs your mentors as specialized models continue to appear. --- ## Two Ways to Open the LLM Selector ### Click the Model Name - In the mentor card, click the current LLM’s name. - The list of all LLM providers appears. ### Use the Mentor Dropdown - Click the mentor’s name. - Select **LLM** from the dropdown menu. - You land on the same provider list. --- ## Choose a Provider and Model - Click a provider (for example, OpenAI or Google). - A list of that provider’s models is displayed. - Select the model you want—e.g., `o1-preview`. - A **“Successfully changed”** notification confirms the switch. - Repeat anytime to pick another provider or model. --- ## Adding New or Custom Models - If a provider or model isn’t visible, it can be added to the platform quickly. - mentorAI stays current by implementing new models as soon as they are released. --- ## Final Step: API Keys (If Required) - Some models may need you to supply your own LLM keys. - Enter the keys when prompted; afterward, the mentor will run on the chosen model. --- That’s all it takes—select, confirm, and your mentor is now powered by the model that best fits its subject. ### Transcript mentorAI | LLM Configuration https://www.youtube.com/watch?v=6_s7E7oB6ds [0.5s] AI is an LLM agnostic platform. Being [3.4s] LLM agnostic lets instructors match the [5.8s] large language model strengths uh to the [8.1s] sub subject matter rather than forcing [10.1s] every single mentor into a [12.3s] one-sizefits-all engine. For example, um [15.2s] a math mentor can tap into a model [17.0s] optimized for symbolic reasoning and [19.0s] step-by-step equation solving, while an [22.0s] English composition mentor um can switch [24.8s] to a model fine-tuned uh on rhetorical [27.2s] structures or literary analysis or both. [30.3s] Um each would be delivering more clear, [32.7s] more accurate feedback within its own [34.2s] domain by selecting an LLM that meets [36.6s] the needs for that specific mentor. Um [39.6s] this flexibility not only lists learning [41.8s] outcomes but also futurep proofs uh you [44.5s] know your mentors with these specialized [46.9s] models. Now to switch your mentors model [50.4s] you have a couple of different ways of [51.7s] doing this. You can select the LLM model [53.8s] here which will pull up the list of LLM [56.3s] providers we currently have. Now we also [59.0s] can go to mentor AI right here which is [61.2s] the mentor we're currently using and [63.4s] select LLM. This will take you to the [65.3s] same page. Now, this is going to show [67.2s] you the providers. Um, now if if there's [70.0s] a provider or a specific LLM model that [71.9s] you don't currently see here, it can [73.4s] very well be added very easily. In fact, [76.2s] we try to keep this updated as soon as a [77.8s] new model comes out. That's what we uh [80.1s] implement into our platform. And now, [83.4s] now we have the providers, but what we [85.1s] can do is open up a select provider. So, [87.7s] I'm going to select OpenAI. And you can [89.6s] see here that there are multitude of [91.4s] models you can select from. So in this [94.2s] case, we can go with 01 preview and you [96.3s] can get a notification down at the [98.2s] bottom that just says, "Hey, this has [99.6s] successfully been changed." Um, and if [102.3s] we wanted to even go forward with a [104.5s] different model, you know, different [106.2s] provider, all of that, we can go to [108.2s] Google and then we can select which one [110.5s] we want. It's super simple to change [111.8s] your model. You don't really have to do [113.8s] much more than that. Now, you might need [116.0s] to provide your LLM keys depending on, [118.7s] you know, whatever we've worked out with [120.5s] you. But, uh, this is how you would [122.9s] switch your LM model. --- ## Administration URL: https://ibl.ai/docs/configuration/lti/administration Video: https://www.youtube.com/watch?v=oGJeqkvaS08 # Administration ## Purpose Control who can integrate and use a mentor via **LTI** from your LMS (e.g., Canvas), and who can see the mentor inside the **mentorAI web app**. --- ## LTI Accessible Toggle (Per-Mentor) 1. Open the mentor’s **Settings**. 2. Find **LTI accessible** and toggle: - **On** → The mentor appears in your LMS’s External Tool / Deep Linking picker and can be added to a course. - **Off** → The mentor does not appear in the LMS picker; any existing LTI link will show an error after refresh. ### Canvas example (when LTI accessible = On): - In a course: **Add External Tool → choose your mentor integration → select the mentor (e.g., AI Socratic Mentor) → Add Item → launch and chat.** - If you later toggle **Off** and **Save**, refreshing the Canvas item shows an **error**, and the mentor is **removed** from the add-list. --- ## “Administrators can view” (Web-app visibility) - If the mentor is marked **admin-only** in the mentorAI web app **and** LTI accessible is **On**: - Other users won’t see the mentor in the web app - But **students can still access it in the LMS** - Use this to keep a mentor **hidden in the web UI** while leaving **LMS access intact**. --- ## Who Sees Mentors in the LMS Picker - **Canvas admins:** - Do **not** need a mentorAI account - See **all mentors** available to the LMS environment (system-admin level) - **Instructors:** - **Must** have a mentorAI account using the **same email** as in the LMS - This filters the LMS picker so instructors only see **mentors they created** --- ## Typical Workflow 1. In mentorAI → open the mentor’s **Settings** → toggle **LTI accessible = On** → **Save**. 2. *(Optional)* Set **Administrators can view** if you want the mentor **hidden in the web app but still usable from the LMS**. 3. In Canvas: - **Add External Tool → choose your mentor integration → select the mentor → Add Item → launch** 4. To **revoke LMS access**: - Toggle **LTI accessible = Off** → **Save** - Existing LMS links **error after refresh** and the mentor **disappears** from the add-list. --- ## Results & Expectations - **On** → mentor appears in LMS picker; launchable in courses. - **Off** → mentor disappears from LMS picker; existing links fail on refresh. - **Admin-only (web) + On (LTI)** → mentor hidden in web app but available to students in LMS. --- Use these controls to manage **LTI visibility and access** without exposing mentors broadly in the web app. ### Transcript mentorAI | Managing LTI Access https://www.youtube.com/watch?v=oGJeqkvaS08 [1.0s] I'm going to be showing you some updates [2.5s] that we've made to LTI to help you [4.6s] manage who can and cannot access your [7.5s] mentors. Now, under the settings of a [9.9s] given mentor that you would like to [11.7s] integrate or prevent people from [13.5s] integrating via LTI, you'll see this [16.0s] little LTI accessible toggle. Now, if [18.7s] this is toggled on, that means we're [20.5s] going to be able to LTI integrate into a [22.6s] given LMS. In this case, we're going to [24.6s] use this Canvas course. And what we'll [27.9s] do is click module. Oop, sorry. We're [30.6s] going to click, you know, just add a new [33.9s] piece to the module with an external [35.8s] tool. I'm going to select the mentor [38.2s] integration. [41.0s] And you can see here that we have this [44.0s] mentor right here, this AI Socratic [45.8s] mentor, which is the one that we're [47.2s] looking at over here. [49.7s] Um, so what I'm going to do is hit [51.8s] select content, [55.0s] add item, [57.1s] and you'll see that once we go to this [59.7s] item, [64.3s] you'll actually be able to chat with the [65.8s] mentor. [67.5s] Now, if that LTI toggle is off, it won't [70.0s] show up in the list of accessible LTI [74.7s] components to add to this course. And [77.0s] I'll show you what that looks like in [78.2s] here in just a second. [80.4s] So, we're going to go back over here. [83.3s] And now what we're going to do is toggle [84.6s] off LTI access. And then we're going to [87.3s] hit save. [89.7s] Now, if we go back to the course over [91.5s] here, we'll refresh. And we should [93.4s] expect to see an error because this is [95.0s] no longer LTI accessible. And there you [97.8s] go. Now, another really cool thing here [100.2s] is that if this is marked as LTI [102.2s] accessible uh and it's marked as [104.7s] administrators can view, this basically [107.4s] prevents other people from seeing your [109.0s] mentor um on this platform, but within [112.4s] the Canvas or LMS environment, you and [115.8s] your users can still chat with it. So, [118.2s] I'm going to hit save here and then [121.2s] refresh. [127.5s] Now you can see this is back up and [129.3s] accessible to me despite this being [131.3s] admin only. This means just basically [134.2s] it's admin only on the web app for [136.4s] mentor AI but your students can still [138.3s] chat with the mentor in Canvas. Now the [141.1s] other thing here is if this is LTI [145.8s] uh if the LTI accessibility is turned [147.7s] off you'll hit save over here again. And [150.5s] then if we're going to add the same [153.8s] mentor again to the course with the [156.5s] external tool setup, [160.5s] you'll actually see that list. That [162.0s] mentor doesn't appear on this list [163.3s] anymore. [165.4s] So that's basically how this is working [167.9s] within a given LMS for managing your [171.1s] users and access to the mentors better. [173.6s] Another thing is that Canvas admins do [177.0s] not need to be registered on our [178.5s] platform to be able to see all available [180.9s] mentors to a given LMS environment as [184.6s] long as they're uh system admins. Now, [187.1s] instructors will need to have an account [189.0s] on our platform that with the same email [191.4s] address that they use in the LMS. And [193.3s] this is just so that we can filter out [195.4s] any mentors that they did not create [197.1s] themselves. They're only seeing the ones [198.5s] that they've created. [200.9s] Back over to mentor AI. [203.8s] just to explain the ultimate end goal [205.8s] here, which is you don't want other [208.1s] instructors to be able to see your [209.3s] mentor from the mentor spa unless you [211.8s] give them that access or um you don't [214.7s] you want your students to be able to [215.8s] access the Canvas course and be able to [218.5s] chat with the mentor. Um and again, [221.0s] you're just preventing people from [222.1s] seeing this on the web app that you [223.3s] don't want seeing it, but you're still [224.3s] giving your students access. --- ## Blackboard URL: https://ibl.ai/docs/configuration/lti/blackboard Video: https://www.youtube.com/watch?v=VDBKUMKWW2Y # Blackboard ## Prerequisites - Obtain the **Client ID** and the **Target Link URI** (tool launch link) from the **mentorAI** team. - Obtain the **LTI launch URL** (and any other parameters) if you’ll add the tool directly at the course level. - Be an **instructor (or admin)** with permission to add **LTI tools** in your Blackboard course. ## Admin Setup (One-Time) 1. In the **Admin Panel**, open **Integrations → LTI Tool Providers**. 2. Register a **new tool** (or edit an existing one). 3. Paste the **Client ID** and **Submit**. 4. The registration form auto-populates from the Client ID. 5. Optionally, choose user info to send (e.g., name, email). - Grade services, membership services, and user acknowledgment are optional in this demo. 6. **Create Placements** for how the tool appears in courses: - **Course Content Tool** – adds the mentor inside courseware. - **Deep Linking** – opens a content picker to choose specific mentors/resources. - **Course Tool** – adds an always-available course-level entry point. For each placement: - Give it a **Name** (and optional Description). - Set a unique **Handle**. - Mark **Available** and set **Placement Type** (Course Content, Deep Linking, or Course Tool). - For **Course Content Tool**, enable **Allow Grading** if you want grade passback. - Ensure **Allow Student Access** is on when ready for learners. - Set the **Target Link URI** (same as the tool launch link for standard launches; deep linking uses its specific launch). - **Save** your changes. ## Blackboard Ultra – Workflow A (Teaching Tools with LTI Connection) 1. Open your course in **Blackboard Ultra**. 2. Navigate to the folder/area where you want the mentor. 3. Click **Create → Teaching Tools with LTI Connection**. 4. Fill the form: - **URL:** paste the mentorAI launch URL. - **Name:** e.g., “EN Comp AI”. - **Description:** short friendly description. - **Open in a New Window:** enable if you prefer a separate tab. 5. **Save** and toggle **Visible to Students**. 6. Test by clicking the item (it opens embedded or in a new window per your choice). ## Blackboard Ultra – Workflow B (Content Market using a Placement) 1. Open the course. 2. Click **Add Content → Content Market**. 3. Select your **Course Content Tool placement** (created in Admin Setup). 4. The item is added; launch it to start chatting with the mentor. ## Blackboard Original (Legacy Interface) 1. Open the classic-layout course. 2. Go to **Build Content → Web Link**. 3. Provide: - **Name** of your AI mentor. - **URL** (mentorAI launch link). - Check **This link is a tool provider** (marks it as LTI). - Optional **Description**. 4. **Submit** and launch from the new link. ## Result - **Admin Setup:** Register the tool with **Client ID**, create **Placements**, set the **Target Link URI**, and enable student access. - **Ultra:** Use **Teaching Tools with LTI Connection (Workflow A)** or **Content Market with your placement (Workflow B)**. - **Original:** Build Content → Web Link → Tool Provider. Always use the **URLs/IDs provided by mentorAI** and make the item **visible** so students can access the assistant. --- # Course to Deep-Link ## Purpose Set up a **mentor for LTI Deep Linking** so you can add it to your **LMS** (e.g., Blackboard) yourself—assuming the base integration exists and deep linking is enabled. ## 1) Get the Mentor ID 1. Open the mentor you want to integrate. 2. Copy the **unique identifier** from the mentor’s URL. ## 2) Create the Course in Studio 1. Go to **[ibl.ai Studio](https://studio.learn.iblai.app)**. 2. Create a new course (e.g., “Socratic Mentor”) and choose the organization. ### Settings → Schedule & Details - Set the **Course Start Date** to a time in the past. - Set **Enrollment Start** to at least a day before the start date. - Click **Save**. ### Settings → Advanced Settings - In the **advanced modules list**, add: `"ibl mentor_xlog"` *(This enables the mentor component.)* - **Save** changes. ## 3) Add the Mentor Component 1. Go to **Outline → add Section → Subsection → Unit**. 2. Click **Advanced → Add New Component → Mentor**. 3. Click **Edit** and paste the mentor’s **unique ID** you copied earlier. 4. Set the **Display Name** (e.g., “Socratic Mentor”). ### Optional Settings (from the demo) - **Context Awareness:** Enter your LMS domain; copy it into the **Mentor Domain** and **Domain** fields. - **Anonymous:** Toggle if you want users to chat without authentication. - **Advanced View:** Enable tabs like **Summarize, Translate, Expand**. Click **Save** and **Publish** the unit. ## Result Your mentor is **published in Studio** and **selectable via deep linking** in your LMS—letting you add mentors to courses on your own. ### Transcript mentorAI | Deep Linking | Connect New Mentors to Your LMS https://www.youtube.com/watch?v=gfqIb8RyIxs [1.1s] Hello. Today I'm going to be talking [2.6s] about how you can more easily selfs [4.2s] serve when it comes to LTI integrations [7.0s] and and adding mentors to your own LMS. [9.9s] Essentially the process that we do is [11.7s] once somebody says a mentor is ready to [13.7s] go, we'll add it to a course in our own [15.9s] LMS which is edex. Um and that way the [18.8s] content can actually be served. The [20.2s] mentor can be consumed within you know [22.5s] whatever LMS you have. That could be [24.2s] Canvas, Brightpace, Blackboard, so on [26.3s] and so forth. Now, assuming we already [28.6s] have a base integration going um and you [31.3s] have deep linking enabled, the first [33.1s] thing you'll do is find the mentor that [35.0s] you're ready to integrate. You'll see in [37.2s] this URL bar, there's going to be a a [39.4s] unique identifier. What you'll do is [41.8s] copy this. Now, what you'll do is you'll [44.7s] go to our studio, which is going to be [47.0s] at studio.learnibai.app. [51.8s] So, we're going to come over here, and [53.7s] what we'll do is actually create a new [55.1s] course. So create a new course. We're [57.2s] going to say Socratic [60.5s] mentor [68.7s] and then you'll select the specific [70.2s] organization [73.3s] and then you'll hit create. [78.4s] Now once you're here, what I usually do [80.5s] first is go to schedule and details [82.1s] which is under the settings tab right [84.8s] here. So, you'll just click this and [85.9s] click schedule and details. You want to [87.8s] make sure this course start date is, you [90.3s] know, sometime in the past and the [92.2s] course enrollment date right here is at [95.7s] least a day before. It just has to be [97.8s] sometime before the actual start date um [99.9s] set for the course. So, we've got that [102.2s] selected and what we'll do is save. [105.4s] And then once that's saved, I'll go to [107.4s] advanced settings. [131.5s] All [144.1s] right. Once we're in advanced settings, [146.0s] what we'll do is click click into this [149.0s] little in its module list. And what you [151.0s] do is type in quotation mark ibl [155.0s] mentor_xlog. [158.0s] What this will do is enable the [160.0s] component inside the course that you can [162.4s] use to add in that that ID. So, we'll go [165.7s] ahead and hit save changes. [169.0s] Now, the next thing we'll do is go to [170.9s] the outline. [174.4s] And you don't really have to worry too [175.9s] much about, you know, the display names [177.8s] and settings and all that other than one [179.6s] than the ones that I've mentioned. [182.1s] You'll hit section, new subsection, and [184.8s] then click on new unit. This is where [186.5s] the content actually lives. [191.0s] So again, this is where the content [192.7s] lives. So we'll click on this advanced [194.6s] add new component button. Click on [197.4s] mentor and then we'll hit edit. And just [199.6s] to go back, we're going to get this same [201.8s] this string. It's a unique identifier [204.2s] for your mentor. [206.7s] So we're going to put this in [209.4s] and then we're going to put socratic [211.5s] mentor. [214.4s] And this will be the name that shows up [216.2s] on the list when you are adding this to [219.0s] your own LMS. [221.7s] All right. And then if you want to have [223.1s] any additional settings like context [224.8s] awareness, what you would do is put in [226.6s] your domain for your own LMS. So that [229.7s] could be like in our case it would be [231.4s] learn.ai.app. [235.1s] And then you can for the mentor domain [237.7s] and domain you can literally just copy [239.3s] and paste these in [241.6s] to these fields like this. [245.8s] And you can also choose whether or not [247.8s] this is anonymous, meaning users do not [249.8s] have to have an authenticated account to [252.7s] to talk with the mentor. And this is [255.0s] where you'll set context awareness. As [256.5s] long as you have that domain up in this [258.2s] field, you should be good to go. Um, we [260.7s] also have an advanced view has specific [262.5s] tabs like summarize, translate, and [264.8s] expand if that's a view that you would [266.2s] like to have in your course. And then [268.2s] we'll just simply hit save. [272.2s] And then you'll want to make sure this [273.7s] content is published. [277.8s] Once we have that, what we'll do now is [279.8s] go over to the Canvas side. Um, this is [282.2s] an example LMS. So, you can see that [284.7s] this is actually um now going to show up [287.7s] on your deep linking selection of tools [289.9s] to choose from. You know, which mentor [291.6s] you would like to integrate. [294.3s] So, now we're in Canvas and what we'll [296.2s] do is select the mentor that we want to [299.0s] use here. So I'll click on you know add [302.2s] new item. We're going to hit external [304.2s] tool and what we added on the canvas [307.3s] side prior to this uh because this is a [309.4s] current integration. This is the name of [311.4s] the deep linking tool. [314.6s] And once this populates what you'll be [316.5s] able to see is actually the Socratic [318.9s] mentor we set up. So that way you can [321.1s] actually just add members as you please [323.4s] to your course and have this be a little [325.7s] bit more self-s serve. And that's it. --- ## Brightspace URL: https://ibl.ai/docs/configuration/lti/brightspace Video: https://www.youtube.com/watch?v=xePQv8VC8Cc # Brightspace ## Prerequisites ![](/images/brightspacelti.png) - A **System Admin** account in Brightspace. - All LTI 1.3 parameters from mentorAI: - **Domain** - **Redirect URL(s)** - **OpenID Connect Login URL** - **Key-set (JWK) URL** - **Target Link URI** (points to the specific mentor you want to surface) ## Register the Tool - Log in as a system admin and click the **gear icon**. - Choose **Manage Extensibility**. - Open the **LTI Advantage** tab. - Click **Register Tool → Standard**. - Fill out: - **Name and Description** (anything you like) - Paste the **Domain**, **Redirect URL(s)**, **OpenID Connect Login URL**, and **Key-set URL** provided by mentorAI - **Target Link URI** → link to the exact mentor (varies per integration) - Leave **Extensions**, **Roles**, **Substitution Parameters**, and **Custom Parameters** blank unless mentorAI instructs otherwise. - Save the registration. - Copy the **Registration Details** (they include the issuer, client ID, etc.) and send them back to mentorAI so they can complete their side of the setup. ## Create a Deployment - In the same **LTI Advantage** area, click **View Deployments**. - Choose **New Deployment** (or open the one you just created). - Under **Security Settings**, check the user-related boxes so Brightspace passes learner identity to mentorAI. - If the mentor will push grades back, enable **Assignments and Grade Services**. - Under **Configuration Settings**, tick **Make Tool Available to the Org** so any course can use it. - Save the deployment. - Copy the **Deployment ID** and provide it to mentorAI (needed to finish the integration). ## Create a Link to the Mentor - Still inside the deployment, click **View Links → New Link**. - Fill out: - **Name** (e.g., “mentorAI – Biology Tutor”) - **URL** → the same **Target Link URI** you used in registration - **Type of Launch** → Basic - Save the link. ## Add the Mentor to a Course - Return to the Brightspace homepage and open a course. - Navigate to the content area or module where you want the mentor. - Click **Add Existing → External Tool Activity**. - Select **mentorAI** from the tool list. - **Publish** the item so learners can see it. ## Result - **Register Tool** with mentorAI-supplied URLs. - **Deploy it**, enabling user identity (and grade services if needed). - **Create a Link** pointing to the specific mentor. - **Insert the link** into any course via **External Tool Activity**. Your mentorAI assistant is now live in Brightspace, ready to help learners directly inside their course pages. --- # Brightspace Deep Linking ## Purpose Integrate **Mentor AI content** into **Brightspace courses** using **LTI Deep Linking**. This allows instructors to insert specific mentors directly into course content modules. ## Step 1 — Configure the Tool in Brightspace 1. Go to the **Manage Extensibility** section as an admin 2. Open the **LTI Advantage** tab 3. Add a **new tool** or update an **existing one** - No main registration settings need to change 4. Under **Extension Settings**, make sure **Deep Linking** is enabled 5. Go to **Deployments** 6. Open an existing deployment 7. Confirm that **Deep Linking** is marked 8. *(Optional)* Adjust additional settings if needed (e.g., send user info) 9. **Save changes** ## Step 2 — Add the Deep Linking Launch 1. In the **Deployments**, go to **Links** 2. Add or edit a link that uses the **Deep Linking Launch URL** 3. Adjust display settings: - **Height and width** - **Type** should be **Quicklink** so content can be selected 4. **Save and close** ## Step 3 — Insert Into a Course 1. Open a **course** in Brightspace 2. Navigate to **Course Content** 3. Select a unit or create one 4. Click **Add Existing** 5. Choose the **Deep Linking option** (instead of the standard external tool) 6. Select the **deep linking configuration** you set earlier 7. A **content picker** opens, showing the mentors available for your tenant 8. Mentors display with their names for easy selection 9. Pick the desired mentor and **add it to the course** ## Result The chosen mentor appears as **integrated course content**. Learners can **launch it directly**, and instructors can **repeat the process** for additional mentors or resources. --- # Course to Deep-Link ## Purpose Set up a **mentor for LTI Deep Linking** so you can add it to your **LMS** (e.g., Brightspace) yourself—assuming the base integration exists and deep linking is enabled. ## 1) Get the Mentor ID 1. Open the mentor you want to integrate. 2. Copy the **unique identifier** from the mentor’s URL. ## 2) Create the Course in Studio 1. Go to **[ibl.ai Studio](https://studio.learn.iblai.app)**. 2. Create a new course (e.g., “Socratic Mentor”) and choose the organization. ### Settings → Schedule & Details - Set the **Course Start Date** to a time in the past. - Set **Enrollment Start** to at least a day before the start date. - Click **Save**. ### Settings → Advanced Settings - In the **advanced modules list**, add: `"ibl mentor_xlog"` *(This enables the mentor component.)* - **Save** changes. ## 3) Add the Mentor Component 1. Go to **Outline → add Section → Subsection → Unit**. 2. Click **Advanced → Add New Component → Mentor**. 3. Click **Edit** and paste the mentor’s **unique ID** you copied earlier. 4. Set the **Display Name** (e.g., “Socratic Mentor”). ### Optional Settings (from the demo) - **Context Awareness:** Enter your LMS domain; copy it into the **Mentor Domain** and **Domain** fields. - **Anonymous:** Toggle if you want users to chat without authentication. - **Advanced View:** Enable tabs like **Summarize, Translate, Expand**. Click **Save** and **Publish** the unit. ## Result Your mentor is **published in Studio** and **selectable via deep linking** in your LMS—letting you add mentors to courses on your own. ### Transcript mentorAI | Brightspace | Configuring LTI https://www.youtube.com/watch?v=xePQv8VC8Cc [0.9s] I'm going to be showing you how to [2.4s] integrate Mentor AI as an LTI 1.3 [5.4s] component in Brightpace. So, the first [8.5s] thing you're going to need to do is log [10.3s] in with your uh system admin account [12.5s] because you'll need a system admin [14.2s] account to be able to create the tool. [16.6s] Now, once you're there, you'll click on [18.3s] this little gear icon and then you'll go [21.2s] down to manage [24.2s] extensibility. So, now that we're here, [26.4s] we're going to click on the LTI [27.7s] advantage tab. [30.5s] and you'll click on register [35.2s] tool. Once you click on register tool, [37.7s] you'll hit standard and you'll enter a [40.4s] name, a description. Those can be [42.0s] whatever you so choose and we will be [44.9s] providing you with the other parameters [47.5s] needed to integrate the LEI. So we'll [49.8s] give you the domain, the redirect URLs, [52.4s] open ID, uh connection login URL, target [56.2s] link, key set URL, so on and so forth. [59.3s] Now in terms of extensions, roles and [61.8s] substitution parameters, as well as [63.4s] custom parameters, you won't need to [65.3s] fill anything out here. Now to see an [68.7s] example of this, we'll click on this one [71.7s] that we already have in our own bright [76.4s] space. All right. So you can see what [78.7s] this looks like filled out. You'll see [80.8s] all of the parameters that you need. The [83.8s] domain redirect uh open ID target or [87.5s] sorry not target key set URL will [90.0s] basically remain the same for your [91.5s] organization um per LTI integration. But [94.6s] this target link URI will determine [97.0s] which exact mentor is being displayed. [99.9s] So this one will change per LTI [101.9s] integration. [103.8s] Um you can use the assignments and grade [106.1s] services if you are using a grade [108.2s] passback mentor but in this case you [110.5s] won't necessarily need [112.3s] it. And again you can see that we don't [115.5s] have any substitution parameters custom [117.1s] parameters so on and so forth. But then [118.6s] we have um the registration details [122.2s] which you will need send them back to us [123.8s] so we can enter them on our side to [125.3s] complete the integration. [127.8s] And so once we have this, we'll go to [129.8s] view [138.2s] deployments. And you can see that we [140.4s] have this deployment here. Now, we can [143.7s] click on [145.2s] that. And it'll look very similar to [147.7s] what you saw before. Um, but there are a [150.9s] couple of different pieces you'll need [152.3s] to check off here. Under the security [154.1s] settings, you'll need to check off any [155.7s] information related to user [157.6s] specifically. You can select these other [159.7s] pieces, but it's not [161.8s] necessary. And then for the [163.4s] configuration settings, again, if you do [165.2s] have any grading going on with this [167.4s] mentor, uh you'll select these two for [170.3s] grade [171.6s] passback. And then you will want to make [173.8s] the tool available to the org. Uh so [176.6s] that way it can be used across any [178.5s] course inside of the org in in [180.6s] Brightpace. [182.1s] Now, we're going to go to view [184.2s] links. Um, so we'll click on this. [187.0s] You'll hit new link if you're creating a [189.0s] new one. Um, and I'll show you an [191.8s] example of this so you can understand, [193.4s] you know, what kind of description, URL, [195.7s] name, kind of launch it is, and all of [197.7s] that. So, I'm going to hit cancel here. [200.6s] We'll click on mentor AI. [204.0s] Now you'll see here we have the name, [206.2s] the URL which is going to be the same [208.1s] target URI that indicates which mentor [210.6s] is going to be displayed and the type of [212.9s] launch you're going to use is [215.1s] basic. So now that we have everything [217.9s] configured on the system admin side, we [221.7s] can now actually start using this in [223.5s] courses as course [226.8s] authors. So we'll go back to the [229.0s] homepage. [233.2s] We'll select a [235.8s] course. Going to click on course [240.8s] information. And then if we want to add [243.6s] new integration, specifically the mentor [245.8s] that we just set up, add existing. And [248.7s] you'll click on external tool [251.0s] activity. It'll show you what external [254.0s] tools you can uh add to that specific [256.8s] page in the course. You'll click on [258.0s] mentor AI. [261.3s] And you will have the mentor. --- ## Canvas URL: https://ibl.ai/docs/configuration/lti/canvas Video: https://www.youtube.com/watch?v=pmiFXWWQL3g # Canvas ## Prerequisites ![](/images/canvaslti.png) - You must be a Canvas admin. - Obtain all LTI parameters from the mentorAI team (redirect URI, target link URL, OpenID details, JWK URL, etc.). ## Create a Developer Key - In Canvas, open **Admin → Developer Keys**. - Click **Add Developer Key** and choose **LTI Key**. - Fill the form: - **Name**: mentorAI (or similar). - **Owner Email**: your contact. - **Configuration Type**: Manual Entry. - **Redirect URI**: paste the value supplied by MentorAI. - **Title / Description**: e.g., “mentorAI – LTI 1.3 integration.” - **Target Link URI**: the mentor link provided by mentorAI. - **OpenID Connect Initiation URL**: supplied by mentorAI. - **JWK Method**: select **JWK URL** and paste the URL from mentorAI. - **Placements**: enable **Account Navigation** (top-level link) and/or **Link Selection** (in-course content). - Save the key and set its **State** to **On**. - Note the **Client ID** shown in the list (you’ll need it shortly). ## Register the Key as an App - Navigate to **Admin → Settings → Apps**. - Click **View App Configurations → Add App**. - Choose **By Client ID**. - Paste the **Client ID** from the Developer Key page and submit. - The MentorAI app now appears in the list. - Open the app’s settings to confirm or adjust **Placements**. - Copy the **Deployment ID(s)**—send them to mentorAI so they can finalize the provider-side setup. ## Add mentorAI to a Course - Go to **Courses** and open the desired course. - Enter **Modules**. - Click **+ Module**, give it a name, and **Add Module**. - Inside the new module, press **+** to add an item. - Select **External Tool**. - Choose **mentorAI** from the list (URL and page name auto-populate). - Leave **Load in a New Tab** unchecked unless preferred. - Click **Add Item**, then **Publish** the module item. - Once published, learners will see and access the integrated mentor directly in Canvas. ## Result - **Developer Key** creates the LTI credentials. - **Client ID** registers the app inside Canvas. - **Deployment ID** completes provider-side setup. - **Modules → External Tool** inserts mentorAI into any course. Your mentorAI LTI integration is now live in Canvas, ready for learners to use. --- # Canvas Deep Linking ## Purpose Integrate **mentorAI content** into an LMS course (example shown in **Canvas**) using **LTI Deep Linking**. This enables instructors to insert specific mentors or tools directly into course modules. ## Step 1 — Configure the Tool in Canvas 1. Go to the **Admin Menu** 2. Select **Developer Keys** 3. Either create a **new tool** or edit an **existing one** 4. In the tool’s configuration: - Scroll to **Link Selection** - Enter the **Target Link URI** for the deep-linking launch - Set the **Message Type** to **LTI Deep Linking Request** - Save your changes ## Step 2 — Add Deep Linking to a Course 1. Navigate to the **desired course** 2. Add an **External Tool** 3. Select the **Deep Linking Launch** option you configured 4. A **content selection menu** appears ## Step 3 — Choose and Insert Content 1. From the selection menu, pick the **mentor or resource** to integrate (e.g., Career Path Mentor) 2. The LMS returns the **Target Link URI** and a **suggested title** 3. Add the item ## Result The selected **mentorAI content** is now **embedded in the course** and ready to launch directly through **LTI Deep Linking**. --- # Course to Deep-Link ## Purpose Set up a **mentor for LTI Deep Linking** so you can add it to your **LMS** (e.g., Canvas) yourself—assuming the base integration exists and deep linking is enabled. ## 1) Get the Mentor ID 1. Open the mentor you want to integrate. 2. Copy the **unique identifier** from the mentor’s URL. ## 2) Create the Course in Studio 1. Go to **[ibl.ai Studio](https://studio.learn.iblai.app)**. 2. Create a new course (e.g., “Socratic Mentor”) and choose the organization. ### Settings → Schedule & Details - Set the **Course Start Date** to a time in the past. - Set **Enrollment Start** to at least a day before the start date. - Click **Save**. ### Settings → Advanced Settings - In the **advanced modules list**, add: `"ibl mentor_xlog"` *(This enables the mentor component.)* - **Save** changes. ## 3) Add the Mentor Component 1. Go to **Outline → add Section → Subsection → Unit**. 2. Click **Advanced → Add New Component → Mentor**. 3. Click **Edit** and paste the mentor’s **unique ID** you copied earlier. 4. Set the **Display Name** (e.g., “Socratic Mentor”). ### Optional Settings (from the demo) - **Context Awareness:** Enter your LMS domain; copy it into the **Mentor Domain** and **Domain** fields. - **Anonymous:** Toggle if you want users to chat without authentication. - **Advanced View:** Enable tabs like **Summarize, Translate, Expand**. Click **Save** and **Publish** the unit. ## 4) Add It in Your LMS (Canvas Example) 1. In **Canvas**, go to your course → **Add New Item → External Tool**. 2. Select your existing **Deep Linking tool**. 3. In the picker, choose the **Socratic Mentor** you just created. 4. Add the item. It now appears in the course for launch. ## Result Your mentor is **published in Studio** and **selectable via deep linking** in your LMS—letting you add mentors to courses on your own. --- # Chat Persistence ## Purpose Keep your mentor conversation **active and uninterrupted** in the Canvas side panel when you navigate between pages—**no reset or loss of context**. ## Steps 1. In the Canvas side panel, ask a question Example: > “What is the War of 1812?” 2. Navigate to a **different page** in the same Canvas course. 3. The mentor automatically **remembers where you left off**, restores the same chat thread, and you can continue the conversation seamlessly. ## Result Your chat **persists across page changes** in Canvas, allowing you to keep talking to the mentor without losing the conversation thread. ### Transcript ibl.ai | Canvas | LTI 1.3 Deep Linking https://www.youtube.com/watch?v=vw1hrcZWsRU [1.1s] Hi, my name is Jeff from IBLAI and today [3.6s] we're going to look at integrating deep [5.9s] linking with LTI. So the this is Canvas [9.3s] and the first thing we're going to do is [10.5s] go to the admin menu and then select [12.4s] developer keys and then you can either [15.2s] create or edit an existing tool. [18.3s] This is an existing tool, so we don't [20.2s] need to change anything about the main [21.5s] integration. But if we go down to link [24.2s] selection, we can enter in a target link [27.2s] URI for the deep linking launch. And [29.8s] we'll make sure that the message type is [31.8s] LTI deeping request. Then we can save [34.7s] it. [36.6s] Then go over to a course [41.8s] and we can add an external tool [45.9s] and select our deep linking launch [49.1s] and that'll bring up a content selection [50.7s] menu so you can choose the content you [52.6s] want to integrate with. We'll select the [55.5s] career path mentor [58.2s] and then we'll get the target link URI [60.0s] returned for us and then also a [61.4s] suggested title. So, we can add the item [64.7s] and then it's ready to launch. [74.2s] And that's it. --- ## Canvas Side Panel Screen Sharing URL: https://ibl.ai/docs/configuration/lti/canvassidepanel Video: https://www.youtube.com/watch?v=M5HT9RXJyqI # Canvas Side Panel Screen Sharing ## Overview UI/UX enhancements to the screen sharing experience when using the mentor inside the Canvas LTI 1.3 side panel. These updates give users more control over audio, mute state, and conversation transcripts during screen sharing sessions. --- ## Starting a Screen Share 1. Open the mentor in the Canvas **side panel**. 2. Initiate screen sharing from the mentor controls. 3. The mentor will respond once it can see the shared screen. --- ## Audio Controls - **Mute yourself**: prevents the mentor from hearing you (useful in noisy environments or when pausing the conversation) - **Mute the mentor**: silences the mentor's verbal responses while still receiving text in the screen share preview - Both controls are accessible from the side panel during an active screen share --- ## Text Streaming During Muted Audio When the mentor's audio is muted: - Voice communication remains available in both directions - The mentor's response streams as **text** in the screen share preview below - This allows you to read responses without audio --- ## Ending a Screen Share 1. Click **Stop Screen Sharing** (available from two locations in the UI). 2. Upon ending, a **full transcript** of the screen sharing conversation appears in the chat. 3. The transcript preserves the complete exchange for reference. --- ## Key Features - **Mute/unmute controls** for both user and mentor audio - **Text preview** of mentor responses during muted state - **Automatic transcript** generated when screen sharing ends - **Status indicators** visible in the side panel throughout the session --- ## Dynamic Mentor Integration (LTI 1.3) URL: https://ibl.ai/docs/configuration/lti/dynamicmentor Video: https://www.youtube.com/watch?v=-R6nQbwyICc # Dynamic Mentor Integration (LTI 1.3) ## Overview A dynamic mentor integration using LTI 1.3 that allows different mentors to appear in different courses within the same LMS environment. This example uses Canvas but applies to other LMS platforms. --- ## Admin Setup: Developer Keys 1. In Canvas, go to **Admin → Developer Keys**. 2. Click **Developer Keys → LTI Key** to create a new integration. 3. Configure the parameters (provided by the mentorAI team): - **Redirect URIs** - **Target Link URI** - **OpenID Connect Initiation URL** - **JWK Method**: Public JWK URL 4. Under **LTI Advantage Settings**, enable user data sharing (email, name) for reporting visibility. 5. Add the **custom field** that enables dynamic mentor selection per course. 6. Set **Privacy Level** to Public. 7. Set **Placements**: Account Navigation and Link Selection (defaults). --- ## Course Navigation Placement Add **Course Navigation** as a placement: - **Default enabled**: the mentor appears in every course's side navigation automatically - **Default disabled** (via Paste JSON): the mentor does not appear in side navigation unless an instructor enables it To disable by default, use Paste JSON and set `"default": "disabled"` under the course navigation placement. --- ## Instructor Configuration 1. Open the course where you want the mentor. 2. An instructor/admin panel shows an **Enable** toggle and a **Mentor ID** field. 3. Get the mentor's unique ID from the mentor platform URL (the segment after the last `/`). 4. Paste the Mentor ID and click **Save**. --- ## Student View - Students see the mentor in the **side panel** and can chat directly. - If both the side panel and course navigation are enabled, the mentor appears in both locations. - Students can close the side panel and use the course navigation page exclusively. --- ## Hiding the Course Navigation Link As an instructor: 1. Go to **Course Settings → Navigation**. 2. Drag the mentorAI item to the **hidden** section. 3. Click **Save** — the mentor no longer appears in the course navigation for students. As an admin: - Set `"default": "disabled"` in the Developer Key JSON to prevent the course navigation link from appearing globally. - Instructors can still enable it per-course if the admin leaves the option available. --- ## Key Takeaways - **Dynamic integration** allows different mentors per course using a single LTI tool - **Mentor ID** links each course to a specific mentor from the platform - **Course navigation** and **side panel** are independent — you can enable either or both - **Instructors** control per-course visibility; **admins** control global defaults --- ## Moodle LTI 1.3 Deep Linking URL: https://ibl.ai/docs/configuration/lti/moodle Video: https://www.youtube.com/watch?v=oOPGSEaiE4U # Moodle LTI 1.3 Deep Linking ## Prerequisites - You must be a Moodle **site administrator**. - Obtain all LTI parameters from the mentorAI team (tool URL, key set URL, login URL, etc.). --- ## Register the Tool 1. Go to **Site Administration → Plugins → Activity Modules → Manage Tools**. 2. Click **Configure a Tool Manually**. 3. Fill in the fields: - **Tool name**: Mentor AI - **Tool URL**: provided by mentorAI - **Key set URL**: provided by mentorAI - **Login URL**: provided by mentorAI - **LTI version**: **1.3** 4. Under **Tool Configuration Usage**, select **Show as preconfigured tool and adding an external tool**. 5. Set **Default Launch Container** to **Existing Window** (or choose Embed, Embed Without Blocks, or New Window). 6. Check **Deep Linking** to enable content selection. 7. Under **Privacy**, enable sharing the user's **name** and **email** for reporting purposes. 8. Save the configuration. --- ## Add the Mentor to a Course 1. Navigate to the course where you want to add the mentor. 2. Click **Edit Mode** to enable editing. 3. Click the **+ Activity or Resource** button. 4. Select **Mentor AI** from the list (under Activities or Starred). 5. Click **Select Content** — a window opens showing all available mentors. 6. Choose the mentor you want to integrate (e.g., Career Path Mentor). 7. Click **Select Content**. 8. Optionally add a description or configure the module settings. 9. Click **Save and Display**. --- ## Result - The mentor loads directly in the course page. - Students can chat with the mentor immediately. - The embedded size can be adjusted from the integration settings. --- ## Key Takeaways - **LTI version must be 1.3** — verify this in the tool configuration - **Deep linking** allows instructors to select specific mentors per course module - Enable **name and email sharing** for accurate learner analytics - The mentor can be added to multiple courses with different mentor selections via deep linking --- ## Administration URL: https://ibl.ai/docs/configuration/mcp/administration Video: https://www.youtube.com/watch?v=Y4rLO5y0mzE # Administration ## Purpose Configure MCP connectors so mentors can call external MCP tools and return results inside the chat. --- ## MCP Tab Overview - **Featured connectors:** prebuilt/default (some open-source) you can enable - **Custom connectors:** define your own endpoint and authentication --- ## Add a Custom Connector 1. Go to **MCP → Custom Connectors → Add**. 2. Fill the fields: - **Image** (optional) - **Name** - **URL** (connector/server URL) - **Description** (optional) - **Transport type** - **Authentication method:** - None - API Key - **Token type for the header** (e.g., Bearer, Basic, Token) - **Token value** 3. Click **Connect** to save. - The connector appears in your list. 4. You can **enable/disable** any connector. - Disabling prevents mentors from using that MCP server in replies. --- ## Example: Connect an MCP Server with API Key 1. In the MCP provider, copy the **Access URL** and **Access Token**. 2. In **MCP → Custom Connectors**, set: - **Name:** e.g., Workflow MCP - **URL:** paste the access URL - **Auth:** API Key - **Token type:** Bearer - **Token value:** paste the token 3. Click **Connect**. - The connector is added and can be toggled on/off. --- ## Using MCP in Chat (Demo Flow) 1. Grab a **project ID** from your MCP workspace. 2. Ask the mentor, for example: - “List the workflows in the project ``.” 3. The mentor lists active workflows (e.g., FAQ, webhook test, etc.). 4. Run a workflow, for example: - “Execute the FAQ workflow and tell me: How do I reset my password?” 5. The mentor includes your question in the workflow request body and returns the answer from the MCP server. --- ## Result Your mentors can call **enabled MCP connectors** to list resources and execute workflows, returning **MCP-sourced answers** directly in the conversation. --- # MCP Servers Model Context Protocol (MCP) servers that expose APIs as tools for AI agents like Claude Code and Cursor. ## Available Servers | Server | Endpoint | Description | |--------|----------|-------------| | Analytics | `/mcp/analytics/` | Learning analytics, mentor usage, LMS metrics | | Search | `/mcp/search/` | Course catalog and mentor search (global only, personalized excluded) | | Agent Chat | `/mcp/agent-chat/` | Chat with AI mentors | | Agent Create | `/mcp/agent-create/` | Create and manage AI mentors | ## Authentication All servers require a **Platform API Key**: 1. Log into the Admin panel 2. Navigate to Platform Settings > API Keys 3. Create and copy your API key --- ## Analytics Server Get insights about learning activity, mentor usage, and costs. ### Description (for MCP Setup) > The iblai-analytics MCP server enables AI agents to access detailed analytics about mentor-student interactions, conversation patterns, topic analysis, sentiment tracking, and cost reporting. This server provides comprehensive insights into learning activity, mentor usage, user engagement metrics, LLM costs, and platform effectiveness. Key features include: learner analytics, content analytics, message and session tracking, topic analysis, financial reporting, and time-based metrics. ### Setup ```json { "mcpServers": { "iblai-analytics": { "url": "https://your-instance.com/mcp/analytics/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY" } } } } ``` ### Example Queries Once configured, ask your AI assistant: - "How many users are active in the acme platform?" - "What topics are students asking about most?" - "Show me conversation trends for the last month" - "What are the LLM costs broken down by mentor?" - "Which students have the most chat messages?" - "What's the average course grade?" ### Working Examples #### Basic Health Check and User Info ```python # Health check ping() # Returns: "2026-01-12T23:08:47.008677+00:00" # Get current user get_current_user() # Returns: "ibl_user" # Get user platforms get_user_platforms() # Returns: "User: ibl_user\nMember of platforms: ibl, main, ...\nAdmin of platforms: ibl, main, ..." # Count users in platform count_users_in_platform(platform="ibl") # Returns: 8 ``` #### Learner Analytics ```python # Get learner analytics for a specific user get_learner_analytics( username="ibl_user", platform_key="ibl" ) # Returns: User summary with time spent, enrollments, credentials, etc. # Get learner list for a platform get_learner_list( platform_key="ibl", limit=5 ) # Returns: Paginated list of learners with metrics # Get detailed learner information get_learner_details( username="ibl_user", platform_key="ibl" ) # Returns: Consolidated learner analytics across catalog, mentor, and credential data ``` #### Content Analytics ```python # Get content analytics (requires platform_key) get_content_analytics( metric="courses", platform_key="ibl" ) # Returns: Course analytics with enrollments, time spent, completion rates, etc. # Get detailed content information get_content_details( content_id="course-v1:Academy+S-IB-50+v1", metric="course", platform_key="ibl" ) # Returns: Detailed course information with learner breakdowns ``` **Important Notes:** - Most analytics endpoints **require** `platform_key` parameter - `get_web_analytics_time_spent_per_user` requires admin user privileges - Use `metric` parameter to specify content type: `"course"`, `"courses"`, `"program"`, `"programs"`, `"pathway"`, `"pathways"`, `"skill"`, or `"skills"` --- ## Search Server > **⚠️ Temporarily Disabled**: The Search server is currently disabled while we clean up naming conventions. It will be re-enabled in a future update. Search for courses and mentors in the catalog. ### Description (for MCP Setup) > The iblai-search MCP server enables AI agents to search for courses, programs, pathways, and mentors in the learning catalog. This server provides global search capabilities for discovering educational content and AI mentors. Key features include: catalog search across courses, programs, pathways and skills; mentor search to find AI tutors by topic or expertise; and course recommendations based on user preferences. ### Setup ```json { "mcpServers": { "iblai-search": { "url": "https://your-instance.com/mcp/search/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY" } } } } ``` ### Example Queries - "Find courses about machine learning" - "Search for mentors that teach Python" - "What courses would you recommend for a beginner?" - "Show me the most popular courses in the catalog" ### Working Examples #### Health Check ```python # Health check ping() # Returns: "2026-01-12T23:14:26.318962+00:00" ``` #### Catalog Search ```python # Search catalog for courses, programs, pathways, skills get_catalog_search( query="test", limit=5 ) # Returns: Search results with courses, programs, pathways # Includes facets for filtering (topics, subjects, tags, etc.) ``` #### Mentor Search ```python # Search for mentors (requires platform_key for authenticated requests) get_mentor_search( query="test", limit=5, platform_key="ibl" ) # Returns: List of mentors matching the query with metadata ``` **Important Notes:** - `get_mentor_search` **requires** `platform_key` parameter for authenticated requests - Use `platform_key` or `tenant` parameter (both serve the same purpose) - Search results include pagination information #### Recommendations ```python # Get course recommendations get_recommendations( platform_key="ibl", limit=3 ) # Returns: Personalized course recommendations based on RAG search # Can specify recommendation_type: "mentors", "courses", "programs", "resources", "pathways" ``` --- ## Agent Create Server Create and manage AI mentors, including CRUD operations, forking, and configuration. ### Description (for MCP Setup) > The iblai-agent-create MCP server enables AI agents to create, configure, and manage AI mentors. This server provides full mentor lifecycle management including creation from templates, settings configuration, and document training. Key features include: create mentors from templates; configure LLM settings (provider, model, temperature); manage display settings and feature flags; train mentors with documents, URLs, or text content; and update mentor prompts and permissions. ### Setup ```json { "mcpServers": { "iblai-agent-create": { "url": "https://your-instance.com/mcp/agent-create/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY" } } } } ``` ### Example Queries - "Create a new mentor for data science" - "List all mentors in the platform" - "Update mentor settings for mentor ID 123" - "Fork mentor ID 456 with new name" - "Show me available mentor templates" - "What categories are available for mentors?" ### Working Examples #### Health Check ```python # Health check ping() # Returns: "2026-01-12T22:57:37.066570+00:00" ``` #### Get Mentor Settings ```python # Retrieve mentor settings get_mentor_settings( org="ibl", user_id="ibl_user", mentor="ad2be335-5afa-4a9e-9298-8273b3d94e10" ) # Returns: Complete mentor configuration including: # - Display settings (theme, colors, images) # - LLM configuration (provider, model, temperature) # - Feature flags (image generation, web browsing, code interpreter) # - Prompts (system, proactive, study mode) # - Visibility and permissions ``` #### Create Mentor from Template ```python # Create a new mentor from a template post_mentor_with_settings( org="ibl", user_id="ibl_user", template_name="ai-mentor", new_mentor_name="test-mentor-123" ) # Returns: Created mentor object with unique_id, settings, and configuration # Optional parameters: display_name, description, system_prompt, llm_provider, etc. ``` #### Update Mentor Settings ```python # Update existing mentor settings put_mentor_settings( org="ibl", user_id="ibl_user", mentor="ad2be335-5afa-4a9e-9298-8273b3d94e10", mentor_description="Updated description for testing" ) # Returns: Updated mentor settings object # Can update: mentor_name, display_name, system_prompt, llm_provider, # enable_image_generation, enable_web_browsing, categories, types, subjects, etc. ``` #### Train Documents ```python # Train a document directly (for smaller documents) post_retriever_train( org="ibl", user_id="ibl_user", pathway="ad2be335-5afa-4a9e-9298-8273b3d94e10", # Use mentor_id as pathway url="https://example.com/test-document" ) # Returns: {"detail": "Document trained successfully"} # Train a document through worker process (for larger documents) post_train_document( org="ibl", user_id="ibl_user", type="url", pathway="ad2be335-5afa-4a9e-9298-8273b3d94e10", # Use mentor_id as pathway url="https://example.com/test" ) # Returns: Task confirmation or error message ``` **Important Notes:** - For `post_train_document` and `post_retriever_train`, use the **mentor_id** as the `pathway` parameter - The `pathway` parameter must be a valid mentor unique ID (UUID format) - `post_train_document` supports multiple types: `"file"`, `"url"`, `"text"` (for Wikipedia), etc. - If URL is not accessible, you'll get: `"We couldn't reach the website. It may be offline or blocking access."` --- ## Agent Chat Server Have conversations with AI mentors. Requires specifying which mentor to use. ### Description (for MCP Setup) > The iblai-agent-chat MCP server enables AI agents to have conversations with configured AI mentors. This server acts as a bridge to interact with specialized mentors that have been trained on specific topics or documents. The mentor is specified via the X-Mentor-Unique-Id header, and responses are based on the mentor's system prompt, LLM configuration, and trained knowledge base. ### Setup ```json { "mcpServers": { "iblai-agent": { "url": "https://your-instance.com/mcp/agent-chat/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY", "X-Mentor-Unique-Id": "YOUR_MENTOR_ID" } } } } ``` ### Example Queries - "Explain how photosynthesis works" - "Help me understand quadratic equations" - "What are the key events of World War II?" - "Can you review this code and suggest improvements?" ### Working Examples #### Chat with Mentor ```python # Get a response from a mentor get_mentor_response( prompt="hello" ) # Returns: "Hello! How can I assist you today?" # More complex query get_mentor_response( prompt="Can you explain machine learning in simple terms?" ) # Returns: Detailed explanation from the mentor based on their configuration ``` **Important Notes:** - The mentor is specified via the `X-Mentor-Unique-Id` header in the MCP server configuration - The mentor's response depends on their system prompt, LLM configuration, and any trained documents - Simple prompts like "hello" work well for testing connectivity --- ## Full Configuration To use all available servers together: ```json { "mcpServers": { "iblai-analytics": { "url": "https://your-instance.com/mcp/analytics/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY" } }, "iblai-agent": { "url": "https://your-instance.com/mcp/agent-chat/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY", "X-Mentor-Unique-Id": "YOUR_MENTOR_ID" } }, "iblai-agent-create": { "url": "https://your-instance.com/mcp/agent-create/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY" } }, "iblai-search": { "url": "https://your-instance.com/mcp/search/", "transport": "streamable-http", "headers": { "Authorization": "Api-Token YOUR_API_KEY" } } } } ``` > **Note**: The `iblai-search` server excludes personalized search tools (personalized mentor search and personalized catalog search). Save this to: - **Claude Code**: `~/.claude/claude_desktop_config.json` or `.mcp.json` in your project - **Cursor**: Add to Settings > Features > MCP Servers --- ## Tool Generation and Endpoints This section documents the endpoints used by each MCP server and how tool names are generated. ### Tool Name Generation Format Tool names follow the pattern: **`{HTTP_METHOD}_{usable_name}`** Where: - **`{HTTP_METHOD}`**: The HTTP method in lowercase (e.g., `get`, `post`, `put`, `patch`, `delete`) - **`{usable_name}`**: Derived from the URL pattern name with transformations: - Hyphens converted to underscores - Redundant prefixes removed (`apis_`, `ai_analytics_`, `ai_mentor_`, `v2_`, `trigram_`) - Section prefixes removed (for analytics: `overview_`, `audience_`, etc.) - Common words abbreviated (e.g., `completion` → `compl`, `enrollment` → `enroll`) **Example**: URL pattern name `ai_mentor_orgs_users_mentor_with_settings` → Tool name `post_mentor_with_settings` --- ### iblai-agent (Mentor Chat Server) **Custom Tool** (not auto-generated from endpoints): | Tool Name | Endpoint | Method | Description | |-----------|----------|--------|-------------| | `get_mentor_response` | N/A (custom implementation) | N/A | Get a response from a mentor. Requires `X-Mentor-Unique-Id` header. | **Note**: This is a custom tool that directly calls the LLM service, not generated from an API endpoint. --- ### iblai-agent-create (Agent Create Server) **Endpoints** (auto-generated from API patterns): | Tool Name | Endpoint | Method | URL Pattern Name | |-----------|----------|--------|------------------| | `post_mentor_with_settings` | `/api/ai-mentor/orgs/{org}/users/{user_id}/mentor-with-settings/` | POST | `mentor-with-settings` | | `get_mentor_settings` | `/api/ai-mentor/orgs/{org}/users/{user_id}/mentors/{mentor}/settings/` | GET | `mentor-settings` | | `put_mentor_settings` | `/api/ai-mentor/orgs/{org}/users/{user_id}/mentors/{mentor}/settings/` | PUT | `mentor-settings` | | `post_train_document` | `/api/ai-index/orgs/{org}/users/{user_id}/documents/train/` | POST | `ai_index_orgs_users_documents_train` | | `post_retriever_train` | `/api/ai-index/orgs/{org}/users/{user_id}/documents/train/retriever/` | POST | `ai_index_orgs_users_documents_train_retriever` | | `ping` | N/A (custom tool) | N/A | Health check tool | **Tool Name Generation Examples**: - `mentor-with-settings` → `post_mentor_with_settings` (hyphen → underscore, method prefix added) - `mentor-settings` → `get_mentor_settings` / `put_mentor_settings` (different methods create different tools) - `ai_index_orgs_users_documents_train` → `post_train_document` (prefixes removed, simplified) --- ### iblai-search (Search Server) **Endpoints** (auto-generated from search patterns, excluding personalized): | Tool Name | Endpoint | Method | Description | |-----------|----------|--------|-------------| | `get_mentor_search` | `/api/ai-search/mentors/` | GET | Global mentor search (personalized excluded) | | `get_catalog_search` | `/api/search/catalog/` | GET | Global catalog search (personalized excluded) | | `get_recommendations` | `/api/ai-search/recommendations/` | GET | Course recommendations | | `ping` | N/A (custom tool) | N/A | Health check tool | **Excluded Tools** (personalized endpoints): - `get_personalized_mentors` - Personalized mentor recommendations - `get_personalized_catalog_search` - Personalized catalog search **Tool Name Generation**: - URL pattern names are cleaned by removing redundant prefixes - Only GET methods are included (`only_get=True` in registration) - Personalized endpoints are explicitly filtered out **Custom Tools** (not auto-generated): - `ping_search`: Health check --- ### iblai-analytics (Analytics Server) **Endpoints** (auto-generated from `/api/analytics/` namespace): All endpoints under `/api/analytics/` are automatically included. Examples: | Tool Name | Endpoint | Method | Description | |-----------|----------|--------|-------------| | `get_content_analytics` | `/api/analytics/content/` | GET | Get content analytics | | `get_content_details` | `/api/analytics/content/details/{content_id}/` | GET | Get detailed content analytics | | `get_learner_analytics` | `/api/analytics/learners/` | GET | Get learner analytics | | `get_learner_list` | `/api/analytics/learners/list/` | GET | Get list of learners | | `get_learner_details` | `/api/analytics/learner/details/` | GET | Get detailed learner analytics | | `get_web_analytics_time_spent_per_user` | `/api/analytics/time-spent/user/` | GET | Get time spent per user | | `get_financial` | `/api/analytics/financial/` | GET | Get financial analytics | | `get_financial_details` | `/api/analytics/financial/details/` | GET | Get detailed financial analytics | | `get_financial_invoice` | `/api/analytics/financial/invoice/` | GET | Get invoice data | | `get_messages` | `/api/analytics/messages/` | GET | Get message analytics | | `get_sessions` | `/api/analytics/sessions/` | GET | Get session analytics | | `get_ratings` | `/api/analytics/ratings/` | GET | Get rating analytics | | `get_topics` | `/api/analytics/topics/` | GET | Get topic analytics | | `get_time` | `/api/analytics/time/` | GET | Get time-based analytics | | `ping` | N/A (custom tool) | N/A | Health check tool | | `count_users_in_platform` | N/A (custom tool) | N/A | Count active users in platform | | `get_current_user` | N/A (custom tool) | N/A | Get current authenticated user | | `get_user_platforms` | N/A (custom tool) | N/A | Get user's platform memberships | **Tool Name Generation**: - URL pattern names are cleaned by removing redundant prefixes and applying abbreviations - Only GET methods are included (`only_get=True` in registration) - All endpoints starting with `/api/analytics/` are automatically included **Custom Tools** (not auto-generated): - `ping`: Health check - `count_users_in_platform`: Direct database query - `get_current_user`: Authentication helper - `get_user_platforms`: Authentication helper --- ## Troubleshooting ### Authentication Issues **"Unauthorized" errors**: Check your API key is correct and hasn't expired. **"Could not authenticate mentor"**: Verify the `X-Mentor-Unique-Id` header is set correctly. **Connection issues**: Ensure the instance URL is correct and accessible. ### Common Parameter Errors **"platform_key is required"** (Analytics & Search): - Most analytics endpoints require the `platform_key` parameter - Example: `get_content_analytics(metric="courses", platform_key="ibl")` - The `platform_key` should match your organization/platform identifier **"Invalid parameters"** (Search): - `get_mentor_search` requires `platform_key` for authenticated requests - Use either `platform_key` or `tenant` parameter (both serve the same purpose) **"Document pathway is not a valid mentor unique id"** (Agent Create): - When using `post_train_document` or `post_retriever_train`, the `pathway` parameter must be a valid mentor UUID - Use the mentor's unique ID (not the mentor name or slug) - Example: `pathway="ad2be335-5afa-4a9e-9298-8273b3d94e10"` **"We couldn't reach the website"** (Agent Create): - When training documents from URLs, ensure the URL is publicly accessible - The URL may be blocked, require authentication, or be offline - Verify the URL works in a browser before using it ### Permission Errors **"Requires admin user"** (Analytics): - Some endpoints like `get_web_analytics_time_spent_per_user` require admin privileges - Ensure your API key has admin permissions for the platform ### Best Practices 1. **Always include `platform_key`** for analytics and search operations 2. **Use mentor UUIDs** (not names) for pathway parameters in document training 3. **Test with simple prompts first** (e.g., "hello") before complex queries 4. **Check URL accessibility** before training documents from external sources 5. **Use health check tools** (`ping`) to verify server connectivity ### Transcript mentorAI | MCP https://www.youtube.com/watch?v=Y4rLO5y0mzE [0.7s] Okay. So this is to demo the um MCP [4.6s] connectors we have added. If we go to [6.7s] the MCP tab, we have uh featured [9.4s] connectors and and the custom connectors [11.6s] as well. Fidget connectors are just [13.6s] custom default and some of some of them [17.0s] are open source as well which we can [18.9s] simply enable and some of them are based [21.7s] out of O. But first let's just see what [26.1s] uh custom connectors are. For the custom [27.9s] connectors, we can upload the image, [30.2s] provide the name, the URL, no optional [32.6s] description, the transport type, and the [34.3s] authentication method. For this, for [36.6s] now, we have the uh no authentication [38.6s] and the API key. The API key, we can [41.5s] provide the token type for the for the [43.7s] headers like beer, basic uh for and then [47.7s] provide the value of the token. So let's [50.6s] first try with the [53.8s] uh with the NA10 [56.3s] MCP connector. So if I go to the [58.2s] settings and click MCP access and go to [61.2s] the access token based uh we have this [63.8s] URL. Let's just copy it and I paste it [68.2s] over here and just name it en workflow [73.0s] and then we we select the API key and [76.0s] the token as bearer. So as it is hidden [79.4s] over here, let me just copy it from the [82.6s] value I have. [85.7s] Okay. And let me just paste it here. [90.0s] Okay. So now we have the value. We have [92.1s] the URL [93.9s] and M8 [96.8s] workflow [99.3s] MCP. [103.4s] So okay and we connect and it adds in [108.5s] the connectors list as well. So this is [110.5s] the uh option where we can like enable [113.7s] and disable the uh disable the MCP [117.6s] server and uh disabling this will not [120.1s] allow the mentor to use this uh to use [123.3s] this MCP server um [127.8s] to reply with the with the MCP tools. So [131.8s] now let's uh let's use so if I like go [135.4s] to the MCP [138.2s] and let's go to the homepage. [144.5s] Okay. So [146.8s] so this is the project I have. So if I [150.5s] like copy the project ID and ask him [155.1s] list the workflows [159.0s] in the project. [168.4s] So it's it's it's listing out all of the [171.2s] projects we have. It only lists the [173.9s] active active ones. So you see it's it [176.7s] has FAQ, MCP web book, MCP test [179.8s] workflow. So here we have the active [182.2s] ones, three of them active and three of [184.2s] them uh three of them in the list. So [187.7s] let's say if we want to execute this FAQ [190.6s] MCP. So we can say let's execute [196.8s] workflow [199.2s] of and provide [201.9s] me with the FAQ [205.0s] we have. [216.7s] Okay, [220.0s] let's try with some with ID of [236.0s] okay it uh we have the query uh in the [240.9s] request body right so um it's just um [245.6s] the fa FAQ [248.7s] um [251.1s] to match the provided FAQ. So for [254.7s] example uh [258.7s] and tell me the answer [263.0s] for [270.6s] so so you see uh it uh I have asked him [273.8s] how do I reset my password. it add add [276.2s] it into the into the query and then it [278.9s] provides me the answer we have in the [281.4s] MCP server. So that's how it works. --- ## Analytics URL: https://ibl.ai/docs/configuration/mcp/analytics Video: https://www.youtube.com/watch?v=Q88dvtE3wVQ # Analytics --- ## Purpose Connect an Analytics MCP server to a mentor so it can query, analyze, and visualize platform data—including user activity, learner metrics, content usage, financial analytics, and message/session data—directly in chat. --- ## Overview An Analytics MCP–powered mentor acts as an Analytics Assistant, pulling real data from your platform via MCP. It can answer questions, generate summaries, and return visualizations (e.g., graphs) based on up-to-date analytics. --- ## Prerequisites - A mentor configured with an Analytics Assistant system prompt. - MCP Tool enabled for the mentor. - An API key for the Analytics MCP server. --- ## Setup Steps #### 1) Configure the Mentor - Open the mentor you want to use for analytics. - Set the system prompt to an Analytics Assistant that can pull: - User and platform analytics - Learner metrics - Content analytics - Financial analytics - Message and session data #### 2) Enable the MCP Tool - Go to the mentor’s **Tools** tab. - Toggle **MCP** to **On**. #### 3) Add the Analytics MCP Connector - Open the **MCP** tab. - Add (or edit) a connector with: - **Connector name** - **Server location** (Analytics MCP URL) - **Description** - Similar to a system prompt; describe what data can be pulled and how responses should be formed (e.g., user/platform analytics, learner metrics, content usage, financials). - **Transport:** Streamable HTTP - **Authentication:** API Key - **Token type:** API Key - **Token value:** paste your API key (hidden after save) - Save the connector. --- ## Using Analytics in Chat #### Example Queries **Prompt example:** > “Show me a graph of last logged-in users over the past seven days.” - The mentor retrieves data across multiple sources and returns a visualization and counts. **Prompt example:** > “What mentor has the highest usage over the past seven days?” - The mentor identifies the top-used mentor and can include associated metrics (e.g., LLM cost). **Note:** Some queries may take slightly longer due to multi-step data retrieval, but responses remain fast. --- ## Result With Analytics MCP connected, mentors can answer analytics questions, generate graphs, and summarize usage and costs—all grounded in live platform data and delivered directly in conversation. ### Transcript mentorAI | Analytics MCP https://www.youtube.com/watch?v=Q88dvtE3wVQ [1.7s] Today I'll be walking you through how to [3.2s] connect an Analytics MCP server with a [5.8s] mentor or an agent. Now the first piece [8.9s] here is making sure that the system [10.2s] prompt is set appropriately. And you can [11.7s] see here we've set this up so it's an [13.4s] analytics assistant. They can help pull [15.8s] relevant information from the platform [17.7s] directly. This will include things like [20.2s] user and platform analytics, financial [22.4s] analytics, um learner metrics, so on and [25.9s] so forth. Once the system prompt is all [28.8s] set up, you'll go to tools and enable [30.6s] the MCP tool. [33.0s] After that has been enabled, you can go [34.7s] to this MCP tab. And if you're adding a [37.2s] new connector, you would just simply hit [39.0s] add connector. In this case, we already [41.1s] have one set up. You'll put in a [42.6s] connector name, the location of the [44.2s] connector server, and within the [45.9s] description itself. It's very similar to [47.5s] a system prompt in terms of what you put [50.3s] in this description is going to inform [51.9s] how this connection works. You can see [53.8s] here that we're saying this will pull [55.0s] user and platform analytics, um, [57.0s] specific learner metrics, content [59.0s] analytics, financial, message and [61.4s] session, so on and so forth. Basically, [63.9s] anything that's happening within the [65.2s] platform itself. We even provide a [67.2s] couple of answers directly here as a [69.8s] couple of examples that the mentor can [72.0s] understand and pull from when it's [73.5s] pulling data from this MCP connection. [76.2s] Now, with the transport, you'll just [77.5s] leave it at streamable HTTP and in this [79.6s] case, the authentication method is going [80.9s] to be API key. The token type you'll [83.5s] select is API key. And then in this [86.0s] token field is where you'll enter your [88.2s] API key. Now this is already in place. [91.0s] The reason this is blank is because we [92.4s] hide the key. Um but if you enter anyone [94.6s] it'll replace. Now if we want to chat [97.0s] with the mentor we can say you know show [98.1s] me a graph of last logged in users over [100.8s] the past seven days. [104.0s] Now, there are a couple steps that [105.3s] happen here um that make this process a [108.8s] little bit slower um because it's [110.6s] pulling information from a couple of [111.8s] different areas and multiple steps need [113.3s] to happen. It is relatively fast um [117.1s] however [119.4s] and you can see that we already got an [121.0s] answer. So, we can see last login [123.8s] activity or users who have logged in the [125.4s] past seven days, how many that has been. [127.8s] And then I'll ask [130.2s] what mentor has the highest usage over [134.8s] the past [138.4s] seven days as well. [144.2s] All right. So, the mentor with the [145.4s] highest usage is our analytics mentor, [147.1s] which is what we're actually chatting [148.2s] with. And it even gives us that LLM cost [151.0s] um associated. [152.8s] And there you have it. --- ## MCP Connector Authentication URL: https://ibl.ai/docs/configuration/mcp/connectorauthentication Video: https://www.youtube.com/watch?v=fcFsqz7PndI # MCP Connector Authentication ## Overview Configure MCP connectors with different authentication methods and scopes to control how users authenticate with external services like Notion, Google, and other OAuth-based providers. --- ## Adding a Custom MCP Connector 1. Go to the **MCP** tab on a mentor. 2. Click **Add Custom MCP Connector**. 3. Provide the **connector URL**. 4. Select the **connector scope**: - **This mentor only**: the connector is available only for this mentor - **All mentors**: the connector is available for every mentor in the tenant --- ## Authentication Methods ### API Key - Select **API Key** as the authentication method. - Choose the **token type** for the header (e.g., Bearer, Basic, Token). - Select **Other** to provide a custom token type. - Enter the **token value**. ### OAuth For OAuth-based connectors (e.g., Notion), choose the **OAuth** authentication method and configure the **authentication scope**. --- ## OAuth Authentication Scopes | Scope | Behavior | |-------|----------| | **Tenant** | Authenticate once — the authentication is shared across all tenant members and all mentors | | **Mentor** | Authenticate once — the authentication applies to this mentor only but is shared across all users | | **User** | Each user must authenticate individually when they first interact with the mentor | --- ## User-Based Authentication Flow 1. Admin adds the OAuth MCP connector with **User** scope. 2. Admin authenticates first (completes the OAuth flow). 3. The connector is enabled on the mentor. 4. When a different user (e.g., a student) chats with the mentor: - An authentication prompt appears: "Authentication required for [service]. Please complete the login in the open window." - The user completes the OAuth flow in a popup window. - Upon success, the chat continues with the connector active. --- ## Enable/Disable Connectors - After adding a connector, use the **toggle** to enable or disable it. - **Enabled**: the mentor can use the MCP connector in responses. - **Disabled**: the mentor cannot access the connector. --- ## Key Takeaways - **Tenant** scope: authenticate once, everyone benefits - **Mentor** scope: authenticate once per mentor, shared across users - **User** scope: every user authenticates individually (most secure) - Connectors can be scoped to a **single mentor** or **all mentors** in the tenant - Use the toggle to quickly enable/disable connectors without removing them --- ## Search URL: https://ibl.ai/docs/configuration/mcp/search Video: https://www.youtube.com/watch?v=HbKNTemQeLU # Search --- ## Purpose Connect a Search MCP to a mentor so it can search your platform for courses, programs, and mentors—and return grounded, recommendation-ready results without hallucinations. --- ## Overview The Search MCP uses an MCP server (in this demo, ibl.ai’s own search MCP) to power a mentor that acts as a search assistant. It can: - Search the course catalog (courses/programs). - Find and recommend mentors. - Ask follow-up filters (subject, level, format, language). - Enforce guardrails via the system prompt. --- ## Prerequisites - A mentor with a Search Assistant system prompt. - MCP Tool enabled for the mentor. - An API key for the Search MCP server. --- ## Setup Steps #### 1) Prepare the Mentor - Open the mentor you want to use for search. - Set the system prompt to a search-focused assistant (catalog + mentor search, recommendations, guardrails). #### 2) Enable the MCP Tool - Go to the mentor’s **Tools** tab. - Ensure **MCP** is enabled. #### 3) Add the Search MCP Connector - Open the **MCP** tab. - Add (or edit) a connector with: - **Connector name** - **Server location** (Search MCP server URL) - **Description** - Acts like instructions for the connection (what the mentor can pull and how to respond). - **Transport:** Streamable HTTP - **Authentication:** API Key - **Token type:** API Key - **Token value:** paste your token - Save the connector. --- ## Using Search in Chat #### Search Courses **Prompt example:** > “What courses are available on the platform?” **Result:** - Returns a subset of courses from the tenant (even if the catalog is large). - Asks if you want to filter by subject, level, format, or language. #### Search Mentors **Prompt example:** > “What mentors can help me become a better student?” **Result:** - Lists relevant mentors (e.g., study tips, quizzes, Socratic support). - Offers to filter further or show more results (based on system-prompt limits). --- ## Result With Search MCP connected, mentors can reliably discover courses and mentors across your platform, guide users with filters and recommendations, and return accurate, grounded results from your own data. ### Transcript mentorAI | Search MCP https://www.youtube.com/watch?v=HbKNTemQeLU [1.8s] I'm going to be showing you how to [3.3s] connect a search MCP to an agent on [6.5s] Mentor AI. Now, in this case, it's going [9.4s] to be our own search MCP server, meaning [12.0s] we're pulling directly from our own [13.4s] platform. And this is essentially [15.9s] helping people search for mentors or [18.0s] content that would help them in some [19.8s] way, shape, or form. You can see that [21.5s] the system prompt here is set to be a [25.0s] search assistant. You know, we can pull [26.8s] from catalog of courses, programs, so on [30.2s] and so forth, as well as search for [31.5s] mentors and even get recommendations. We [34.1s] also have a couple of guardrails in here [35.4s] to prevent any hallucinations [38.4s] and so on and so forth. Basically, this [41.1s] is just informing the mentor how it's [42.5s] going to work with this MCP connection. [45.0s] Now, the other thing here is under the [46.6s] tools tab, you'll need to make sure that [48.1s] this MCP tool is enabled. And once [50.8s] that's done, you can go to the MCP tab [52.6s] itself and add a connector. [55.0s] We already have one in place for search [56.6s] which you can see here. You'll add in [58.6s] the connector name, the server location [61.0s] and the description which the [64.2s] description here is similar to a system [65.5s] prompt in the sense that it is serving [68.2s] as instructions on how this connection [70.5s] is going to work. So whatever you put in [72.0s] here is going to inform what what the [74.9s] mentor can pull from this you know [77.0s] server connection and respond to. [79.8s] Additionally, the transport will remain [81.4s] streamable HTTP and the authentication [83.9s] method here will be API key. The token [86.8s] type you'll select from the dropown is [88.2s] this API key and then you'll enter your [90.2s] token. [92.6s] So we'll ask the mentor [95.9s] what courses are available [99.9s] on the platform [104.5s] and this will pull up a list of all the [106.1s] courses associated with this IBLAI [108.1s] tenant. [115.4s] All righty. [122.1s] So, we can see a list of courses that [124.3s] were pulled. Um, there's over about [126.9s] 1,400 courses, but we're pulling just a [129.5s] subset of those. And we're even being [131.4s] asked uh if we want to filter by a [133.6s] specific subject, level, format, or [135.4s] language. And just to kind of see what [138.8s] it would look like to search for a [139.9s] mentor, I'm going to ask what mentors [142.9s] can help [145.4s] me become a better student. [152.3s] Essentially, I'm just looking for [153.4s] mentors that would help me with study [154.8s] tips, um, you know, quiz testing, so on [158.5s] and so forth. [162.3s] All righty. [166.6s] And now these are a list of mentors I [168.7s] can select from that are focused [170.2s] specifically on student needs. Um that [173.0s] for you know studying whatever it may be [175.8s] uh socratic gamified quiz creator and [179.0s] even it's even asking if we want to even [180.6s] filter further to find you know specific [183.0s] goal um or even display more mentors [186.3s] that weren't shown here because we are [187.9s] limiting how many are are shown via the [189.8s] system prompt. All right and there you [192.2s] have it. That search --- ## Tenant Creation URL: https://ibl.ai/docs/configuration/multitenancy/tenantcreation Video: https://www.youtube.com/watch?v=fk0Zm2NnhFU # Tenant Creation ## Goal Create and launch a brand-new tenant (and its first user). --- ## Steps ### Open the Launch Form - In **Data Manager**, expand **IBL AI Service** app. - Click **Launch Tenants**. --- ### Fill Out User Information - **Username** – choose one that does not already exist. - **Email** – enter the user’s email address. - **First Name** and **Last Name** – type both. - Default roles are set to **Admin** for both **edX role** and **Web role**. - Change either role to **None** or **Subscriber** if you need a non-admin user. --- ### Define the Platform - **Platform Key** – the unique identifier for this tenant. - **Platform Name** – what learners will see in their browser tab; spaces are allowed. - **Password** – set the initial password for the new user. --- ### Launch the Tenant - Click **Launch Tenant**. - A **success message** confirms the tenant and user were created with the selected roles. --- ### Verify Creation - Still in **Data Manager**, go to **Core → Platforms**. - The newly created tenant should now appear in the list. --- ## Result A new **tenant** exists, along with its default **admin user**, ready for immediate use. ### Transcript skillsAI | Super Admin | Create and Validate Tenant Launch https://www.youtube.com/watch?v=fk0Zm2NnhFU [0.5s] all right I'm going to show you guys how [2.0s] to create um and launch a new tenant um [6.2s] so right now we're in the data manager [8.5s] side of things and if we scroll [11.7s] down to IBL aai Service app you'll see [15.3s] launch [16.5s] tenants and so what you'll do now is [19.3s] fill in this information you'll see a [21.1s] couple Fields here now any user [24.1s] information what this is going to do is [26.0s] create a a user underneath that tenant [29.6s] by default it's going to have the edex [32.6s] role and the web role of admin we can [37.2s] change these to nun if needed um or a [40.4s] subscriber which means they're just a [41.8s] normal user now for this we're going to [44.6s] create a new username for somebody who [47.8s] doesn't already exist on the platform so [49.6s] we'll [50.5s] do um let's see [54.7s] Ashlin Brown [58.1s] tenant um and then we'll do the same for [60.7s] the email but we'll do [63.2s] i. a as the email address here and then [66.7s] we'll put in the user's [69.3s] name first and last name and now this [73.0s] platform key is going to be the key for [75.8s] the platform so we'll [78.3s] do uh let's see here [81.5s] [Music] [82.9s] Ashlin [84.1s] [Music] [86.0s] tenant ACI and now for the platform name [91.0s] this is what's going to show up in the [92.4s] tab bar that a learner sees so we can [95.2s] add spaces like [98.2s] this and then we'll create a [101.3s] password and so we'll hit launch [106.1s] tenant and you'll see that we have [109.4s] successfully created a tenant and a [110.8s] tenant user with the correct roles now [113.0s] if we wanted to validate how or that [115.7s] tenant has been actually created then [117.9s] what we can do is come down here and [122.0s] under core you'll go to [125.6s] platforms and then you can see that new [127.5s] tenant has just been [131.9s] created --- ## Tenant-User Association URL: https://ibl.ai/docs/configuration/multitenancy/tenantuserassociation Video: https://www.youtube.com/watch?v=fdWCEHqy5-w # Tenant-User Association ## Purpose Create a tenant-admin user who can manage more than one tenant. --- ## Create the Admin User - In **Data Manager**, expand **IBL AI Services** and click **New Tenant Users**. - Enter the new person’s **Name**, **Email/Username**, and **Password** (e.g., “Ashlin B12”). - Ensure this user is an **admin**: - **edX RO** stays **Instructor** (default) - **Web RO** stays **IBL Admin** (default) - Choose the appropriate **Platform**, select an existing **tenant**, type the password again, and click **Save**. - Look for the **green confirmation message** showing the user was created. --- ## Link the User to a Second Tenant - Return to **Home → User Platform Links**. - Note the newly created user’s **ID** (example: 129). - Click **Add User Platform Link**. - Search for and select that user (recent entries appear first). - Pick a different **Platform** (example: editech). - Check **Is Admin** and **Is Staff**. - Marking **Is Staff** also grants **ATX access** for course-authoring tasks if desired. - Click **Save**. --- ## Result The user now appears as a **Tenant Admin** in each chosen tenant and, where selected, as **staff** with course-authoring privileges. ### Transcript ibl.ai | Set Up Tenant Admin to Manage Multiple Tenants https://www.youtube.com/watch?v=fdWCEHqy5-w [0.6s] all right I'm going to show you how to [1.8s] set up a tenant admin to manage multiple [4.3s] tenants um as a super admin user so [8.3s] we're going to start from the data [9.6s] manager and now let's just say that the [11.9s] user doesn't already exist we want to [14.4s] create a new user who's going to manage [16.1s] multiple tenants the first thing we're [18.3s] going to do is come down to where it [20.4s] says IBL AI [21.9s] Services clect uh click on new tenant [25.0s] users and we're just going to create [28.6s] another user so we're going to put [31.9s] in the user's name that we're [35.3s] entering we're going to also create [37.8s] their username email or username and [40.9s] password at the same time here um so [43.8s] let's just do Ashlin [47.5s] B12 um and then we do need to have the [50.1s] email as [55.4s] well so we have that entered and we want [59.5s] them to be an admin because this person [61.6s] is going to manage multiple tenants so [63.1s] if the edex RO needs to be or instructor [65.4s] which it is by default and web Ro needs [67.6s] to be IBL admin which it is by default [69.5s] as well now what we're going to do is [71.6s] select the [72.7s] platform um and we already have a [74.7s] handful of tenants created so we're [76.2s] going to go ahead and select a tenant [78.0s] and then enter a password for [86.0s] them so now that we have the new admin [89.4s] user created [90.5s] we'll see some green messages saying [92.7s] everything looks good we'll come back to [94.8s] the home over [96.3s] here and then what we're going to do now [99.6s] is go to user platform [101.5s] links you'll see that the user id/ user [104.5s] object is 129 so when we add a new user [108.5s] platform link we'll search for that um [111.0s] and that one's since it's just recently [112.5s] created going to be right on the top [114.0s] there and then we're going to select a [116.0s] different [117.0s] platform so in this case we'll do [119.0s] editech [121.0s] um so we've got that platform selected [123.1s] and then we would just need to make sure [124.4s] that is admin is marked and is staff is [127.4s] marked if we want them to have ATX [131.4s] access for course authoring purposes and [134.7s] now we'll just hit save now this user is [137.8s] a part of two separate tenants so we [141.4s] have ashl this tenant right here where [144.5s] this user [145.9s] 129 is a tenant admin and then on under [150.0s] editech same thing they're going to be a [151.8s] ten an admin here as [154.9s] well --- ## User Migration URL: https://ibl.ai/docs/configuration/multitenancy/usermigration Video: https://www.youtube.com/watch?v=vl2793vT9nE # User Migration ## Purpose Move an existing user from one tenant to another while controlling their permissions. --- ## Remove the User’s Access to the Current Tenant - In **Data Manager**, expand **Core** and open **User Platform Links**. - Locate the target user (example: AshenBrown15, ID 126). - Edit that user’s **platform-link entry**. - Clear both **Admin** and **Active** check-boxes. - This prevents the user from using admin tools and stops access to the tenant. - Click **Save**. --- ## Add the User to a New Tenant - Still in **User Platform Links**, choose **Add User Platform Link**. - Search for and select the **same user** (ID 126). - Pick the **destination platform** (example: editech). - Set the desired permissions: - **Is Admin** – lets the user reach admin panels - **Is Staff** – grants access to edX courses - **Active** – allows log-in to the platform - Click **Save**. --- ## Result The user is now **removed from the original tenant** and **added to the new one**, with permissions exactly as configured. ### Transcript ibl.ai | skillsAI | Migrate Tenant User Between Tenants https://www.youtube.com/watch?v=vl2793vT9nE [0.2s] right I'm going to show you how to [2.9s] migrate users between tenants as a super [5.8s] admin so you'll see that we're starting [7.9s] from the data [9.3s] manager and the first thing we're going [11.0s] to do is find the platform links so this [15.5s] is going to be under the core right [20.2s] here and you'll see a list of user [23.1s] platform links so let's just go ahead [25.5s] and take a look at which user this is [28.1s] Ashen Brown 15 with the IDE of [30.9s] 126 um and we let's just say this is the [33.6s] user that we want to migrate to a [35.6s] different um tenant um so what we're [39.8s] going to do here we're going to go back [41.4s] to that user platform link for this user [43.8s] and what we're going to do is unselect [46.0s] admin and unselect active what this is [48.2s] going to do is make sure that they can't [49.4s] access anything that's admin related but [51.8s] also access this specific tenant um that [54.6s] active Mark right here uh for a normal [57.5s] user like a learner uh with normal be [60.4s] active if they can access the platform [62.4s] if they can't or they shouldn't be able [63.9s] to access it we would unmark this active [66.2s] right [66.9s] here and then from there we'll hit [70.0s] save and now if we want to add them to a [73.0s] different tenant um we can go to add [75.8s] user platform [77.1s] links um find the user uh so we have [81.5s] asham brown 15 with that user ID of 126 [85.3s] and then we'll look for the platform [86.8s] that we want um them to be a part of [91.4s] we'll just go ahead and select aditech [93.8s] right [94.5s] here um and we don't really need to [97.4s] worry about added on or expired on dates [99.8s] but what we do need to focus on is these [101.9s] little three right here is admins going [104.0s] to give them access to admin panels is [106.0s] staff is going to give them access to [107.3s] edex courses um and then active again [111.9s] allows them to have access to the [113.4s] platform and then we just hit save so [116.8s] now that this user went from the ashle [119.5s] inant right [122.0s] here to this main edch tenant and if we [126.0s] want to go and mark them as admin since [127.5s] they were in the other environment or [129.0s] the other tenant we [134.2s] can --- ## Administration URL: https://ibl.ai/docs/configuration/rbac/administration Video: https://www.youtube.com/watch?v=p0uIV35F7PI # Administration ## Core Concepts ![](/images/rbac.png) - **Groups** – collections of users (e.g., Tenant Admin group) - **Roles** – bundles of permissions - **Actions** = object-level rights (view, edit, delete) - **Data actions** = field-level rights (read, write) - **Policies** – bind a role to specific resources within a platform and link users or groups to that bundle - **Hierarchy** – a policy applied to a platform cascades to every resource beneath it --- ## Granting Full Tenant-Admin Access - Locate the **Tenant Admin group** - Add yourself (or another user) to that group and **save** - **Refresh the app**: every tab (Tools, Prompts, Safety, Data Sets, History, API, etc.) is now visible - To revoke full access, **remove the user** from the group and **save** --- ## Creating a Limited-Access Role - Create a **new role** (start broad, then remove what you don’t need) - Limit it to the required model—for example, only **Settings** - Make a **policy** that applies this role to the desired resources - Add yourself to the policy’s user list and **refresh** ### Result: - **Settings** stays accessible - Tabs like **LLMs, History, Data Sets, and API** disappear - Only items classified under **Settings** (certain prompts, Safety) remain --- ## Field-Level Control (Data Actions) - In the role, add a data-action rule such as: - `description : read` - **Refresh**: you can view but not edit that single field - Change the rule to: - `description : read,write` *(or use `*`)* to allow editing --- ## Adding Access to Specific Resources Follow the same pattern by updating the role: - **Tools** – add `tools` with the actions you need; the **Tools** tab appears - **API Tokens** – add `api_tokens` with `read` and `list` (then `delete` if required); tokens become viewable and, with delete, removable - **Documents/Data Sets** – add `documents` with `read`; add `write` to untrain; add `delete` to remove - For each added permission, **refresh the browser** to see the change --- ## Key Takeaways - **Groups, Roles, and Policies** combine for precise, layered control - Adding a user to a group instantly grants that group’s policies - Roles can be fine-tuned from **section-wide access** down to **individual fields** - **Permission changes appear immediately after refresh**, letting you verify results in real time ### Transcript mentorAI | RBAC https://www.youtube.com/watch?v=p0uIV35F7PI [0.6s] Hey, this is Jeff Fonman of IBL. Um, [3.0s] just going to give an overview of the uh [5.1s] ARB implementation. Uh, so at a high [8.3s] level, we generally have groups, [10.6s] policies, and roles. Uh, the groups are [13.7s] a way to associate um different users [17.0s] within a group. So for instance, we have [18.8s] a tenant admin group here. Um, and then [22.6s] we have uh roles which allow us to [25.6s] associate actions and data actions. data [28.6s] actions being the field level [29.8s] permissions and object permissions are [32.5s] the or actions are the object level [34.3s] permissions. Um so for our tenant admin [36.6s] role we've generally just given them [38.2s] everything um at this point. Then we [41.4s] have a policy which associates a [43.8s] particular role in a platform to a [46.8s] resource and then or resources rather [49.9s] and then allows you to associate users [52.6s] to that or groups to it. So for [54.6s] instance, our tenant admin policy has [57.4s] the tenant admin group. So anyone in [59.2s] that group gets this uh this policy on [61.4s] these resources and since this is [63.8s] hierarchical effectively this means that [66.2s] um this role applies to everything at or [69.8s] underneath this particular um platform. [73.7s] Uh I currently my user has no privileges [76.8s] right now. Um, so if we come over, I'm [79.3s] still technically a platform admin. Um, [83.5s] from a platform our perspective, not an [85.0s] arbback related one. So you can see we [86.8s] don't really have much for options here. [89.2s] Um, and even if we are to come in here, [91.6s] we don't have uh any information shown. [96.3s] Uh, but we can go back to that tenant [97.8s] admin group and can add myself in. Save [102.0s] that. [103.6s] And when we come back over here now, we [106.3s] have access to everything. [110.2s] All the different pieces, tools, [112.2s] prompts, safety stuff, data sets, [114.8s] history, [116.5s] API, all the uh all the good bits. Um [120.4s] but so we can uh remove myself back out [123.0s] of that group for the moment and then [125.0s] look at some more specific stuff. We [127.7s] have another policy here um that applies [130.7s] to everything just for simplicity as [133.2s] well, but I'll put just myself in in [135.6s] there. [137.3s] And that particular role is this other [140.0s] role that I've created here. And it [142.5s] currently has also basically access to [146.5s] everything. Um but we can change that to [151.1s] just have access to the settings or [153.5s] whatever is defined rather on the [155.0s] settings uh the settings model in this [157.3s] case for us. And then when we come back [160.0s] and refresh and look at our [163.8s] uh the things that we have access to we [165.8s] still have access to settings but our [168.2s] LLMs are gone. Um these particular [171.0s] prompts are part of settings whereas all [172.8s] of our suggested prompts are not. That's [174.4s] in a different area. tools. Uh safety is [178.2s] part of um settings. We don't have [181.0s] access to history or data sets or API [183.4s] still. Um and then if we wanted, we [187.0s] could if we just wanted to do like for [189.3s] instance uh the description field, um we [192.3s] could change this to the word [195.3s] description [197.3s] and then just give that a read [199.0s] privilege. [202.1s] That's the wrong place. Don't do it [203.7s] there because it won't do anything. to [205.6s] do in the data actions [208.3s] description [211.0s] read [213.4s] and then when we refresh this now we [216.5s] just have read access to only this [218.7s] particular field. Um and then we could [222.6s] change that to a star to give readr [225.0s] access here [230.6s] and then the user has edit privileges [232.6s] there as well. Um [236.0s] if we uh want to then like let's look at [239.7s] like for instance the tools we can do [241.8s] the same same idea with the tools. So we [245.8s] can add these in [248.0s] and we also need to put this back. We [252.3s] can leave that where it is [255.8s] and throw in [260.5s] tools [272.6s] and then the tools are then available as [275.7s] well. [277.1s] Um, another one we could look at might [279.8s] be like the uh API tokens. So, those we [285.0s] can add up in here as well. And that's [288.6s] under a different space. We're going to [291.1s] just give those uh read and list [293.1s] privileges for the moment. [302.2s] And we should be able to save that. [305.8s] And when we come back and list our API [309.0s] tokens [311.4s] and we have those available, but they're [313.1s] not currently deletable since we don't [315.0s] have um our delete privilege, but we can [318.6s] add that one in as well. [328.4s] And then we come back here. I got to [331.8s] refresh it. [333.8s] then we get the ability to delete the [335.9s] tokens. [340.2s] So the same similar concepts work for um [343.7s] most rest of the tabs. Um we could also [347.7s] look at the data sets which um [351.1s] internally in this section is called the [353.3s] uh documents. [357.3s] We can add those in there. [375.3s] and then we get access to all that [376.9s] stuff. And um similarly see we don't [380.1s] have um [382.4s] uh a right privilege for the documents. [385.4s] So we can't untrain a document at this [388.6s] point. Oops. But that would be called, [395.2s] right? [399.0s] And then we should be able to untrain [401.6s] it. But we still can't delete it since [404.2s] we don't have um the delete privilege [407.4s] there. So we could add that as well. [415.3s] and we trust [418.1s] and we can delete that document. So, [423.3s] um [424.9s] I think that's probably a pretty good [426.7s] overview of the main pieces. [431.4s] Uh you guys have any questions, [433.2s] certainly let us know. Next one. --- ## Bulk Team Management URL: https://ibl.ai/docs/configuration/rbac/bulkteammanagement Video: https://www.youtube.com/watch?v=1tGZBe1kxsw # Bulk Team Management ## Purpose Create teams and invite users at the same time using a platform invitation CSV. This lets tenant admins and enrollment managers bulk-enroll users while automatically assigning them to teams with a designated manager. --- ## Who Can Use This - Tenant Admins - Enrollment Managers (users who can invite people to the platform/content) --- ## How It Works (High Level) - You upload a platform invitation CSV. - The CSV specifies: - Which team (user group) the user belongs to. - Who the team owner/manager is. - Invitations are sent. - Users appear as team members after they register. --- ## Step-by-Step ### 1) Prepare the CSV - Download or use your existing platform invitation CSV. - Ensure the following fields are set correctly: - **User group:** the team the user should belong to (e.g., Team 2). - **User group owner email:** the email of the person who will manage that team. - The user group owner email links the team to its manager. ### 2) Upload the CSV - Go to **Management → Invite**. - Select **Upload CSV**. - Review the preview: - Validate user details. - Confirm the user group and group owner are correct. - Click **Save**. This triggers the platform invitations. ### 3) User Registration - Invited users must register before they appear as team members. **Before registration:** - The invite shows as pending. - The user does not appear in the team yet. **After registration:** - The invite shows as accepted. - The user automatically appears in the assigned team. --- ## What Admins and Managers See ### Tenant Admins - Can view all teams. - Can see team members once registration is complete. - Have access to analytics for these users. ### Enrollment Managers - Can create and manage teams via CSV. - If granted analytics access, can view the same analytics as tenant admins. - Do not access tenant settings or edit course content. --- ## Key Notes - Users do not see which team they belong to from their own perspective. - Teams are populated only after registration is completed. - This method scales easily for onboarding large cohorts tied to specific managers. --- ## Result With Bulk Team Management, you can onboard users, assign them to teams, and establish team managers—all in one CSV upload—streamlining large-scale enrollment and oversight. ### Transcript skillsAI | Bulk Team Management https://www.youtube.com/watch?v=1tGZBe1kxsw [1.6s] I'm going to be showing you how to [3.1s] create a team at the same time that [5.6s] you're sending a platform invitation. [7.8s] Now, this is going to be available to [9.8s] tenant admins as well as anybody who has [13.1s] um the enrollment manager role, meaning [15.5s] they can invite people to content the [17.4s] platform, so on and so forth. Now, [21.1s] you're going to be using the platform [22.5s] invitation here. If you click on invite, [25.8s] I already have a CSV downloaded uh that [28.7s] I'll show you here in a second, but I [30.5s] want to go ahead and show you that under [32.1s] teams, we're going to be adding to this [33.6s] team, team 2. And you can already see [35.8s] there's one user in here. So, if we go [38.6s] back to users, we'll hit invite, upload [41.8s] CSV. And when you upload the CSV, you'll [44.2s] be able to validate the information [45.6s] that's in here. Make sure everything [47.0s] looks good. We already And you can see [49.1s] right here we have the user group. This [51.8s] is the group that that user is going to [53.4s] be part of and overseen by a particular [56.0s] manager. In this case, the user group [58.6s] owner email is creating that link to the [62.1s] owner of the group um to said group. So, [65.3s] this user right here is going to be [66.7s] managing team 2. [69.2s] Now, what we'll do is hit save. And what [72.3s] this will do is trigger a platform [73.9s] invitation. [75.9s] So, I'm going to go register as this [78.2s] user. And the user doesn't really know [80.0s] what group they're a part of. they won't [81.3s] see it. But I'm going to go register [82.9s] real quick. Pause this and then I'll [84.6s] show you what it looks like once once [86.9s] the folks who are in that invitation [89.2s] have registered because if they have not [92.2s] show, you know, registered an account on [94.6s] the platform, they won't show up as a [96.6s] team member yet. But as soon as they do, [98.6s] they will. So I'm going to again pause [100.8s] this and we'll come back and refresh um [103.3s] so you can see those changes applied [104.8s] once once I have registered this other [106.9s] user. [108.5s] All righty. So I have just created the [110.2s] account for the the team member. If we [112.5s] go under management and hit invite, you [114.6s] can even see that that invite has been [116.2s] accepted. Now when we go to the team [120.6s] as a tenant admin, we'll be able to see [122.1s] all teams created. Um similarly with um [125.1s] anybody who has a high level um you know [128.8s] enrollment manager role. um the the [131.8s] folks who have, you know, tenant level [134.4s] access, tenant admin level access will [136.2s] be able to see analytics about these [137.5s] users. And if you decide to allow um [140.3s] enrollment managers to see analytics, [143.4s] they'll also be able to see the same [144.9s] thing as tenant admins. They just want [146.4s] to be able to, you know, create or edit [148.2s] any course content um so on and so forth [151.2s] or, you know, access tenant settings. [153.9s] But now you can see that the team member [155.4s] is here. --- ## Enrollment Manager Assignment URL: https://ibl.ai/docs/configuration/rbac/enrollmentmanagerassignment Video: https://www.youtube.com/watch?v=7P49vajFL70 # Enrollment Manager Assignment ## Purpose Give a user the ability to invite and enroll people into the platform, courses, and programs—without granting full tenant admin or course-editing access. --- ## How to Assign the Enrollment Manager Role ### Open Management 1. Go to **Tenant Settings → Management**. ### Assign the Policy 2. In **User Management / Policies**, select **Enrollment Manager**. 3. Apply this policy to the user. 4. Click **Save**. Assigning Enrollment Manager globally is the quickest way to grant invitation permissions. --- ## What Enrollment Managers Can Do After assignment, the user can: - Invite users to: - The platform - Courses - Programs - Use **single-email invites** or **CSV uploads** (templates available). - View invite status: - Sent - Pending - Accepted - Manage Teams: - View teams - Create teams they own - Add users to teams they created or to existing teams --- ## What Enrollment Managers Cannot Do - Edit or modify course content - Change tenant settings - Perform broader administrative actions outside invitations and team assignment --- ## Enrollment Manager View (What It Looks Like) - Access to **Teams** tab: - See team membership - Add members to teams - **Invite** button available across: - Users - Courses - Programs - Invitation flow mirrors tenant admin UX, but is limited to enrollment tasks only --- ## Typical Workflow 1. Log in as an **Enrollment Manager**. 2. Go to **Teams → Invite**. 3. Send invitations via email or CSV. 4. Track **pending** and **accepted** invites. 5. Once users register, add them to the appropriate team (new or existing). --- ## Result Enrollment Managers can efficiently onboard learners and organize them into teams—without access to content editing or tenant-wide configuration—keeping responsibilities clean and secure. ### Transcript skillsAI | Enrollment Manager https://www.youtube.com/watch?v=7P49vajFL70 [1.6s] to give somebody the ability to enroll [4.7s] um folks into you know courses or [7.6s] programs via invites or even invite [9.4s] people to the platform. The way you [11.4s] would do this is go by going to the [13.4s] management tab and you have a couple of [17.2s] different options but the easiest way to [20.0s] globally give this invitation permission [23.1s] is by setting specific policy on this [26.1s] user management tab by clicking [27.8s] enrollment manager. that's going to give [30.2s] me the ability to enroll folks um into [34.5s] programs, courses, and even invite them [36.6s] to the platform. So, I can show you how [39.3s] this looks for this user that we [41.0s] selected here. [47.4s] All righty. [53.6s] I have logged in as the user that I have [55.8s] given that enrollment access to. Um, [58.7s] this person is also able to access [61.4s] teams, see who's in a team, um, and even [65.3s] add folks to teams that they've created [67.1s] themselves. Now, under this teams tab, [69.3s] we can hit the invite button, [71.7s] and you can see that this is very [73.5s] similar to, you know, a tenant admin [75.8s] view. Um, but the difference here is [78.1s] that they don't have access to course [79.7s] editing. Um, being able to modify really [83.4s] much other than just being able to [85.0s] invite folks. Um, so you have courses, [88.2s] programs, and users. It's a very similar [91.7s] process. You can throw an email address [93.4s] in here to hit invite. You can even use [95.4s] the CSV templates for each of these tabs [97.8s] as well. [101.5s] This will also allow you to see who who [104.0s] you've sent invites to, who's still [105.5s] pending. Um, and even once they're in [109.3s] the platform, you can create a new team [111.3s] and add them to, you know, whatever [114.1s] company or team that they belong to, or [116.4s] add them to previously created teams as [118.6s] well. --- ## Groups URL: https://ibl.ai/docs/configuration/rbac/groups Video: https://www.youtube.com/watch?v=tecYyRpQWjI # Groups ## Purpose Use Groups to assign users a consistent set of permissions (policies) across the platform, and combine them with Teams to scope what data those users can see. --- ## Groups vs. Teams (Key Concepts) ### Groups - Collections of users that are assigned policies. - Policies define what actions users can take (e.g., view analytics, enroll learners, create teams). - Best for role-based access (e.g., account executives, enrollment managers). ### Teams - Collections of learners (e.g., Company A students). - Used to scope data visibility so managers only see their own learners’ data. - Best for data segmentation (Company A vs. Company B). --- ## Creating and Managing Groups 1. Go to **Tenant Settings → Management → Groups**. 2. Create a group by providing: - Group name - Description - Group members (add or remove at any time as a tenant admin) 3. Save the group. Anyone added to the group automatically inherits the policies attached to it. --- ## Assigning Policies to Groups 1. Open **Policies** in Management. 2. Each policy can have groups assigned to it. 3. Add a group by searching for it and selecting it. --- ## Example A group like **SBA Corporate Account Executive** may be assigned policies that include: - Enrollment Manager - Analytics Viewer / Reader - List Users - List Teams - Create Teams - Access to all user reports This allows group members to: - View users and teams. - Create teams. - View and download analytics. - Enroll learners by sending invitations. --- ## Using Groups Together with Teams (Scoped Access) If you want someone to: - Have a role like **Analytics Viewer**, but - Only see data for a subset of learners (e.g., Company A only), Then you should: 1. Create a **Team**. 2. Add the learners who belong to that company or cohort. 3. Create or use a **Group** - Example: *Company A Analytics Viewer*. 4. Assign the **Team to a Policy**. The policy links: - The analytics role - The specific team 5. Add users (or a group) to that policy. --- ## Result Company A managers: - Can see only Company A’s analytics. - Cannot see Company B’s data. - Cannot invite users or manage content unless explicitly allowed. --- ## What Groups Enable Depending on assigned policies, group members can: - View lists of users and teams. - Access analytics dashboards and reports. - Create and manage teams. - Enroll users into content via invitations. What they cannot do is determined entirely by the policies attached to their group. --- ## Result Groups provide clean, role-based permission management, while Teams ensure data stays properly segmented. Together, they give you fine-grained control over who can do what and which learners’ data they can see. ### Transcript skillsAI | Groups https://www.youtube.com/watch?v=tecYyRpQWjI [1.7s] So we have the concept of two different [3.6s] types of groups. We have a group in the [7.0s] sense that these are group of folks that [8.8s] are assigned a specific policy [11.4s] and then we also have teams. Now with [14.0s] groups you can assign people to a group [17.6s] with a certain set of policies attached. [19.5s] So for instance this SBA corporate [21.3s] account executive they are able to [23.8s] access analytics and also enroll [26.1s] learners into content. [29.1s] The way we set this up is just by simply [31.5s] giving a group name, description, and [33.8s] adding group members. You can add a [36.1s] member at any time as a tenant admin. [38.6s] Um, and those people who are assigned [41.1s] here will have access to specific [43.8s] policies. Now in this case this policy [46.6s] is going to include the enrollment [48.2s] manager [50.2s] analytics viewer list users list teams [55.0s] analytics reader create teams and all [58.6s] user reports. So if we open one of these [61.0s] you can see that there are groups [62.2s] assigned. You can simply just add a [63.8s] group by searching and clicking on the [65.3s] one you want to add and they'll have all [67.0s] those same policies assigned to them. [69.3s] This is going to ensure that they can [71.0s] access not only the list of users that [73.3s] exist but the specific teams in addition [76.4s] to getting all reports [78.9s] um being able to create teams. [81.9s] You can also view analytics about your [84.5s] specific users and even enroll them into [86.5s] content by sending out invites. [89.9s] Now the way this works in terms of of [92.8s] teams, [95.0s] if you want somebody assigned to a [99.0s] particular policy, let's just say it's [100.4s] analytics viewer, but you only want them [102.1s] to see a subset of that data. So company [104.2s] A versus company B, they shouldn't have [106.2s] any data crossing over you. Company A [108.6s] should only see company A's data. The [111.0s] way that would work is you create a new [112.6s] team, put in a description, add the [115.4s] students who will be inside of this [117.4s] team, and then you can assign that team [120.6s] to one of these groups over here. And in [122.9s] this case, we have company A. And this [125.4s] is just basically a a company A group [129.4s] that can only see company A's analytics. [132.6s] Uh they cannot invite people. They [134.4s] can't, you know, add anybody courses or [137.4s] anything like that. they only have the [139.3s] ability to get analytics on their [141.2s] students. [143.0s] And you can see that with this here we [144.9s] have the company analytics viewer. This [146.5s] is the group that they're allowed to [147.8s] oversee. Um and you can add again either [151.8s] individuals or a group to a policy to [154.2s] oversee specific students. --- ## Learner Management URL: https://ibl.ai/docs/configuration/rbac/learnermanagement Video: https://www.youtube.com/watch?v=scbYDMffenI # Learner Management ## Purpose Set up learner managers (e.g., account executives or program overseers) who can manage enrollments, teams, users, and analytics across a tenant—without giving access to course content or invitation workflows. --- ## Where to Start 1. Open **Tenant Settings**. 2. Go to **Management**. This area is also used for inviting users to the platform, courses, or programs. --- ## Roles (Preconfigured) You don’t need to create these from scratch—several roles are already available, including: - Analytics Viewer / Reader - List Teams - List Users - Create Teams - Read Teams - Download Analytics Reports These roles collectively allow a user to: - See users and teams across the tenant. - Create and manage teams. - View analytics dashboards. - Download reports. --- ## Policies Policies bind roles to resources (e.g., platform data, users, teams, analytics). - Dedicated policies already exist for learner management / account executive use cases. - These policies grant: - Visibility into users and teams - Access to analytics and reports - Team creation and management --- ## Groups (Recommended Assignment Method) To simplify setup, a group is already configured and assigned to the correct learner-management policies. --- ## How to Grant Access 1. Open **Groups** in **Management**. 2. Select the **learner management / account executive** group. 3. Add users who should oversee enrollments, teams, and analytics. 4. Click **Save**. Because the group is already tied to the right policies, adding a user here is all you need to do. --- ## What Learner Managers Can Do (User View) ### Management Access Once added to the group, the user will see the **Management** tab and can: - Create teams (e.g., “Company A Team”). - Add members to teams. - Edit or delete teams they created. - View other teams across the tenant (read-only where applicable). - Understand which users belong to which teams. ### Analytics Access Learner managers can: - View analytics dashboards (when data is available). - Download reports showing: - Course enrollment - Learner progress - Participation and completion metrics These users can oversee learners and outcomes without modifying courses or sending invites. --- ## Ideal Use Case - Corporate or institutional staff who need oversight, not content control. - Account executives managing cohorts or client learners. - Program managers tracking progress and engagement. --- ## Result By assigning users to the **learner-management group**, you enable controlled, tenant-wide visibility into learners, teams, and analytics—supporting oversight without risking course or content changes. ### Transcript skillsAI | Learner Management https://www.youtube.com/watch?v=scbYDMffenI [1.7s] I will be showing you how to set up [4.3s] managers who will be overseeing [7.0s] enrollments, analytics, um, and so on [10.0s] and so forth with, um, these new rules [14.2s] and policies we have implemented. The [16.6s] first thing is you're going to open up [17.7s] your tenant settings and go to [19.2s] management. This is also where you can [21.0s] invite users um, to the platform, [24.4s] courses or programs. [27.0s] Now, the first thing here is we're going [28.4s] to go to roles. [30.6s] And we've set up a couple of different [32.3s] roles uh so you won't have to worry [34.6s] about setting these up in the future [35.9s] yourselves. Um we have the analytics [38.2s] viewer role, read teams, analytics [41.1s] reader, [42.6s] um list teams, list users, and create [45.5s] teams. Essentially, what this is going [47.3s] to do is allow you to see all the or [49.5s] assign roles to a policy that will allow [52.8s] um folks assigned to set policy to see [55.7s] teams, see users across the the tenant, [60.6s] um list them, create them themselves, [63.8s] see analytics across the board, um so [66.9s] and download reports. So if we go to [69.2s] this policies over here, [71.8s] you can see that we have a couple of [74.5s] different ones specifically meant for [76.6s] this account executive slash overseeing [80.3s] enrollments and analytics for uh users [84.2s] across the board. Um and essentially the [88.2s] way this works is you're just assigning [89.8s] the role to a policy and making sure [91.5s] that there's a resource selected. In [93.2s] this case it's uh you know the platform [95.6s] the platform data users teams so on and [98.4s] so forth. [100.2s] So what we've done uh to make this [102.6s] simple is created a group assigned to [106.2s] the relevant policies for this this [109.4s] account executive role which should be [111.6s] able to have enrollment access as well [114.2s] as [115.8s] um being able to view analytics about [119.1s] users on the platform. Um this will also [122.0s] allow them to create teams which is [123.6s] going to be a collection of users uh [125.3s] that folks can oversee. Um so this will [128.2s] be useful for you know a company who [130.1s] wants to manage their learners but [132.2s] shouldn't be touching you know course [133.7s] content you know sending invites or [135.5s] anything like that. Um but the way this [138.9s] will work the way you'll assign this to [140.6s] folks is through this group. what [144.6s] because this group is already assigned [146.1s] to all the proper policies. All you'll [148.8s] need to do is add anybody who should [150.4s] have access [152.5s] um to you know being able to enroll, [154.7s] managing users, so on and so forth um at [158.2s] a tenant level. Uh you'll just add them [161.0s] to the members here. And now I'll show [163.4s] you what that looks like from from the [165.8s] side of a a corporate account executive. [169.1s] So, I'm just going to copy this [173.0s] and we'll log in as that user. [183.8s] So, I have logged in as the user who [186.1s] I've assigned to that account executive [188.3s] role. Um, and the first thing we'll see [191.2s] here is that we'll have access to this [193.5s] management tab. This will allow us to [196.1s] create new teams. uh you know this could [199.1s] be like company a [203.4s] uh company a team and you can add [207.6s] members to said team uh across the [210.2s] board. So I'll add just a few folks here [216.4s] let's see [219.4s] so we've got you know just a few folks [221.0s] in this team and we'll hit create team. [225.9s] We can now edit these as well um and [229.8s] delete them as long as we've created [231.4s] them. Now, this user has access to teams [233.8s] that um have not been created by them, [237.4s] but they can, you know, you know, see [239.6s] who's in the team and whatnot. Um and [243.8s] you know, manage whoever is in this team [246.0s] that they've been assigned. um or you [249.0s] know see the teams across the board and [251.3s] see you know who belongs to what team [252.9s] what company has oversight over which [255.3s] teams so on and so forth. [258.7s] So the same user who is a executive um [263.7s] that can invite folks and create teams [266.6s] and assign people to said teams can also [269.7s] view analytics. Now these dashboards are [272.6s] currently based off tour so you won't [274.6s] see any analytics here but you have [277.5s] access to these data reports that you [279.3s] can download see you know who's in you [282.4s] know what which [284.7s] courses you know what their progress [286.5s] looks like so on and so forth and [289.1s] through these different reports Here. --- ## LLM Assignment URL: https://ibl.ai/docs/configuration/rbac/llmassignment Video: https://www.youtube.com/watch?v=VLKQ_tX2L9k # LLM Assignment ## Purpose Assign specific LLMs to individual users or groups so mentor editors only see the models they are authorized to use, rather than every model available to tenant admins. --- ## Prerequisites - The user must already have **mentor editor** access (visible on the mentor's **Access** tab) - A role with LLM-related actions must exist (or you will create one) --- ## Create an LLM Access Role 1. Go to **Tenant Settings → Management → Roles**. 2. Create a new role (e.g., "LLM Model Access"). 3. Assign the actions: - **Read** all LLMs - **Select** from available LLMs --- ## Create or Edit a Policy 1. Navigate to the **Policies** tab. 2. Click **New Policy** (or edit an existing one like "LLM Model Access"). 3. Configure the policy: - **Policy name**: descriptive label - **Role**: select the LLM access role created above - **Platform**: select your tenant - **LLM(s)**: choose one or more models (e.g., OpenAI GPT-4o Mini, Google, Perplexity) 4. Under **Users/Groups**, add the specific user or group. 5. Click **Save Policy**. --- ## Verify as the User 1. Log in as the assigned user. 2. Open a mentor and click the **LLMs** tab. 3. Confirm that only the assigned models appear (e.g., GPT-4o Mini, Google, Perplexity). 4. Tenant admins will still see all LLMs; restricted users see only their assigned models. --- ## Scope Options - **Platform-wide**: the user sees the same limited LLM set across all mentors - **Per-mentor**: restrict LLM access to specific mentors by adjusting the policy resource to a particular mentor rather than the whole platform --- ## Key Takeaways - **Roles** define what actions can be taken (read, select LLMs) - **Policies** bind a role to specific LLM resources and assign users or groups - LLM restrictions apply immediately upon the user's next login - Use **groups** to manage LLM access for multiple users at once --- ## Mentor Chat Access URL: https://ibl.ai/docs/configuration/rbac/mentorchataccess Video: https://www.youtube.com/watch?v=jt8fL7vROcI # Mentor Chat Access ## Purpose Restrict a user or group to chatting with a specific mentor without exposing other mentors or settings on the platform. Useful for providing targeted access to students or external users. --- ## How It Works A pre-configured **role** and **policy** limit the user to: - **Chat** with a specific mentor - **Read** the mentor name and outputs needed for chatting - **No access** to settings, other mentors, or admin features --- ## Setup 1. Go to **Tenant Settings → Management**. 2. A role has been pre-created with the following permissions: - **Actions**: chat with mentor, read mentor metadata - **Data actions**: read permissions required for chatting 3. Navigate to the **Policies** tab. 4. Click **Edit** on the associated policy. 5. Verify the policy includes: - The correct **role** - The correct **platform** (tenant) - The specific **mentor** resource 6. Under **Users**, add the user who should have chat-only access. - Alternatively, create a **group** and add the group to the policy. 7. Click **Save**. --- ## User Experience When the restricted user logs in: - The **mentors list** shows only the assigned mentor(s) - The **dropdown** shows only "New Chat" (no other options) - The **Explore** page shows no additional mentors (if the mentor is set to admin-only visibility) - The user can **chat normally** with their assigned mentor --- ## Adding New Users As new users register: 1. Go to the policy and click **Edit**. 2. Add the new user to the **Users** section. 3. Save — the user can start chatting immediately. --- ## Key Takeaways - The role and policy are **pre-configured** — you only need to add users - Users see **only** the mentor(s) assigned via their policy - Use **groups** to manage access for multiple users at once - This is for **chat-only** access — no settings, editing, or admin features are exposed --- ## Mentor Editor URL: https://ibl.ai/docs/configuration/rbac/mentoreditor Video: https://www.youtube.com/watch?v=WpJBhNB-2xs # Mentor Editor ## Purpose Grant edit access to a specific mentor without elevating a user’s tenant-wide permissions. This lets instructors or tenant admins collaborate on a mentor while keeping access tightly scoped. --- ## Overview - Users without editor rights (e.g., students) cannot edit any mentors by default. - Tenant Admins can edit any mentor and assign editor access. - Instructors can grant editor access only to mentors they own. - Editor access is mentor-specific—not tenant-wide. --- ## Granting Editor Access (Owner/Admin) ### 1. Open the Mentor - Go to the mentor you want to share (e.g., **Socratic Mentor**). ### 2. Open Access Settings - In **Mentor Settings**, select the **Access** tab. ### 3. Assign the Editor Role - Use the existing **Editor** role (or create one if needed). - Add the user (by email/username) to the **Editor** role. - Click **Save**. This grants edit rights **only for this mentor**. --- ## What Editors Can Do Once assigned, the user can: - Edit settings - Change the LLM - Update system prompts - Add/remove data sets - Enable/disable tools They cannot: - Access tenant settings - Edit other mentors they weren’t granted access to --- ## Verifying Access (Editor’s View) - The user remains a student (or non-admin) in profile. - The shared mentor shows **Edit** options in the dropdown. - Other mentors still show **Chat only** (no edit access). --- ## Notes & Best Practices - Use editor access for collaboration without over-privileging users. - Prefer mentor-level editors over tenant roles to minimize risk. - Review editor assignments periodically from the mentor’s **Access** tab. --- ## Result You can safely collaborate by giving users precise, mentor-level edit access—no broader permissions required. ### Transcript mentorAI | Mentor Editor https://www.youtube.com/watch?v=WpJBhNB-2xs [1.4s] I'm going to be walking you through how [3.8s] you can give edit access to a given user [6.6s] within your tenant um to a mentor that [9.9s] you as an instructor have created or if [12.1s] you're a tenant admin um any mentor that [14.7s] exists on the platform. So I'm going to [16.8s] be showing you from the perspective of a [19.0s] person we want to edit a mentor before [21.3s] we apply the role. So if we see here [24.2s] under our profile this person's just [25.8s] considered a student. they have no [28.5s] ability to access um any editing [31.6s] settings um across any of the mentors [34.6s] within this given space. [37.0s] So what I'm going to do is go back to [39.4s] the Socratic mentor [44.4s] and what we'll do now is look at the [47.0s] tenant admin perspective um or the owner [50.4s] of a given mentor's perspective to show [52.6s] you how you can give edit access to [54.4s] other people. [56.8s] So now we're going to take a look at the [58.1s] tenant admin perspective. Um so you can [61.0s] see here that I have you know admin [62.8s] access over my tenant. Um you can even [65.2s] see here on my profile that it says [66.6s] admin. So this person is going to have [69.3s] access to edit and create as many [71.3s] mentors as they want. Um also edit any [74.2s] mentors within the given platform as it [76.1s] stands. Now if you're an instructor and [78.2s] you want to give somebody access um to a [80.9s] mentor you've created, you would only be [83.0s] able to do that for mentors that are [84.7s] under your ownership. [87.2s] Now you can see here um under the mentor [89.9s] settings we have this access tab. You [94.1s] can create different roles and even add [97.0s] people to roles that already exist. In [98.9s] this case we already have editor and [100.9s] we'll click on actions. Now we're going [103.4s] to add the same user that we saw that [107.6s] was just a normal student and had no [109.2s] edit rights to any given mentor on the [111.3s] platform. This is going to give them [113.3s] access exclusively to this singular [115.9s] mentor within this dedicated space. So [118.8s] now we're going to exit out of this. And [121.1s] now we're going to go back to the [122.6s] perspective of this individual user. Um [125.8s] so you can see that they'll be able to [127.1s] edit this mentor but nobody else's. [132.4s] Now we are back to the individual user [134.9s] role. You can see here that this is [137.0s] still considered a student user. Um and [140.3s] we still can't access any of the tenant [142.1s] settings for example. Now if we click on [144.8s] this little drop down you can actually [146.3s] see that we are able to actually edit [148.0s] this mentor. [149.8s] Um so you know if we wanted to change [151.5s] the LLM we can always select a different [154.3s] mentor here. Um you have settings [158.5s] you can even edit the prompts and add [161.4s] data sets [163.4s] enable tools so on and so forth. [167.6s] So now if we wanted to [171.1s] look at another mentor to see if we can [173.0s] actually edit it, we shouldn't be able [174.7s] to. So if we click on this, you can see [177.8s] that we only have the new chat. We don't [180.2s] have access to edit this one because we [181.8s] were only given permissions to edit that [183.9s] singular mentor, that Socratic mentor. --- ## Team Manager Assignment URL: https://ibl.ai/docs/configuration/rbac/teammanagerassignment Video: https://www.youtube.com/watch?v=3l7oPOtcogI # Team Manager Assignment ## Purpose Restrict visibility so external partners (e.g., companies purchasing courses) can only see analytics for their own learners, not data from other organizations or teams. --- ## Overview This setup uses roles, policies, and groups to grant scoped analytics access. Team managers can view reports for the learners they oversee—nothing else. --- ## Setup Steps ### 1) Create or Use a Team-Scoped Role - Go to **Tenant Settings → Management → Roles**. - Use (or create) a role like **Company Analytics Viewer** with actions such as: - Read analytics and reports - Read specific user groups - This role should **not** include content editing or invitation permissions. ### 2) Bind the Role to a Specific Group (Policy) - Navigate to **Management → Policies**. - Create (or edit) a policy for the company/team (e.g., **Company A**). - Assign the **Company Analytics Viewer** role. - Set the **Resource** to: - The platform - A specific user group (the team this company should see) - Click **Save**. This resource binding is what limits visibility to only that group’s learners. ### 3) Assign Users to the Policy - In the same policy, add: - Individual users, or - A group that represents the company’s managers - Click **Save**. Anyone added here gains analytics access **only** for the selected group. --- ## What Team Managers See (User View) - Access to **Analytics dashboards**. - Ability to **filter reports** by the groups they oversee. - Reports and dashboards show **only their assigned learners**. - **No visibility** into other companies’ users or data. - Current dashboards focus on **mentor analytics**; course dashboards will be added later. --- ## Example Outcome A manager assigned to **Group B**: - Sees analytics filtered to **Group B only**. - Cannot view data for **Group A** or any other teams. --- ## Result By pairing a **team-scoped role** with a **group-bound policy**, you ensure external managers have the insights they need—without exposing any other learner data. ### Transcript skillsAI | Team Manager https://www.youtube.com/watch?v=3l7oPOtcogI [1.4s] Let's say you have a company who [2.9s] purchases courses from you. You don't [5.2s] want them to see other, you know, [7.1s] folks's data. You only want them to see [9.0s] the data for their subset of learners [11.3s] that they have purchased courses for. [13.8s] [snorts] The way to set this up is under [17.6s] this management tab in the tenant [19.3s] settings. Now we already have this set [21.5s] up and this is the model that you can [23.1s] follow um for subsequent uh [27.4s] mentor/manager roles that are only [29.4s] overseeing um a specific team. Um in [32.9s] this case we have the role setup for [35.4s] this which is the company analytics [38.1s] viewer and you can see in the actions [40.5s] here we have the ability to read um [43.0s] reports analytics and including uh the [46.9s] ability to read specific user groups. [49.3s] Now, where this be becomes connected to [51.9s] an actual user group and to specific [53.9s] users is under policies. You'll see [56.3s] [snorts] that we we have a couple of [57.5s] different types of policies here. In [60.1s] this case, company A. This should be say [61.9s] company B, but we'll leave it as is for [65.5s] now. Um, and you'll see here that we [67.7s] have the role assigned where we have [69.6s] those specific actions um set up for [72.9s] them to be able to view analytics. [75.7s] And [77.3s] we also have a couple of others. So [78.7s] we'll just click on company teams and [80.3s] then the resource here is what we're [81.8s] saying is a specific group. So the [84.1s] platform is you know this uh particular [86.5s] platform we're in and the user groups uh [90.0s] number here is referencing a specific [92.0s] group. Now [snorts] you can add folks [94.3s] here either by user or by a specific [96.9s] group. [98.4s] Um so we have you know company A u in [101.5s] this particular user here we'll just say [105.6s] company A again. Um, and then you can [108.1s] add as many folks to this group as you'd [109.8s] like. And then once you had hit save, [114.2s] the folks that you have in those either [115.9s] the groups or that you add individually [118.3s] will have edit act or not addit access, [120.6s] but apologies. Um, they'll have access [123.1s] to analytics for their particular users. [125.7s] Um, they won't see anybody else's and [127.4s] I'll show you what that perspective [128.4s] looks like here in a minute. [132.2s] So I'm currently logged into somebody [133.8s] who has access to group B um that we set [136.5s] up in the first half as a tenant admin [139.4s] when creating that group um to oversee [143.5s] you know a particular set of folks and [145.9s] you can see here that we have analytics [147.6s] and we can filter by the two groups we [149.2s] have access to. Um that being said, [151.4s] these are mentor related dashboards [153.4s] which will be um enhanced later with [156.3s] course dashboards. So you can visualize [158.2s] the data that you have access to in [160.5s] these reports. These reports will only [162.6s] be limited to the users in the groups/ [165.3s] teams that you oversee and you will not [167.2s] see anybody else's data. --- ## Team Sharing URL: https://ibl.ai/docs/configuration/rbac/teamsharing Video: https://www.youtube.com/watch?v=3I2Ozruusgw # Team Sharing ## Purpose Share an existing team with additional users—such as external managers or partners—so they can view learner data and analytics for that team without gaining edit, invite, or notification privileges. --- ## Overview Beyond CSV-based platform invitations, teams can now be shared directly from the Management UI. This lets you grant read-only and analytics visibility to users who did not create the team (and may currently be students). --- ## Setup Steps (UI-Based) ### Open Team Management 1. Go to **Tenant Settings → Management**. 2. Locate and open the team you want to share (e.g., **Company A**). ### Edit Team Permissions - Edit the team and add the user(s) you want to grant access to. ### Assign Permissions - **Read permission:** allows viewing the team and its members. - **View Analytics permission:** allows viewing enrollments, progress, and analytics for that team. Assign **both permissions** to enable full oversight of team data (without edit powers). ### Add Users or Groups - Add **individual users** (e.g., a company manager who currently exists as a student), or - Add a **group** to bundle multiple managers together for easier management. ### Save - Once saved, the added users can log in and view analytics **only for the assigned team**. --- ## What Shared Users Can (and Can’t) Do ### Can - View team membership. - View analytics, enrollments, and learner progress for their team. ### Cannot - Edit platform settings. - Send invitations or notifications. - Access other teams’ data. --- ## Result Team sharing lets you grant scoped visibility to the right people—directly from the UI—so they can oversee their learners’ data without exposing or altering anything else on the platform. ### Transcript skillsAI | Team Sharing https://www.youtube.com/watch?v=3I2Ozruusgw [1.4s] In addition to being able to create [3.4s] teams and team managers via the CSV [5.9s] upload for platform invitations, we've [8.5s] also added a new way under this [10.1s] management tab to share teams to people [12.7s] who have not created them. So in this [15.4s] example, we'll say that you know me as a [18.1s] an account executive [20.2s] um creates a team for company A. I want [23.8s] to give company A [26.6s] um a manager that you know exists on the [29.9s] platform but is currently just a [32.2s] student. The way we would do that is by [34.6s] editing the team itself [37.0s] and giving this person the view [39.2s] analytics permission in addition to the [42.1s] read permission. What this is going to [44.0s] do is allow them to oversee their users [46.0s] and their data without being able to [47.9s] make edits on the platform, send [49.6s] invitations, send out notifications or [52.3s] anything like that. [54.7s] So, you can do the read permission like [57.0s] I said and then we'll just add a user [59.0s] here [64.9s] and we're just going to add a random [66.4s] student of mine. Um, you can also add a [69.4s] collection of users. You can create a [71.0s] group. Um, so this will provide them [73.4s] with, you know, if you you can bundle [76.3s] people together in case you need a [78.1s] multiple managers per team, that's [80.6s] totally okay. You just add them to a [82.4s] single group to kind of simplify the [83.9s] process. Um, and what this basically [86.5s] does is say this user gets read access [91.0s] um for this specific team. [94.4s] So if we wanted to also give analytics [97.9s] access or sorry the view analytics [100.4s] access role then you would do the same [102.2s] thing you would add the same user here [104.2s] for anybody who needs to oversee a group [106.2s] of users data you know their enrollments [109.2s] progress so on and so forth you need to [111.4s] make sure that they receive both the [112.9s] view analytics and this [116.6s] read permission. Once that's set, [119.0s] they'll be able to log in and see just [121.4s] the the data for the students inside of [123.8s] their own team. They won't see anybody [125.4s] else's. [127.0s] And that's basically how you set this up [128.6s] through the UI rather than the the CSV [130.6s] file. --- ## RBAC Troubleshooting URL: https://ibl.ai/docs/configuration/rbac/troubleshooting Video: https://www.youtube.com/watch?v=BAExpMFykEw # RBAC Troubleshooting ## Overview A starting point for diagnosing issues when mentor editors cannot see certain settings or tabs, or when RBAC policies do not appear to be working as expected. --- ## Common Issue: Missing Tabs for Mentor Editors **Symptom**: A user with editor access cannot see certain tabs (e.g., Data Sets). ### Steps to Diagnose 1. Go to the mentor's **Access** tab. 2. Verify the user has the **Editor** role assigned. 3. If tabs are missing despite editor access: - Create a **test student user**. - Add the test user to the same **Access** tab with the same role. - Check which settings are visible to the test user. - Compare with what the original user sees. --- ## Verifying Policy Assignments ### Check Roles and Policies 1. Go to **Tenant Settings → Management → Roles**. - Note: mentor editor roles assigned via the **Access** tab won't appear here — they exist only on the mentor's Access tab. 2. Go to the **Policies** tab. 3. Look for the policy associated with the user. ### Verify the Resource Mapping 1. Open the browser **Network** tab (Developer Tools). 2. Refresh the page. 3. Search for the `check` endpoint. 4. The first result shows the **tenant** information. 5. The second result shows which **mentor** the policy resource number maps to. 6. Confirm the mentor number in the policy matches the correct mentor. --- ## Checking User and Group Membership 1. In the **Policies** tab, click on the relevant policy. 2. Check the **Users** section to confirm the user is listed. 3. Check the **Groups** tab to see if the user belongs to a group assigned to the policy. 4. Verify group membership by expanding the group to see all members. --- ## Verifying Platform and Mentor Resources - The **platform number** can be found by looking at default policies in the Policies tab. - The **mentor number** can be verified via the Network tab's `check` endpoint. - Ensure both the platform and mentor resource numbers in the policy match the intended targets. --- ## Key Takeaways - **Access tab roles** (editor) are separate from **Management tab roles** — check both locations - Use the **Network tab** to verify which mentor a policy resource number maps to - Always verify **user membership** in the policy's Users or Groups sections - If the setup looks correct but issues persist, escalate for deeper investigation --- ## Authentication URL: https://ibl.ai/docs/configuration/uiux/authentication Video: https://www.youtube.com/watch?v=ujpqTh1aQFQ # Authentication ## Purpose Customize what appears on your **login page** for the **Mentor**, **Analytics**, and **Skills** platforms. *(Example below uses Mentor.)* --- ## Steps #### Open Tenant Settings - Click your **profile icon** (top-right). - Select your **tenant name** from the dropdown. #### Go to Advanced - Open **Advanced → Authentication Interface**. #### Edit Login Content - **Title:** e.g., *Ashland’s mentors* - **Description:** e.g., *mentor AI for education* - **Display Logo:** upload an image file or paste an image URL - **Legal Links:** add URLs for *Privacy Policy* and *Terms of Service* **Carousel Images:** - Add one or more display images (upload or URL). - *(Optional)* Add **alt text** for accessibility (e.g., *mentor chat*). #### Save - Click **Save**. #### Preview - Log out to view your updated login page with: - Title - Description - Logo - Legal links - Carousel images --- ## Result Your customized **login page** is applied across the selected platform, reflecting the **title, description, logo, links, and images** you configured. ### Transcript ibl.ai | Login Customizations https://www.youtube.com/watch?v=ujpqTh1aQFQ [1.2s] Hello, it's Ashley here again and I'm [3.2s] going to be showing you um a new feature [5.6s] update that we have released um recently [8.2s] for customizing your login pages. Now, [11.2s] this is going to apply to our analytics, [13.6s] our mentor, and our skills platforms. [16.9s] And in this case, we're going to be [18.3s] walking through the mentor login page [20.4s] setup. So, what you'll first do is click [22.6s] on your little profile icon and click on [25.4s] your tenant name. Once you're here, what [28.0s] you'll do is go to advanced and then [30.4s] you'll see this authentication [31.7s] interface, which is this is where you'll [33.9s] be able to customize what is being [35.4s] displayed on the login page. You can put [37.9s] in a title. So, I'm going to say [39.1s] Ashland's [41.4s] mentors. [44.4s] And then you can also add in a little [45.9s] description that will display on the on [47.6s] the page as well. And I'm just going to [49.0s] say mentor AI for education. And then [55.5s] you can also add in a display logo, your [57.8s] legal links, and also update the [59.8s] carousel images. Um, when it comes to [62.1s] uploading images, you can either include [64.0s] a URL directly to the image or upload [66.3s] one yourself. And that in this case, I'm [68.6s] going to be uploading an image. [72.4s] So now we have this little logo in. And [75.0s] then for the privacy and policy or terms [77.5s] of service links, I have some ones that [79.5s] I can throw in right here. [84.6s] And then for the display images, this is [86.8s] going to be the images that show up in [88.3s] the carousel. You also have the option [90.2s] of putting in alternative text for [91.8s] accessibility. Um, just, you know, a [95.0s] basic description of what you'd like to [96.6s] see here. So, I'm going to include just [97.8s] a few images. I'm going to say mentor [101.6s] chat. [103.4s] And then we'll hit save. [107.0s] And now what we'll do is actually see [109.8s] these changes applied by logging out of [112.8s] the platform. [122.7s] And you can see all these changes have [124.5s] been applied here. And that's basically [126.7s] all you have to do to customize your [128.2s] login page. --- ## Custom CSS URL: https://ibl.ai/docs/configuration/uiux/customcss Video: https://www.youtube.com/watch?v=JJgVHqw0OHA # Custom CSS ## Purpose Apply custom CSS to brand your dedicated tenant beyond the default styling. Adjust button colors, backgrounds, toggles, and other visual elements. --- ## Access Advanced CSS Settings 1. Click your **profile icon** in the top right. 2. Click your **tenant name**. 3. In the **Organization** tab, scroll to **Advanced Tenant Settings**. 4. Locate the **Advanced CSS** field. --- ## Apply Custom CSS 1. Write or paste your CSS rules into the **Advanced CSS** field. 2. Click **Save**. 3. Exit the settings — changes apply immediately. ### Example Changes - Button colors (e.g., changing blue buttons to your brand color) - Learner/instructor toggle color - Background colors - Any element accessible via CSS selectors --- ## Other Organization Settings From the same Organization tab, you can also configure: - **Support link** - **Email address** - **Help center link** - **Logos** --- ## Key Takeaways - Custom CSS is entered in the **Advanced Tenant Settings** section - Changes take effect immediately after saving - You can override any default styling using standard CSS selectors - This applies tenant-wide to all users in your platform --- ## License Creation URL: https://ibl.ai/docs/configuration/users/Licenses/licensecreation Video: https://www.youtube.com/watch?v=6XnUAQ9ISAo # License Creation ## Set Up a Content-Provider Platform - In **Platforms**, click **Create Platform**. - Fill out: - **Display Name** – e.g., Licensing Demo - **Platform Key** – e.g., licensing-demo - Add initial **tenant-admin credentials** (username, email, password) - Click **Save**. - A new content-provider tenant and its admin account are now created. --- ## Log In as the Tenant Admin - Sign out of the super-admin account (if needed). - Log in with the **tenant-admin credentials** you just set up. - You’ll land on **Skills**. From there, you also have access to **AI Analytics**. --- ## Author Course Content for the Provider - Go to **Studio → Authoring**. - Open **Courses** and click **Add New Course**. - In the creation form: - **Course Name**: Licensing Demo (or similar) - **Organization**: auto-filled with the provider’s tenant - Click **Create Course**. - Add at least one unit (e.g., a multiple-choice problem). - **Publish** the course and set course dates in the past so it’s immediately accessible. --- ## Assemble Courses into a Program - Still in **Authoring**, choose **Programs → Add New Program**. - Enter a **program name**, e.g., Licensing Program. - Paste the **Course ID** of the course you just created. - Add more Course IDs as needed. - Click **Save**. --- ## Create a Program License (Super Admin) - Log back in as a **Super Admin**. - Open **Data Manager → Program Licenses**. - Click **Add Program License** and fill out: - **Display Name**: e.g., Licensing Demo - **Count**: total seats purchased (e.g., 10) - **Start Date / Expire Date**: set the access window (optional but recommended) - In the pop-up: - Select the **Platform** you created earlier - Select the **Program** you assembled - Click **Save**. - The program license is now active and enabled by default. --- ## Verify in Tenant Analytics - Log back in as the **Tenant Admin**. - Go to **Analytics → Licenses**. - The new license pool (e.g., Licensing Demo) appears in the list with seat count and dates. --- ## Assign Licenses to Groups or Individuals - In **Analytics → Licenses**, select the license pool. - Choose **Group** or **Individual** assignment. - For a **Group**, pick an existing group (e.g., IBL V1) and click **Assign**. - For an **Individual**, search for and select specific users. - The assigned seats now show under the **Groups** or **Individuals** tab. --- ## Key Takeaways - **Platform** = tenant for the content provider - **Courses → Program → Program License** is the content chain - **Seat Count** controls how many users can access the program - **Start/Expire dates** define the license window - **Tenant admins** assign seats via **Analytics → Licenses** to groups or users You’ve now created a content-provider tenant, authored courses, bundled them into a program, issued a license pool, and assigned seats—all within mentorAI. --- ## Program Assignment URL: https://ibl.ai/docs/configuration/users/Licenses/programassignment Video: https://www.youtube.com/watch?v=go-fSu6jLDg # Program Assignment ## Review the Program License A **Program License** has already been created with: - **Name** (e.g., “Content Licensing Demo”) - **10 seats available** - **Active status** - Assigned to the **ax Alpha tenant** - Includes specific course content --- ## Tenant Admin: Locate the License - Log in as the **ax Alpha tenant admin**. - Open **Analytics → Licensing → Content**. - Under **Content Licenses**, locate the license you just reviewed. - At this point, no users are yet assigned. --- ## Assign the License to a User or Group - Click the **license name** to open its details. - Choose **Assign**. - Select either: - **Individual User** – search and pick a single account - **Group** – assign seats to an existing group (all members get access) - From the **License dropdown**, pick **Licensing Demo** (or the license’s display name). - Click **Assign**, then **Close**. --- ## Confirm the Assignment - Back in **Content Licenses**, click the **license name** again. - The **user or group** you chose now appears in the assignment list. --- ## Verify as the Assigned User - Log out, then log in as the **user who received the license**. - Navigate to **Discover**. - Search for the licensed content title (e.g., **Licensing Demo**). - The **course or program** is visible and ready to launch. --- ## Quick Recap - **Tenant admins** see unassigned content licenses in **Analytics → Licensing → Content**. - Licenses can be distributed to **individuals or groups** from the license detail view. - **Assigned users** find the newly unlocked content in **Discover** immediately after assignment. With these steps, you’ve successfully allocated a program license and ensured the learner can access the content. --- ## User Assignment URL: https://ibl.ai/docs/configuration/users/Licenses/userassignment Video: https://www.youtube.com/watch?v=sU3uwf2fqEY # User Assignment ## Create a Group - Open **Groups** and click **Add Group**. - Enter a name (e.g., IBL V1). - Search for users, select the ones you need, and click **Add**. - New tenants may list only a few users at first; the list grows as accounts are added. - Bulk-select and advanced search (e.g., by company) are planned for future releases. - The group now appears in the list; expand its dropdown to view current members. --- ## Create a User-License Pool - Go to **Data Manager → User Licenses**. - Click **Add User License**. - Fill out: - **Name**: IBL Licensing Demo (or similar) - **Count**: 20 seats (or any number you need) - **Start Date / Expire Date** (optional access window) - **Select the Platform** that the tenant admin belongs to - Click **Save**. - The license pool is now available to that tenant admin inside **Analytics → Licenses**. --- ## Assign the License Pool to a Group - Log in as the **Tenant Admin** and open **Analytics → Licenses**. - Choose the license pool you just created (Licensing Demo). - Select **Group assignment**. - Pick the group you created earlier (IBL V1) and click **Assign**. - All members of IBL V1 now hold active seats. --- ## Verify the Assignment - In **Analytics → Licenses**, switch to the **Groups** tab to confirm the pool is linked to IBL V1. - The **Individuals** tab remains empty because you assigned seats at the group level. --- ## Quick Recap - **Groups** let you bundle users for simpler license management. - **User License Pools** define seat counts and (optionally) start/expire dates. - **Tenant admins** assign pools to **Groups** or **Individuals** from **Analytics → Licenses**. - You can always add users to a group later; their licenses activate automatically. --- ## Management URL: https://ibl.ai/docs/configuration/users/Teams/management Video: https://www.youtube.com/watch?v=YCnrjHpTwcM # Management ## Accessing the Team-Manager Dashboard - Sign in with a **Team Manager** role. - You land on the **Reports** section by default. --- ## Reports for Individual Learners ### Filter Options - Set a **date range**. - Search by **pathway**, **learner name**, or **enrollment status**. - Click **Apply Filters** to update the list. - Remove filters to view every learner you manage. ### Download - Click **CSV** to export the filtered (or full) learner list. --- ## Group Reports - Shows all teams you manage (e.g., five groups in this demo). - Use the same **filter bar** to narrow results. - Export a **CSV** for group-level data. --- ## Notifications *(Covered in a separate demo, but key points included here)* - Send messages to specific **learners**, one or more **teams**, or an **individual**. - Choose delivery channel: - **Email** - **Events** (in-platform) - **Others** as enabled - Optional: **schedule** a send date. - Type your **message** and **preview** it before sending. --- ## Assignments - You can assign any of the following to learners or entire teams: - **Courses** - **Programs** - **Pathways** - **Skills** - **Credentials** ### How to assign: - Select the **item type** (e.g., Course). - Choose the specific **course**, **program**, etc. - Search for a **learner** or select a **team**. - **Confirm** to add the assignment. - All **active assignments** appear in the list below. --- ## Downloads - Access the same report types tenant admins see, but scoped only to the **learners and teams** you manage. - Use these files for **deeper analysis** or **record-keeping**. --- ## Key Takeaways - **Reports** give learner-level insights with flexible filtering and CSV export. - **Group Reports** mirror that functionality for entire teams. - **Notifications** let you reach learners directly from Analytics. - **Assignments** enable quick enrollment in courses, programs, and more. - **Downloads** provide segmented analytics identical to tenant-admin reports—only limited to your own teams. With these tools, a **Team Manager** can monitor progress, communicate with learners, and assign learning paths—all without tenant-admin privileges. --- ## Setup URL: https://ibl.ai/docs/configuration/users/Teams/setup Video: https://www.youtube.com/watch?v=Nk5Z-u0Kjsc # Setup ## Create a Learner Group - Open **Groups** and click **Add Group**. - Enter a name (e.g., IBL V1). - Search for users and select the members. - You can **bulk-import via CSV** if you have many users. - Click **Add** to save the group. - The group now appears in the list; expand its dropdown to confirm the members. --- ## Create a Team-Manager Group - Still in **Groups**, click **Add Team Manager**. - Give the manager group a name (e.g., Team Manager Mar 17). - Select one or more users who will act as managers. - Click **Add**. - This group defines everyone authorized to manage the team’s learners. --- ## Assign the Manager Group to a Team - Open **Team Management**. - Click **Assign User Group to Team Manager**. - Choose the learner Team you just created (e.g., IBL V1). - Select the Team-Manager Group (e.g., Team Manager Mar 17). - Click **Add**. - The chosen managers now oversee that team. - Multiple manager groups can be linked to a single team if needed. --- ## Edit or Update Later - **Groups**: add or remove learners anytime. - **Team-Manager Groups**: add or remove managers as roles change. - **Team Management**: reassign or revoke manager groups from teams. - All changes take effect immediately across **Analytics** and **reporting**. --- ## Quick Recap - **Groups** gather learners for easier management. - **Team-Manager Groups** bundle the people who supervise those learners. - **Team Management** ties the two together, allowing managers to view reports, send notifications, and assign learning tasks for their designated team. With these steps, you’ve set up a learner team, assigned its managers, and preserved the flexibility to adjust membership at any time. --- ## Groups URL: https://ibl.ai/docs/configuration/users/groups Video: https://www.youtube.com/watch?v=sU3uwf2fqEY # Groups ## Create a Group - In the admin console, open **Groups**. - Click **Add Group**. - Enter a name (e.g., IBL V1). - Search for and select the users you want in the group. - If your tenant is new, you may have only a few users; groups grow as more users are added. - Click **Add** to save the group. - You can reopen the group’s dropdown at any time to view or add members. --- ## Edit an Existing Group - Open **Groups** and select the group’s dropdown. - View current members. - Add or remove users as needed. - Multi-select and advanced search options (e.g., by company name or other attributes) are coming soon. --- ## Create a User-License Pool (Data Manager) - Switch to **Data Manager → User Licenses**. - Click **Add User License**. - Fill out: - **Name** (e.g., IBL Licensing Demo) - **Count** – total seats, e.g., 20 - **Start Date / Expire Date** (optional) - **Platform** – choose the platform this license pool belongs to - Click **Save**. - The new license pool now appears in the tenant admin’s **Analytics** view. --- ## Assign Licenses to a Group (Tenant Admin) - In **Analytics**, open **Licenses**. - Select the license pool you just created (e.g., Licensing Demo). - Choose **Group assignment**. - Pick the group you created earlier (IBL V1). - Click **Assign**. The licenses are now distributed to all members of that group. To verify, open **Groups → your group → Licenses**; the assignment is listed there. --- ## Key Points to Remember - **Groups** simplify license management—assign once, cover many users. - **User-license pools** control seat counts and expiration windows. - You can always: - Add more users to a group. - Reassign or retire licenses by changing group membership or license pools. - Upcoming features will make bulk user selection and advanced searching even easier. --- With these steps, you can quickly organize users into groups and allocate licenses in MentorAI Analytics without manual per-user work. ### Transcript ibl.ai | User Groups Licenses https://www.youtube.com/watch?v=sU3uwf2fqEY [0.8s] let's talk about how you can create [2.4s] groups that can be later used to assign [4.4s] licenses to so the first thing you're [6.4s] going to do is go to [7.8s] groups um and then you'll hit add group [11.4s] so I'm going to get this the name [14.0s] IBL [15.6s] um V1 I don't know you can make the name [18.8s] whatever you'd like it to be um and then [20.6s] you can search for users I'm just going [22.0s] to quickly select mine and I'm using a [24.9s] previously created tenant that has a lot [26.6s] of users already populated um so that [29.0s] way we can create groups um you know [31.7s] newer tenants might not have any users [33.5s] so once people start to populate their [35.9s] you know their own users into their own [38.0s] tenant you'll be able to see a lot more [39.9s] users we'll hit [42.2s] add so we've added that and if we click [47.2s] on drop down we can see this license [51.0s] that we just created and we can see [52.5s] these two users that we've added if we [54.6s] want to add more users later we always [57.0s] can um we will be also adding a way to [60.4s] multi- select users when creating groups [62.4s] as well um including the ability to [64.7s] search um by more information you know [67.0s] maybe it's the by the company name or by [70.6s] any other sort of you know attribute [73.6s] that's attached to that [75.4s] user so we've got our group created and [79.2s] if we go to you know any of the users or [83.2s] the content so if we hit users um we can [86.6s] either assign a license we've already [88.4s] got one created you can see who who [90.0s] already been assigned that license um if [92.2s] there's a group already there you can [94.3s] also see those [96.2s] groups although we already have a [98.1s] license here what I actually am going to [99.6s] do next is show you exactly how to [103.1s] assign a new user license to um a given [106.4s] tenant so that way the tenant can later [108.4s] go and assign licenses so if we go to [112.1s] data manager which is [117.2s] this we'll go down to we'll find groups [121.0s] actually sorry user [126.5s] licenses and we can see that there are a [128.4s] few already created so we're going to [130.6s] hit add user license we're going to do [133.7s] IBL [135.2s] licensing demo as the name here and [139.0s] we're just going to give this a count of [140.3s] 20 you can also have a start an uh [143.1s] expire date here as well um the start [145.4s] data is when access starts and expired [147.2s] is when that license expires um [150.9s] but after we have [152.7s] this we'll essentially just select the [158.9s] platform and assign it to the the [162.0s] platform that this tenant admin is [164.0s] currently in and so we'll hit [166.5s] save and then then we have that license [169.1s] now that we've created this user license [171.1s] this will now appear in that tenant [172.6s] admin's analytics platform so we'll gock [176.4s] over to licenses in Analytics as that [180.2s] tenant [181.5s] admin and you can see this is the [183.9s] license that we just created now if you [185.6s] have dates it won't say invalid dat for [188.6s] instance like you can see here this is [190.2s] the start end date um I didn't add them [192.6s] when we were creating this license just [194.0s] to quickly be able to create this to [195.6s] show you um now that we have that [198.5s] license created we can select from here [201.8s] which license we want so we're going to [203.0s] hit licensing demo and you have the [205.6s] option of using groups or individuals so [208.2s] we're going to do a group and we're [210.2s] going to do IBL V1 and then hit [215.7s] assign and now we have those licenses [217.8s] assign so if we go back over here [219.4s] there's no individuals but if we go to [221.1s] groups we will see that group we just [224.1s] created and those folks in that group [226.6s] now have user licenses to be able to [228.9s] access the platform --- ## Invitations URL: https://ibl.ai/docs/configuration/users/invitations Video: https://www.youtube.com/watch?v=Vu_Ws0Ex4Dc # Invitations ## Purpose Invite an **existing learner** to a specific **course** from the **Data Manager**. > **Note:** The same workflow applies to **Program invitations** and (when the user is known) **Platform invitations**. - Use **bulk invitations** (tenant admin) or **placeholder invitations** (super admin) for **unknown platform users**. ## Steps ### Open the Invitation App - In **Data Manager**, navigate to **Catalog Invitation Apps** ### Choose Invitation Type - Click **Add Course Invitations** - *(For programs, choose **Add Program Invitations**; for platforms, choose **Add Platform Invitations**)* ### Select the Learner - **Search for and pick** the user; they must **already exist** on the platform - Leave the **Email field empty** (not required if the user is found) ### Confirm Status - Ensure the **Active toggle** is **on** ### Pick the Course - Choose the **course** the learner should join ### Save - Click **Save** - A **confirmation entry** confirms the course invitation was created ## Result The learner now has an **invitation to the selected course**, issued directly by a **super‑admin**. --- # Bulk Invites ## Purpose Invite users to your **tenant** by uploading a **CSV**, reviewing/editing entries, and submitting the invitations—all from one place. ## Steps ### Open Tenant Invites 1. Click your **profile dropdown** and select the **tenant name**. 2. Go to **Management → Invite**. ### Get the Template (Optional) - Click **Download CSV Template** to get a starter file you can fill out. ### Upload the CSV - Click **Upload CSV** and choose your file. ### Review and Edit - Update any values directly in the grid. - Add rows for additional users you forgot to include. - Delete rows if needed. - You can **edit both columns and rows before submitting**—use this to validate that everything looks right. ### Submit - Click **Submit**. - The uploaded users appear in the list as **Pending** until they register. ## Result Your **CSV entries** are imported as invitations for the selected tenant, with each invite **pending until the user completes registration**. ### Transcript AI Skills | Super Admin | Create User Invite https://www.youtube.com/watch?v=Vu_Ws0Ex4Dc [0.5s] all right so I'm going to show you how [2.0s] to send an invitation um as a super [5.9s] admin from the data manager to a learner [8.9s] um so we'll first go to the catalog [11.2s] invitation apps and the same method is [13.4s] going to apply for course platform and [15.5s] program [16.4s] invitations um if we know the user [18.8s] specifically for platform invitations [21.0s] otherwise you'll use the bulk um [25.3s] invitation creation method for platforms [27.6s] as a tenant admin or the placeholder [30.3s] invitation as a super admin that we've [32.8s] shared in a previous video so in this [35.3s] case we're going to go with a course [37.3s] invitation and we'll hit add course [41.0s] invitations and now if we're doing a [43.7s] course or a program invitation the [45.1s] assumption is that users is already in [46.7s] the platform which means we can search [48.6s] for a user um so we'll select the user [51.7s] that we want to invite to a [54.6s] course um and since we have that user we [57.0s] don't need to worry about the email um [60.4s] so we'll make sure that they're active [62.7s] as well they are uh if there's [65.9s] [Music] [67.0s] a course that we want to select here so [70.2s] we need to select a course for them to [71.5s] be able to get an [73.0s] invitation um to a specific course so [77.1s] we'll select this course and we'll hit [80.7s] save and now you'll see that this [84.1s] invitation has been created um by super [86.7s] admin to in invite a learner to a course --- ## Claw Agents URL: https://ibl.ai/docs/developer/agents/claw-agents # Claw Agents 48 pre-built agent configurations for [OpenClaw](https://github.com/iblai/iblai-claw-agents) instances, organized by vertical. --- ## Overview **iblai-claw-agents** provides a curated library of ready-to-deploy agent configurations designed for various industry verticals. Each agent is pre-configured with appropriate system prompts, tool selections, and behavioral parameters. --- ## Repository - **GitHub**: [iblai/iblai-claw-agents](https://github.com/iblai/iblai-claw-agents) - **License**: Open Source --- ## Getting Started Clone the repository and explore the available agent configurations: ```bash git clone https://github.com/iblai/iblai-claw-agents.git cd iblai-claw-agents ``` Browse the verticals directory to find agents suited to your use case, then deploy them to your OpenClaw instance. --- ## Claw Setup URL: https://ibl.ai/docs/developer/agents/claw-setup # Claw Setup Connect self-hosted [Claw](https://github.com/iblai/iblai-claw-setup) servers to the ibl.ai platform. --- ## Overview **iblai-claw-setup** enables connection of self-hosted Claw servers to the ibl.ai platform. This tool handles the configuration and authentication needed to integrate your local Claw deployment with ibl.ai's cloud services. --- ## Repository - **GitHub**: [iblai/iblai-claw-setup](https://github.com/iblai/iblai-claw-setup) - **License**: Open Source --- ## Getting Started ```bash git clone https://github.com/iblai/iblai-claw-setup.git cd iblai-claw-setup ``` Follow the setup instructions in the repository README to configure your self-hosted Claw server and connect it to the ibl.ai platform. --- ## .iblai Agent Standard URL: https://ibl.ai/docs/developer/agents/standard # .iblai Agent Standard The [.iblai Agent File Format](https://github.com/iblai/standard) is a Markdown-based portable agent definition format. --- ## Overview **standard** defines the `.iblai` file format, a Markdown-based specification for portable agent definitions. This format allows agents to be described, shared, and deployed across different platforms using a human-readable document structure. --- ## Repository - **GitHub**: [iblai/standard](https://github.com/iblai/standard) - **License**: Open Source --- ## Getting Started ```bash git clone https://github.com/iblai/standard.git cd standard ``` Read the specification to understand the `.iblai` file format and start defining your own portable agent configurations. --- ## App CLI URL: https://ibl.ai/docs/developer/applications/app-cli # App CLI Interactive CLI for scaffolding [ibl.ai frontend applications](https://github.com/iblai/iblai-app-cli) with Next.js and React. --- ## Overview **iblai-app-cli** is a command-line scaffolding tool that generates ibl.ai frontend applications built with Next.js and React. It provides project templates, component libraries, and configuration presets to accelerate frontend development on the ibl.ai platform. --- ## Repository - **GitHub**: [iblai/iblai-app-cli](https://github.com/iblai/iblai-app-cli) - **License**: Open Source --- ## Getting Started ```bash git clone https://github.com/iblai/iblai-app-cli.git cd iblai-app-cli ``` Run the CLI to scaffold a new ibl.ai frontend application with your preferred configuration. --- ## MCP Servers URL: https://ibl.ai/docs/developer/applications/mcp # MCP Servers A collection of [Model Context Protocol](https://github.com/iblai/iblai-mcp) servers for the ibl.ai platform. --- ## Overview **iblai-mcp** provides a collection of Model Context Protocol (MCP) servers designed for the ibl.ai platform. These servers expose platform capabilities as tools that AI agents can use, enabling deep integration between language models and ibl.ai services. --- ## Repository - **GitHub**: [iblai/iblai-mcp](https://github.com/iblai/iblai-mcp) - **License**: Open Source --- ## Getting Started ```bash git clone https://github.com/iblai/iblai-mcp.git cd iblai-mcp ``` Follow the setup instructions to configure and run MCP servers for your ibl.ai deployment. --- ## Infrastructure CLI URL: https://ibl.ai/docs/developer/infrastructure/infra-cli # Infrastructure CLI Interactive CLI for provisioning and configuring the [ibl.ai platform](https://github.com/iblai/iblai-infra-cli) on AWS. --- ## Overview **iblai-infra-cli** is an interactive command-line tool that automates the provisioning and configuration of the ibl.ai platform infrastructure on Amazon Web Services. It guides you through setup with interactive prompts, handling resource creation, networking, and service deployment. --- ## Repository - **GitHub**: [iblai/iblai-infra-cli](https://github.com/iblai/iblai-infra-cli) - **License**: Open Source --- ## Getting Started ```bash git clone https://github.com/iblai/iblai-infra-cli.git cd iblai-infra-cli ``` Run the CLI and follow the interactive prompts to provision your ibl.ai infrastructure on AWS. --- ## Career Launch URL: https://ibl.ai/docs/gallery/careerlaunch Video: https://www.youtube.com/watch?v=1S2x1EBepVE # Career Launch ## Demo --- ## Description A mentor that helps students **launch their careers** by connecting them to internships and career fairs, guiding resume/cover-letter writing, and supporting job searches during school and after graduation. It provides **next steps**, points to the right **campus portals and resources**, and prompts students to meet with **career coaches** when appropriate. --- ## Typical Flow #### 1. Start the conversation **Sample prompt:** “I am a computer science major at Buffalo State University. What internships are available for my major?” #### 2. Find opportunities - Mentor directs you to where **major-specific internships** are listed (campus portals, departmental pages, career center listings). - Asks follow-ups like *“Would you like a current list?”* - Offers concrete next steps (check the portal, attend job fairs, book time with a career coach). #### 3. Build application materials - Ask for help drafting or refining a **resume** or **cover letter** tailored to your major and target roles. #### 4. Search for jobs - Request guidance for **post-graduation job searches**, including how to target roles and prepare applications. --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to give students a proactive, step-by-step path from campus to career? Email us at **support@iblai.zendesk.com** and we’ll enable the **Career Launch Mentor** in your environment—or help you build a **customized onboarding experience** tailored to your institution. ### Transcript mentorAI | Career Launch Mentor https://www.youtube.com/watch?v=1S2x1EBepVE [1.2s] In this demo, I'm going to be showing [2.6s] you our career launch mentor. This is a [5.3s] mentor that helps connect students to [7.8s] internships and career affairs. It can [10.6s] help with resume and cover letter [12.6s] writing and even help search for jobs. [16.0s] So if we take a look at the system [17.4s] prompt, [21.7s] you can see that this is designed to [23.1s] help, you know, help develop the career [25.6s] of the student while in in university or [28.8s] even post university, kind of getting [30.7s] them prepped for a career. Um, in this [33.4s] case, what I'm going to do to test this [36.3s] is I'm going to say I am a computer [40.3s] science major [43.8s] at [45.9s] Buffalo [47.4s] State [49.1s] University. [51.8s] What internships [55.2s] are available for my major? [62.6s] So now the mentor is actually directing [64.0s] us to where we can actually find [65.1s] resources on what specific internships [67.4s] are available for the specific major. [69.5s] It'll even ask you know additional next [71.3s] steps uh questions you know would you [73.5s] like a current list? Uh it'll even give [75.6s] you steps for you to take um outside of [78.3s] chatting with the mentor. You know maybe [79.8s] that's going into a portal and seeing [82.2s] what what internships are available on [84.8s] that specific portal meant for [86.2s] internships attending job fairs. um even [89.4s] meeting with career coaches at at their [92.0s] campus. And so this is basically how the [94.6s] the career launch mentor works. And it [96.3s] goes even further like I mentioned [97.5s] earlier. It can help with cover letter [99.0s] and resume writing or even helping you [101.7s] find specific jobs, you know, after you [104.6s] graduate. --- ## Completion Coach URL: https://ibl.ai/docs/gallery/completioncoach Video: https://www.youtube.com/watch?v=0zP_PLgfvm4 # Completion Coach ## Demo --- ## Description A mentor designed to help students **stay on track for on-time degree completion**. It monitors **credit momentum**, flags **registration risks/holds**, answers questions like *“Am I on track to graduate by X term?”*, and explains **major + university requirements**. The system supports **proactive alerts**, ensuring students know what’s missing **before delays occur**—strengthening retention and graduation outcomes. --- ## Typical Flow #### 1. Start the conversation **Sample prompt:** “I am a marketing major at Stony Brook University. What are my degree requirements?” #### 2. Review requirements - Mentor breaks down: - **University requirements** - **Major-specific requirements** - **Additional notes** - Minimum grade rules - % of credits required at the institution - Program-specific standards #### 3. Check progress & risks Ask follow-up questions such as: “Am I on track to graduate by next Spring?” “Do I have any holds to clear before I can register?” “Which remaining courses do I need for my major?” - Mentor surfaces: - Potential **gaps** - **Holds** or administrative blockers - **Registration issues** - Next steps to stay on track #### 4. Stay proactive Mentor can be configured to issue **proactive alerts** when something could delay on-time completion—guiding the student to resolve issues early. --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to improve retention and on-time graduation with proactive, requirement-aware guidance? Email us at **support@iblai.zendesk.com** and we’ll enable the **Completion Coach Mentor** in your environment—or help you build a **customized onboarding experience** tailored to your institution. ### Transcript mentorAI | Completion Coach Mentor https://www.youtube.com/watch?v=0zP_PLgfvm4 [1.3s] In this demo, I'm going to be walking [2.6s] you through our completion coach mentor, [4.5s] which is a mentor designed to keep [5.9s] students on track for on-time degree [8.4s] completion. It can monitor things like [10.5s] credit momentum, any potential risks or [13.0s] registration issues. You know, it can [15.4s] even answer whether or not somebody is [16.9s] on track to graduate by a certain [18.5s] semester. Uh, and even if there's, you [21.4s] know, holds that they need to clear [22.7s] before registering for specific courses [24.3s] to complete their degree. Essentially, [26.1s] this is a way to support retention and [27.8s] graduation by surfing surfacing [29.9s] proactive alerts, [laughter] which is [31.5s] exactly how this mentor's system prompt [33.7s] is set up. So, if we take a look at [35.3s] that, that's what this is looking like. [39.0s] All right. Now, if we were to ask as a [40.6s] student, I'm going to say, I am a [43.3s] marketing [45.8s] major at Stonybrook [48.7s] University. [50.7s] What are [53.0s] my degree requirements? [60.2s] And you can see right here, it's [61.3s] breaking down the university's [62.9s] requirements, the specific majors [64.2s] requirements, and any additional notes [67.5s] like earning a C in business courses or [69.9s] getting at least 50% of business credits [71.7s] completed at Stony Brook specifically. [74.0s] Um, so this is a great way for students [75.9s] to keep on track. Um, and even ask [78.4s] questions, you know, about their own [79.9s] specific circumstances in relation to [81.8s] these requirements, such as any holds [83.7s] that they might have or any issues, gaps [86.3s] that they have with, you know, getting [88.0s] their degree actually completed. And [90.2s] again, this is meant to help with [91.4s] retention and improving graduation [93.2s] rates. And we'll actually have the [95.4s] option of proactively alerting, you [97.4s] know, students if there's something [98.6s] missing that would delay them from [101.0s] graduating on time. --- ## Course Path URL: https://ibl.ai/docs/gallery/coursepath Video: https://www.youtube.com/watch?v=iRun3GBwIL0 # Course Path ## Demo --- ## Description A mentor that helps students understand degree requirements, prerequisites, and course sequencing by grounding answers in the university’s official curriculum data. It reinforces existing degree-planning tools and keeps guidance aligned with the mapped requirements for a given program. --- ## Typical Flow #### 1. Start the conversation **Sample prompt:** “I am a freshman at Alfred State University. I am studying political science. What are my degree requirements?” #### 2. Review structured guidance - Mentor considers the university, offered courses, and the specific program map. - Returns items like total required credits, liberal arts/core requirements, concentration courses, electives, and graduation requirements. - Provides next steps and a link to review courses. #### 3. Ask about options **Sample prompt:** “What elective options are available?” - Mentor lists political science electives and related courses that fit the plan. --- ## Target Audience **Instructor · Administrator · Student** --- ## Want This in Your Courses? Ready to give students clear, program-aware guidance on requirements and sequencing? Email **support@iblai.zendesk.com** and we’ll enable **Course Path Planner** on your mentors—or help you build a solution tailored to your institution. ### Transcript mentorAI | CoursePath Planner https://www.youtube.com/watch?v=iRun3GBwIL0 [1.3s] I'm going to be showing you our course [2.8s] path mentor, which is a mentor that's [5.3s] meant to help students understand their [7.2s] degree requirements, prerequisites for [9.9s] their degree, um, and even course [11.6s] sequencing, for example. Um, it can [13.9s] answer questions such as, you know, what [15.4s] classes do I need to register for next [17.3s] semester to stay on track or what [19.8s] electives will satisfy my degree plan? [22.2s] Just essentially this is reinforcing the [23.9s] official degree planning tools uh by [26.9s] pulling directly in from the curriculum [28.6s] data um from a given university. In this [31.4s] case, the system prompt is going to look [32.9s] something like this. [35.0s] Uh you know, it's a program/deree [37.7s] planning adviser helping students [39.4s] understanding requirements essentially [41.0s] what I was just describing here. Um and [43.4s] it's making sure that the responses are [44.7s] grounded in specific curriculum mapping [47.4s] um for a given degree path. All right. [50.2s] So, this is the way this is going to [51.6s] work for a user. I want to say I am a [54.3s] freshman [56.4s] at Alfred State University. [61.9s] I am studying [65.3s] political science. [71.0s] What are my degree requirements? [79.2s] And so what it's doing now is taking [80.8s] into account what university I go to, um [83.4s] the courses that the university itself [85.8s] offers and what's also required by my [87.8s] degree. So you can see here we're [90.5s] getting a couple of different options. [92.2s] Um we're even seeing how many required [94.2s] credits there are. Uh the actual [97.1s] requirements for the liberal arts core. [99.5s] Um then the concentration courses as [102.4s] well as electives and graduation [104.2s] requirements. It also gives you next [106.1s] steps and we'll actually send you a link [109.5s] to where you can actually review courses [111.8s] if if you need that as well. [115.4s] Let's see here. I'm going to ask [119.4s] what elective options are available. [123.0s] And this will tell me exactly what [124.6s] political science electives there are or [126.3s] related, you know, elective classes that [128.3s] I could take that would be great for my [130.2s] degree. And that's basically how this [133.0s] this course path course path mentor [135.1s] works. --- ## Faculty Partner URL: https://ibl.ai/docs/gallery/facultypartner Video: https://www.youtube.com/watch?v=P0Ku6ldwTZs # Faculty Partner ## Demo --- ## Description **Faculty Partner** is an AI mentor that helps university instructors weave **generative-AI activities** into an assignment, project, or entire course. You describe the task; the mentor proposes AI uses, checks alignment with learning goals, and suggests safeguards. --- ## Typical Flow 1. **Describe the assignment** – topic, outline, length, formatting, citation style 2. **State goals and context** – what students should demonstrate, class size, delivery mode 3. **Review two initial AI-integration ideas** – e.g., AI for drafting & feedback, simulated peer review 4. **Voice concerns** – e.g., *“Will AI do the students’ work?”* and receive safeguards: - Structured prompts - Incremental drafting - Reflection on AI use 5. **Confirm or request alternatives** – the mentor iterates until aligned 6. **Receive a concise summary and next steps** to finalize the assignment design --- ## Target Audience **Instructor · Administrator** --- ## Want This in Your Courses? Ready to bring **structured, AI-aware assignment design** to your teaching? Email **support@iblai.zendesk.com** and we’ll enable **Faculty Partner** on your mentors—or help you build a solution tailored to your institution. ### Transcript mentorAI | Faculty Partner Demo https://www.youtube.com/watch?v=P0Ku6ldwTZs [0.2s] Hey, what's going on? This is Jeremy at [1.6s] IBLAI. Today I'm showing you the faculty [4.4s] partner. So, this is an AI agent that's [7.2s] designed for university faculty to come [10.0s] and describe an assignment, a project, [13.1s] or even an entire course to the agent [16.2s] and basically get guidance and [19.0s] instruction on how they can effectively [21.6s] incorporate AI into whatever it is that [24.8s] they're working with. So for the purpose [27.0s] of this test conversation, I've [29.1s] described uh basically a research paper [32.2s] that this hypothetical faculty member [34.5s] would like to assign to students. So, [37.1s] I've described what the research paper [39.3s] is, uh, specifically what topics [42.6s] students will be engaging with, um, the [45.7s] required outline for this paper, and any [49.1s] other additional details that we should [51.7s] be aware of for this assignment, such as [54.5s] the length, uh, font, margins, what kind [59.1s] of style we want students to be citing [62.3s] their sources with, and any other [65.8s] consideration. ations for the assignment [67.4s] here. So the first thing that the mentor [69.6s] is going to do is basically say, "All [72.0s] right, we're rethinking this assignment. [74.1s] You got a solid framework for your [75.7s] research paper. What would be helpful is [78.1s] if we could kind of clarify the context, [80.6s] surface learning objectives, uh, and [83.1s] then we can kind of explore AI [85.0s] affordances, right? So just getting a [87.4s] better idea of the overall goals that we [89.7s] have for this assignment. So what I've [92.6s] gone ahead and done is said you know the [94.4s] goal is to ensure students are able to [95.8s] produce high quality research papers [97.6s] that demonstrate critical thinking uh [100.2s] through cohesive grounded researched [102.3s] arguments [104.0s] etc. I've given an overview of how many [106.6s] students are in this course um and what [109.0s] kind of course this is like it's hybrid [111.9s] right so there's some instruction but a [114.0s] lot of work for this course is done [115.8s] remotely so the next thing that the [118.2s] faculty partner is going to do is [121.0s] explore AI affordances with me and I've [123.9s] set it up so that it's only going to [125.4s] suggest two different you know ways to [127.9s] incorporate AI at a time so as to not [130.7s] overload an instructor with options if [133.2s] either of these two opt options that it [135.0s] suggests don't align with the goals of [137.6s] the project that we're trying to [139.3s] incorporate AI into. The mentor is going [142.6s] to ask that question right here at the [144.6s] bottom of its response. And if they [146.4s] don't align, provide a couple of more [148.3s] suggestions. But for the purpose of this [150.7s] research paper, I think these [152.6s] suggestions are pretty good. AI for [154.7s] drafting and feedback, pretty [156.3s] self-explanatory, [158.1s] uh, and simulated peer review, which is [160.6s] great for critical thinking. But as an [163.4s] instructor, I do have concerns, right? [165.5s] So I want to make sure that if I [167.4s] incorporate an AI tool for drafting and [169.6s] feedback, I want to be able to ensure [171.6s] that the AI isn't doing students work [174.1s] for them and is instead assisting them [177.0s] as they put together uh an outline [179.6s] themselves. [181.1s] And the mentor is going to suggest a [183.8s] number of ways to combat that to make [186.3s] sure that AI isn't doing the work for [188.6s] students and students are getting the [190.0s] highest quality learning possible. Such [193.1s] as, you know, provide students with some [195.5s] structured prompts. Maybe we're not [197.3s] asking the AI to develop an entire [200.7s] outline for me. Maybe we're asking the [203.6s] AI to help us brainstorm evidence for my [207.3s] argument about whatever topic a [209.8s] student's working with. Incremental [212.1s] drafting is another way that we can [213.8s] combat uh AI actually doing the heavy [216.6s] lifting for students. My favorite of [219.2s] these is this reflection on AI use. So, [223.0s] the idea here is that students will use [225.1s] an AI tool to help them draft and kind [228.0s] of develop an outline or even some [230.2s] feedback on the work they've already put [232.2s] together. But on top of that, we'll have [235.0s] a separate assignment where students are [237.0s] going to reflect on their experience [238.9s] using the AI and describe what worked, [242.6s] what they felt could be a little bit [244.6s] better, um, and overall what their [247.4s] learning journey was for this. Um, and [251.3s] again, there are a couple other [252.8s] recommendations here. And the mentor is [254.7s] always going to ask if these strategies [256.8s] help me feel more comfortable with the [258.6s] idea of integrating AI or should we [261.5s] discuss something else? So, for the [264.5s] purpose of this demo conversation, I say [266.4s] I think these strategies resonate with [268.0s] my objectives. And what the mentor is [269.7s] going to do to finish that and wrap this [271.8s] conversation up is provide an overview [274.9s] breakdown of what we've discussed. the [277.4s] points that I've said kind of align with [280.3s] the objectives of this project and then [282.9s] provide some next step considerations [285.5s] that an instructor might want to be [287.4s] aware of. And that's a very quick [289.5s] overview of the faculty partner. Thanks [291.4s] so much for tuning in and as always, [293.9s] learn more at ibl.ai. --- ## FAFSA URL: https://ibl.ai/docs/gallery/fafsa Video: https://www.youtube.com/watch?v=h2CINrCaW-g # FAFSA ## Demo --- ## Description A mentor that guides students through the **financial aid process**—explaining FAFSA steps, deadlines, documentation requirements, and troubleshooting common issues like verification holds. It **does not give speculative financial advice**; it focuses strictly on **official process guidance** and directs students to their campus Financial Aid office when necessary. --- ## Typical Flow #### 1. Start the conversation **Sample prompts:** “What documents do I need for FAFSA?” “How do I retrieve my FAFSA ID?” “Am I considered dependent or independent?” #### 2. Get process-grounded guidance - Breaks down required documents for all students - Distinguishes requirements for **dependent vs. independent** status - Explains deadlines and next steps - Can provide the official FAFSA site link #### 3. Resolve common issues - Walks through typical roadblocks (e.g., **verification holds**) - Advises when to contact your campus **Financial Aid team** #### 4. Step-by-step help - Provides **clear, ordered instructions** (e.g., recovering a FAFSA ID) - Points to where to get further assistance if needed --- ## Target Audience **Instructor · Administrator · Student** --- ## Want This in Your Courses? Ready to give students a reliable, process-accurate guide for FAFSA? Email **support@iblai.zendesk.com** and we’ll enable **FAFSA Mentor** on your mentors—or help you build a solution tailored to your institution. ### Transcript mentorAI | FAFSA Mentor https://www.youtube.com/watch?v=h2CINrCaW-g [1.4s] In this demo, I'm going to be covering [2.9s] our FAA mentor, which is a mentor that [4.8s] provides guidance um through the complex [7.6s] steps of the financial aid process. [9.5s] It'll explain FASA steps, deadlines, and [12.5s] documentation requirements. It also [14.6s] could help solve common issues such as [16.8s] verification holds. Um, and this is what [19.4s] the system prompt is going to look like [21.0s] for this [23.3s] essentially describing exactly what I [25.0s] said here. Um, but we're never going to [27.6s] provide speculative financial advice. [29.8s] Just explaining the official process, [32.9s] um, deadlines and next steps. When [35.2s] needed, students will be directed to the [37.0s] financial aid at their campus, um, so [39.8s] that they can help with resolution. [43.0s] So, we can ask here, what documents do I [45.8s] need for FAFSA? [48.5s] and it'll break down the steps and for [51.4s] all students including, you know, what [53.3s] requirements there are, if I'm an [54.6s] independent student or if I'm a [56.0s] dependent, um, and any additional tips. [59.4s] You can even see here it says, "Start [60.8s] gathering these documents from your [62.5s] FAFSA ID, sending you a link to the [64.2s] FAFSA website." Um, and it will help you [67.9s] guide help guide you through any of the [69.7s] steps that you need assistance with. Um, [72.2s] for example, you can see here it's [73.4s] asking, "If you're not sure whether [74.4s] you're considered dependent or [75.4s] dependent, independent, I can help you [77.7s] figure that out." Um, [80.5s] how do I retrieve my FAFSA ID? That'll [82.1s] be the next question I ask. It'll give [84.7s] me step-by-step instructions to make [86.2s] sure I can get back into my FAFSA [88.4s] account. And it even gives me, you know, [90.7s] the information of where I can contact [92.3s] FAFSA if I need to. [95.0s] All right. And that's basically how this [96.7s] is going to work. give students a guide [99.1s] to filling out FAFSA, making any [101.7s] adjustments, um, and asking any [103.7s] questions about the process. --- ## Italian-Speaking URL: https://ibl.ai/docs/gallery/italianspeaking Video: https://www.youtube.com/watch?v=Xa4-2FedAQw # Italian-Speaking ## Demo --- ## Description The **Italian-Speaking Mentor** helps learners build confidence in Italian through both **text and voice conversations**. It supports **pronunciation, sentence structure, and syntax**, while flexibly responding in **Italian or English** depending on the learner’s needs. This makes it useful for **beginners starting at a 100-level** as well as more **advanced learners** who want to refine their skills. --- ## Typical Flow 1. **Text or voice chat** - Begin with typed prompts or start a **voice call** directly within the Mentor AI platform 2. **Pronunciation assistance** - Practice phrases and receive **corrective feedback** on sounds, accents, and phrasing 3. **English support when needed** - Ask for clarification in **English**—helpful for beginners—while the mentor primarily responds in **Italian** to build immersion 4. **Flexible depth** - Stay at **introductory conversational practice** or move into **more complex grammar and syntax** as needed #### Example **User:** > “How do I pronounce ‘What is your name?’ in Italian?” **Mentor:** - Provides feedback on the correct sound and asks the learner to repeat **User:** > “I’m trying to say ‘buongiorno’ but it sounds too American. Can you help?” **Mentor:** - Adjusts pronunciation guidance and explains how native speakers say it naturally --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to give your learners **real-time Italian language support** and **immersive practice**? Email us at **support@iblai.zendesk.com** and we’ll enable the **Italian-Speaking Mentor** in your environment—or help you build a **multilingual AI solution** tailored to your institution. ### Transcript mentorAI | ItalianMentorAI https://www.youtube.com/watch?v=Xa4-2FedAQw [0.1s] Hey, what's going on? This is Jeremy at [1.4s] IBLAI. Today I'm showing you one of our [3.6s] newest mentors, the Italian mentor AI. [6.2s] Now, very similar to the Spanish mentor [8.6s] AI that you saw previously, this mentor [11.6s] is designed to assist users as they [14.3s] begin their journey learning the Italian [16.2s] language, either by having a texttoext [19.7s] conversation and using one of these [22.4s] starter template prompts or a prompt [24.6s] directly from the user and having a [26.9s] texttoext conversation to kind of begin [29.0s] that building journey towards uh you [31.8s] know comprehensive learning of the [33.7s] Italian language, but also being able to [37.1s] voice chat like and and have a verbal [40.3s] conversation with users basically to [42.5s] assist in tasks like pronunciation [45.2s] assistance, sentence structure, syntax, [47.8s] things of that nature. Um, and being [50.1s] able to respond not only in Italian but [53.0s] in English as well where needed. So to [56.2s] give you a quick example of what that [57.8s] looks like, I'm going to open this voice [59.8s] call tab, which you can find at the [61.4s] bottom of the mentor AI screen as you [63.9s] are communicating with a mentor. As soon [66.5s] as I do, the mentor is going to be able [68.4s] to hear everything I'm saying. So keep [69.9s] that in mind and let's watch. [85.5s] I was actually hoping today that you [87.2s] could help me work through [88.5s] pronunciation. So, I may need you to [90.2s] respond in English to this, but the [92.3s] phrase that I'm trying to pronounce is [94.3s] what is your name? Um, is this [96.4s] pronunciation correct? Kotami. [100.9s] >> Great start. [102.6s] The pronunciation is almost correct, but [105.8s] let's adjust a tiny detail in [109.7s] the ch in is a soft k sound like key. So [115.4s] it should sound like [119.0s] now repeat after me. [126.7s] Awesome. [140.4s] Thank you so much for your help. Um, one [142.5s] other thing that I need help pronouncing [144.6s] is uh just a general greeting, something [148.5s] along those lines. I know that the word [150.9s] is bonjouro, but I feel like that just [153.9s] sounds very American. Is there a better [156.4s] way for me to be pronouncing that? [170.7s] Perfecto. [194.3s] Hi or hello. [200.3s] >> Awesome. Well, thank you so much for [201.6s] your help today, mentor. [203.9s] And that's a very quick run through of [205.5s] the Italian language mentor. It can get [207.9s] much more complicated than that. It can [210.0s] stay in a very 100 level area if that's [213.1s] what you needed to do. It will respond [215.0s] in English. It will respond in Italian. [217.4s] You can text with this mentor. You can [219.4s] have a verbal conversation with this [221.4s] mentor. We want to make learning as [223.7s] flexible and as approachable and [226.5s] accurate as possible. Um, so this is [230.0s] just another mentor in a series of [231.7s] language mentors we'll be creating. So, [233.6s] thank you so much for tuning in. Stay [235.5s] tuned for more language mentors and as [237.8s] always, learn more at ibl.ai. --- ## Physics URL: https://ibl.ai/docs/gallery/physics Video: https://www.youtube.com/watch?v=wbXsAeKthfI # Physics  ## Demo --- ## Description The **Physics 101 mentor** give learners instant, **textbook-sourced explanations** with direct chapter references for deeper study. Choose from **Physics 101** or **University Physics Vol 1, Vol 2, or Vol 3** mentors to match your course level. --- ## Typical Flow 1. **Choose the right mentor** - In the **Explore** tab, select the Physics 101 or the relevant University Physics volume mentor - Start chatting 2. **Verify the textbook is loaded** - Go to the **Data Sets** tab - Check for the **OpenStax Physics 101** (or relevant volume) - Ensure it has a **green Trained badge** and **Visibility** set to **On** 3. **Ask your question** - Type directly in chat or use a ready-made prompt - The mentor answers instantly, citing the exact chapter and summarizing the content 4. **Review the sources** - Open the **Retrieved Documents** panel - View **chapter location**, **relevance score**, and one-click access to the original page --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to give your learners **instant, citation-rich answers** from trusted physics textbooks? Email **support@iblai.zendesk.com** and we’ll enable the **Physics 101 mentor** in your environment—or help you build a **brand-new solution** tailored to your institution. ### Transcript mentorAI | AI Physics Mentor Demo https://www.youtube.com/watch?v=wbXsAeKthfI [0.0s] Hey, what's going on? This is Jeremy at [1.4s] IBLAI. Today I'm showing you one of the [4.0s] physics mentors that we've created [5.8s] trained on an OpenStacks OEER textbook. [9.0s] But before I get into that, just wanted [11.1s] to show you that we have multiple [12.8s] mentors for physics, right? So, it's not [15.0s] just one cohesive physics mentor [17.7s] designed to handle every level of [19.7s] physics available. We have one for [22.0s] physics 101 all the way up through [24.1s] University Physics Volume 3. So, for the [27.6s] purpose of this demo, I'm just going to [29.0s] use the physics 101 mentor right here. [31.8s] And the way that I open a chat log is [33.5s] just by selecting the mentor from this [36.3s] explore tab right here. Now, in the data [39.9s] sets for this mentor, you'll see that [42.4s] the entire OpenStacks OEER physics 101 [46.2s] textbook has been trained to this [48.8s] mentor. And you'll know that the mentor [50.4s] is ready to use this information because [52.7s] of this green trained icon under status. [55.8s] and also this number populating the [58.0s] tokens column. That just lets you know [59.5s] the mentor has consumed this information [61.6s] and is ready to use it. We've also made [64.2s] the entire textbook visible, meaning the [66.5s] mentor can directly site within the [68.8s] textbook where it is finding the [70.6s] information that it used to formulate [72.7s] its response. So, we have a couple of [75.3s] pre-prompted questions right here. We [78.2s] include these just so when learners come [80.2s] in and know that they need help but [82.4s] might not know exactly what question to [84.5s] ask in order to get that help, we have a [87.0s] couple of options to kick this [88.3s] conversation off and get that ball [89.9s] rolling. So, what I'm going to do is [92.5s] select one of these. What are the laws [94.9s] of motion? The mentor is going to very [97.6s] quickly respond to my question and give [99.9s] me a general overview of Newton's three [102.8s] laws of motion and then tell me exactly [106.0s] which chapter within the Open Stacks [108.3s] textbook that it used to create this [112.2s] response and give me this information. [114.9s] It's also going to open this retrieved [116.6s] documents tab right here, which is going [118.6s] to show you each individual portion of [122.0s] the textbook that the mentor used to [124.1s] formulate its response throughout the [126.5s] entire textbook. And it's going to rate [129.2s] each one of these with a percentage of [131.3s] accuracy so that students can pull up [134.9s] either, you know, introduction chapter 4 [137.4s] if they want to read about Newton's laws [139.7s] of motions, but also introduction to [142.1s] chapter 7, Newton's law of gravitation, [144.6s] right? That just points the user in the [147.7s] right direction if they should want to [150.1s] go back into the textbook themselves and [152.6s] find exactly where this information is [154.7s] coming from. Now, that's a very quick [157.6s] overview of the physics mentor we've [160.1s] created. As I've said, there are [162.3s] multiple physics mentors available to [164.3s] the users of the mentor AI platform. And [167.4s] that was just a a quick glimpse of how [169.4s] it functions, how it cites its [171.4s] information, and the flow of [173.0s] conversation that you will have when [175.0s] using this Mentor. Thank you so much for [177.4s] tuning in, and as always, learn more at [180.2s] ibl.ai. AI. --- ## Pre-Calculus URL: https://ibl.ai/docs/gallery/precalculus Video: https://www.youtube.com/watch?v=kkaYxi4jI3g # Pre-Calculus ## Demo --- ## Description The **Pre-Calculus mentor** gives learners **concise, textbook-anchored explanations** with direct links to cited chapters—ideal for self-guided study. Powered by the **OpenStax Pre-Calculus 2e** textbook, it can list chapters, break down key topics, and provide step-by-step explanations with example formulas. --- ## Typical Flow 1. **Access the mentor** - Use the link provided by the **ibl.ai** team and open the chat 2. **Confirm the textbook is loaded** - Check the **Data Sets** tab for the **OpenStax Pre-Calculus 2e** textbook - Ensure it has a **numeric token count** and **Visibility** is set to **On** 3. **Start the conversation** - Ask a direct question or use a suggested prompt (e.g., > “What topics are covered in Pre-Calculus 2e?” - The mentor lists **major chapters** and **core topic areas**, and offers **follow-up prompts** 4. **Dive deeper** - Ask specific questions (e.g., > “Explain functions and their properties” - Receive **clear explanations** and **example formulas** 5. **Check the sources** - Open the **Retrieved Documents** panel - View every textbook excerpt used along with **relevance scores** and **direct links** --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to give your learners **instant, citation-rich support** in pre-calculus? Email **support@iblai.zendesk.com** and we’ll enable the **Pre-Calculus mentor** in your environment—or help you build a **brand-new solution** tailored to your institution. ### Transcript mentorAI | AI Precalculus Mentor Demo https://www.youtube.com/watch?v=kkaYxi4jI3g [0.2s] Hey, what's going on? This is Jeremy at [1.8s] IBLAI and today I'm showing you our [4.4s] pre-calculus mentor. Uh, this is [7.0s] available to any of the users who want [9.5s] to work on the mentor AI platform. If [12.2s] it's a mentor that jumps out to you and [14.3s] you'd like access to it, just let us [16.4s] know. We'll send you a link and get you [17.9s] set up. So, like I said, this is a [20.6s] pre-calculus mentor. We've trained it on [23.8s] the OpenStacks OEER pre-calculus 2e [28.6s] textbook. Uh you can see that the entire [31.4s] textbook has been consumed and is [33.2s] successfully trained by this number in [35.0s] the tokens column. And we've also made [37.5s] it visible allowing the mentor to site [40.6s] where it's pulling its answers from. So [44.9s] just to show you a little bit of the [46.6s] functionality of the mentor, um we do [49.3s] have these pre- suggested guided [53.1s] prompts, so to speak. This just allows [55.4s] the user uh to come in. They know that [58.2s] they need help, but don't really know [59.5s] what question to ask in order to get [61.6s] where they need to be. Here are a couple [63.8s] of questions that will dynamically [65.7s] change throughout the context of the [67.3s] conversation that will allow the student [69.4s] to begin that dialogue and hopefully get [72.0s] to where they need to be. So, first [75.1s] question I'm going to ask are what [77.0s] topics are covered in pre-calculus 2e. [80.2s] So the mentor is going to give a general [82.8s] overview of the chapters of the [85.4s] textbook, the topics of the textbook, [87.4s] and the three major core areas of the [90.9s] textbook. [92.4s] And then at the bottom here, we will see [94.6s] based on the conversation that we've [96.2s] started, a couple of additional [98.3s] suggested prompts uh to continue that [101.4s] conversation. And on the right hand [103.5s] side, we're seeing the retrieved [105.1s] documents. This is the mentor citing [107.4s] where it pulled the information from. [109.8s] and rating each portion of the textbook [112.3s] where it pulled some of this information [114.1s] from with a percentage of accuracy, [116.7s] which allows students to go directly [118.4s] into the textbook themselves, find the [120.9s] exact area that the mentor is citing in [123.8s] its response to get a much more concrete [126.7s] overview directly from the textbook for [129.0s] whatever information they're looking [130.4s] for. [131.8s] So, next I'm going to ask the mentor to [135.1s] explain functions and their properties, [137.3s] get a little more into the nitty-gritty [139.4s] of pre-calculus. [141.8s] And again, the mentor is doing exactly [143.4s] what I want. It's giving me a very clear [146.5s] overview, some example formulas for some [149.4s] of the uh major functions in [152.2s] pre-calculus. And again, citing the [156.9s] textbook and its response. And that's a [159.9s] very quick overview of the pre-calculus [162.5s] mentor that we have put together [164.0s] available on the mentor AI platform. [166.5s] Thank you so much for tuning in and as [168.6s] always, learn more at ibl.ai. AI. --- ## Professional Narrative URL: https://ibl.ai/docs/gallery/professionalnarrative Video: https://www.youtube.com/watch?v=sj9EiLrmB64 # Professional Narrative ## Demo --- ## Description A mentor that guides students—especially those nearing graduation or seeking internships/entry-level roles—to craft a clear, confident professional narrative: who you are, what you’ve done, what motivates you, and what you hope to accomplish. It collects your answers through a friendly, supportive Q&A and produces drafts suited for interviews, resumes, or LinkedIn. --- ## Typical Flow 1. **Start the session** - Say: “I need help crafting my professional narrative.” - The mentor introduces itself and explains it will ask a few short questions to shape your story. 2. **Answer the guided questions** - Shares a sequenced set of prompts to learn: who you are, what you’ve done so far, short- and long-term goals, motivations, and skills. - Tone is intentionally encouraging and stress-reducing. 3. **Get your first draft** - Produces a baseline narrative (about 3–4 paragraphs)—long enough to tell your story, not so long it’s hard to edit. 4. **Refine for purpose** - Offers to tailor the result: - Short version (~75–125 words) or even shorter (elevator-pitch length). - Long version (~2,000 words): asks for more depth before expanding (e.g., academic focus, on-campus projects, personal perspective, tone preference). 5. **Adjust for audience** - Can frame the narrative for specific contexts (e.g., interviews, resumes, LinkedIn), based on your responses. --- ## Target Audience Student --- ## Want This in Your Courses? Ready to craft confident, interview-ready professional stories? Email **support@iblai.zendesk.com** and we’ll enable **Professional Narrative AI** on your mentors—or help you build a solution tailored to your institution. ### Transcript mentorAI | ProfessionalNarrativeAI Mentor Demo https://www.youtube.com/watch?v=sj9EiLrmB64 [0.4s] Hey, what's going on? This is Jeremy at [1.8s] IBLAI and today I'm showing you our [4.4s] professional narrative AI mentor. So, in [7.7s] a nutshell, what this mentor is designed [9.6s] to do is help students who may be close [13.1s] to completing their college journey. Uh, [15.5s] might be getting their feet wet in terms [17.2s] of internships or even applying for [19.7s] entry level postgrad jobs. put together [22.6s] a high quality, high-level professional [25.4s] narrative, right? That we can use for [28.1s] job interviews or resumes or even just [31.5s] sharing on LinkedIn. The idea being a [34.6s] professional narrative is a [38.3s] basically a story, right? That tells [41.4s] employers, uh, professionals, etc. who [45.0s] you are, what you've done, what [47.2s] motivates you, what drives you to [49.8s] success, and what do you hope to [51.7s] accomplish. It's basically your elevator [54.4s] pitch, which can range from a quick 30- [57.7s] secondond snippet to, you know, [59.8s] sometimes a three, three and a half page [62.1s] document that you would submit along [63.5s] with a resume, and also being able to [67.0s] craft that professional narrative around [69.6s] a specific audience. [71.8s] So the way that the mentor works is I've [75.0s] basically given it a series of questions [79.2s] that it's meant to ask students, collect [82.5s] their information that they share and [84.6s] help them put together that initial [86.6s] professional uh narrative. So you know [90.2s] here are a number of the [92.7s] questions within the sequence what we're [94.8s] looking for from the user instructions [97.6s] about how the mentor should engage. [99.8s] Right? We want this to be very engaging [102.1s] and friendly because getting into the [105.1s] job force is incredibly stressful, [107.8s] right? We want to make this as soft of a [110.6s] pillow as we possibly can and ensure [113.2s] that students and learners are feeling [115.0s] encouraged the entire uh walk of the [118.1s] process. [119.9s] So basically the system prompt is just [122.7s] containing you know what kind of [124.0s] questions it is, what the overall goal [126.0s] of this mentor is, how it should [128.1s] communicate with students, things of [130.2s] that nature. I've also trained this [132.9s] mentor on uh a professional narrative [136.2s] series of questions here. [139.0s] So, the mentor now has a a very specific [142.0s] set of questions that it's going to ask [144.2s] in order in order to put together that [147.6s] professional profile to craft that [149.9s] professional narrative. So, the first [152.2s] thing a user is going to do is come in [154.0s] and just simply say, "I need help [155.7s] crafting my professional narrative." The [158.8s] mentor is going to introduce itself [161.0s] saying, you know, "I'm the professional [162.7s] narrative mentor. I'll guide you step by [164.6s] step through a few short questions to [166.4s] help you create a professional narrative [168.4s] that reflects your goals, values, and [170.2s] experiences. [171.8s] So, for the sake of time, I'm not going [173.6s] to hang tight on every single aspect of [176.2s] this conversation, but I did want to [178.8s] work through the entire interaction from [181.0s] a user perspective just to test the [183.7s] efficacy of this mentor. Um, so what the [186.2s] mentor is doing, and please feel free to [188.1s] pause at any point if you'd like uh to [189.8s] read in a bit more depth, but it's going [192.2s] through this questionnaire, so to speak, [194.6s] this series of questions that I've [196.0s] shared with the mentor to get an idea of [199.0s] who I am, what I've done so far, what [201.7s] I'm looking to do in the future, both in [203.5s] the short or long term, what motivates [206.1s] me, what skills have I gained, all to [208.9s] create this unified picture of the [212.6s] individual student as a professional. [215.9s] So once we get to the end of the [218.3s] questions, the mentor is going to be [220.5s] very supportive and encouraging in its [222.2s] words, "Excellent. You've demonstrated [223.9s] continuous learning, initiative, and [225.8s] thoughtful career planning. Here's your [228.3s] draft professional narrative." And the [230.5s] way that I've structured the mentor is [232.5s] to just kind of create a baseline [235.0s] narrative, right? So, we don't want the [236.9s] mentor to spit something out that's four [238.6s] pages long that's going to take us a [240.2s] very long time to sit back and edit. We [242.9s] also don't want something that's way too [244.5s] short and doesn't cover the full scope [246.6s] of who the student is as a professional. [249.1s] So the initial output is going to be [251.2s] somewhere in the three to four paragraph [253.2s] range, which is just enough space for [255.8s] you to make a case for yourself and tell [257.9s] a story about yourself in an interview [260.2s] without kind of going overboard, so to [263.4s] speak. But the next thing that it's [265.8s] going to do is ask, would you like me to [268.2s] help you shorten or refine this version [270.5s] for a specific purpose? So, let's say [273.3s] it's the LinkedIn about section or uh a [276.5s] quick 30 second elevator pitch. Tell me [278.8s] who you are. A very common question that [280.8s] comes up in job interviews. [283.4s] So, the mentor is going to basically [286.1s] say, uh, you know, I'm really glad you [287.9s] like it. You've built a powerful story. [289.8s] I can help create a shorter version [291.5s] around 75 to 125 words. Here it is. Uh, [296.1s] I can also create an even shorter one. [298.4s] We also want to make sure that the [299.7s] mentor is capable of creating a longer [301.6s] one. So I asked the mentor, I'd like to [304.9s] put together a longer version, maybe [306.5s] around 2,000 words. [309.0s] So the mentor says that's a great idea. [311.2s] A 2,000word version will let you explore [313.8s] your experiences and motivations in more [315.7s] depth. But in order to do that together, [319.1s] let's go a little bit more in depth. [321.1s] Right? the amount of information that we [322.6s] got thus far, perfect for creating that [325.3s] quick, you know, three to four paragraph [327.5s] pitch as well as any kind of shorten and [329.6s] condensed pitch that you might need. But [332.1s] if we're going to go much larger in [334.2s] scope, maybe we should dive deeper into [336.6s] your personal experiences and details. [338.6s] So again, asking uh could you tell me a [341.5s] bit more of the following? your academic [343.4s] focus, any projects you've worked on on [346.5s] campus, what your personal perspective [348.6s] is, and what your tone preference is. So [351.7s] again, just prompting the student to [353.1s] keep diving deeper into their own [354.9s] experiences to really craft that [357.0s] highlevel professional narrative. And in [360.2s] a nutshell, that is it. That is the [363.0s] professional narrative AI mentor. Thank [365.5s] you so much for tuning in and as always, [367.7s] learn more at IBLAI. --- ## Spanish-Speaking URL: https://ibl.ai/docs/gallery/spanishspeaking Video: https://www.youtube.com/watch?v=5B_4oHHzRQo # Spanish-Speaking ## Demo --- ## Description The **Spanish-Speaking Mentor** enables **real-time, voice-based conversations in Spanish**—ideal for native speakers and learners alike. By default, it responds in **Spanish** for immersive practice, but on request, it can briefly **clarify in English** before seamlessly continuing in Spanish. This makes it useful both for **Spanish speakers needing support** and for **English speakers learning Spanish**. --- ## Typical Flow 1. **Voice Call (in-app)** - Start a call so the mentor can **listen and respond verbally in real time** 2. **Spanish by default** - Ask questions in either **English or Spanish**—answers are always delivered in **Spanish** for immersion 3. **On-demand clarification** - Request an **English restatement** at any time - The mentor provides a concise English explanation, then continues in Spanish 4. **Flexible use cases** - Perfect for: - **Language practice** - **Multilingual student services** - **Bilingual classroom assistance** #### Example **User:** > “Describe the average temperature in Colorado during August (3–5 sentences).” **Mentor:** - Provides a clear, **Spanish-language response** **User:** > “Can you restate that in English?” **Mentor:** - Gives a short **English version**, then switches back to **Spanish** for the next prompt --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to bring **real-time Spanish-language support** and **practice** to your learners? Email us at **support@iblai.zendesk.com** and we’ll enable the **Spanish-Speaking Mentor** in your environment—or help you build a **multilingual AI solution** tailored to your institution. ### Transcript mentorAI | Spanish Speaking Mentor Demo https://www.youtube.com/watch?v=5B_4oHHzRQo [0.3s] Hey, what's going on? This is Jeremy at [1.8s] IBLAI and today I'm showing you uh the [4.7s] first of many mentors that we'll be [7.3s] creating for uh multilingual users. What [10.5s] you're looking at here is the Spanish [12.1s] mentor AI. Now, this mentor is designed [15.6s] to communicate with users verbally on [18.4s] the platform who may be Spanish speakers [22.2s] who may be learning Spanish um and is [24.8s] designed to respond to both English and [27.4s] Spanish in real time in Spanish. So, to [31.4s] give you a quick idea of what that looks [34.4s] like in practice, we have this voice [37.4s] call feature right here. So, I'm going [39.8s] to select this. Once I've opened this, [42.3s] the mentor is going to be able to hear [43.8s] everything that I'm saying and will [45.6s] communicate with me. [50.7s] >> Hello. [55.1s] >> Good. Um, I was hoping that in three to [58.7s] five sentences you could describe what [61.4s] the average temperature is in Colorado [65.8s] during August. [88.9s] Awesome. Thank you so much for that [90.4s] overview. Um, could you restate that for [93.3s] me in English as well? [95.7s] >> Sure. Here it is in English. In August, [98.7s] the average temperature in Colorado [100.8s] usually ranges between 15 and 30° C, [104.3s] depending on the region and altitude. [106.7s] Mountain areas are cooler, while lower [109.0s] elevations can feel warmer. It's a month [111.6s] with warm days and cooler nights, making [113.9s] it ideal for outdoor activities. Now, [116.6s] I'll go back to Spanish for any next [118.5s] questions. [120.1s] >> Thank you very much. [122.2s] So, as you can see, the mentor is [125.3s] working exactly as the way that I want [126.9s] it to. Um, it is able to communicate and [129.6s] basically hold a conversation in [131.5s] Spanish. And if you'd like it to [133.2s] clarify, you know, such as an English [135.8s] speaker who's learning Spanish, uh, it [138.6s] will respond to you in English as well, [140.6s] if you ask it uh, for that one prompt [144.0s] and then immediately return to Spanish. [146.6s] So, that's a very quick overview of the [149.2s] Spanish mentor AI that we have. [151.0s] available on the Mentor AI platform. And [153.7s] as always, learn more at ibl.ai. --- ## Student Life URL: https://ibl.ai/docs/gallery/studentlife Video: https://www.youtube.com/watch?v=N2xQ5hRXxo0 # Student Life ## Demo --- ## Description A mentor that guides students to **basic-needs and non-academic support** on their campus—food pantries, housing resources, emergency funds, wellness services, and more. It summarizes available options, explains **how to access them**, provides **eligibility and operating hours**, and lets students choose what’s most relevant. --- ## Typical Flow #### 1. Start the conversation **Sample prompts:** “I’m a student at the University of Buffalo. What wellness services does my university offer?” “I’d like to focus on mental health.” ### 2. Review campus options - Mentor lists the **range of available services** and summarizes each - Invites the student to **pick an area** to explore further #### 3. Drill into a service - For mental health, the mentor describes: - Counseling services - Where to access them - How to make an appointment - Relevant links - Includes **immediate-help information** when appropriate #### 4. Next steps - Mentor offers **concrete actions**, such as: - Schedule an appointment - Connect with a wellness coach --- ## Target Audience **Instructor · Administrator· Student** --- ## Want This in Your Courses? Ready to give students fast, clear access to non-academic supports? Email **support@iblai.zendesk.com** and we’ll enable **Student Life Mentor** on your mentors—or help you build a solution tailored to your institution. ### Transcript mentorAI | Student Life Mentor https://www.youtube.com/watch?v=N2xQ5hRXxo0 [2.3s] I'll be walking you through our student [4.0s] life mentor, which is a mentor that [5.8s] provides guidance on basic needs and [7.7s] non-academic support. This can direct [10.0s] students to things like food pantries, [12.0s] housing resources, emergency funds, and [14.7s] even wellness services. So in this case, [17.6s] if we look at the system prompt here, [20.6s] you'll see that this is actually going [22.1s] to say exactly what I said in terms of [24.0s] what kinds of answers it'll be able to [25.6s] give, but it'll also be providing clear [27.4s] directions on how to access the [28.9s] services, any eligibility requirements [31.4s] or andor operating hours. If multiple [34.5s] options exist as well, it'll summarize [36.3s] them and let the student choose which [37.8s] one is most relevant. So in this case [39.8s] I'm going to say I am a student at the [44.1s] University [46.9s] of Buffalo. [49.9s] What wellness [52.7s] services [54.9s] does my university [58.0s] offer? [62.0s] And you can see here this already is [63.4s] saying that there's going to be a wide [64.6s] range. So, it's going to summarize all [66.3s] of them and let us, you know, determine [68.3s] which one we'd like to focus on more to [70.1s] learn more about. Um, let's see here. [72.8s] So, I'm going to say I'd like to focus [77.9s] on mental [80.7s] health. [85.7s] And now it's going to show me what my [87.8s] campus's counseling services offer. um [90.6s] where to access them, you know, where to [92.6s] make an appointment, even sending me [93.9s] links. And even with the immediate help, [96.2s] it's giving me information that is a [97.8s] little bit outside of my initial [99.1s] question, but is really important in [100.7s] this case when we're talking about [101.9s] mental health. [104.1s] All right. And it even is asking me next [105.5s] steps, like if I want to schedule an [107.0s] appointment or connect with an wellness [108.6s] coach as well. So, it's actually guiding [110.3s] the student to get to where they need to [111.8s] be in terms of getting the support [113.5s] outside of academic life. --- ## Student Onboarding URL: https://ibl.ai/docs/gallery/studentonboarding Video: https://www.youtube.com/watch?v=3KeOnldbFVY # Student Onboarding ## Demo --- ## Description The **Student Onboarding Mentor** helps new learners identify their **individual learning strengths and weaknesses** before starting a course. By completing a short, **20-question Likert-style inventory**, students receive a **personalized learning profile** based on four core learning modalities: 1. **Active and Interactive Engagement** 2. **Collaborative and Cooperative Learning** 3. **Cognitive Strategy-Based Learning** 4. **Informative Feedback and Mastery Learning** The mentor then provides a **detailed roadmap for success**, including study tips and **course-specific recommendations**—all designed to help students understand how they learn best and apply those insights to upcoming assignments and activities. --- ## Typical Flow 1. **Begin the questionnaire** - Students start a chat and type “**Let’s start the questionnaire**.” - The mentor begins the **20-question learning modalities inventory**. 2. **Complete the inventory** - The mentor guides the student through each question, dynamically tracking responses. 3. **View results** - After completion, the mentor thanks the student and identifies their **top two learning modalities**. 4. **Review explanations** - The mentor explains all four modalities, highlighting the student’s **strongest learning areas**. 5. **Get personalized study tips** - Students receive **tailored recommendations** and assessment strategies for each modality they scored highest in. 6. **Connect to course content** - The mentor links the learning profile to **specific course assignments and unit content**, suggesting engagement and study techniques for maximum success. 7. **Instructor insights** - The mentor’s output can be shared with **instructors** to help them personalize instruction and support each learner’s strengths. --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to help your students **discover how they learn best** and **personalize their success from day one**? Email us at **support@iblai.zendesk.com** and we’ll enable the **Student Onboarding Mentor** in your environment—or help you build a **customized onboarding experience** tailored to your institution. ### Transcript mentorAI | Learning Modalities Inventory Mentor for Student Onboarding https://www.youtube.com/watch?v=3KeOnldbFVY [0.2s] Hey, what's going on? This is Jeremy at [1.6s] IBLAI and today I'm showing you one of [3.5s] our newest mentors, the AI learning [5.8s] modalities inventory mentor. Now, the [8.7s] purpose of this mentor is to basically [11.9s] assist students in onboarding into a new [14.6s] course. In this case, a communications [16.6s] course oriented around cultures to [18.9s] cultures. The idea is students will come [21.5s] in and engage with this mentor and then [23.8s] conduct a liyker style 20 questionnaire [28.1s] to kind of build a learner profile that [31.5s] identifies learning strengths and [33.4s] weaknesses oriented around four key [36.3s] learning modalities. Active learning and [38.9s] interactive engagement, collaborative [41.0s] and cooperative learning, cognitive [43.7s] strategybased learning, informative [46.3s] feedback and mastery learning. [49.2s] So to kind of show you what we've done [51.9s] here, the initial system prompt I've put [55.0s] together for this mentor basically [58.4s] explains the goal here. We're trying to [60.8s] identify students strengths and [62.5s] weaknesses and provide them basically a [64.8s] roadmap for success when completing the [67.8s] course and whatever assignments and [70.1s] activities come up throughout the [71.5s] course. So an overview of the types of [74.0s] questions we would like to ask or the [76.3s] exact questions that we'd like to ask so [78.4s] the mentor has that readily available. [80.8s] How it should rate students, how it [82.9s] should respond once students have [84.5s] completed the questionnaire and how that [87.7s] information should then be related to [89.9s] the overall structure of the course. [93.4s] So to show you an example of this mentor [96.3s] in action, here's a a quick [98.6s] questionnaire that I did as an example. [101.8s] So a student will come in, say to the [104.2s] mentor, "Hello, let's start the [105.8s] questionnaire." And the mentor will [107.8s] begin right off the bat with the first [109.5s] question. [111.3s] So quickly, I'm just going to scroll [113.2s] through this for the sake of time. If [115.5s] you'd like to pause at any point and see [117.4s] the kind of questions that the mentor is [119.1s] asking, how I'm responding, please feel [121.6s] free to just pause the video. But again, [124.4s] for the sake of time, we're just going [125.7s] to quickly come through here to the end [128.4s] of the questionnaire. So, now that I've [131.0s] completed the questionnaire, the mentor [132.9s] is going to do a number of things. one, [135.8s] it's going to give me a a quick [137.4s] introduction where it thanks me for [139.8s] participating in the survey and then [141.9s] identifies the top two learning [143.9s] modalities that I test for based on my [146.4s] responses. So in this case it would be [148.6s] active learning and interactive [150.3s] engagement, informative feedback and [152.6s] mastery learning. Just so students [155.2s] understand what this means, the next [157.5s] thing the mentor is going to do is [159.2s] provide a summary of those four key [161.4s] learning modalities to kind of give a a [164.2s] much more scoped picture of what each [166.9s] one of them means in terms of learning [168.9s] strength and weakness. [171.4s] Next, we're going to get into the [173.0s] section where we're identifying my [174.8s] learning strengths. Next, we're going to [178.0s] look at the recommended assessment [180.0s] pathways and then a number of study tips [183.1s] for each one of the modalities that I [185.0s] test for so that I can engage learning [187.4s] in the way that suits me best. So, after [191.7s] that quick learner profile overview, the [194.8s] next thing that a student would do is [196.3s] say, how can I connect this learning [198.0s] profile to this course? The mentor is [201.2s] going to break down uh assignment [202.8s] strategies for each of the key [204.3s] assignments through this course and give [206.6s] me advice for each one of the learning [208.4s] modalities that I test for in case one [211.6s] works better than the other in each [213.3s] particular instance. [215.5s] For unit content focus, there's going to [217.8s] be again an overview for each one of [219.5s] those modalities that I tested for just [221.6s] so I have strengths across the board and [224.9s] then a number of engagement techniques [226.7s] that I should use in order to be the [228.8s] most successful student that I can be. [231.5s] This information can be offloaded [233.4s] directly to instructors to give them a [235.6s] much more personalized overview of each [239.1s] student's individual learning profile [242.1s] and how to tailor the course to meet [243.9s] those needs. [245.5s] So, as always, thank you so much for [247.9s] tuning in and learning about this new [249.8s] learning modalities mentor that we have [251.6s] for student onboarding. Uh, and as [254.7s] always, learn more at ibl.ai. AI. --- ## Study Support URL: https://ibl.ai/docs/gallery/studysupport Video: https://www.youtube.com/watch?v=c1XS52kKus4 # Study Support ## Demo --- ## Description A 24/7 mentor that connects students to on-campus study supports—tutoring centers, writing labs, academic workshops, and more. It answers availability questions (including nights/weekends), provides location and hours, suggests after-hours alternatives, and can walk students through booking in-person or virtual sessions. Responses are tailored to the student’s university (e.g., Buffalo State University). --- ## Typical Flow #### 1. Start the conversation **Sample prompts:** “Where can I find tutoring for calculus tonight?” “Is there a writing center open on the weekends?” “I’m a student at Buffalo State University—can I schedule a virtual appointment?” #### 2. Get availability & details - Mentor replies yes/no on availability, plus: location, typical hours, and options if the center is closed. #### 3. Book support - Ask for steps to schedule; the mentor provides how to book a meeting or session (in-person or virtual). #### 4. Follow-ups - Continue with specifics (e.g., “Any workshops this week?”) and the mentor guides you to the right resource. --- ## Target Audience Student · Instructor · Administrator --- ## Want This in Your Courses? Ready to offer equitable, 24/7 access to campus study supports? Email **support@iblai.zendesk.com** and we’ll enable **Study Support Mentor** on your mentors—or help you build a solution tailored to your institution. ### Transcript mentorAI | Study Support Mentor https://www.youtube.com/watch?v=c1XS52kKus4 [1.3s] I'm going to be showing you our study [2.9s] support mentor that helps connect [5.1s] students with different study supports [8.2s] on their given uh university's campus. [10.8s] So, for example, this could connect [12.5s] learners to tutoring centers, to writing [16.4s] labs, academic workshops, so on and so [18.7s] forth. Um, this is going to be a 247 [21.7s] mentor. So, this is really great for [23.7s] ensuring equitable access to academic [25.9s] support even after business hours. [29.0s] So, we can see a couple example [31.0s] questions here. Where can I find [32.5s] tutoring for calculus tonight? Is there [34.1s] a writing center open on the weekends? [36.3s] Um, and then we'll use this question [37.8s] here to show you how this metar works. [41.2s] So, I'm going to say [43.4s] I am a student at Buffalo State [49.0s] University. [53.3s] And then we're asking is our writing [54.7s] center open on the weekends? And this [56.6s] but what this will do is actually give [58.8s] you a yes or no answer. And it'll [61.0s] actually give you more information if [62.3s] there is a support center, in this case [65.4s] uh a writing center. It'll give you [67.4s] where this is located, what the hours [69.3s] are typically like. Um and then even [72.0s] additional options outside of the [73.7s] business hours of that writing center. [76.7s] And it can even show you how to book or [79.4s] u a meeting or a writing session um if [82.6s] if you want those steps. And you can [85.0s] also continue asking, can I schedule a [86.6s] virtual appointment? Which we already [87.9s] seen that last answer, but just as an [89.5s] example. And this is actually giving me [91.0s] the steps to to book that session. [95.5s] All right. And that's basically how this [96.9s] works. --- ## Transfer Navigator URL: https://ibl.ai/docs/gallery/transfernavigator Video: https://www.youtube.com/watch?v=5iP4N9hX9rQ # Transfer Navigator ## Demo --- ## Description A mentor that helps students **navigate credit transfer**—interpreting transcripts, checking articulation agreements, and mapping courses to a destination university’s degree requirements. It can analyze **uploaded transcripts** (including international credits), handle **credit conversions** (e.g., UK → US), explain **minimum grade requirements**, and show **how credits apply** to a chosen program. --- ## Typical Flow #### 1. Start the conversation **Sample prompts:** “I want to transfer to Empire State University. Will my credits transfer?” “Can you map my courses to Empire State’s requirements?” #### 2. Upload and analyze - Upload a sample transcript (PDF). - Mentor analyzes coursework and recognizes originating institution standards (e.g., **UK credit system**). #### 3. Get an evaluation - Returns an overview with: - Evaluation process and what to expect - Credit conversion details (e.g., UK → US) - Minimum grade requirements - Degree applicability (how credits fit major/core/electives) #### 4. Map equivalents **Sample prompts:** “Yes, map the equivalent.” **Mentor returns:** - A course-to-course mapping (e.g., *Bath University → Empire State*) - Equivalencies and converted credit values #### 5. Next steps - Provides actionable follow-ups for completing the transfer process, such as: - Required documentation - Confirmation steps - Advising touchpoints --- ## Target Audience **Student · Instructor · Administrator** --- ## Want This in Your Courses? Ready to give students clear, step-by-step guidance on credit transfer and equivalencies? Email us at **support@iblai.zendesk.com** and we’ll enable the **Transfer Navigator Mentor** in your environment—or help you build a **customized onboarding experience** tailored to your institution. ### Transcript mentorAI | Transfer Navigator Mentor https://www.youtube.com/watch?v=5iP4N9hX9rQ [1.2s] I'm going to be showing you our transfer [2.9s] navigation mentor which is focusing on [6.0s] credit transfers, articulation [7.9s] agreements and any sort of requirements [10.2s] related to transferring. In this case, [12.6s] if we look at the system prompt here, [16.6s] you can see that this will allow you to [19.4s] ask the mentor about your own, you know, [21.4s] credits, your own transcript and compare [24.1s] it against, you know, the requirements [26.7s] for transferring and how it'll also [29.8s] answer questions about how, you know, [31.7s] students can transfer, what credits will [33.5s] transfer, which ones won't, and how they [35.4s] apply to current degree requirements at [37.2s] their given campus. [39.8s] So, in this case, what I'm going to do [41.7s] is upload a sample transcript um from a [44.6s] a sample student, [47.8s] and I'm going to ask I [51.4s] want to transfer [54.7s] to Empire State [58.7s] University. [61.2s] How will actually, let's say, will my [64.5s] credits transfer? That's our starting [66.5s] point. [71.6s] And so the mentor is going to basically [73.0s] analyze the transcript of this example [75.2s] student um and you know give it [77.8s] information. In this case this is [79.1s] actually um a a sample transcript from a [81.4s] university in the UK. So it's actually [83.8s] going to break down the conversions [85.9s] between the UK standards for credits and [88.7s] also uh for this specific campus. [92.3s] So you can see here we get the [93.7s] evaluation process, credit conversion, [96.2s] minimum grade requirements, and degree [98.0s] applicability. [100.0s] Now, it's going to give us some next [101.2s] steps we can follow um to help us [103.4s] continue throughout the process to get [104.6s] our our credits transferred over and be [107.0s] able to attend our Empire State [108.9s] University. [112.7s] I'm going to say yes, map [115.6s] the equivalent here. [125.3s] So, what this is basically going to do [126.7s] is just mapping the current credits that [129.0s] they see um and the courses that they [130.5s] see from the example transcript and [132.8s] align it with Empire State's standards [135.4s] here. So, you can see that this is [138.2s] giving us a table of courses that map to [140.6s] what we previously took at Bath [143.3s] University and what the equivalence [145.0s] would be at Empire State. [150.9s] All righty. And that's basically how [152.5s] this works. You'll also get that which [154.3s] is really great. This little conversion [156.7s] um even with you know some course [158.9s] universities in the US some credits [160.9s] might be three for a course and some [162.9s] might be four depending on where you're [164.5s] where you're at. So this is great to [166.4s] have that conversion example as well. [168.3s] And again it continues with the next [169.8s] steps and essentially that's how this [171.8s] credit transfer process works. --- ## Chat Ratings URL: https://ibl.ai/docs/instructor/analytics/chatratings Video: https://www.youtube.com/watch?v=EG7cuQ8GhiU # Chat Ratings ## Description **Chat Ratings** gives instructors a quick, rolling snapshot of how learners are experiencing a specific mentorAI—by connecting the **History** (recent chats) and **Memory** (saved user context) features. The rating aggregates the past **24 hours** of learner interactions and **refreshes daily**, helping you see what’s working, what’s not, and where to intervene. --- ## Target Audience **Instructor** --- ## Features #### 24-Hour Rolling Rating Calculates a mentor’s learner-experience rating from the most recent 24 hours of chat activity; updates automatically every day. #### History × Memory Integration Links recent conversation data (**History**) with user context (**Memory**) to ground ratings in **real usage**, not one-off anecdotes. #### Per-Mentor View Ratings are scoped to the specific mentor (e.g., “mentorAI”), allowing accurate comparisons between mentors. #### Actionable Insight Use the rating trend to spot when learners are thriving—or struggling—and prioritize follow-ups or prompt refinements. --- ## How to Use (step by step) #### Open the Mentor - Select the mentor you want to review (e.g., **mentorAI**). #### Verify Memory Is Enabled - Go to **Memory** to confirm it’s **On** and (optionally) that **Reference Saved Memories** is enabled. - You can browse which learners have saved memories such as: - Personal Information - Knowledge Gaps - Help Requests - Lessons Learned #### Check the Chat Rating - Open **History** (or view the rating indicator in the mentor’s overview, if available). - View the **24-hour rating** that reflects recent learner experiences with this mentor. #### Drill Into Evidence - In **History**, review recent transcripts from the same time window to understand why the rating changed. - Cross-reference with **Memory** entries for those users (e.g., known gaps or help requests) to see if the mentor addressed them effectively. #### Take Action - If the rating dips, adjust one or more factors: - **Prompts** – refine tone, structure, or guidance. - **Datasets** – fill content gaps. - **Tools** – enable relevant features (e.g., Web Search, Code Interpreter). - Recheck the rating the next day to assess the impact of your changes. --- ## Pedagogical Use Cases #### Early Warning for Struggle A downward trend signals confusion—review transcripts, add resources, or tweak prompts to clarify key concepts. #### Quality & Tone Assurance Ensure the mentor’s responses align with course expectations; refine the **System Prompt** or tone as needed. #### Measure Improvements After changing prompts, datasets, or tools, use the next day’s rating to validate that your intervention improved learner experience. #### Targeted Support Combine **rating trends** with **Memory insights** (knowledge gaps, help requests) to identify and reach out to specific learners or cohorts needing support. --- With **Chat Ratings**, you get a simple, always-current gauge of learner experience—grounded in the last day of real conversations—so you can keep each mentorAI **effective, supportive, and on track**. ### Transcript mentorAI | Chat Ratings https://www.youtube.com/watch?v=EG7cuQ8GhiU [1.5s] Hi everybody. I'm going to be talking [3.5s] about how we have actually extended both [6.2s] our history and memory features to [9.0s] actually give you an understanding of [10.7s] how people are viewing your mentor. So [13.5s] in this case we have mentor AI here and [15.9s] then we're going to go to memory. We can [18.1s] see that memory is enabled and we can [21.1s] you know see who's who has what memory [23.8s] stored. This is basically a way for the [25.8s] mentor itself to remember information [27.8s] about particular users while also being [30.8s] able to reference the the memories about [32.9s] the user to personalize responses um if [35.9s] that's enabled which in this case it is. [38.5s] Now we've connected this memory to the [40.9s] history tab to provide you as an [43.0s] instructor with a rating of how people [45.3s] are viewing you know their experience [47.4s] with their mentor. Um, in this case, we [50.3s] have um this set to be at a 24-hour uh, [54.1s] you know, chat history/memory input and [56.9s] then using that to to determine what the [59.4s] rating is. So, this will change every 24 [61.4s] hours. Um, assuming that people are [64.0s] chatting and are having, you know, [65.8s] different experiences, [67.6s] um, which would, you know, change this [70.0s] rating here. Um, but essentially that's [72.5s] how this works and it's a really cool [74.1s] feature. It helps you understand, you [75.7s] know, how learners are doing, you know, [77.5s] how they're using the mentor, what's [79.3s] working well and what's not. And that's [81.5s] basically it. --- ## Comprehensive Analytics URL: https://ibl.ai/docs/instructor/analytics/comprehensiveanalytics Video: https://www.youtube.com/watch?v=t0oPlMFYFjU # Comprehensive Analytics ## Description **Comprehensive Analytics** gives you a built-in, **real-time dashboard** for every mentor in mentorAI—no add-ons required. Track **engagement** (messages, users, topics), **satisfaction**, **transcripts**, and **financial usage** (by week/month/provider/model) to understand learning impact and optimize costs. Admins can drill down to **sessions, users, and topics** for evidence-based course improvements. --- ## Target Audience **Administrator · Instructor** --- ## Features #### Single Click Access Open analytics from any mentor’s chat to see live metrics—**no separate setup**. #### Overview Tab Bare-bones totals at the top (messages this month, active users, topics, satisfaction), plus: - User sessions list - Most-popular topics - Active-users bar chart (**Today, Yesterday → 90-day default; custom ranges supported**) #### Users Tab - Active users over time - “Logged-in right now” - Last-30-days usage - Time-of-day heatmap - Total registered users - Per-user details #### Topics Tab - Totals for topics and active users - Average ratings by conversation/topic - Sessions over time - Ratings over time - Highlighted topics (bar chart) #### Transcripts Tab - Full chat histories to see what students ask (including **uncomfortable questions** they avoid in class) - KPIs: avg messages/session, avg cost/session, avg rating/session - Search by topic, user, or rating #### Financials Tab - Usage over time: weekly, monthly, total cost - Cost/day - Cost by provider - Cost by LLM model (reflecting mentorAI’s LLM-agnostic design: OpenAI, Gemini, Microsoft, etc.) #### No Extra Cost Analytics are **included** and available to admins for any mentor. --- ## How to Use (step by step) #### Open Analytics - In a mentor’s chat interface, locate the **analytics icon** (lower-left, above Settings) and click it - The **Comprehensive Analytics** panel opens #### Review the Overview - Scan totals for **Messages, Active Users, Topics, Satisfaction** - Browse **User Sessions**, **Popular Topics**, and the **Active-Users bar chart (Today → 90 days)** - Adjust the **date range** if needed (custom ranges supported) #### Drill into Users - Open **Users** to view: - Active-user trends - Who’s online now - Last-30-days usage - Time-of-day patterns - Total registered users - User-level details #### Explore Topics - Open **Topics** for totals, active users, average ratings by topic/conversation - View **sessions and ratings over time** - See highlighted topics (bar chart) #### Study Transcripts - Open **Transcripts** to inspect full chat histories - Use **Search** to filter by topic, user, or rating - Note KPIs at the top: avg messages/session, avg cost/session, avg rating/session—to spot friction or confusion #### Monitor Financials - Open **Financials** for: - Weekly/monthly/total cost - Cost/day - Breakdown by **provider** and **LLM model** - Use this to choose **cost-effective models** without sacrificing quality #### Iterate & Improve - Use insights to **refine prompts, datasets, tools, or LLM choices** - Recheck analytics after changes to measure impact --- ## Pedagogical Use Cases #### Targeted Remediation Use **Topics + Transcripts** to find concepts with low ratings/high volume, then add materials or adjust prompts. #### Instructional Timing **Time-of-day usage** informs when to schedule office hours or auto-released study nudges. #### Quality Assurance Track **satisfaction** and **avg rating/session** to verify the mentor’s tone, clarity, and helpfulness. #### Assessment & Curriculum Tuning Common queries in **Transcripts** reveal where lessons or assessments need clarification. #### Cost Optimization Compare **cost by provider/model** to select LLMs that balance price, speed, and accuracy for each mentor. #### Program Evaluation **Overview trends** (messages, active users) offer quick evidence of engagement and learning impact for stakeholders. --- With **Comprehensive Analytics**, you gain a **clear, actionable picture** of learner engagement, topic mastery, and cost—so you can continuously improve your mentors and your courses. ### Transcript mentorAI | Analytics Overview https://www.youtube.com/watch?v=t0oPlMFYFjU [1.2s] For admins and administrators, we really [3.5s] value having your insights available to [6.7s] you and your analytics available to you [9.0s] as well. One way you can find your [11.3s] analytics for for your particular mentor [14.0s] is with this little analytics button [15.5s] right here. So, we'll click on this [18.0s] and I'm going to specifically be [19.4s] focusing on the overview tab today. So, [22.2s] on the overview tab, you're going to see [23.8s] a couple of different key pieces of [25.3s] information. You'll see, you know, how [27.4s] many users over the past month have uh [30.4s] been active, how many uh messages over [33.0s] the past month um have been sent to the [35.3s] mentor. And you can even see a [37.0s] comparison against the last month's [38.9s] data. You also can see the topics for [41.2s] the month and the number of [42.5s] conversations that have been held with [44.2s] this particular mentor for that month as [46.3s] well. Now, in this little sessions [48.6s] graph, you'll be able to see exactly, [50.3s] you know, how many individual sessions. [52.5s] So that's a user chatting with a mentor [54.6s] for you know a certain length of time. [56.2s] They might have exchanged multiple [57.4s] messages or they might have just sent [59.2s] one. And you can also filter by how many [62.7s] uh you know a specific time frame. So [64.6s] today [66.3s] seven days you know the past week and [68.3s] you even have the option of doing um a [70.4s] custom filter here. Similarly with you [73.6s] have the same options with filtering by [74.9s] a certain time. This will cover topics [76.9s] that have been specifically spoken about [79.1s] um with the mentor. any sort of [80.7s] questions um you know looking for [83.4s] information whatever it may be these are [85.0s] going to be the topics that have been [86.0s] discussed the most over the past [87.8s] whatever time frame you have set in this [89.4s] case the past month. [92.1s] Now we also can see on this page active [94.3s] users a total count of how many users [96.5s] have been active um over a set time [99.3s] frame. Um, in this case, this will show [101.0s] you by day. Um, but again, you can also [103.5s] filter the time frame and even see, you [105.3s] know, individual days within that that [107.3s] specific time frame that you've selected [108.6s] up here. And that's how this overview [110.7s] tab's going to work. And this will give [113.4s] you great insights on how your mentor is [115.2s] being used, how frequently it's being [116.7s] used, and we'll be covering the other [118.7s] analytics tabs here in a future video. --- ## Financials URL: https://ibl.ai/docs/instructor/analytics/financials Video: https://www.youtube.com/watch?v=0GztbtwwnDA # Financials ## Description The **Financials tab** shows the **real cost** of running a specific mentorAI—broken down by **time period, provider, model, and even user**. Use it to monitor spending, compare weeks or months, and identify savings opportunities (e.g., switching models) without guesswork. --- ## Target Audience **Instructor · Administrator** --- ## Features #### Headline Cost Metrics View **Weekly Cost**, **Monthly Cost**, and **Total Cost** with comparisons to the previous week or month. #### Cost per Day A **daily chart** that lets you hover to view exact dates and spending amounts. Filter by **Today**, **7 Days**, **Past Month**, or a **Custom Range**. #### Cost by Provider Breaks out spending across **LLM providers** (e.g., *OpenAI → Gemini*), especially useful when evaluating provider switches. #### Cost by LLM Shows cost per **specific model** (e.g., *GPT-5 Latest*, *GPT-4.0 Mini*)—ideal for optimizing mentorAI’s **LLM-agnostic configuration**. #### Cost per User A table listing **User**, **Total Cost**, **Sessions**, and **Last Active**—revealing heavy usage patterns and cost drivers. #### Flexible Time Filters All charts and tables automatically respect your **selected timeframe**. --- ## How to Use (step by step) #### Open Analytics → Financials - In the mentor header, click **Analytics**, then select **Financials**. #### Read Headline Metrics - Review **Weekly**, **Monthly**, and **Total Costs**, and note the **week-over-week** or **month-over-month** comparisons. #### Inspect Cost per Day - Use **Today**, **7 Days**, **Past Month**, or **Custom** filters. - Hover over the chart to see the exact per-day spending. #### Compare Providers - Open **Cost by Provider** to see how spending shifts when you switch vendors (e.g., *OpenAI → Gemini*). #### Optimize by Model - Check **Cost by LLM** to identify high-cost models versus more economical options that still deliver quality. #### Identify Cost Drivers (Users) - In **Cost per User**, sort by **Total Cost** or **Sessions** to find heavy users. - Confirm that high usage aligns with course goals and expected outcomes. #### Decide & Iterate - Adjust your **mentor’s LLM/model**, **tools**, or **prompts** as needed. - Recheck **Financials** after several days to verify savings and efficiency improvements. --- ## Pedagogical Use Cases #### Budget Tracking Monitor weekly and monthly spend to ensure programs stay within allocated budgets. #### Model Selection Strategy Use **Cost by LLM** to run high-stakes mentors on premium models and routine mentors on more cost-efficient ones. #### Cohort Management **Cost per User** helps highlight outliers for targeted guidance—encourage efficient usage among high-spending learners. #### Program Reporting Share clear **cost trends** and **provider/model breakdowns** with institutional stakeholders. #### A/B Cost–Quality Tuning Compare **daily costs** before and after switching models or tools to validate savings while maintaining learning outcomes. --- Use **Financials** to make **data-driven decisions** about providers, models, and usage—ensuring your mentors remain both **effective and fiscally responsible**. ### Transcript mentorAI | Financial Analytics https://www.youtube.com/watch?v=0GztbtwwnDA [0.9s] As an instructor or as an admin, you [3.0s] might want to know how much your the [5.2s] financial cost of your mentor is. And a [8.3s] way to do that is by going to this [9.7s] analytics tab. You'll see that there's a [12.0s] financial tab here. And this will break [14.5s] down the costs for you. Now, you'll see [16.7s] this little bar up at the top that has [19.0s] weekly costs, monthly costs, and the [20.9s] total costs of the mentor usage. This [24.0s] will cover also a comparison of the past [26.7s] month and the past week. The the other [30.1s] piece here is that this will show you [31.6s] the cost per day. You can even filter, [33.5s] you know, from today, a week, or even a [36.6s] custom time frame. Now, right now, what [39.0s] you're seeing is the the cost for a [40.7s] given month. And you can hover over a [43.0s] day to see what date it was and the [45.0s] actual cost. Another really cool piece [47.4s] here is that you're able to see the cost [49.0s] per provider. Let's say that you have [51.3s] OpenAI for in use for a little bit of [53.8s] time and you switch to another provider [55.8s] like Gemini. This will break down the [58.2s] exact costs, you know, per provider. [61.4s] And then on the cost by LLM, this will [63.4s] actually cover the costs of the provider [65.8s] LLM selection. So in this case, we're [67.8s] looking at chat GBT5 latest, but we also [71.3s] have, you know, chat GPT40 mini. Um, and [74.6s] this will also again show you the total [76.2s] cost of the specific LLM. You can also [79.4s] filter this by a given time frame as [81.4s] well. The other really cool piece here [83.5s] is you can actually see the cost per [85.0s] user over a given time frame. This will [87.2s] allow you to see who the user is, the [89.0s] total cost, how many sessions, and the [91.3s] last time that they were active. This [92.7s] will also give you a bit of insight of [94.1s] who is using your mentor more frequently [96.2s] than others, and how much it's actually [98.4s] costing. And that covers financials. --- ## History URL: https://ibl.ai/docs/instructor/analytics/history Video: https://www.youtube.com/watch?v=qtKPKlcxCjs # History ## Description The History panel records every conversation between a mentorAI and its learners. Administrators can review transcripts, apply filters (date, sentiment, topic, category), and export data for deeper analysis—revealing learning trends, common pain points, and opportunities to improve both course content and mentor behavior. ![](/images/history.png) --- ## Target Audience **Administrator** --- ## Features #### Complete Chat Log View every session the mentor has had, with user identifiers or **Anonymous** labels if the mentor is set to anonymous. #### Transcript Viewer Click any session to open the full **learner question and mentor response** thread. #### Multi‑Facet Filters Narrow results by **Date Range**, **Sentiment**, **Topic**, or **Category** to focus your review. #### CSV Export Download **filtered or full histories** for analysis in spreadsheets or BI tools. #### Insight Generation Identify **frequent questions**, **learning gaps**, and **tone alignment** to guide prompt tweaks or additional training materials. --- ## How to Use (step by step) #### Open the History Panel - Sign in to **mentorAI** - Click the **mentor’s name dropdown** - Select **History** #### Review the History List - See every chat session plus **user identifiers** (name, username, email, or Anonymous) #### Inspect an Individual Chat - Click a session in the left‑hand list - The full **transcript** appears, showing the learner’s questions and mentor’s responses #### Apply Filters - **Date Range** – limit to a semester, week, or custom period - **Sentiment** – isolate positive or negative reactions - **Topic** – focus on specific subjects - **Category** – filter by predefined learner categories #### Export Conversations - Click **Export** - A **CSV** downloads containing the visible (filtered) data for off‑platform analysis --- ## Pedagogical Use Cases #### Identify Learning Gaps Filter by **negative sentiment** or **repeated questions** to spot concepts that confuse learners. #### Refine Course Materials High‑frequency topics may indicate areas needing **clearer explanations** or **additional resources**. #### Assess Mentor Tone & Style Review transcripts to ensure the mentor’s language aligns with **institutional guidelines** and **learner expectations**. #### Prompt & Dataset Optimization Use findings to tweak the **system prompt** or add new **datasets** that address common queries. #### Accreditation & Quality Assurance Exported histories provide evidence of **learner support** and **instructional effectiveness**. --- By accessing, filtering, and exporting chat histories, administrators gain **actionable insights** to continuously improve both the **mentorAI** and the **educational experience** it supports. ### Transcript mentorAI | Chat History https://www.youtube.com/watch?v=qtKPKlcxCjs [1.3s] One of the cool features of Mentor AI is [3.3s] that you're actually able to track your [5.0s] learner's conversation history. Now, [6.9s] this is really great because a lot of [8.3s] times if learners are using AI in their [10.2s] courses, you don't exactly know what's [12.0s] going on behind the scenes because you [13.4s] just don't have that visibility. Now, [15.3s] with Mentor AI, we give you that [16.9s] visibility and that really, really helps [19.0s] with, you know, understanding what [20.4s] learners are asking, where they're [21.8s] struggling the most, so on and so forth. [24.8s] And I'm going to show you how to take a [26.1s] look at your chat history um on Mentor [28.9s] AI. So once you're in Mentor AI, you'll [31.8s] see this mentor name. We're going to [33.6s] click that dropdown and you'll see a [35.9s] couple of different tabs. Now what we're [37.6s] going to select is the history tab. [40.7s] Once you're here, what this is going to [42.2s] do is pull up all the chat history from [44.7s] each session that uh users have, you [47.8s] know, messaged mentor um and show you [50.2s] here. So on this side, you can see all [53.0s] the chat history uh that this mentor has [55.9s] had with various different learners. [58.8s] Now, if you have your mentor set to [60.3s] anonymous, you would see anonymous [62.1s] instead of the user's actual name or [64.3s] username or email. And now, if you want [67.4s] to dive deeper into each individual [69.1s] chat, you can. You'll just select the [70.5s] message that you want to dive deeper [71.8s] into. And what you'll see here is how [76.2s] the mentor responded and what that [78.1s] question was that the learner asked. So, [80.2s] this really helps you figure out, you [82.2s] know, if there's a specific topic that [84.2s] has been asked about multiple times, [85.6s] well, maybe the course materials need to [87.1s] be made a little bit more clear. Um, you [89.7s] know, you can identify learning gaps, [91.7s] see where learners are struggling, see [93.3s] what topics are even being talked about [94.7s] the most. Um, another cool thing here is [97.7s] that you can filter by a couple [100.1s] different things. You've got your dates. [101.8s] So, if you want to filter by a specific [103.2s] time frame, maybe it's just a semester [105.3s] or a week of a course, um, then you can [108.4s] do that. You can also filter by [110.2s] sentiment to see, you know, how learners [112.6s] are interpreting the responses from the [114.5s] mentor, whether they like them or not. [116.4s] Um, and that'll also help you as an [118.6s] instructor define the system prompt a [120.6s] little bit more. If the mentor isn't [122.5s] responding in a way that makes sense to [124.3s] learners, maybe the system prompt needs [126.1s] a little bit of adjustment or maybe we [128.0s] need to train the mentor on additional [129.8s] materials. It it really depends. Um but [133.5s] essentially this will give you an [134.6s] oversight of not only like where your [136.2s] learning gaps are or but where the [137.9s] mentor can even be improved. You can [139.8s] filter by topics as well. Um like I was [142.2s] saying earlier you can see you know [143.6s] what's being asked about most frequently [145.9s] and then you can also select the [147.2s] category which is al various um [150.0s] different categories that you can select [151.4s] for from um specifically when it comes [153.8s] to learners. [155.4s] And you can also expect uh the ability [158.5s] to export here. Um your whatever you [161.0s] filter by is what you can export. So if [163.0s] you didn't filter by anything, you want [164.7s] to export this into a CSV format for [167.4s] deeper evaluation, [169.4s] you would just simply hit this export [170.9s] button and download the file. And that's [173.8s] basically it. When it comes to uh mentor [175.8s] chatistry, --- ## Overview URL: https://ibl.ai/docs/instructor/analytics/overview Video: https://www.youtube.com/watch?v=t0oPlMFYFjU # Overview ## Description The **Overview tab** gives instructors and admins a **quick pulse on mentor usage**—showing **active users, message volume, topics, and session trends**. Compare activity against last month, filter by time range, and spot engagement patterns to guide instructional and operational decisions. --- ## Target Audience **Instructor · Administrator** --- ## Features #### Active Users (Past Month + Trend) See how many learners engaged with the mentor and compare to the previous month. #### Messages (Past Month + Trend) Track total messages exchanged with the mentor and view month-over-month changes. #### Topics for the Period Identify which subjects or questions were discussed most during the selected timeframe. #### Sessions Graph Visualize **individual learner sessions** (a conversation window containing one or more messages) over time. #### Flexible Time Filters Switch between **Today**, **7 Days**, **Past Month**, or a **Custom Date Range** to refine the analysis. #### Daily Active Users Chart See per-day engagement counts within the chosen period to spot **spikes and lulls** in activity. --- ## How to Use (step by step) #### Open Analytics - In a mentor’s header, click the **Analytics** button to open the analytics panel. #### Select the Overview Tab - You’ll land on (or switch to) **Overview** to see headline metrics and charts. #### Set Your Timeframe - Choose **Today**, **7 Days**, **Past Month**, or **Custom** to focus the analysis window. #### Read Headline Metrics - Review **Active Users** and **Messages** for the chosen period, noting the comparison to last month. #### Scan Topics - Check **Topics** for the month or selected period to understand what learners are asking most frequently. #### Inspect Session Trends - Use the **Sessions Graph** to view **conversation volume patterns** and identify peaks or troughs. #### Drill into Daily Engagement - Review **Daily Active Users** to pinpoint specific high- or low-usage days. #### Decide Next Actions - Use insights to **adjust prompts**, **add datasets**, **tune tools**, or **schedule outreach** to improve engagement. --- ## Pedagogical Use Cases #### Engagement Monitoring Validate adoption after launching a mentor or new module; plan engagement nudges on low-usage days. #### Curriculum Focus Topic frequency highlights where to add examples, FAQs, or short micro-lessons. #### Capacity & Support Planning Identify busy periods (e.g., around assignments or exams) to anticipate support needs. #### A/B Intervention Tracking Compare analytics before and after prompt or dataset changes to measure their impact. #### Stakeholder Reporting Share month-over-month trends to demonstrate mentor usage, engagement growth, and instructional value. --- The **Overview tab** is your **at-a-glance dashboard** for mentor usage—helping you understand **who is engaging, how often, and what they need**, so you can respond quickly and effectively. ### Transcript mentorAI | Analytics Overview https://www.youtube.com/watch?v=t0oPlMFYFjU [1.2s] For admins and administrators, we really [3.5s] value having your insights available to [6.7s] you and your analytics available to you [9.0s] as well. One way you can find your [11.3s] analytics for for your particular mentor [14.0s] is with this little analytics button [15.5s] right here. So, we'll click on this [18.0s] and I'm going to specifically be [19.4s] focusing on the overview tab today. So, [22.2s] on the overview tab, you're going to see [23.8s] a couple of different key pieces of [25.3s] information. You'll see, you know, how [27.4s] many users over the past month have uh [30.4s] been active, how many uh messages over [33.0s] the past month um have been sent to the [35.3s] mentor. And you can even see a [37.0s] comparison against the last month's [38.9s] data. You also can see the topics for [41.2s] the month and the number of [42.5s] conversations that have been held with [44.2s] this particular mentor for that month as [46.3s] well. Now, in this little sessions [48.6s] graph, you'll be able to see exactly, [50.3s] you know, how many individual sessions. [52.5s] So that's a user chatting with a mentor [54.6s] for you know a certain length of time. [56.2s] They might have exchanged multiple [57.4s] messages or they might have just sent [59.2s] one. And you can also filter by how many [62.7s] uh you know a specific time frame. So [64.6s] today [66.3s] seven days you know the past week and [68.3s] you even have the option of doing um a [70.4s] custom filter here. Similarly with you [73.6s] have the same options with filtering by [74.9s] a certain time. This will cover topics [76.9s] that have been specifically spoken about [79.1s] um with the mentor. any sort of [80.7s] questions um you know looking for [83.4s] information whatever it may be these are [85.0s] going to be the topics that have been [86.0s] discussed the most over the past [87.8s] whatever time frame you have set in this [89.4s] case the past month. [92.1s] Now we also can see on this page active [94.3s] users a total count of how many users [96.5s] have been active um over a set time [99.3s] frame. Um, in this case, this will show [101.0s] you by day. Um, but again, you can also [103.5s] filter the time frame and even see, you [105.3s] know, individual days within that that [107.3s] specific time frame that you've selected [108.6s] up here. And that's how this overview [110.7s] tab's going to work. And this will give [113.4s] you great insights on how your mentor is [115.2s] being used, how frequently it's being [116.7s] used, and we'll be covering the other [118.7s] analytics tabs here in a future video. --- ## Reporting URL: https://ibl.ai/docs/instructor/analytics/reporting Video: https://www.youtube.com/watch?v=a8PETKB2i8U # Reporting ## Description **Reporting** gives instructors and tenant admins **downloadable, analysis-ready datasets** from mentorAI. From a single screen, you can select a mentor, open **Data Reports**, and export **User Reports**, **Metadata**, and **Chat History**—making it easy to **audit activity**, **measure engagement**, and **support outcomes research**. --- ## Target Audience **Instructor · Administrator** --- ## Features #### Mentor Selector Pick the mentor you want to analyze from a drop-down (e.g., *“mentorAI”* or any other mentor in your tenant). #### Data Reports Hub One centralized location to access all exports related to usage and conversations. #### User Reports Basic account/usage fields for users who engaged with the mentor (e.g., **email**, **login**, **activity details**). #### Metadata Reports Expanded profile context (e.g., **organization/company**, or additional attributes where available) for deeper segmentation. #### Chat History Reports Conversation-level exports showing **inputs** (learner messages), **outputs** (mentor responses), and **user identifiers**. #### Downloadable Formats Exports are **ready for spreadsheets or BI tools**, enabling deeper analysis and visualization. --- ## How to Use (step by step) #### Open Analytics - After logging in, click the **Analytics** button (visible to tenant admins and instructors with access). #### Choose a Mentor - Use the **mentor selector** at the top to pick the specific mentor you want to report on. #### Open Data Reports - Click **Data Reports** to view available report types. #### Select a Report Type - **User Reports** → basic user/account and usage info - **Metadata Reports** → expanded user context (e.g., company/affiliation where available) - **Chat History Reports** → transcript rows with inputs, outputs, and user identifiers #### Download - Click to **export** the selected report. - Open the file in your preferred **spreadsheet** or **BI tool** to filter, chart, or combine with other datasets. --- ## Pedagogical Use Cases #### Engagement Monitoring Use **User Reports** to track who engaged, when, and how often—useful for participation credit or outreach to inactive learners. #### Cohort Segmentation Leverage **Metadata Reports** to compare usage across programs, cohorts, or partner organizations—and tailor interventions accordingly. #### Curriculum & Assessment Improvement Analyze **Chat History** to identify **common misconceptions**, repeated questions, and content that needs clarification. #### Program Evaluation & Accreditation Provide evidence of learner support and outcomes using exports that show **participation trends** and **qualitative conversation data**. #### Research & Quality Assurance Join usage and transcript data to study **help-seeking patterns** and **course performance correlations**. --- With **Reporting**, instructors get **clean, exportable datasets**—so you can move from anecdotes to evidence and continuously improve **teaching, support, and student success**. ### Transcript mentorAI | Reporting https://www.youtube.com/watch?v=a8PETKB2i8U [1.1s] Hello. I'm going to be showing you the [3.4s] new reporting features we have in Mentor [6.1s] AI. Now, the way you'll get there is as [8.4s] a tenant admin, once you're logged in, [9.9s] you should see this analytics button. [11.9s] When you click on this, you'll be taken [14.5s] to a page that looks like this, which [16.7s] covers a handful of different analytics [19.0s] points about you, the specific mentor's [21.4s] usage. You can see right here I have [22.6s] mentor AI selected. You can also change [24.8s] the mentor from this list. [27.4s] All right. So to get to the data [29.4s] reports, what we'll do is click data [30.8s] reports and you'll see that there are a [32.9s] few different reports. You you'll get [35.0s] user reports, which is just basically [36.6s] information about the users inside this [38.2s] tenant who have used the mentors. [41.6s] And then we also have metadata [43.4s] information. This is going to be a [45.6s] little bit more informative of who the [47.5s] user actually is. You know, it can [49.2s] include company information. um you know [51.8s] who just basically more about who that [54.4s] user is as opposed to with the user [56.4s] reports it's just basic information like [58.5s] you know email login details so on and [61.2s] so forth. You'll also have the chat [63.3s] history which is basically what you see [65.4s] on the chat history tab um but just [67.8s] another way of downloading this [69.2s] information. You'll get to see the [71.2s] reports for you know the conversations [74.2s] between users and the mentor. You'll see [75.8s] the inputs and outputs um and exactly [78.2s] who that user is. All right. And that's [81.0s] basically the new updates for reporting. --- ## Topics URL: https://ibl.ai/docs/instructor/analytics/topics Video: https://www.youtube.com/watch?v=auVd5lMOn6A # Topics ## Description The **Topics tab** shows what learners are talking about most with a specific mentorAI. Instructors and admins can view **topic volume, conversation counts, and message activity** over a selected time range—plus **month-over-month comparisons**—to identify trends, confusion clusters, and activity spikes (e.g., the week before a quiz). --- ## Target Audience **Instructor · Administrator** --- ## Features #### Headline Metrics (Past Period + Trend) For the selected window (e.g., past month), view: - **Total Topics** - **Conversations** - **Messages** Each metric includes a **comparison to the prior period** for context. #### Conversations Over Time Displays a **daily count of conversations**, useful for identifying activity ramps around exams, deadlines, or new modules. #### Timeframe Controls Switch between **Today**, **7 Days**, **Past Month**, or set a **Custom range** (e.g., “week before the quiz”). #### Topic Details Table A **filterable list** of all topics discussed within the chosen period to reveal **interest areas** and **potential confusion points**. --- ## How to Use (step by step) #### Open Analytics → Topics - In the mentor header, click **Analytics**, then select the **Topics** tab. #### Set Your Timeframe - Choose **Today**, **7 Days**, **Past Month**, or use a **Custom date range** (e.g., the week before a major assessment). #### Read Headline Metrics - Review **Total Topics**, **Conversations**, and **Messages** for the selected window. - Note the **month-over-month comparison** to measure engagement or confusion shifts. #### Scan Conversations Over Time - Use the **daily conversations chart** to identify: - **Spikes** (e.g., pre-exam review periods) - **Lulls** (e.g., post-deadline quiet periods) - Plan outreach or review sessions accordingly. #### Drill into Topic Details - Browse the **Topic Details table** to see which subjects drive discussion. - Use **filters** to narrow the list to a course unit, week, or exam period. #### Decide Next Actions - Add **resources or micro-lessons** for high-volume topics. - Coordinate **review sessions** around the highest conversation peaks. - Cross-reference **Users** and **Overview** tabs to triangulate where learners need additional support. --- ## Pedagogical Use Cases #### Pre-Exam Focus Filter to the week before an assessment to see which topics need reinforcement and schedule targeted review sessions. #### Curriculum Tuning Identify **persistent high-volume topics** that may require clearer lecture notes, examples, or practice questions. #### Early Warning Signals Spot **sudden topic spikes** that could indicate confusion or emerging interest—adjust prompts, datasets, or issue timely announcements. #### Cross-Course Insights Compare topics **month-over-month** to evaluate how new materials or instructional changes impact discussion patterns. #### Evidence for Stakeholders Share topic trends to demonstrate what students engage with most and how interventions affect engagement and understanding. --- Use the **Topics tab** to uncover what learners are asking about—and turn those insights into **timely instruction**, **better materials**, and **targeted academic support**. ### Transcript mentorAI | Topics Analytics https://www.youtube.com/watch?v=auVd5lMOn6A [1.1s] In previous demos, we've covered a [3.0s] handful of analytics topics such as [4.7s] overview of highle analytics across a [7.2s] specific mentor as well as user [9.5s] analytics. Now, we're going to be [11.4s] talking about topic analytics for a [14.5s] particular mentor. As an administrator [17.3s] or an instructor, you'll have access to [19.2s] this, and this will give you insight [21.8s] about what specific topics are being [23.4s] talked about the most within a given [25.0s] mentor. So, what we'll do now is [28.4s] navigate to this topics tab. And you'll [30.3s] see this little bar at the top that'll [32.2s] cover a couple different pieces. You'll [34.1s] see for the past month, what topic how [36.4s] many topics been covered, how many [37.9s] conversations have been had, and how [39.8s] many messages have been sent within that [41.8s] 31 conversations that we're seeing on [43.8s] the screen over the past month. You'll [45.6s] also see a comparison of, you know, last [48.9s] month's results compared to this [50.2s] month's. We'll also have this [52.3s] conversations tab which will count how [54.1s] many conversations have been with the [56.0s] mentor on a particular day. You have the [58.3s] option of selecting a specific time [60.2s] frame. So if you want to hone in on, you [61.9s] know, a week before a quiz, that's a [64.1s] great way to see, you know, what topics [65.3s] have been covered and how many [66.3s] conversations students are having with [68.2s] their mentors. And then down here you'll [70.9s] see topic details. And again, you have [73.0s] the ability to filter by whatever time [74.9s] frame that you select. In this case, [77.0s] this is going to be over the past 30 [78.6s] days. And this will show you a list of [80.8s] all the topics that have been talked [82.5s] about with the mentor. So you can kind [84.3s] of get an idea of, you know, where [86.0s] people might be having questions or what [88.0s] they're most interested in, so on and so [90.2s] forth. [91.6s] And that really covers topics. --- ## Transcripts URL: https://ibl.ai/docs/instructor/analytics/transcripts Video: https://www.youtube.com/watch?v=AEwbtAzgGdw # Transcripts ## Description The **Transcripts tab** lets instructors and admins review **real conversations** between learners and a specific mentorAI—with **context and cost metrics**. See summary KPIs (messages per conversation, average rating, estimated cost), search by topic or user, and open any session to read the **exact learner inputs and mentor replies**. Use these insights to improve instruction and tune your mentor’s **prompts, datasets, and tools**. --- ## Target Audience **Instructor · Administrator** --- ## Features #### Session KPIs at a Glance Headline metrics for the selected period: - **Avg messages per conversation** - **Avg conversation cost** - **Avg rating** #### Global Counts Totals for **conversations**, **user queries**, and **assistant responses** in the time window. #### Search & Filter Find transcripts by **topic** (e.g., “mitosis”) or by **user** when you need to check on a particular learner. #### Per-Conversation Details For each transcript, view: - **User name/username** (or *Anonymous*) - **Mentor name** - **LLM model** - **Messages exchanged** - **Estimated cost** - **Timestamp** #### Full Conversation Viewer Read the **exact learner questions** and the **mentor’s responses** to evaluate **clarity, tone, and accuracy**. #### Exports Available Elsewhere Download conversation data from **Data Reports → Chat History** when you need spreadsheets or BI analysis; quick viewing is also available in **Chat History**. --- ## How to Use (step by step) #### Open Analytics → Transcripts - In the mentor header, click **Analytics**, then select the **Transcripts tab**. #### Review Summary Metrics - Scan **avg messages/conversation**, **avg cost**, and **avg rating** to gauge conversation quality and efficiency. #### Search by Topic or User - Use the **search bar** to locate transcripts about a specific concept or for a specific learner you want to check in on. #### Inspect Global Counts - Note totals for **conversations**, **user queries**, and **assistant responses** to understand overall load and activity. #### Open a Transcript Click any session to view: - **User identity** (or *Anonymous*) and **username** (if login is required) - **Mentor**, **LLM model**, **messages exchanged**, **estimated cost**, and **timestamp** - The **exact Q&A exchange** between learner and mentor #### Decide Follow-Ups - If you see **confusion** or **low ratings**, adjust: - **Prompts** - **Datasets** - **Tools** (e.g., *Web Search*, *Code Interpreter*) - Reach out to specific learners based on what you observe. #### Export if Needed - Go to **Data Reports → Chat History** to download **CSVs** for deeper analysis or archival. --- ## Pedagogical Use Cases #### Identify Knowledge Gaps Spot recurring misunderstandings and create **targeted reviews, examples, or mini-lessons**. #### Quality & Tone Assurance Ensure responses are **accurate, on-brand, and student-friendly**; refine the **System Prompt** where needed. #### Support at the Right Time Use **user-level transcripts** to reach out with **resources or office-hour invitations**. #### Measure Impact of Changes After updating **prompts**, **datasets**, or **tools**, compare new transcripts’ **ratings**, **message lengths**, and **costs**. #### Assessment & Curriculum Tuning Topics that dominate transcripts may indicate where **lecture materials or assignments** need clarification. --- Use **Transcripts** to move beyond surface metrics—**read the conversations themselves**, understand learner needs, and continuously improve both your **teaching** and your **mentorAI**. ### Transcript mentorAI | Transcript Analytics https://www.youtube.com/watch?v=AEwbtAzgGdw [1.9s] As instructors and administrators, [4.3s] you'll want to have insights on how your [6.2s] mentor is being used with learners and [8.3s] even have an insight on, you know, the [10.2s] conversations that they're having. As [12.6s] part of the mentor AI platform, you do [14.6s] have this analytics tab. [17.0s] And this transcripts tab right here is [19.0s] actually going to be great for getting [20.5s] this insight. [23.5s] So, what you'll see once you're on the [25.2s] transcripts tab is a little, you know, [27.7s] summary of the number of messages per [30.0s] conversation, the financial costs of [32.7s] these conversations, and the average [34.8s] rating of uh, you know, the [36.5s] conversations that the the learners are [38.4s] having with the mentor. Now, you can [41.1s] search for topics in specific users. If [43.5s] you're looking for um you know a [44.7s] particular topic you think students [46.2s] might be struggling with or particular [48.3s] user you know might be struggling with a [50.2s] certain part of the course or you just [52.2s] kind of want to check in to see how [53.4s] they're you know handling the course [55.0s] content. [57.0s] Now you can also see a total number of [59.0s] conversations how many user queries [60.8s] there have been um times you know we've [63.5s] had assistant responses and when you get [66.6s] down to this part these are going to be [68.1s] the actual transcripts that you see um [70.7s] from individual users. Now, if we click [73.0s] on one of these, [75.5s] we can see that this is the user's [78.3s] input. You'll see here the name of the [80.8s] user if it's not anonymous. Um, if there [83.0s] is a username, again, if the mentor, you [85.4s] know, requires you to log in, that's [86.9s] when you'll see this username piece [88.2s] filled out. And you'll even see what the [89.8s] mentor name is and the specific LLM [91.9s] model. Another really cool thing here is [94.0s] that when you select this one, you can [95.8s] also see the number of messages [97.2s] exchanged and the estimated cost as well [100.4s] as when this conversation happened. And [103.1s] now on the transcript side, you can [105.0s] actually review exactly what the mentor [107.8s] responded with um and what questions the [110.1s] learner was asking. This is also [112.1s] available um on the chat history tab as [114.6s] you probably have seen before. And you [116.3s] also have the data reports tab here that [118.3s] allows you to download the chat history. [120.4s] This page here is going to just give you [122.2s] a little bit more insight uh again with [124.2s] like the financial costs, the average [125.9s] count of messages um per you know [129.2s] session and you know any additional [132.0s] information um like the LLM model and so [134.6s] forth. And that's how you can use the [137.0s] transcripts tab to really enhance you [138.7s] know your teaching or even enhance your [140.8s] mentor so that it responds better. This [143.1s] also gives you insights like I mentioned [144.9s] in into you know what topics are being [146.6s] asked about the most. Um, you know, [148.4s] maybe that helps you understand where [149.7s] knowledge gaps lie. Um, you can also [151.8s] again look for specific users who you [153.8s] think might be struggling and you want [155.4s] to reach out to to assist. This will [157.4s] give you that insight that you're [158.5s] looking for. --- ## Users URL: https://ibl.ai/docs/instructor/analytics/users Video: https://www.youtube.com/watch?v=R8igyis9Oho # Users ## Description The **Users tab** shows who is using a specific mentorAI, how often they access it, and when activity peaks. Filter by time ranges, view **live and historical counts**, scan **access-time heatmaps**, and drill into **per-user details** (messages sent and last active) to guide outreach and support. --- ## Target Audience **Instructor · Administrator** --- ## Features #### Live & Historical Counts View **currently logged-in users**, users active in the **last 30 days**, and the **total registered users** for the mentor. #### Flexible Time Filters Switch between **Today**, **Past 7 Days**, **Past Month**, or a **Custom Date Range**. All tables and charts automatically update to match your selection. #### Active Users Over Time Track how many **unique users** engaged with the mentor within the selected period. #### Access Times Heatmap See a **time-of-day/day-of-week grid** where darker shades indicate higher user activity—helping you quickly spot **peak hours**. #### User Details Table For each user, view: - **Name/Username** - **Messages sent** (specific to this mentor) - **Last active timestamp** --- ## How to Use (step by step) #### Open Analytics → Users - In the mentor header, click the **Analytics** button. - Select the **Users tab**. #### Set Your Timeframe - Choose **Today**, **7 Days**, **Past Month**, or **Custom**. - All counts, charts, and tables refresh to match your selection. #### Check Topline Counts - Review the **currently logged-in**, **last-30-days**, and **total registered users** metrics to gauge adoption and growth. #### Scan the Active-Users Trend - Use the chart to observe **growth or decline** in unique users over time. #### Study the Access Times Heatmap - Identify **peak access windows** (darkest cells) to schedule announcements, office hours, or reminders when most students are online. #### Drill into User Details - Sort or filter the table to find users with: - High message volume - Recent activity - Inactivity for targeted follow-up --- ## Pedagogical Use Cases #### Targeted Outreach Filter for **inactive or low-activity learners** and send personalized reminders or learning resources. #### Scheduling Support Use the **heatmap** to time **office hours** or **content releases** when most learners are active. #### Cohort Health Check Track **30-day activity** and **active-user trends** to confirm adoption following new mentor launches or updates. #### Intervention Prioritization Combine **messages sent** and **last active** to identify learners who need help (e.g., those with high activity but low success rates or long inactivity). #### Reporting & Stakeholder Updates Export or share **clean user counts** and **peak-time visualizations** to demonstrate engagement and participation patterns. --- The **Users tab** helps you understand **who is engaging**, **when they engage**, and **how often they interact**—so you can deliver **timely, data-driven support** to your learners. ### Transcript mentorAI | User Analytics https://www.youtube.com/watch?v=R8igyis9Oho [1.4s] In a previous video, we discussed how to [3.8s] find analytics as administrators or [6.7s] instructors for a particular mentor and [9.1s] how the overview page on on the [11.1s] analytics tab works. Now, we're going to [13.3s] dive into the next tab, which is users. [16.7s] So, again, you'll click on this little [18.3s] analytics button here, and then we'll [21.0s] click on the users tab. This is very [23.8s] similar in terms of being able to filter [26.1s] um you know based on certain time frames [27.7s] for these tables and you'll get a little [30.0s] bit more insight about how the users are [31.9s] actually using the platform in terms of [34.4s] you know how often they've accessed and [36.1s] who those users are. So you can see [38.2s] right now we've got about two users [40.5s] logged in, how many have logged in over [42.6s] the past 30 days and the total number of [45.1s] registered users uh according to this [47.4s] you know mentor here. Um, so you'll [50.2s] again be able to filter by, you know, [52.0s] the a specific time frame for these [53.8s] tables here. You'll be able to see an [55.6s] active users count within a a specified [59.0s] time frame. So that could be today, uh, [61.0s] within the past week, within the past [62.8s] month, or even a custom time frame. Now, [66.1s] this access times table is going to show [67.6s] you when your mentor is most accessed [69.5s] or, you know, when people are accessing [71.2s] it. You can see here that darker shades [73.8s] of blue are going to indicate more users [75.9s] during a specific time frame. And again, [78.2s] this is something that you can filter by [79.5s] time frame. [81.4s] Now, we also have user details down [83.4s] here, and I'm just going to show you [85.0s] those. Again, filtering by time frame, [87.4s] you'll be able to see who that user is, [88.8s] what their username is, and how many [90.6s] messages they have sent for this [92.4s] specific mentor in addition to the last [94.5s] time that they were active. And that [96.9s] really covers all that's on the users [98.3s] tab. --- ## Auto-Retraining Datasets URL: https://ibl.ai/docs/instructor/data/autoretrainingdatasets Video: https://www.youtube.com/watch?v=-kopLzUyExI # Auto-Retraining Datasets ## Description **Auto-Retraining Datasets** keeps **URL-based sources** in your mentorAI up to date—automatically. For any dataset that points to a **live web page** (e.g., documentation sites, LMS pages, LinkedIn articles), you can schedule periodic checks. If the page changes, the mentor re-trains on the new content at your chosen interval—**no manual uploads required**. --- ## Target Audience **Instructor · Administrator** --- ## Features #### Per-URL Scheduling Set retraining intervals for **individual URL resources** directly from the **Datasets tab**. #### Preset & Custom Intervals Choose from **Daily**, **Weekly**, or **Monthly**, or set a **custom every-N-days** cadence (e.g., every 2 or 4 days). #### Change-Aware Updates At each interval, mentorAI checks the page; if it’s been updated, the dataset is **automatically re-trained** on the latest content. #### Status Feedback After saving, a confirmation appears (e.g., *“Retrain interval successfully updated”*) to verify your schedule has been applied. --- ## How to Use (step by step) #### Open Datasets 1. In the mentor header, open the dropdown → **Datasets**. 2. You’ll see your dataset list, including any **URL sources**. #### Pick a URL Resource - Locate the web resource you want to keep current (e.g., a documentation page, LMS module, or LinkedIn article). #### Set the Retrain Interval 1. Click the **interval/clock icon** next to that URL. 2. Select a preset (**Daily**, **Weekly**, **Monthly**) or enter a custom **every-N-days** value. 3. Click **Save**—you should see a success message confirming the schedule. #### Let mentorAI Auto-Update - On the chosen cadence, the platform checks the page. - If the content has changed, mentorAI **re-trains automatically** on the updated version. #### Adjust Anytime - Reopen the **interval control** to modify or remove the schedule as your content cadence evolves. --- ## Pedagogical Use Cases #### Living Syllabi & Course Pages Keep mentors aligned with **frequently updated syllabi, schedules, or module content** without manual refreshes. #### Policy & Procedure Updates Ensure the latest **academic integrity**, **financial aid**, or **IT policies** are reflected in mentor responses automatically. #### Research & Reading Lists Track **evolving bibliographies** or **lab protocols** hosted on departmental or institutional websites. #### Faculty Blogs & Announcements Auto-ingest new **guidance, examples, or announcements** posted publicly (e.g., faculty blogs, LinkedIn updates). #### Program Documentation Keep **onboarding guides and how-tos** updated each term without repeated uploads. --- With **Auto-Retraining Datasets**, your mentor stays synchronized with **changing web sources**—so learners always receive answers grounded in the **latest, most accurate materials**. ### Transcript mentorAI | URL Dataset Auto-Retraining https://www.youtube.com/watch?v=-kopLzUyExI [0.1s] how we can autoupdate data sets that are [3.0s] URL based. So in this case of mentor AI, [6.2s] I'm going to go to the data sets tab and [8.2s] you can see we have quite a few URLs [10.0s] here. Now let's say that you update your [13.0s] documentation or there are updates to an [15.8s] article you wrote and posted on [17.4s] LinkedIn. Any set sort of URL that could [20.5s] be updated for whatever reason um that [23.8s] is a part of your data sets. One way we [26.2s] can autoupdate that uh data set to [29.2s] incorporate new changes is to click this [31.8s] little interval icon, this little clock. [34.4s] Now you can see here you have a couple [36.1s] of different options. You can retrain at [39.0s] different intervals or you can even set [40.9s] a specific interval you'd like in a day [43.4s] format. So for instance, we have one day [46.1s] a week, monthly, and if you wanted to [47.9s] have it every two days, every four days, [50.2s] so on and so forth, that's what you [51.4s] would put here. So we'll click on daily [53.3s] one and it schedule retraining. [57.4s] You can see this has successfully [59.0s] updated the retrain interval. And so now [61.4s] every one day um if it'll check to see [64.1s] if the the page has been updated and if [66.0s] it has been updated the mentor will now [68.4s] account for that. --- ## Context Awareness URL: https://ibl.ai/docs/instructor/data/contextawareness Video: https://www.youtube.com/watch?v=lJB2SJIPOBU # Context Awareness ## Description The Context Awareness feature embeds mentorAI directly beside any Canvas page or PDF you open. Because the mentor “sees” the material on screen, it can summarize, translate, expand, or answer questions about that exact content—giving you lightning‑fast, page‑specific support without switching tabs. --- ## Target Audience **Instructor** --- ## Features #### Auto‑Detect Page Content The mentor panel loads with full awareness of the Canvas page, slide deck, or PDF you’re viewing. #### Quick‑Action Buttons One‑click prompts like **Summarize key takeaways** speed up common tasks. #### Three Smart Tabs - **Course** – summarize or ask questions about the current page - **Translate** – render the entire page in a chosen language - **Expand** – dive deeper into concepts or slides for richer understanding #### Suggested vs. Guided Prompts - **Suggested prompts (top)** = ready‑made starters - **Guided prompts (bottom)** = generated from your ongoing chat to nudge learning forward #### Seamless Chat Continuation Type any question at any time; mentorAI responds in the **context of the page** you’re on. --- ## How to Use (step by step) #### Open a Course Page - In **Canvas**, navigate to any **page or PDF** - The mentor panel appears on the side—already **context‑aware** of what you’re viewing #### Use the Course Tab - Click a quick‑action button such as **Summarize key takeaways** - mentorAI reads the current page and returns a **concise summary** - Prefer typing? Skip the buttons and enter your own question #### Translate the Page - Switch to the **Translate** tab - Choose a **target language** (e.g., French) - mentorAI **rewrites the entire page** in that language—great for multilingual learners #### Expand the Content - Open the **Expand** tab - mentorAI delivers a **deeper explanation** of the selected material (e.g., a “Colliding Cultures” slide deck) - Continue chatting or click another quick prompt to explore further #### Leverage Suggested & Guided Prompts - **Suggested prompts** at the top provide instant ideas if you’re unsure what to ask - **Guided prompts** at the bottom evolve from your conversation, nudging you toward the next logical question - Click any prompt—or just type your own—and mentorAI answers in **full context** --- ## Pedagogical Use Cases #### Efficient Reading & Note‑Taking Generate **summaries of dense textbook pages** to grasp key points quickly. #### Language Support **Translate lecture notes** into a preferred language for better comprehension. #### Concept Deep‑Dives Use the **Expand** tab to get **richer explanations** of complex diagrams or historical documents. #### Active Study Sessions Ask follow‑up questions prompted by **Guided prompts**— > “Explain how this relates to the next chapter,” etc.—to connect ideas. #### Accessibility & Inclusive Learning Students with different language backgrounds or learning preferences receive **page‑specific help** without extra tools. --- With **Context Awareness**, mentorAI meets you **right where you’re studying**—summarizing, translating, and expanding the exact content on your screen so you can learn **faster and more effectively**. ### Transcript mentorAI | Canvas | Chat, Summary, Expand and Translate https://www.youtube.com/watch?v=lJB2SJIPOBU [0.2s] is already integrated on the side. So [2.2s] this mentor is context aware. So [4.2s] whatever you know content is inside [6.5s] these pages within the course the mentor [8.8s] can actually answer about. So we'll [10.0s] select the first one [16.2s] here. And once we have the PDF open for [19.7s] this we can start asking the mentor [21.3s] questions about it. And again since this [23.0s] is contextaware we can uh ask it [25.4s] anything about the content on this [26.9s] specific page. In this case, I'm going [28.7s] to use one of the pre-selected options [30.6s] in these little buttons. So, I'll hit [32.6s] summarize key [35.2s] takeaways. And what this is going to do [37.2s] is evaluate the contents of this PDF and [40.1s] provide you [41.2s] with information that summarizes the [44.6s] page. So now, if we want to summarize [47.3s] the entirety of this specific [49.6s] page, we can get that information as [52.0s] well. Um, again, that first tab there [54.6s] had a couple of different options. You [56.0s] can also chat with the mentor directly [57.5s] without having to select a prompt. And [59.4s] you can see here this summarizes the [61.0s] page. Now, if we wanted to translate the [63.3s] page, uh there are a couple different [65.1s] options we can translate from. I'm going [66.7s] to select French [68.0s] here. And what this is going to do is [70.0s] take the contents of the page and [71.4s] translate it into French. This might be [73.1s] helpful for, you know, users who are not [75.9s] um speaking English as their first [77.4s] language. Um and or if they're taking an [80.6s] English or a different language class [82.5s] and they don't really understand [83.5s] materials, they can translate it into [84.9s] English. so on and so [86.4s] forth. Now, after this, we can go to the [89.2s] expand tab. And what this is going to do [90.7s] is expand on the materials that we we [93.0s] find in this course tab. Um, so in this [96.4s] case, we're going to be expanding upon [98.5s] uh this colliding cultures deck that we [101.0s] have here. And you can see that this is [104.5s] providing an expanded version of what [106.2s] we're learning. [109.9s] Now again, we can also chat with the [112.2s] mentor um as a normal learner would [114.8s] without having to select one of these [116.1s] prompts. You can also use a suggested [120.0s] prompt. Um you can select from these [122.5s] suggested prompts which are essentially [123.8s] prompts that have been pre-created that [125.8s] the user can select from um in case they [128.4s] don't know exactly where to [130.5s] start. And here you can see that they've [132.9s] provided an answer from one of those [134.1s] suggested prompts. [138.5s] And then if we want to go down to the [140.2s] end here, there's also what we call [141.6s] guided prompts. These are generated [143.5s] based on the conversation history. Um, [145.5s] we can also select one of these and we [148.6s] can get an answer back essentially the [150.1s] same way we do with these other [153.6s] tabs. All righty. And that's basically [156.2s] how this mentor integration works in [157.8s] Canvas. --- ## Datasets URL: https://ibl.ai/docs/instructor/data/datasets Video: https://www.youtube.com/watch?v=kvhDpVdRAOw # Datasets ## Description The Datasets tab is where you curate the knowledge base for a mentorAI. Upload files, link cloud resources, toggle whether content is active for training, and decide if learners can view the original sources. This central dashboard gives you full control over what the mentor “knows” and what students can see. ![](/images/datasets.png) --- ## Target Audience **Instructor** --- ## Features #### Comprehensive Dataset Table Columns show **Name**, **Type** (file, URL, etc.), **Tokens** (size contribution), **Storage Location**, and **Status** (training progress). #### Training Toggle Switch a resource **On** to include it in the mentor’s knowledge base, or **Off** to exclude it without deleting. #### Visibility Toggle Decide whether learners can open the source in the **Retrieved Documents** side panel (**On**) or keep it hidden (**Off**). #### Flexible Resource Types Accepts direct uploads (**PDF, PPT, images, video**) and cloud links from **Google Drive, Dropbox, OneDrive, YouTube**, and standard URLs. #### Easy Deletion & Re‑Upload Remove outdated items with a **trash‑can icon**, or add fresh resources anytime with **Add Resource**. --- ## How to Use (step by step) #### Open the Datasets Tab - Click the **mentor name dropdown** - Select **Datasets** from the settings list #### Review the Dataset List - Scan the columns: **Name, Type, Tokens, Storage Location, Status** - **Status** shows if training is complete (large files may take longer) #### Manage Training Status - Flip the **Training** switch: - **On** → resource is active - **Off** → resource is ignored (kept for later reactivation) #### Control Learner Visibility - Toggle **Visible**: - **On** → learners see and can open the source - **Off** → mentor uses it, but learners can’t access it #### Delete a Resource (if necessary) - Click the **trash‑can icon** next to an entry to remove it entirely #### Add New Resources - Click **Add Resource** - Choose a source type: - File upload (PDF, PPT, image, video…) - Cloud link (Google Drive, Dropbox, OneDrive) - YouTube or web URL - Connect to your cloud account if prompted, pick the files, or paste a URL - The new item appears in the list and begins training automatically #### Monitor Training - Watch the **Status** column; wait until it shows **Trained** before relying on the content --- ## Pedagogical Use Cases #### Curriculum Updates Toggle **Off** outdated lecture slides while adding new editions—keeping the mentor current without losing historical material. #### Selective Source Sharing Hide proprietary PDFs but keep public articles **Visible** so learners can read primary sources. #### Thematic Mentors Upload domain‑specific datasets (e.g., nursing guidelines, legal cases) to create specialized tutors. #### Iterative Course Development Add draft materials as **Training Off** for internal testing, then flip **On** when finalized. #### Compliance & Quality Control Regularly review the dataset table to ensure only vetted, up‑to‑date documents feed the mentor’s answers. --- With the **Datasets tab**, you have precise, granular control over the mentor’s knowledge and learner access—ensuring **accurate guidance** and **transparent sourcing** throughout the course. ### Transcript mentorAI | Datasets https://www.youtube.com/watch?v=kvhDpVdRAOw [0.6s] As an instructor on the mentor AI [2.5s] platform, you have the ability to define [4.2s] the knowledge base that your mentor is [5.8s] trained on. Now, to see what this looks [7.8s] like, you'll select the dropown where [9.3s] you see the mentor name. This will show [11.0s] you a list of different settings that [12.6s] you can edit, make adjustments to, so on [14.6s] and so forth. You have the data sets tab [17.2s] here, which is where you'll upload your [18.7s] knowledge base. You can see that we've [21.2s] already trained this specific mentor on [22.7s] a few different things. Um, you'll get [24.6s] the name, the type of file it is, if [26.7s] it's a file or if it's a URL, things [28.2s] like that. how many tokens this takes uh [31.0s] to train the mentor. Um where that [34.0s] content is going to be stored. So if [36.2s] it's, you know, a file that you upload, [37.8s] it'll be stored in our system, but you [39.2s] can also have links to other various [41.3s] sources. You'll also have a status [43.3s] column that'll let you know if it's been [45.0s] trained on a certain material. Sometimes [46.9s] it takes a little bit longer for bigger [48.6s] files to be trained. So you'll be able [50.2s] to keep track of the status there. Now, [52.6s] this training toggle that you see right [54.2s] here is going to be the toggle that says [56.9s] whether or not this document is actually [58.7s] a part of that knowledge base. So, you [60.3s] might upload a document and later it [62.3s] might be outdated or you might have, you [64.6s] know, something better that you'd like [66.2s] to replace it with. Instead of just [67.8s] simply deleting it, you can turn it off. [69.9s] And if you wanted to turn it back on [71.2s] later, you definitely can by just [72.7s] selecting that toggle. Now, this visible [75.0s] setting right here is going to indicate [77.5s] whether or not that material is [79.4s] accessible to learners as a resource [81.6s] when they're asking questions. So, if [83.5s] since you can see this one's toggled [85.0s] off, if a mentor or a learner asks the [87.4s] mentor a question specifically about [88.9s] this file, it'll respond based off the [90.8s] knowledge base, but won't provide that [92.9s] as a resource the learner can access. [95.3s] Whereas these visible ones here, there [97.0s] will be a panel that pops up that'll [98.8s] show um exactly where the response is [101.9s] coming from. Um, if you have this marked [104.0s] as visible, you also can delete um the [107.2s] resource if needed with this little [109.1s] trash can icon. There's also an add [111.0s] resource button. So, this is how you'll [112.5s] add new materials. [114.6s] And you'll see that you have a variety [116.1s] of different uh options here. So, you [118.2s] can have PowerPoint, Google Drive, [120.2s] YouTube, various different file types, [123.0s] um even images and video. If you use one [126.3s] of the options like Google Drive or [127.7s] Dropbox or One Drive, then you'll just [129.9s] simply connect your account uh through [132.5s] this this platform here and be able to [135.4s] select your materials from those those [137.5s] sources. Now, otherwise uh you'll just [140.0s] upload or add a link and it's as simple [142.9s] as that. So, if you wanted to see what [144.4s] that looks like, for instance, YouTube [146.1s] will be a URL whereas PowerPoint will be [149.4s] a file that you can select from. [152.2s] And that's basically how you train your [154.1s] your mentor on its knowledge base. --- ## Data Sources URL: https://ibl.ai/docs/instructor/data/datasources Video: https://www.youtube.com/watch?v=xL_HcXuyGeo # Data Sources ## Description Dataset Visibility in mentorAI lets instructors control whether learners can see and open the exact resources (“datasets”) that the mentor used to generate an answer. When visibility is enabled, a Retrieved Documents side panel appears in chat so learners can explore the source material themselves. When visibility is disabled, mentorAI still uses the dataset behind the scenes, but the source files remain hidden—useful when you want the AI to draw on proprietary, assessment, or advance material without revealing it. ![](/images/datasources.png) --- ## Target Audience **Student** --- ## Features #### Learner Transparency Toggle Decide, per dataset, whether students may open the exact resources mentorAI retrieved. #### Retrieved Documents Side Panel When visibility is on, chat answers are accompanied by a clickable list of source files so learners can read, cite, and verify the material. #### Non‑Destructive Control Toggling visibility does not remove the dataset from mentorAI’s training; it only controls whether learners can access the documents. #### One‑Click Icon Interface An eye icon (👁️ = visible, 👁️‍🗨️ = hidden) in the Datasets tab makes it effortless to turn visibility on or off. #### Automatic Retraining (if needed) When a dataset is made visible again after being hidden, mentorAI seamlessly retrains on that content to ensure up‑to‑date retrieval. --- ## How to Use (step by step) #### Open Settings In your mentor admin view, click **Settings**. #### Select the Datasets Tab You’ll see a table of every resource collection used to train this mentor. #### Locate the Desired Dataset Scroll or search to find the dataset whose visibility you want to adjust. #### Check the Eye Icon - 👁️ (no slash) = learners currently see this dataset in the side panel. - 👁️‍🗨️ (with slash) = learners cannot open this dataset. #### Toggle Visibility Click the eye icon to switch states. - Turning off (👁️ → 👁️‍🗨️) hides the resource from students; mentorAI still uses it to answer questions. - Turning on (👁️‍🗨️ → 👁️) reveals the resource and retrains the mentor if necessary. > **Note:** Visibility only affects learner access. The dataset remains in the mentor’s knowledge base unless you explicitly remove it. --- ## Pedagogical Use Cases #### Source Transparency & Citation Practice Enable visibility so learners can open primary sources, encouraging proper citation and critical evaluation of evidence. #### Scaffolded Learning Paths Start courses with visibility off to prevent information overload; toggle on later to let advanced students explore deeper materials. #### Controlled Assessment Support Keep answer keys or formative‑assessment rubrics hidden while still letting mentorAI reference them to provide feedback. #### Encouraging Independent Research By showing retrieved documents, you prompt learners to read beyond the AI’s summary, fostering information‑literacy skills. #### Selective Disclosure of Proprietary Content Hide internal documents from external cohorts while maintaining the mentor’s ability to leverage that expertise. --- ## Quick Reference - **Visible resources:** learners see and can open them in the Retrieved Documents panel. - **Hidden resources:** mentorAI still uses them, but learners cannot access the files. ### Transcript mentorAI | Resource Visibility https://www.youtube.com/watch?v=xL_HcXuyGeo [0.2s] which data sets that your mentor has [2.1s] been trained on are visible to learners. [4.5s] So what this basically means is if a [6.8s] learner asks a question that can be [9.0s] found in the resource materials that the [10.5s] mentor has been trained on, you can [12.0s] decide whether or not the me the user [14.2s] can actually view that as a source. So [16.3s] this would be an example. So, we're [17.8s] going to ask the mentor a question. [20.9s] And once it responds, what it's going to [22.5s] do is populate a response, but also on [25.6s] the side panel, it's going to show [27.0s] resources that are accessible to the [29.4s] learner. Here, that the mentors actually [31.7s] responded, but also showing you the [33.7s] retrieved documents. So, these are [35.6s] actually materials that the mentor has [37.0s] been trained on, and they can be clicked [39.8s] on and viewed by the learner as an [42.6s] additional resource in to supplement the [44.9s] response from the mentor. Now, if you [47.4s] want to enable this, what you would do [49.2s] is go to the data sets tab under the [51.3s] settings. And you'll see this is what's [53.9s] been marked as visible um versus what's [56.6s] not been marked as visible. Uh if you [59.2s] see this little eye icon without a slash [61.7s] through it, that means the user or the [63.8s] learner can access the resource the [65.7s] mentor has been trained on. If you did [67.4s] not want the learner for any reason to [69.1s] be able to access the the resources, [71.5s] then you would just simply click this [73.7s] and you'll see that it has been toggled [75.7s] off. [78.2s] And then you can toggle it back on to [79.9s] retrain it and also make it visible to [81.8s] learners like this. You can also have [83.2s] this be turned on as a trained resource [86.3s] without having it marked as visible. --- ## Document Retrieval URL: https://ibl.ai/docs/instructor/data/documentretrieval Video: https://www.youtube.com/watch?v=shdYfSObDS8 # Document Retrieval ## Description **Document Retrieval** makes every mentorAI answer **transparent and verifiable**. When a learner asks a question, the mentor: 1. **Cites** the exact source inside the reply (e.g., *“Lecture 11 — Slides 35–36”*) 2. **Shows** a dynamic **Source Panel** with the documents it used, ranked by relevance 3. **Lets users open** any listed file with one click to read the full context Administrators control which materials can be shown by toggling each file’s **Visible** switch in the **Datasets** table—**no retraining required**. --- ## Target Audience **Student · Instructor · Administrator** --- ## Features #### Inline Citations in Answers Replies reference the exact lecture, slide, or page used (e.g., *Lecture 11 — Slides 35–36*). #### Source Panel (Ranked by Relevance) A live panel lists retrieved documents for that specific answer and updates as the conversation continues. #### One-Click Source Opening Learners can open any listed file to read supporting context immediately. #### Admin Visibility Controls Per-file **Visible** toggles determine which sources can be shown/cited in the panel—without removing them from training. #### Works at Scale Handles large training sets; sources are still ranked and cited for each response. #### Guided-Prompt Friendly Use guided prompts to kick off a conversation when learners aren’t sure where to start. --- ## How to Use (step by step) #### Ask a Question in Chat - Example: > “What are key epidemiological study designs?” - mentorAI reads your query, searches trained resources, and composes an answer with inline citations - Example citation: > “These study designs are discussed in Lecture 2.10 of Prof. Quinlan’s course.” #### Review the Source Panel - The panel displays the documents used, ranked by relevance (often with a confidence/percentage indicator) - Click any source to open the original and read more #### Ask Follow-Ups (Panel Updates Automatically) - Example: > “Can you explain case-control studies in detail?” - The Source Panel refreshes to show the most relevant documents for the new question and cites them in the reply (e.g., *“See Lecture 11, Slides 35–36”*) #### Open & Read Sources - Select a listed document (lecture, slide deck, PDF) to view full context and deepen understanding #### (Admin) Control Visibility of Sources - Go to **Settings → Datasets** for the mentor - Use the **eye icon** in the Visible column to show or hide individual files: - **Visible On** → learners can see/click the source in the panel - **Visible Off** → the mentor can still use the file for answers, but it won’t appear in the panel - Changes apply instantly; **no retraining is required** --- ## Pedagogical Use Cases #### Transparent, Citable Answers Teach students to verify claims and cite original materials—great for research and academic integrity. #### Guided Reading & Deep Dives Learners jump straight from an answer to the exact slide/page for fuller context. #### Instructor QA & Content Gaps Instructors can confirm the mentor cites the right sources and spot where additional materials are needed. #### Assessment Support Link explanations to specific readings so students revisit core texts before quizzes or exams. #### Scaffolded Disclosure Keep some documents hidden (**Visible Off**) for assessments or proprietary content, while still letting the mentor draw on them to answer. --- ### Transcript mentorAI | Document Retrieval Feature Demo https://www.youtube.com/watch?v=shdYfSObDS8 [0.2s] Hey, what's going on? This is Jeremy at [1.8s] IBLAI. Today I'm showing you one of our [3.9s] coolest features, the document retrieval [6.2s] feature. Now what this means is we have [10.1s] this feature that allows the mentors to [12.2s] actually site directly where they're [13.7s] pulling information from as well as [16.5s] sharing the specific documents that it [18.6s] pulled information from so that users [21.0s] can then go directly into those [22.7s] documents if they'd like to I don't know [25.8s] get uh more fuller context dive deeper [28.6s] into the information or content that [30.6s] they're working with uh etc. So in order [34.0s] to show this I am showing you this [37.4s] mentor that we've created with the [39.3s] George Washington University. Uh the [42.1s] training data that the mentor is using [44.2s] to communicate with students is quite [46.9s] vast as you can see by these six pages [49.2s] of uh training documents. But the [52.5s] important thing I want to call your [53.6s] attention to is this column right here [56.2s] visible. You see these little eye icons [59.6s] right throughout here. What this means [62.3s] is that while they are enabled as [65.9s] visible, the mentor can actually [68.7s] directly site each one of these [70.3s] documents within its responses to users [73.5s] uh who are communicating with this [75.4s] mentor. And that can be toggled on or [78.1s] off at any time very easily just by [80.5s] selecting this button. [83.0s] So, I'm going to leave that on for now, [84.8s] but just so you can see the actual [87.0s] function in action, that's how we get [90.4s] this uh this visibility to work. [94.4s] So, in order to show you what that looks [97.4s] like in conversation, uh we have a [100.1s] couple of, you know, potential guided [102.6s] prompts down here that students can use [105.1s] uh in order to kick off a conversation [107.6s] if they know that they need a little bit [109.0s] of assistance but don't really know [110.4s] where to start. It's also a very easy [112.6s] way to kick off a conversation to show [114.8s] you the features that we uh are very [117.1s] proud of. So, I'm going to ask what are [119.8s] key epidemiological study designs and [123.4s] the mentor is going to take a second to [126.2s] digest the question that I've shared [127.9s] with it. Uh pull from the resources that [130.8s] it's been trained on in order to compile [132.7s] a response and then is going to actually [136.2s] directly site where it's getting that [137.9s] information. So for this particular [140.6s] answer that the mentor is giving me, [142.3s] these study designs are discussed in [144.2s] lecture 10 of Professor Quinnland's [147.2s] course. So I'm going to ask, can you [150.5s] explain case control studies in detail? [153.8s] And as I'm doing that, you'll see that [155.4s] the mentor has also pulled up the [157.8s] specific documents that it used to [160.0s] create this question so that students [162.3s] can see with a percentage of accuracy [164.6s] where most of this information is coming [166.6s] from and which course specific uh [170.1s] content they'll need to pull up in order [172.4s] to dive deeper into that information. [175.4s] and it's going to continuously update as [177.8s] we ask new questions uh to share you [181.3s] know the specific documents that it is [183.8s] using to put together its responses for [186.3s] each response and again within the [188.8s] response itself for more information you [191.1s] can refer to lecture 11 slides 35 and 36 [194.6s] from professor Quinnland's course that's [197.1s] a very quick shot at the retrieve [199.7s] documents feature thank you so much for [201.4s] tuning in and as always learn more at [203.8s] ibl.ai AI. --- ## Automated Document Screening URL: https://ibl.ai/docs/instructor/data/documentscreening Video: https://www.youtube.com/watch?v=4IjsTNUv4ps # Automated Document Screening ## Purpose Use a mentor as an automated document screener that evaluates uploaded files against custom criteria and returns a pass/fail result with a detailed breakdown. --- ## Define Screening Criteria 1. Open the mentor's **Prompts** tab. 2. In the system instructions, specify: - The criteria a document must meet (e.g., name present, years of experience, required skills) - Whether **all** criteria must pass or just some - The output format for a pass (e.g., a table with a check icon) - The output format for a fail (e.g., a bold red fail icon with a table of unmet criteria) ### Example Criteria - Document contains the **name of an individual** - Individual has at least **two years of experience** as a programmer - Individual has experience with **Python and JavaScript** - All three must be met for a pass --- ## Using the Canvas Side Panel 1. Enable **Canvas view** on the mentor to display results in a side panel. 2. Upload a document (PDF, DOCX, etc.) in the chat. 3. The mentor analyzes the document against your criteria and renders the result in the side panel. --- ## Example Results **Fail** (uploading a receipt instead of a resume): - Name of individual: No - Two years experience: No - Python & JavaScript: No - Result: **FAIL** **Pass** (uploading a qualifying resume): - Name of individual: Yes - Two years experience: Yes - Python & JavaScript: Yes - Result: **PASS** --- ## Customization Options - Results can appear in the **side panel** (Canvas view) or the **regular chat frame** - You can send a **custom prompt** alongside the document upload - Adjust criteria, output formatting, and pass/fail thresholds entirely through the system prompt --- ## Embed URL: https://ibl.ai/docs/instructor/data/embed Video: https://www.youtube.com/watch?v=DqctkB3xwuM # Embed ## Description The Embed feature lets you place a fully‑functional mentorAI widget on any website or LMS page. Customize its look, choose which tabs it offers, control user‑tracking options, and enable smart extras (context awareness, safety disclaimer, auto‑open). Once configured, a code snippet drops the mentor directly into your site—no separate login required for end users. ![](/images/embed.png) --- ## Target Audience **Instructor** --- ## Features #### Live Preview & Custom CSS See exactly how the mentor will appear and tweak colors, fonts, or brand styling with optional CSS. #### Two Embed Modes - **Default** – single **Chat** tab - **Advanced** – extra tabs: **Chat**, **Summarize**, **Translate**, **Expand** #### Anonymous vs. Trackable Access Allow public use without login, or require user tracking with a **website URL** and **access token**. #### Context Aware Toggle Let the mentor read **on‑page content** (PDFs, articles) and answer in context. #### Safety Disclaimer Add a notice reminding users to verify AI‑generated answers. #### Open by Default Decide if the widget launches open or stays collapsed until clicked. #### One‑Click Code Generation Create the embed snippet (and token if needed) ready to paste into any site or LMS. --- ## How to Use (step by step) #### Locate the Mentor - Sign in to the **mentorAI web app** - Open the desired **mentor’s chat window** - Click the dropdown next to its name and select **Embed** #### Preview & Style - A **live preview** shows the widget - Optionally add **Custom CSS** to match your site’s branding #### Choose the Mode - **Default** → Chat only - **Advanced** → adds **Summarize**, **Translate**, **Expand** tabs #### Configure Access - **Anonymous toggle On** → no login required - **Anonymous toggle Off** → enter **Website URL** and click **Get Token** for user‑tracking #### Enable Extra Options (as needed) - **Context Aware** → mentor reads page content - **Safety Disclaimer** → adds a cautionary note - **Open by Default** → widget starts open on page load - **Icon** → choose a launcher icon #### Generate Embed Code - Review settings, then click **Create Embed** - Copy the **token** (for non‑anonymous embeds) - Copy the **embed code snippet** #### Deploy - Paste the snippet (and token, if required) into your **website or LMS page** - Save and publish. The mentor is live! --- ## Pedagogical Use Cases #### Course‑Page Tutors Embed a mentor on weekly modules so students get in‑context help, summaries, and translations without leaving the LMS. #### Public FAQ Bots Use **Anonymous mode** to place a “Campus Help Desk” mentor on the university website—no login hurdles for visitors. #### Context‑Aware Reading Aids Turn **Context Aware** on for pages hosting PDFs or slide decks; students receive page‑specific explanations and summaries. #### Multilingual Support **Advanced mode’s Translate tab** lets learners switch languages instantly—ideal for international cohorts. #### Safety‑Conscious Deployments Add a **Safety Disclaimer** when embedding mentors that offer medical, legal, or financial guidance to encourage verification. --- With the **Embed** tool, you can deliver **mentorAI exactly where learners need it**—styled to match your site, configured for your privacy requirements, and equipped with the features that best serve your audience. ### Transcript mentorAI | Embed a Mentor https://www.youtube.com/watch?v=DqctkB3xwuM [0.6s] Using the Mentor AI platform is not [2.5s] limited to just our web app, which is [4.6s] what you're seeing here. You can [6.2s] actually embed this anywhere you'd like [8.6s] on any website. And what we're going to [11.5s] do today is show you how that can be [13.3s] done. So once you're in the platform, [16.1s] you're going to find the mentor that you [17.5s] want to embed. Um once you have that, [19.8s] you'll select it. You'll see a chat [21.4s] screen similar to this. And then you'll [23.6s] select this dropdown. Now there's going [26.5s] to be a little button that you'll see [28.3s] that's embed. and you'll click this. [31.4s] And this is going to pop up a panel um [33.5s] where you can customize your embed as [35.8s] you see fit. Um you'll see here that [38.1s] there is a preview of what will be [39.6s] embedded. Um so you can kind of keep an [41.4s] eye on things as you're customizing it. [43.8s] One of the options is including custom [45.7s] CSS to make sure it fits your colors, [47.8s] your branding, fonts, so on and so [49.9s] forth. Then you have your mode [51.8s] selection. Right now you're looking at [53.4s] the default. And then we also have the [55.4s] advanced option which is going to have a [57.5s] couple more tabs that learners can use [59.4s] um while they're you know communicating [61.0s] with the mentor. You have your chat [62.6s] which is the standard uh default version [64.7s] that we previously saw. Summarize is [67.4s] going to take the content and summarize [69.3s] it. Translate will translate this into [71.5s] any language uh that the learner [73.0s] prefers. For instance, if their first [74.6s] language is Spanish but the course is [76.1s] being taught in English, this can help [77.8s] translate into Spanish. And expanding [80.2s] just takes the material that the learner [82.2s] is learning and expanding upon it. Now [86.0s] we have a couple of other options. [87.8s] Anonymous. If we toggle this on, you'll [89.8s] see a couple of things disappear. This [91.8s] means that learners do not need to be [93.5s] logged in to be able to use this mentor. [96.2s] Now, if you have this in LTI format [98.5s] embedded into your LMS somewhere, uh the [101.4s] LTI component will automatically, you [103.9s] know, have a user associated. So, you [106.0s] won't have to worry about this if you're [107.4s] using LTI. [109.2s] Um, if you're not using LTI, then or [112.5s] sorry, if you're not using anonymous [114.0s] mode um or LTI, then what you'll have [116.5s] here is a couple of other additional [118.7s] pieces such as your website URL and [121.0s] getting a token. This will allow you to [123.7s] um put in the URL of where you want this [126.1s] embedded and get a token for those [127.8s] people who are not anonymous. Um, [130.1s] contextaware, if you toggle this on, [133.0s] this is going to actually read the [135.0s] content on the page and be able to [136.5s] answer based on that. So for instance, [138.1s] if a learner is looking at a PDF of the [140.6s] war on of 1812 um but the mentor has not [144.4s] been trained on it, the mentor can [145.7s] actually read that PDF as it's on the [148.2s] screen and respond based on that. Now [151.0s] the safety disclaimer is going to pop up [153.0s] a safety disclaimer that basically just [155.0s] says, you know, be cautious when you're [156.5s] using any answers for AI. And then open [159.0s] by default is going to say whether the [161.9s] not the mentor that you have embedded on [163.7s] a particular page is already open when [165.8s] somebody accesses that page or if it's [168.1s] off. [169.9s] Once you have all your settings [171.4s] completely ready to go and you feel [173.5s] ready to embed this onto a website, what [176.1s] you'll do is hit create embed. [178.8s] You'll get a token if it is marked as uh [182.0s] non-anmous users and then you'll get [183.8s] this embed code that you can copy and [185.4s] paste and implement. And that's how [188.0s] you're going to embed a mentor on a new [189.4s] site. --- ## Google Drive URL: https://ibl.ai/docs/instructor/data/googledrive Video: https://www.youtube.com/watch?v=UYJ8ddtAmBE # Google Drive ## Description **Drive Integration** connects a mentorAI to **Google Docs** and **Google Slides**, enabling it to generate **living documents and slide decks directly from chat**. Instructors enable the tools once, learners authenticate, and the mentor returns a **shareable Google link** with the requested content (e.g., a curriculum gap-analysis template) in the desired format. --- ## Target Audience **Instructor** --- ## Features #### One-Click Tool Toggles Enable **Google Docs** and/or **Google Slides** in the mentor’s **Tools** tab. #### OAuth Authentication A secure **Google sign-in popup** allows learners to link their account for document creation. #### In-Chat Document Creation Ask for a **doc or slide deck**, and mentorAI generates and returns a **Google Drive link** within the chat. #### Format Control via Prompting Output format follows your **System Prompt** (e.g., Markdown or rich text). #### Works for Templates & Content Generate **outlines, rubrics, gap-analysis templates, study guides, or slide summaries** seamlessly. --- ## How to Use (step by step) #### Enable the Tools (Instructor) 1. Open the mentor’s **Settings → Tools**. 2. Toggle **Google Docs** and/or **Google Slides** **On**. #### Authenticate (First Use per User) 1. In chat, click **Google Docs** (or **Google Slides**). 2. Complete the **Google sign-in popup** to authorize creation in your Drive. #### Request the Asset In chat, describe what you need. Example: > “Suggest a template for documenting curriculum gap identifications in a course.” Then ask: > “Create this as a Google Doc and share the link.” #### Receive the Link - mentorAI generates the file and returns a **Google Docs/Slides URL** in chat. - Click to open and continue editing in **Google Workspace**. #### Control Formatting (Optional) In **Prompts → System Prompt**, specify output format preferences. Examples: - “Use rich text headings and tables in the Google Doc.” - “Produce Markdown structure, then render to Google Docs.” #### Iterate - Request **revisions** (add sections, tables, slides). - **Re-run** to update or create new versions as needed. > 💡 Tip: Ensure your **Google Drive sharing settings** (domain-wide vs. specific users) match your intended distribution. --- ## Pedagogical Use Cases #### Curriculum Artifacts Auto-generate **gap-analysis templates**, assignment specifications, or rubrics. #### Student Handouts Produce formatted **study guides or lab sheets** as Docs and share them instantly. #### Lecture Assets Create **Slides** with summaries, objectives, and key diagrams drawn from mentor prompts. #### Collaboration Kick-Off Spin up **draft documents for group projects**, allowing teams to refine content directly in Drive. #### Rapid Prototyping Iterate on **syllabus changes, checklists, or assessment blueprints** without leaving chat. --- With **Drive Integration**, your mentor can generate **polished Google Docs and Slides on demand**, streamlining course authoring and placing ready-to-edit files directly at your fingertips. ### Transcript mentorAI | Google Drive https://www.youtube.com/watch?v=UYJ8ddtAmBE [1.7s] Hi everybody. I'm going to be showing [3.5s] you one of our latest integrations um [6.4s] that can be enabled with your mentors. [9.0s] Now for this mentor, what we're going to [10.7s] do is go to this tools tab under the [13.7s] mentor settings. We have both Google [16.4s] Docs and Google Slides now enabled um [19.6s] which will actually connect your mentor [21.6s] to either Google Slides or Google Docs [23.7s] or both depending on you select and this [26.4s] can actually output a document. So, in [30.2s] this case, we've already got these [31.3s] enabled. And then when we go back to the [33.0s] chat, you'll see these two options. I'm [35.8s] going to click on Google Docs. And this [38.1s] is going to actually ask me to [39.4s] authenticate. [42.6s] And it'll it'll have a little popup. You [44.5s] just can't see it on my screen at the [45.7s] moment. [47.4s] And this will actually connect the [49.5s] device or your your mentor to your [52.6s] Google Docs um account. So, um what [56.0s] we'll do now is put in a question. And [58.7s] I'm just going to say suggest a template [60.6s] for documenting um gap identifications. [63.2s] This is specifically within um a course [65.1s] curriculum. And then from here, what [67.3s] we're going to do is ask the mentor to [69.0s] actually provide an example of this [70.3s] template in a Google doc. [73.6s] So this might take a few seconds just [75.3s] because [76.9s] um you know it's going to basically be [78.9s] thinking about the response and then [80.7s] also outputting a Google doc. But once [83.1s] you have it, it'll be sent you'll be [85.0s] sent a link to that Google doc itself. [93.0s] All right. So, now we have the goo [94.9s] Google doc link right here. And then [97.4s] we'll click on this. Actually, I'm just [99.4s] going to copy the link address. [101.9s] And we'll go here. [106.1s] Now, this is a markdown format based off [108.0s] of the system prompt here, but this the [110.6s] system prompt can be updated or added to [114.5s] with the specific format of, you know, [116.7s] your preferred preferred format. So, you [119.2s] know, rich text or markdown, for [121.3s] example. And all right, that's that's [123.7s] basically all you have to do to connect [125.4s] Google Docs and produce a document or a [127.9s] Google Slides. --- ## Graded Assignment URL: https://ibl.ai/docs/instructor/data/gradedassignment Video: https://www.youtube.com/watch?v=8UwlrhjRVuM # Graded Assignment ## Description When mentorAI is embedded in Canvas as an LTI tool, it can grade learner responses inside an Assignment and automatically pass the score back to the Canvas gradebook. Learners type their answers directly in the mentor chat, receive an instant score (and, soon, AI-generated feedback), and watch their grade update in real time—no manual grading steps required. --- ## Target Audience **Instructor** --- ## Features #### Instant AI Scoring The mentor evaluates each submitted answer and returns a numeric grade (e.g., 10 / 10). #### Automatic Grade Passback Scores sync to the Canvas gradebook immediately after each response. #### Iterative Resubmission Every new answer replaces the previous score (e.g., a perfect 10 drops to 0 if the next attempt is wrong). #### Future Enhancements (preview) - AI-generated feedback accompanying each grade - Support for file uploads (PDF, Word, Excel, etc.) that the mentor can grade --- ## How to Use (step by step) #### Open the Assignment - In Canvas, navigate to **Assignments** - Click the task labeled (for example) **Mentor Grading** #### Interact with the Mentor - The mentor chat loads on the assignment page - Type your answer (e.g., `5 + 5 = 10`) into the chat and press **Send** #### Receive Your Score - The mentor instantly replies with a grade (e.g., **10 / 10**) - *(Future)* Feedback text will appear alongside the score #### Verify Grade Passback - Open **Grades** in Canvas - Confirm the score has updated (e.g., **10 / 10**) #### Revise & Resubmit (Optional) - Return to the assignment page - Enter a new answer (e.g., `5 + 5 = 9`) - The mentor re-grades and returns **0 / 10** - Check **Grades** again to see the new score reflected --- ## Pedagogical Use Cases #### Math & Short-Answer Quizzes Verify calculations or factual responses instantly, freeing instructors from manual grading. #### Formative Practice Allow students **unlimited retries**; the most recent attempt determines the recorded grade. #### Low-Stakes Knowledge Checks Integrate quick “exit ticket” questions in modules, providing **immediate performance feedback**. #### Future File-Based Assessments Upload **lab worksheets, essays, or spreadsheets** for AI scoring once file-grading launches. --- With mentorAI’s Canvas integration, learners receive **live grades** (and soon feedback) while instructors enjoy **automatic score passback**—streamlining assessment and reinforcing learning in real time. ### Transcript mentorAI | Canvas | Submitting a Graded Assignment https://www.youtube.com/watch?v=8UwlrhjRVuM [0.2s] showing you how if our mentor is [2.0s] integrated via LTI into an LMS, in this [5.0s] case Canvas, can actually pass back [7.2s] grades to the LMS. So, you can see here [10.5s] uh we don't have a grade for this user. [13.1s] So, what we'll do is pop over to [16.8s] assignments. Once we go to assignments, [19.0s] we'll see this mentor [22.2s] grading and we'll open up this [25.3s] page. And now you can see that the [27.4s] mentor is here. And this is just going [29.8s] to be a very simple example. [32.0s] Um, so we're going to do 5 + 5 equals [34.8s] 10. Now, in the future, you're going to [36.9s] be able to add feedback to the responses [39.1s] via a new prompt type. Um, so that [42.2s] basically means uh the mentor will [43.9s] respond with a grade, but also provide [45.5s] the learner feedback. In addition to [47.4s] this, you'll be able to upload different [49.0s] file types such as PDFs, Excel, um, Word [53.0s] documents, so on and so forth. Now, if [55.9s] we want to see uh that the grade has [58.2s] actually been passed back to Canvas in [60.0s] this case, go over to [62.4s] grades and you can see that we have a 10 [64.8s] out of 10 score. Now, if we want to go [66.6s] back to the [72.4s] assignment, we can actually have the [74.5s] grade updated every time a user [76.6s] responds. So, if we say 5 + 5= 9, we [81.6s] should get a 0% score back. [85.2s] And now if we go back to the [88.8s] grades, you'll see my score has changed [91.0s] to a zero. So whatever the last input [93.5s] from the user is is what their grade's [95.0s] going to be. --- ## Link Share URL: https://ibl.ai/docs/instructor/data/linkshare Video: https://www.youtube.com/watch?v=7RO3QsjqOL4 # Link Share ## Description The Link Share feature generates a public URL that lets anyone chat with your mentorAI—no login required. Toggle the link on to invite open access, refresh it to rotate the URL, or switch it off to revoke access instantly. Perfect for demos, outreach, or quick external support. --- ## Target Audience **Instructor** --- ## Features #### Anonymous Access Toggle Enable or disable a public, no‑login chat link at any time. #### One‑Click URL Refresh Rotate the link with a **refresh icon** to invalidate old URLs. #### Copy & Distribute Grab the link and share it via **email, LMS announcements, social media**, or **QR codes**. #### Instant Lockdown Turning the toggle **off immediately blocks anonymous sessions**—even if users still have the old link. #### Safe Testing Verify the link in an **incognito window** to ensure it works exactly as intended. --- ## How to Use (step by step) #### Enable Anonymous Access - Open the mentor’s **Embed** tab - Locate the **Sharable Link** section - Toggle the switch **On** to activate anonymous chatting - *(Optional)* Click the **refresh icon** to generate a new URL - Click **Copy** and distribute the link wherever you like #### Test the Link - Paste the URL into an **incognito/private‑browsing window** - Confirm you are **not logged in** - The mentor chat should load and respond normally for **anonymous users** #### Disable Anonymous Access - Return to the **Embed** tab - Toggle the switch **Off** - Anonymous users will now see: > **Session failed to start** *(Chat box becomes inactive.)* --- ## Pedagogical Use Cases #### Open‑House Demos Share a public link during **webinars or campus tours** so prospective students can try the mentor without creating accounts. #### Community Outreach & Marketing Post the URL on **social channels** to showcase your **AI tutoring capabilities**. #### Conference or Workshop Support Provide attendees with instant access to a **specialized mentor** for live Q&A during events. #### Quick External Feedback Invite **industry partners or subject‑matter experts** to interact with a mentor and suggest improvements—no onboarding required. #### Emergency Helpdesk Temporarily open access during **downtime or high‑traffic periods**, then disable the link once normal support channels resume. --- With **Link Share**, you control whether outsiders can interact with your mentor—**turn the link On for open access**, or **flip it Off to lock things down instantly**. ### Transcript mentorAI | Shareable links https://www.youtube.com/watch?v=7RO3QsjqOL4 [0.2s] As an instructor, you have the ability [2.0s] to share your mentor with anonymous [3.9s] users. This means they do not have to [5.6s] log in at all to be able to chat with [7.7s] the mentor. You can also decide whether [10.0s] or not or if you don't want that to [11.4s] happen, you can disable this. And to get [13.9s] this set up for anonymous users, what [16.6s] we'll do is hit embed. [18.9s] You'll see that there's a sharable link [20.3s] here. And again, like I said, this will [21.9s] generate a link that users can chat with [23.8s] a mentor anonymously. [26.3s] Now, to enable it, all you have to do is [28.0s] hit this toggle. And if you ever need to [30.3s] refresh it for any reason, you can just [31.7s] hit this little refresh button. Now, we [34.3s] can copy this link. And what I'm going [35.8s] to do is go into an incognito window. [39.9s] And I am not logged in here. [45.0s] So now I'm seeing the mentor as an [47.4s] anonymous user and I can chat with it. [52.6s] So the mentor can respond here to the [54.8s] anonymous user with no problem [56.6s] whatsoever. But if we want to go back [61.4s] over here to the mentor that we were [65.8s] just editing, if we go back to the embed [67.7s] tab and disable this link, [71.3s] what we can do now is come back over to [73.2s] this tab and hit [76.3s] refresh. [80.7s] Now you'll see that the session for this [82.6s] user fails to start because we've [84.1s] disabled that anonymous link. So meaning [87.0s] they can no longer chat. This box won't [89.0s] even be clickable for our users if it is [91.0s] disabled, but they have this link for [92.6s] whatever reason. And that's how you can [94.9s] share your mentor anonymously. Or if you [96.9s] wanted to lock that down and not prevent [98.8s] that from happening, that's how you can [100.2s] do it as well. --- ## API URL: https://ibl.ai/docs/instructor/integration/api Video: https://www.youtube.com/watch?v=KrB3R5nhBDM # API ## Description The API feature lets you generate secure keys for a mentor and use them to call mentorAI’s REST endpoints. With an API key, you can integrate mentorAI into other systems—LMS dashboards, custom analytics pipelines, or third‑party apps—while controlling access and expiration dates for security. ![](/images/api.png) --- ## Target Audience **Instructor** --- ## Features #### Scoped API Keys Generate unique keys per mentor, limiting access only to that mentor’s data and functionality. #### Custom Expiration Dates Set a “use‑by” date (e.g., 30 days) to minimize risk if a key is misplaced. #### One‑Time Display Keys are shown only once; you must copy and store them securely. #### Built‑In Authorization Flow Use the key to authorize calls in mentorAI’s API platform (Swagger / Postman collection). #### Seamless System Integration Connect mentorAI with learning‑management systems, CRMs, grading tools, and more. --- ## How to Use (step by step) #### Create an API Key - Click the **mentor’s name** in the header - Select the **API** tab - Choose **Create New** - Enter a descriptive name (e.g., “LMS‑integration‑Aug 2025”) - Set an **expiration date**—typically one month ahead - Click **Submit** - Copy the **API key** that appears (it will not be shown again). - Store it in a **secure vault** or environment variable #### Authorize in the API Platform - Open **mentorAI’s interactive API documentation** - Click **Authorize** - Paste your copied **API key** and confirm - Close the authorization dialog #### Call an Endpoint - Select any available endpoint (e.g., `/chat`, `/datasets`, `/grades`) - Review its required parameters - Enter test values and click **Execute** (or use **cURL / code snippets**) - Examine the **JSON response** to confirm success #### Rotate or Revoke Keys (Optional) - Return to the **API** tab to **deactivate or delete keys** when no longer needed - Create new keys for different integrations to maintain **granular control** --- ## Pedagogical Use Cases #### LMS Grade Sync Pull mentorAI quiz results via API and push them into your institution’s gradebook automatically. #### Custom Analytics Dashboards Fetch conversation counts, tool usage stats, or student progress metrics for real‑time reporting. #### Single‑Sign‑On (SSO) Extensions Use the API to validate user sessions and embed mentorAI directly inside campus portals. #### Automated Enrollment Script the creation of new mentors or the assignment of students to mentors each semester. #### Content Management Pipelines Upload datasets or update system prompts programmatically to keep mentors in sync with new course materials. --- With an **API key in hand**, you can seamlessly **integrate mentorAI’s capabilities** into your existing systems—**extending its reach** while maintaining **strict security and control**. ### Transcript mentorAI | APIs https://www.youtube.com/watch?v=KrB3R5nhBDM [0.0s] [Music] [0.7s] Mentor AI is based off of an APIdriven [3.4s] architecture. This makes it really easy [6.0s] to integrate very seamlessly with [8.4s] existing systems using our comprehensive [10.5s] APIs. [12.0s] You can actually do this from our mentor [13.9s] platform. And the way you would do this [16.2s] is by selecting the mentor name. And [19.1s] you'll see this tab that's called API. [22.1s] You'll click on this. And as you can [24.1s] see, we don't currently have any of [25.7s] these. So, what I'm going to do is hit [27.8s] create new. And then I'm just going to [30.5s] give this a name. [38.2s] And you can also set the expiration [39.9s] date. So, I'm just going to set it for [41.7s] about a month from now and hit submit. [44.6s] You'll also get your your API key here. [47.9s] You can copy and paste it. Um, please [51.3s] make sure you save this somewhere secure [53.0s] because this will disappear and you [54.6s] won't be able to see it again. So if you [56.5s] ever need to go back and use it, um you [59.4s] just need to make sure you have that [60.6s] ready to go. All right. And then if we [64.2s] want to see what it looks like to call [65.7s] one of our APIs, what we'll do is go to [69.2s] our API platform. [71.9s] So here you can see all of our APIs. And [74.2s] what you'll need to do is hit authorize. [77.6s] So you'll put in that API key that you [80.7s] got from the previous screen on the [82.2s] Mentor AI uh API tab. You'll hit [85.1s] authorize [87.0s] and then you'll hit close. And from [89.5s] there you can select any of these APIs [93.2s] to call. So I'm just going to select a [95.4s] random one here. [97.5s] Let's see. [102.0s] And you can see that we can get the the [105.2s] API response and all this other [107.1s] information about the parameters so on [109.3s] and so forth. Super simple to use, very [111.8s] easy to integrate with any existing [113.7s] system uh using our APIs. And that's [117.0s] really all you need to do to uh connect [118.9s] to our APIs. --- ## Community Mentors URL: https://ibl.ai/docs/instructor/integration/communitymentors Video: https://www.youtube.com/watch?v=0RONqCuQ-AQ # Community Mentors ## Description **Community Mentors** lets admins surface ibl.ai’s shared mentor library—such as subject-matter mentors built on **OpenStax textbooks**—inside their institution’s private mentorAI environment so learners can start chatting with them immediately. It’s a **fast, safe way** to pilot or extend high-quality mentors without starting from scratch, while keeping **configuration authority** on your side (prompting, data attachments, model choice, and embedding). --- ## Target Audience **Administrator · Instructor · Student** *(read-only access)* --- ## Features #### Shared Library in Your Tenant Browse a catalog that includes **OpenStax-based course mentors**, **skills mentors**, and other agents. #### Read-Only Learner Access Learners can converse with Community Mentors but cannot change settings or configuration. #### Admin/Instructor Controls - Add or override a **System Prompt** - Attach your own **data** (files/links) to localize content - Switch the **model** powering the mentor - Embed the mentor into your **LMS/SIS** or other systems #### Rapid Pilots & Scale-Up Turn on curated mentors quickly, then adapt them to programs or courses as needed. --- ## How to Use (step by step) #### Enable in Admin 1. Go to **Profile menu → Organization → Advanced → Community Mentors** 2. Toggle **On** to activate Community Mentors for your environment #### Browse & Add - Open **Explore** to view the full catalog of community mentors now available - Add **OpenStax-based course mentors**, **skills mentors**, or other agents to your environment #### Configure (Optional, Admin/Instructor) - **System Prompt** – Add/override to match your teaching voice or institutional guidance - **Attach Data** – Connect your course materials to contextualize answers - **Switch Model** – Choose the LLM that best fits your use case #### Embed (Optional) - Embed selected mentors directly into your **LMS/SIS** or other systems so learners can access them in context #### Learner Experience (Read-Only) - Learners chat with Community Mentors right away but cannot alter prompts, data, or tools --- ## Pedagogical Use Cases #### Instant Course Coverage with OpenStax Deploy subject-aligned mentors (e.g., Intro Biology, Economics) to provide immediate tutoring and Q&A. #### Skills & Study Support Offer skills mentors (study strategies, writing help, career prep) across departments without building new agents from scratch. #### Pilot Programs & Rapid Trials Test mentors with select cohorts, then refine prompts/data and roll out broadly based on findings. #### LMS-Embedded Assistance Embed mentors in course modules so students can get context-aware help alongside readings and assignments. #### Institutional Consistency with Local Control Start from high-quality shared mentors, then tailor prompts, datasets, and models to fit your policies and outcomes. --- Once you complete these steps, you can pilot or extend high-quality mentors without starting from scratch—while keeping configuration authority on your side. ### Transcript mentorAI | Community Mentors Feature https://www.youtube.com/watch?v=0RONqCuQ-AQ [0.1s] Hey, what's going on? This is Jeremy at [1.7s] IBLAI. Today I'm showing you one of our [3.8s] newest features, the community mentors [6.4s] feature, which allows uh admins within [9.4s] their own environment on the mentor AI [11.3s] platform to connect all of the community [13.9s] mentors that we make available on our [16.5s] main environment within their private [19.0s] environment to all learners who might [21.3s] like to communicate with them. This [23.4s] includes uh a number of mentors we've [25.8s] created using the open stacks open [28.2s] resource textbooks [30.6s] um for a range of college level uh [34.0s] courses and subjects. So in order to do [38.4s] that if you are an admin within your own [41.2s] environment what you would do is you [43.3s] would navigate to the user icon in the [45.8s] upper right hand corner here. Once you [48.8s] do that, it's going to open this [50.0s] drop-own menu with your profile, the [52.4s] environment that you're in, uh a help [55.1s] icon, and log out. You're going to [57.3s] select the name of the environment that [59.5s] you're in, which is going to open this [62.3s] organization window, uh for different [65.6s] types of management features you have [67.7s] available to you as an admin. You're [70.8s] going to come down to advanced and this [73.0s] community mentors tab right here. You're [75.4s] going to toggle this on. Now, once [78.6s] you've had that enabled, you come over [80.4s] to the explore column. [84.0s] And from here, you'll see the entire [86.8s] fleet of community mentors that we make [89.9s] available to anybody on the platform [92.9s] within your private environment, [94.6s] including all of these OpenStacks [96.6s] textbooks, and then a number of agents [99.4s] and mentors for a range of use cases for [102.4s] you to explore. [104.6s] Now, if you enable this, the learners in [107.8s] your environment will have the ability [110.2s] to communicate with any of these [112.2s] mentors, but will not have the ability [114.2s] to make any changes to these mentors. [116.8s] However, you as an admin will have the [120.4s] ability to make changes to these [121.8s] mentors. Say you're teaching an [123.8s] introductory to statistics course. If [126.6s] you select on this mentor, one of the [128.6s] community mentors that you've just made [130.2s] available within your environment, you [132.5s] can come to this drop-own menu in the [134.6s] upper left hand portion of the screen [137.4s] where you see the name of the mentor and [140.2s] you can come in and add your own data [142.9s] sets, edit the system prompt, change the [146.9s] model that the mentor is connected to, [149.8s] um, and even embed this directly into [152.9s] your LMS or student information system, [155.4s] wherever you like the mentor to reside. [158.2s] And that's a very quick overview of the [160.8s] uh community mentors feature that we've [162.8s] just made available. As always, thank [165.2s] you so much for tuning in and learn more [167.1s] at ibl.ai. --- ## Access Grades URL: https://ibl.ai/docs/instructor/lms/accessgrades Video: https://www.youtube.com/watch?v=hfmSv6JH85Y # Access Grades ## Purpose View learner responses and grades for problem-bank questions in an LMS course—either by downloading reports or by viewing the course as a specific student. --- ## Method 1: Download Problem Response Reports (Recommended) ### Access Data Downloads 1. Go to the **Instructor Panel**. 2. Click **Data Download**. ### Select Scope You can generate reports at multiple levels: - Full Course - Section - Subsection - Unit - Specific Problem **Example:** - To capture an Exit Evaluation, select the appropriate section or subsection. - To drill into a single question, go down to the unit or specific problem level. ### Generate Report 1. Select the desired scope. 2. Click **Create Problem Response Report**. 3. Wait for confirmation that the report has been generated. 4. Download the file to view: - Each learner’s response - Scores - Submission details This method is especially helpful for problem banks, since learners often receive randomized questions. --- ## Method 2: View as a Specific Learner ### Generate Progress Link 1. Go to the **Student Admin** tab. 2. Select **View Specific Learner Grades and Progress**. 3. Enter the learner’s email or username. 4. Click **Generate Progress Link**. 5. Open the link in a new tab. ### View Gradebook - You’ll see the learner’s scores per exam. - Navigate to the **Course Outline** to view activity from the student’s perspective. - If the view shows “staff,” switch to specific learner/student mode and enter their username. ### Inspect Problem Bank Questions Open an exam or unit the student completed. You can see: - Which randomized question they received - Their selected answer - Number of attempts used - Submission history Additional options include: - Reset attempts - Rescore submissions - View submission status --- ## Best Practice for Problem Banks Because problem banks randomize questions: - Use **Data Download reports** for accurate reporting across all learners. - Use **View as Learner** when investigating a specific student’s attempts. --- ## Result You can accurately review learner performance on problem-bank questions—either through downloadable reports or by stepping into the learner’s perspective to inspect answers, attempts, and progress. ### Transcript skillsAI | Access Learner Grades https://www.youtube.com/watch?v=hfmSv6JH85Y [1.8s] I'm going to be showing you how to find [3.7s] specific learner grades within a given [7.0s] course. Now, there are a couple ways you [8.9s] can do this. One of which is by going to [10.6s] this instructor panel [13.4s] and hitting the data download tab. [16.5s] Once you're here, [18.9s] you can select either a full course [21.4s] section or specific even down to the [24.0s] specific problem. So, if you wanted to [26.3s] do the full course, you would just [27.4s] select the course name up at the top. [29.4s] But for instance, if you wanted to [30.6s] capture the exit evaluation responses, [33.7s] um this is a section level. So we'll go [35.6s] down one further and go to the [37.2s] subsection. Now again, you can go down [39.5s] to the unit level, which is this right [41.4s] here. And then the specific question [43.7s] itself that students are taking um that [46.5s] you're looking for, you know, their [47.8s] specific responses for. Now, let's just [50.3s] say we want to do just the exit [51.5s] evaluation in full this whole section. [54.0s] We hit this and then hit create a [55.8s] problem or report of problem responses. [58.9s] Once that's done, it'll have your report [60.9s] has been su successfully generated and [62.6s] then you'll be able to download the [63.8s] report, seeing each and every learner's [65.4s] responses to that specific problem that [67.6s] you've selected or subsection section or [70.3s] even the full course. The other way you [72.8s] can do this is by viewing the course as [74.7s] a specific student. And the best way to [76.2s] do that would be going to the [78.7s] uh the student admin tab here. And under [81.6s] this view specific learner grades and [83.9s] progress um you can enter this learner [87.0s] the learner specific email that you'd [89.0s] want to see. Um you know what their [91.4s] questions that they saw are or um you [94.1s] know just kind of view what the student [95.4s] is seeing. So in this case we'll use [97.4s] we're going to use somebody we know [99.0s] that's enrolled in this course and we'll [100.5s] hit generate progress link. And once [102.2s] this comes up, you'll be able to open up [105.3s] in a new tab the learner's progress [107.8s] page. [109.8s] So to show you what this looks like, [113.7s] we're going to open it up. [117.0s] You can see up here that we're viewing [118.4s] this course as the specific learner. [121.0s] Now, this is specifically their grade [122.6s] book where you can even see their scores [124.4s] per exam. But if you want to see what [127.2s] actual questions that the learner is [128.5s] taking from their perspective, you'll [130.2s] click the course outline button at the [131.6s] bottom here. And you can see that we're [133.7s] already viewing this course. But [135.1s] sometimes if it says staff like this, [138.1s] you'll be viewing as you know the user [140.2s] you're logged in with. So if it says [141.8s] staff, you'll just select specific [143.0s] learner or student and then put in the [145.6s] learner's [147.7s] name. So we'll have put or their [149.4s] username or email address. So we'll put [151.2s] in their username. [154.4s] And now we can see what the student is [156.6s] seeing from their end. So we can see [157.8s] what their completion looks like. And if [159.7s] we want to go into one of the exams that [161.6s] they've answered questions in, we can [163.8s] actually see what ex specific questions [166.1s] from a grade book, a problem bank using [169.0s] a library that they're seeing. So this [172.2s] is going to be the fourth out of the [173.5s] however many were um in the problem bank [176.5s] that they've completed. You can see how [178.1s] many attempts that they've used um what [180.6s] answer choice they selected. And then [182.2s] you even with submission history, this [184.2s] would actually allow you to put in an [185.7s] email um or a username of a learner to [188.6s] see, you know, whether or not they've [190.7s] submitted, what their score was, so on [192.0s] and so forth. I would say for problem [194.1s] banks though, uh I would stick with [195.8s] either using this view or the the data [197.6s] downloads like I had shown previously. [199.8s] um purely because we're not quite sure [201.5s] you know at a first glance um without [204.8s] you know going and viewing as a student [206.6s] whether or not they've taken a specific [208.6s] question because they are seeing you [210.5s] know a randomized selection um from from [214.5s] the problem bank you also have a couple [217.4s] of other options about seeing you know [219.0s] their submission status and whether or [220.6s] not you want to reset their attempts uh [223.1s] rescore things like that. So that's [226.1s] basically how you can figure out exactly [228.3s] how to see a learner specific answer [230.4s] choices whether via report or even [232.2s] viewing the course as a student. --- ## Course Reports URL: https://ibl.ai/docs/instructor/lms/coursereports Video: https://www.youtube.com/watch?v=oq8P5UrvyFM # Course Reports ## Purpose Download course-level learner data and performance reports from the Instructor Panel. --- ## Accessing the Instructor Panel 1. Enter the course as a Tenant Admin or as a user with Instructor access. 2. Click the **Instructor** tab. 3. Open the **Data Downloads** tab. --- ## Available Reports ### Enrollment & Profile Data - Grading configuration - Learner profile information (including users who can enroll but haven’t yet) - Anonymized enrollment report --- ### Problem & Answer Reports 1. Select scope: - Full course - Section - Subsection - Unit - Specific problem 2. If using a problem bank, selecting it shows available questions. 3. Click **Create Problem Response Report**. 4. Once generated, click **View Report** to download the CSV. --- ### Grade Reports - **Generate Grade Report** - Overall course grades. - **Generate Problem Grade Report** - More detailed per-question performance data. --- ### Open Response Assessments (ORA) - **Generate ORA Data Report** - Student-submitted open response data. - **Generate ORA Summary Report** - Summary-level reporting. - **Download Submission Files (ZIP)** - All uploaded student files for ORA activities. --- ## Downloading Reports - Reports may take a few seconds or minutes to generate. - Once ready, the file name appears in the list. - Click the file name to download or re-download. --- ## Result From the Instructor Panel’s Data Downloads tab, you can export detailed learner enrollment, grading, problem-level, and open-response data for any course. ### Transcript skillsAI | Instructor Panel https://www.youtube.com/watch?v=oq8P5UrvyFM [2.0s] I'm going to be showing you how you can [3.7s] access the instructor panel to download [6.2s] reports about learners within a given [7.9s] course. The first thing you're going to [10.1s] do is enter a course as a tenant admin [13.4s] that you would like to get reports for. [17.5s] Now, we'll go to this instructor tab [19.6s] here that you'll see only if you are a [21.6s] tenant admin [23.6s] or have been given access to a [25.2s] particular course as an instructor. [28.1s] Once the instructor dashboard loads up, [30.5s] you'll navigate to the data downloads [32.1s] tab. And this is where you can download [34.3s] course specific information. So this [36.3s] could include the grading configuration, [39.9s] the profile information of learners who [41.8s] have enrolled into this course. [44.7s] This also includes folks who have who [47.0s] can enroll but have not done so yet. As [49.3s] well as having an anonymized version of [52.0s] the enrollment report. [54.5s] We also can get um learner answer [57.4s] choices um from this pro uh this tab as [61.0s] well. You'll see select section or [63.6s] problem. And if you wanted to get a [66.1s] coursewide report, you would just simply [67.6s] click the course name. If you wanted to [70.3s] do a section, you can click on the [72.2s] section. And you can even go down to the [75.6s] subsection, unit, and specific question [78.9s] level. In this case, this course is [80.2s] using a problem bank. [82.4s] And this will if you click on the this [85.3s] uh the problem bank, it'll show you the [86.6s] questions available. [89.0s] Now, if you want to generate a report, [91.4s] let's just say we want to do it for the [92.6s] full problem bank, we'll just click on [93.9s] that. We can create a report of problem [96.2s] responses. And what this will do is in [98.9s] this right underneath this block, you'll [100.6s] see a view report button. And this will [102.2s] download a CSV. [104.3s] Now, we don't use edex credentials [106.0s] anymore, so you won't need to worry [107.4s] about these certificates. We have our [109.0s] own credentiing system. [111.7s] Um, we also can pull the profile [114.7s] information for enrolled students [116.2s] directly on this page in addition to [118.9s] pulling a couple different grade [120.1s] reports. So, we've got a generate grade [121.7s] report, which is basically an overall [123.3s] course grade um, per section report or [126.6s] subsection, sorry. Um, and then the [128.7s] generate grade uh, problem grade report [130.7s] will give you a little bit more details [132.3s] on exactly, you know, what students are [135.0s] taking. Um, a little bit more in-depth [137.0s] information on what has actually been [138.5s] completed. And if you have an open [140.5s] response assessment, this generate O [142.6s] data report will download a report of um [145.9s] data submitted by students for open [148.0s] response assessments. Uh you can also [150.2s] get a summary report for open response [152.2s] assessments as well as generate the sub [154.5s] uh a zip file of all the submission [156.8s] files from students with an open [158.7s] response assessment um inside of a [160.6s] course. [161.9s] Now once you have a report that you've [164.6s] down ready to download it might take a [166.6s] couple of seconds or minutes to generate [168.4s] the actual file itself but once [170.2s] available you will see the file name [172.0s] here and if you click on that you can [173.8s] also redownload the report. So that's [176.6s] essentially how you can get report data [178.5s] from this instructor tab inside of a [180.4s] course. --- ## Grade Adjustments URL: https://ibl.ai/docs/instructor/lms/gradeadjustments Video: https://www.youtube.com/watch?v=SZTeEaGSqnk # Grade Adjustments ## Purpose Adjust individual learner grades, reset attempts, rescore submissions, or override scores directly from within a course on the Skills platform. --- ## Method 1: Using Staff Debug Info (Single Problem Adjustments) 1. Open the course and navigate to the specific quiz or problem. 2. Click **Staff Debug Info**. 3. Enter the learner’s username. ### Available actions: - **Score Override** – manually assign a new score. - **Reset Attempts** – allow the learner to retake the problem. - **Delete Learner State** – remove all attempt history and scores. - **Rescore Submission** – re-evaluate the submission. - **Rescore Only if Improved** – update only if the score increases. For problem bank questions, this works, but there is a more scalable method below. --- ## Method 2: Using Student Admin (Problem-Level Controls) 1. Go to **Instructor Panel → Student Admin**. 2. Enter the learner’s email or username. 3. Retrieve the block location from **Staff Debug Info** (copy the problem location). 4. Paste the block location into Student Admin. ### Available options: - Reset attempts to zero - Rescore submission - Override score - Delete learner state - Check task status You can also apply these actions: - To a single learner - Or to all enrolled learners (by entering only the block location) --- ## Method 3: Using the Gradebook (Best for Problem Banks) 1. Open the course. 2. Click **View Gradebook**. 3. Locate the learner and the relevant subsection/exam. 4. Add a reason for the adjustment. 5. Enter the new score (e.g., 4/4 for full credit). 6. Click **Save Grade**. This is the recommended method for problem banks, since learners often receive randomized questions. --- ## Verifying Before Adjustments To review detailed responses before making changes: 1. Go to **Instructor Panel → Data Downloads**. 2. Generate: - Problem Response Report - Grade Report - Problem Grade Report 3. Download and review before applying overrides. --- ## When to Use Each Method - **Staff Debug Info** → Quick single-question fixes. - **Student Admin** → Targeted rescoring or attempt resets. - **Gradebook** → Best for subsection/exam-level adjustments, especially with problem banks. - **Data Downloads** → Validate responses before making changes. --- ## Result You can reset attempts, override scores, rescore submissions, or adjust full subsection grades—either individually or at scale—while maintaining full visibility into learner progress. ### Transcript skillsAI | Adjusting Grades https://www.youtube.com/watch?v=SZTeEaGSqnk [2.4s] I'm going to be showing you how you can [3.8s] adjust learner grades from inside of a [6.4s] course on the skills platform. [9.0s] So, we're already inside of a course [11.1s] where there we're looking at a quiz. And [13.0s] if I hit staff debug info, if I know a [15.6s] learner has taken a particular question. [17.4s] In this case, this question is coming [19.1s] from a problem bank. So, it's a little [20.9s] bit different. And I'll I'll show you a [22.6s] better way of handling problem bank [24.6s] questions here in a bit. But when you [27.2s] click on staff debug info, you can put [28.7s] in a user's username and you can do a [31.7s] score override. Reset a learner's [34.1s] attempts meaning you know they can [35.8s] retake the question again. Delete [37.9s] learner state meaning everything gets [39.4s] wiped out including their score. You can [41.8s] rescore a learner submission. [45.1s] Rescore only if points or scores improve [47.9s] or override scores. So you can see here [49.9s] this is the override um particular [51.8s] field. [53.2s] Now, if you want to look at this um for [56.1s] a problem bank and adjust learner [57.8s] scores, you always have the option of [59.4s] coming to the to the instructor panel, [61.9s] going to the student admin, and we'll [63.5s] come back to the grade book because you [64.8s] can also do this there. Um so, if we [68.2s] scroll down, we can also view a [70.6s] learner's grades in progress. Um so, you [72.9s] can kind of see where they're at in the [74.1s] course and what their scores are across [77.1s] the exams inside the course. Now, if you [79.4s] want to adjust a learner's grade for a [81.0s] specific problem, you'll just put in [82.4s] their email or username and the location [84.6s] of the block. Now, the way you would get [86.0s] the location of the block is coming over [88.1s] here. And again, staff debug info [91.8s] is where you would be able to gather [93.4s] this. That's the location. Now, going [96.3s] back to the instructor panel, [99.1s] student admin, [101.2s] then scrolling back down. [104.2s] So once you have the user's email or [106.2s] username entered and the location of the [108.0s] particular problem, you can reset [110.0s] attempts to zero. Meaning it if a user [112.6s] has used all their attempts up, but they [114.6s] want to retake the the question again or [117.4s] you for whatever reason want to give [118.9s] them an an additional attempt or however [121.0s] many attempts are set for that question, [123.1s] you can reset attempts to zero. Now you [126.2s] can also rescore a submission or only [128.5s] rescore if it improves. Meaning you can [130.6s] change the score. you know, if it's zero [132.6s] out of one point, you can just come here [134.1s] and add in, you know, the new score out [136.2s] of the total points available for the [137.6s] problem. If you know, again, if the [139.8s] question is out of one point, you would [141.2s] put one here. Um, and you also have the [145.9s] option of the problem history completely [148.4s] deleting the learner state for a spec [150.3s] specified problem. Um, and you also can [154.4s] show the test status of rescoring. Um, [157.8s] so you can keep a track of, you know, [160.4s] has the has the overrides gone through [162.3s] or has the rescoring gone through, so on [164.1s] and so forth. And then if you would [166.7s] prefer to adjust the learner's grades [169.6s] for that every enrolled learner's [171.5s] grades, you would simply just put in the [173.4s] location of the problem. And then you [175.7s] have the same options as an individual [177.8s] learner. You know, attempts, rescore, [180.1s] and the task status. You can also see [182.6s] the task status here. Uh, no tasks are [184.7s] currently running. But if we go to the [187.3s] grade book, [189.2s] which is up here, view grade book. We'll [192.2s] just go ahead and plop that here. [198.1s] You can see each of the domains, which [199.8s] is the subsection level of what exams [201.9s] were taken. And if for instance, I [203.8s] wanted to rescore one of my my students [206.6s] exams, um you can see that there are [208.3s] four total points because there are four [209.6s] questions within the subsection. Um you [211.8s] can add a reason and then put as many [213.6s] points as you would like to get them. [214.9s] And so if you want to give them 100 in [216.6s] this particular subsection, you would [218.6s] put four out of four because you can see [220.2s] the little four here. And then you would [222.7s] just hit save grade. [228.7s] And you can see now that I have 100% um [231.4s] in the grade book for this particular [233.3s] student um in this particular subsection [235.4s] with an exam. [237.6s] This would be the best way for doing it [239.0s] with problem banks. Um just because uh [241.8s] every person is getting different [243.4s] questions pulled from the problem bank [244.9s] if it is set to you know let's say you [246.7s] have 20 questions in a problem bank. Um [249.8s] but then you also have only four [251.9s] displayed to the learner. So there [253.1s] there's a chance that you know [254.0s] everyone's seeing something different. [255.5s] Um and you can get the actual responses [258.1s] and grades from the data downloads tab [259.8s] in the instructor panel. um if you just [261.9s] want to validate whether or not you'd [263.3s] like to actually make those changes from [265.5s] so this data downloads tab and you can [268.1s] either do um a problem response report [270.6s] or generate a grade report or problem g [273.0s] grade report and there you have it. --- ## Library Bank URL: https://ibl.ai/docs/instructor/lms/librarybank Video: https://www.youtube.com/watch?v=m2qcXCOm39U # Library Bank ## Purpose Create and manage questions inside a Library (Problem Bank) in Studio, including adding images to both questions and answer choices. --- ## Add a New Question 1. Open your Library in Studio. 2. Click **Problem**. 3. Choose the problem type (e.g., **Single Select**). --- ## Add an Image to the Question 1. In the question editor, click the **image icon**. 2. Select **Upload New Image**. 3. Choose your image file. 4. Adjust: - Dimensions (if needed) - Alt text (for accessibility) You may also mark the image as decorative. 5. Click **Save**. The image will now appear inside the question prompt. --- ## Add an Image to an Answer Choice 1. Scroll down to the **Answer** section. 2. Click the **image icon** inside the answer editor. 3. Upload the desired image. 4. Click **Save** for the answer choice. **Note:** Uploaded images are added to your image library and can be reused in other questions. --- ## Preview the Question 1. Click **Preview**. 2. Confirm: - The image appears in the question. - The image appears in the answer choice (if added). --- ## Result Your problem bank question now includes images in the prompt and/or answer choices, fully previewable and reusable across your library. ### Transcript Course Creation | Library Bank Images https://www.youtube.com/watch?v=m2qcXCOm39U [0.9s] This is how you can add questions to a [2.6s] library bank from studio. So right now [5.2s] we're inside of a library. And what [6.6s] we're going to do is hit problem. [9.4s] And in this case, we'll just go ahead [10.6s] and select single select. Now to add a [14.1s] image to a question itself, you know [15.8s] what the learners are answering, you'll [17.2s] just select this image icon and hit [19.7s] upload new image. [22.6s] So I'm going to go ahead and upload this [24.1s] one. [25.6s] And it'll pop up with the image over [27.2s] here. And you can adjust the dimensions [29.4s] as well as add in a alt text for [31.3s] accessibility purposes. For this demo, [33.6s] I'm just going to leave it as decorative [34.8s] just so you can see how this works. So [37.4s] you can see that the image is added [38.7s] here. Um, and then for adding this to an [42.1s] answer question, like an image inside of [43.8s] an answer choice, you'll scroll down to [46.1s] the answer problem and this little popup [48.2s] will appear. You'll select this first [50.0s] one with the image icon and then select [53.4s] the next image you'd want to upload. So [55.4s] I'm uploading this one now. And now once [57.8s] you save this, by the way, these [59.5s] question or the images that you are [61.0s] uploading inside of a question will show [62.6s] up as part of the bank of images you can [64.6s] select for other questions. So we're [66.7s] going to go ahead and hit save [69.0s] and hit save on the answer choice. Now, [71.5s] if we want to preview this, you can see [73.6s] here that if we want to preview, we see [76.3s] the question uh image that we added. And [79.5s] then also the answer choice right here [81.3s] also has that image. [84.5s] And there you have it. --- ## Advanced Settings URL: https://ibl.ai/docs/instructor/mentorcreation/advancedsettings Video: https://www.youtube.com/watch?v=UnkafAWUToU # Advanced Settings ## Description **Advanced Settings** gives tenant administrators full control over institution-level configuration in mentorAI—from **branding and user management** to **integrations, public registration, community mentors, custom domains (DNS), and SMTP email**. Use this panel to tailor your tenant’s **experience, security, and governance**—all without engineering work. --- ## Target Audience **Administrator** --- ## Features #### Tenant Profile & Branding - Edit **tenant name** - Set a **Support link** (e.g., help desk portal) - Set a **Help Center URL** (e.g., institutional docs/knowledge base) - Upload a **Tenant logo** to brand the app header and widgets #### Customize Chat Box Width - Adjust the width of the chat box (the **“Ask anything”** input area) from its default, smallest size - Chat input area **widens or returns to default** based on the value you set. #### User Management (Invitations & Roles) - Invite users via **CSV upload** (downloadable template) or **single email invite** - Track invite **Status**: Pending (not registered) / Accepted (registered) - View **Registered Users**; switch roles (**Admin ↔ Student**) and toggle access on/off #### Integrations - Add your own **LLM API keys** - Configure **third-party data-source credentials** - Register **external application APIs** your mentors can call #### Advanced Tab (Admin-Configurable Toggles) - **Help Menu:** show/hide the in-app help dropdown; link target set under Organization - **Accessibility Menu:** enable an accessibility panel for end-users - **Community Mentors:** allow mentors with public visibility to be shared across tenants - **Public Registration:** let users self-register via a direct link; new users are auto-added on sign-up - **DNS Configuration:** connect a custom domain; complete verification after updating - **SMTP Credentials:** send platform emails from your own domain instead of the default --- ## How to Use (step by step) #### Open Tenant Settings - Click your **profile dropdown** (top-right) → select your **Tenant name** (e.g., IBLAI) - The **Tenant Management window** opens with multiple tabs #### Brand the Tenant (Profile tab) - Update **Tenant name** - Enter a **Support link** and **Help Center URL** - Upload a **Tenant logo** to apply institutional branding #### Invite & Manage Users (Management tab) - Invite via **CSV** (use the template) or enter an email and click **Invite** - Monitor **Status:** Pending vs Accepted - Open **Registered Users** to: - Change role (**Admin / Student**) - Toggle access to enable/disable a user’s login #### Connect Integrations (Integrations tab) - Add **LLM keys** (per provider) - Add **third-party data-source credentials** for secure retrieval - Register **external APIs** your mentors can use #### Configure Advanced Controls (Advanced tab) - **Help Menu:** toggle On/Off; ensure Help Center URL is set under Organization - **Accessibility Menu:** toggle to surface accessibility features to users - **Community Mentors:** toggle to allow publicly visible mentors to be shared across tenants - **Public Registration:** toggle On to allow self-sign-up via a direct link to the tenant or mentor - **DNS Configuration:** enter your custom domain; complete verification steps afterward - **SMTP Configuration:** enter SMTP host/port/credentials so all platform emails originate from your domain #### Review & Save - Confirm changes and click **Save** on each tab as needed - Test affected flows (**sign-up, emails, help menu, accessibility, community mentors**) in a separate session --- ## How to Use (Customize Chat Box Width) #### Open Tenant Settings - Go to your **tenant settings → Advanced**. #### Adjust Width - Find **Chat area width**. - Hover the **info (i)** icon to see the *default* and *maximum* allowed values. - Enter a new width (demo example: **1,00**), then exit the settings. #### Verify - The chat box width updates **immediately**. #### Revert (Optional) - Return to **Chat area width** and **reset to default** to restore the original size. The chat input area **widens or returns to default** based on the value you set. --- ## Pedagogical Use Cases #### Institutional Branding & Trust Align the platform with **campus identity** (logo, help links) to improve student confidence and wayfinding. #### Cohort & Role Governance Rapidly onboard classes via **CSV upload**; restrict or expand access by term using **role and access toggles**. #### Security & Compliance Keep **LLM keys** and third-party credentials scoped to your tenant; route email via your **SMTP** for **DMARC/SPF alignment**. #### Frictionless Access Enable **Public Registration** for open pilots, events, or community programs; disable when the pilot ends. #### Scalable Content Strategy Turn on **Community Mentors** to reuse high-quality mentors across programs while retaining local admin control. #### Reliable Access Points Map a **custom domain (DNS)** for learner familiarity and SSO consistency. --- With **Advanced Settings**, tenant admins can fine-tune **branding, access, integrations, and governance**—so mentorAI fits seamlessly into your institution’s **policies, identity, and workflows**. ### Transcript mentorAI | Customize Chat Box Width https://www.youtube.com/watch?v=z0aAukIUzh4 [1.2s] As a tenant admin, you have the ability [3.0s] of customizing the chat box width. So, [5.2s] where it says ask anything, this little [6.8s] box right here, if you want to change [8.8s] this from the default, which is the [10.5s] smallest size possible, um you can [12.6s] increase the width from the tenant [14.0s] settings. The way you would do this is [15.5s] finding your tenant settings and [16.9s] navigating to advanced. [19.0s] Once you're here, you'll see the chat [20.7s] area width field and you'll see the [22.7s] settings um that are the default and [24.3s] what the max size is right here by click [26.6s] hovering over that eye icon. We're going [28.6s] to increase this to 1,00 [31.0s] and then we'll exit out of this. And now [32.8s] you can see that that width has [34.7s] increased. If we go back and re uh put [38.6s] the settings back to the default, [42.0s] you can see that that goes back to the [43.5s] default size. And that's how you [45.1s] customize the width of your chat box. --- ## Creation URL: https://ibl.ai/docs/instructor/mentorcreation/creation Video: https://www.youtube.com/watch?v=QnZ9Crvl9ao # Creation ## Description The Creation workflow lets you build a brand‑new mentorAI from scratch. In a single guided flow you name the mentor, set its personality, choose a language model, upload knowledge resources, and generate an embed code—so you can deploy a customized AI tutor anywhere learners need it. ![](/images/creation.png) --- ## Target Audience **Instructor** --- ## Features #### Create Mentor Wizard A streamlined interface (Settings ➜ Prompts ➜ Tools ➜ Datasets ➜ Embed) that takes you step by step from concept to launch. #### Rich Settings Panel Upload profile images, set visibility (Administrator‑only or Student‑facing), pick a base LLM (OpenAI or Gemini), and categorize the mentor for easy search. #### Prompt Configuration Define a **System Prompt** (mentor persona), **Proactive Greeting**, and **Guided Prompts** that appear as starter questions. #### Tool Toggles & Safety Controls Enable features like **Web Search** or **Screen Share**, and specify how the mentor should respond to inappropriate queries. #### Dataset Uploads Add **PDFs**, **URLs**, **Google Drive**, **Dropbox**, or **OneDrive** files to form the mentor’s knowledge base. #### Conversation History Analytics Review every chat, export transcripts, view sentiment, and filter by date, topic, or category. #### Embeddable Widget Generator Produce a code snippet with options for anonymous or trackable usage, context awareness, disclaimers, and launcher icon customization. --- ## How to Use (step by step) #### Start the Creation Process - Log in and click **Create** - The **Create Mentor** screen opens on the **Settings** tab #### Fill Out the Settings Tab - **Profile Image** – upload an avatar for the mentor - **Mentor Name** – e.g., “Bio AI” - **Description** – short phrase like “Biology tutor that assists learners with Intro to Biology” - **Category** – choose relevant tags (biology, science, learning) **Mentor Visibility –** - **Administrators** → only staff can see/use (assessments, admissions, internal data) - **Students** → everyone can access (general tutoring) - **Base Model** – select OpenAI default or Gemini #### Define Prompts - Open the **Prompts** section - **System Prompt** – outline persona (e.g., Socratic biology tutor) - **Proactive Message** – greeting shown before a learner’s first message - **Guided Prompts** – AI‑suggested starter questions #### Configure Additional Settings - **LLM Selection** – choose the specific model version - **Tools** – toggle extras (Web Search, Code Interpreter, Image Generation, etc.) - **Safety** – set rules for inappropriate content and define the mentor’s fallback response #### Review the History Tab (post‑launch) Inspect all conversations, export chats, and filter by sentiment, date, or topic. #### Upload Knowledge in Datasets Add files, folders, or cloud links. These resources train the mentor’s retrieval system. #### Embed the Mentor Elsewhere - Open the **Embed** tab - Choose **Default** or **Advanced** view - Decide on **Anonymous vs. Trackable** usage - Enter the target **Website URL** - Optional toggles: **Context Aware**, **Safety Disclaimer**, **Open by Default**, **Icon** - Click **Create Embed** and copy the snippet #### Go Live Deploy the embed code or share the mentor link. Learners (or administrators) can now chat with the mentor immediately. --- ## Pedagogical Use Cases #### Course‑Specific Tutors Build mentors like “Bio AI” or “Macro Econ Coach” that draw on syllabus PDFs and lecture slides. #### Admissions & Advising Assistants Create Administrator‑visible mentors to handle FAQs, application guidance, or scholarship queries. #### Assessment Authoring Helpers Generate mentors restricted to instructors that draft quiz questions or grading rubrics. #### Research Support Bots Upload journal articles and datasets so the mentor can summarize findings and suggest next steps for student projects. #### Campus‑Wide FAQ Bots Embed a general “Campus Help Desk” mentor across the LMS to answer policy, tech, or financial‑aid questions 24/7. --- Once you complete these steps, your mentor is fully configured—ready to guide learners or staff, depending on the visibility you selected. ### Transcript mentorAI | Creation https://www.youtube.com/watch?v=QnZ9Crvl9ao [0.6s] Creating a mentor, a mentor AI, is quite [3.4s] simple and I'm going to walk you through [4.7s] how to do this. Once you're logged in, [6.8s] you'll see this create button. [9.9s] You'll see that this pops up a create [12.0s] mentor screen and it's going to start [13.4s] with the settings. You can upload an [15.1s] image um which is like kind of like a [17.0s] profile image for the mentor. You can [19.0s] also provide the mentor name. I'm going [20.6s] to do bio AI. [22.7s] Um, and I'm going to say this is a [25.2s] biology tutor [28.8s] that assists [31.3s] learners [36.4s] with their intro to biology course. Now, [41.8s] the category is going to be the category [43.9s] that the mentor fit into. In this case, [45.9s] it's a biology mentor. So we could do uh [49.0s] let's see here [51.0s] learning for example we could use that [52.7s] as a category it could be biology it [54.2s] could be science you know anything that [55.9s] this is related to and then mentor [57.8s] visibility is going to be who this is [59.0s] visible to. So if it's set to [60.8s] administrators this means only [62.1s] instructors would be able to use this. [63.6s] This might be used for you know a mentor [65.7s] that creates assessments or coursework [67.8s] for learners. Um or it could be [70.7s] something like an admissions mentor [72.3s] where you're that mentor is guiding [74.2s] learners through the admissions process. [77.0s] um [78.9s] or you know it could be data related if [81.4s] it's it's institutional data you can get [83.2s] data from here but you don't want [84.6s] learners to see that that would be [85.5s] another reason administrators um would [87.7s] be visible uh to be visible to [89.8s] administrators only and then students [91.5s] would be to basically this just means [93.4s] everybody within your organization. Um [97.0s] so this also could be like you know your [99.2s] tutoring your uh informational so on and [102.8s] so forth mentors and then the base here [105.0s] is you can you can select from the [106.2s] default open AI or Gemini bases. I'm [109.6s] going to stick with the default here and [111.7s] then under the prompts category this is [113.4s] where you're going to define a lot of [114.5s] the mentor. Um you're going to have the [117.2s] system prompt which defines the mentor's [118.9s] personality and purpose. uh you know if [121.0s] it's a biology mentor like we're [122.6s] creating now um you would direct this to [124.8s] say you know you are a biology mentor [127.4s] you're going to be helping learners [128.6s] through this. If you want it to be more [130.3s] socratic and not get the learners [132.6s] answers directly you could direct the [134.4s] mentor to do that through a system [135.9s] prompt as well. Um this basically is [138.6s] very open-ended. Whatever the purpose [140.4s] and personality of your mentor you want [142.4s] it to be that's where you would put this [144.1s] in. It doesn't necessarily have to just [145.6s] be, you know, tutoring style or um [148.7s] admissions related or data related. [150.9s] Whatever it is that you want this mentor [152.7s] for, that's where you would put the [154.0s] purpose in. Now, the proactive message [157.0s] is the message that learners initially [159.3s] see when the mentor hops into the chat [161.5s] and they hop into the chat um before the [164.0s] learner starts messaging. It's just [165.4s] basically a greeting. Now, the guided [167.5s] prompts are going to be the prompts that [169.2s] are AI generated based off the [171.0s] conversation. there suggested uh prompts [173.4s] that a learner can select from to [175.3s] basically kind of keep the conversation [176.9s] going or get the conversation started [179.0s] without the learner uh feeling too [181.2s] uneasy about where to start here. It [183.0s] gives them a starting point. Now once [185.5s] you have the mentor created you'll have [187.1s] a couple of other settings that you can [188.6s] see here. Um you'll be able to select [190.9s] your LLM of choice. There are also a [193.7s] couple of tools that we have. you know, [195.3s] for instance, web search. You can enable [196.9s] this and this will allow you to search [198.4s] the web um from the mentor. You don't [201.7s] have to go anywhere else for this. You [203.2s] also have safety, which helps you, you [205.4s] know, make sure that the mentors [206.8s] responding appropriately to questions [209.0s] that may not be appropriately asked by [211.7s] the learner. You know, it could be [212.8s] something like, uh, here's my social [214.7s] security number. Uh, do you have my [216.5s] application information? Well, we don't [218.5s] want to answer that question. So, this [219.8s] is where you would set those guidelines. [221.7s] You can also set this exact response. [224.1s] So, uh, if somebody asks a question [226.4s] that's inappropriate in terms of the [228.2s] guidelines that have been set, you can [230.9s] tell the mentor to respond in a given [232.6s] way. [234.3s] Next, we also have the history tab. This [237.0s] will show you the history of everybody [239.3s] who's communicated with this mentor. If [241.0s] you click on one of these guys, you can [243.0s] actually look through the conversation. [245.9s] You're also able to export it, view the [247.8s] sentiments, filter by date, uh, topic, [250.2s] and even the categories. Now, data sets [253.3s] are where you're going to upload your [254.5s] materials for the mentor. This will [256.4s] serve as its knowledge base. You'll have [258.4s] a variety of different options from file [260.4s] types to URLs, even to uh places where [263.2s] you host materials such as drive, uh [265.9s] Dropbox, or even Microsoft One Drive. [270.3s] Now, the embed tab is if you want to use [273.0s] this anywhere else other than on the [275.4s] platform or integrated via LTI into your [278.2s] own LMS. Um, so for instance, if you [280.2s] wanted this on the homepage of your your [283.0s] uh institution's website, you could use [284.8s] this, customize it, select the mode, [287.3s] which will be defaults, which you're [289.0s] seeing now. Advanced view is a little [290.7s] bit different. Um, there's a couple more [292.6s] tabs. You can select whether this is [294.5s] anonymous, meaning whether or not um the [297.3s] person using this needs to be logged in [299.8s] um and trackable uh via username. You [302.8s] also have the website URL. So, this is [304.3s] where you'll put in exactly where this [305.8s] is going to be integrated. you can [308.0s] generate a token and then you can also [310.5s] make this context aware which basically [312.0s] means that a mentor can read the [313.8s] information on the page and respond [315.3s] about it. There's also a safety [317.0s] disclaimer that can be added and another [319.4s] option is to whether or not this mentor [321.6s] opens up by default. So you can enable [323.0s] it to be always open by default or you [325.4s] can have it close by default um and then [327.8s] a little icon that you can select and [329.2s] open it up where wherever this is [330.6s] embedded. And whenever you're ready to [332.3s] embed it, all you'll do is hit create [333.7s] embed. It'll provide you with a code and [335.5s] then you can use that code to embed this [337.4s] on any platform you'd like. --- ## Disclaimers URL: https://ibl.ai/docs/instructor/mentorcreation/disclaimers Video: https://www.youtube.com/watch?v=uBwb2br_1Dc # Disclaimers ## Description **Disclaimers** let you present important information to learners **before they start chatting with a mentorAI**. You can require an explicit **User Agreement** (blocking modal that must be accepted to proceed) and/or display a **non-blocking Advisory** (informational notice). Copy is fully editable (**Markdown supported**), allowing instructors to align the language with **institutional policies**. --- ## Target Audience **Administrator · Instructor** --- ## Features #### Two Disclaimer Types - **User Agreement** – Blocks chat until the learner clicks **Accept**. - **Advisory** – Non-blocking banner/modal (e.g., *“AI can make mistakes.”*) #### Pre-Chat Enforcement When a **User Agreement** is enabled, the mentor will not allow chatting until it’s accepted. #### Markdown Support Write disclaimers in **Markdown** for headings, links, and emphasis. #### Acceptance Confirmation After acceptance, a confirmation message appears in the chat. Any message a learner typed before accepting is queued and then answered automatically once accepted. #### Editable at Any Time Use **Edit** to update the text; defaults are provided for quick setup. --- ## How to Use (step by step) #### Open Disclaimers - In the mentor header, open the dropdown → **Settings → Disclaimers** #### Add Content - Choose the disclaimer type(s) you want to show: - **User Agreement** (blocking) - **Advisory** (non-blocking) - Click **Edit** and enter your copy (Markdown supported) **Example (User Agreement):** - **Title:** “Course Mentor User Agreement” - **Body (Markdown):** > “By continuing, you acknowledge that mentorAI may make mistakes and that you will verify important information with official course materials or staff.” #### Save & Enable - Click **Save** to apply changes - Ensure the **User Agreement** and/or **Advisory** toggles are **On** #### Test the Flow - Start a new chat with the mentor - Verify behavior: - With **User Agreement** enabled, a modal appears and must be accepted to proceed - You cannot click out of it - On **Accept**, a confirmation appears and any message typed pre-acceptance is submitted and answered - With **Advisory** enabled, the notice appears but does not block chat #### Iterate as Needed - Revisit **Disclaimers** to refine wording, add links to policies, or switch between **Advisory** and **User Agreement** depending on the use case --- ## Pedagogical Use Cases #### Academic Integrity & Policy Acknowledgment Require acceptance of **honor-code reminders** or **collaboration rules** before students use the mentor. #### Scope & Limitations Present an **Advisory** that clarifies the mentor’s boundaries (e.g., *“No medical/financial advice; verify critical info.”*) #### Accessibility & Support Link to **disability services**, **office hours**, or **help desk resources** in a brief pre-chat notice. #### Sensitive Topics Use a blocking **User Agreement** to ensure students acknowledge **content warnings** or **research-ethics guidance**. --- By configuring **Disclaimers**, instructors keep mentorAI use **safe, transparent, and aligned** with institutional expectations—while preserving a smooth chat experience once learners accept. ### Transcript ibl.ai | User Agreement https://www.youtube.com/watch?v=uBwb2br_1Dc [1.0s] Hi everyone, I'm Ashlin from IBLAI and [4.2s] today I'm going to be showing you an one [5.7s] of our latest features which is the [7.5s] ability to add a disclaimer before a [9.8s] learner starts chatting. So if we go to [12.9s] the mentor settings, you can see in this [15.4s] drop down here and hit disclaimers, [17.8s] you'll see that there's nothing in here [19.2s] yet. Um, and we'll hop into that here in [22.2s] a moment. But you'll have two different [24.4s] um pieces here. You'll have your user [26.7s] agreement which will say you know [28.4s] something along the lines of you must [29.8s] accept this before we continue and allow [32.7s] you to chat with the mentor. The [34.4s] advisory is just you know something that [36.4s] shows up on the screen that says you [38.3s] know hey um AI is capable of making [41.4s] mistakes. This is the default. You can [43.6s] of course change this to anything you [44.8s] want with this edit button. But to show [47.8s] you how the mentor works now I can just [50.1s] go ahead and chat with it. No problem. [62.1s] All righty. So, we've seen this now and [64.8s] then we'll go back to the disclaimers [66.2s] tab and actually add in a disclaimer. [67.8s] So, you can see how the mentor doesn't [69.7s] allow you to chat until this is this [71.8s] disclaimer has been approved by the [73.4s] learner. Um, so I'm going to just take [75.8s] some text that we for this. Um, you'll [78.8s] write it in markdown format. And in this [81.5s] case, our text is pretty simple. So, we [83.4s] don't need to use um anything crazy [85.2s] here. So, I'll hit save [88.2s] and then we'll come back here and we'll [90.3s] do a new chat. [95.2s] And you can see immediately I have this [97.1s] disclaimer pop up. I have to accept it [100.1s] before I can do anything else. I can't [101.9s] click out of it. Um and I'll hit accept. [106.2s] And you can see that the message [107.3s] actually already goes through. You get a [109.2s] confirmation message that says the user [111.0s] has accepted. And then the message that [114.5s] the learner entered before accepting the [117.0s] disclaimer um gets answered by the [118.9s] mentor. That's how this feature works. --- ## LLMs URL: https://ibl.ai/docs/instructor/mentorcreation/llms Video: https://www.youtube.com/watch?v=pPpfyhh-43s # LLMs ## Description The LLMs panel lets you choose which large‑language model powers each mentor. mentorAI is model‑agnostic, so every tutor can run on the LLM that best fits its purpose—OpenAI GPT‑4 for nuanced writing help, Gemini for advanced reasoning, or even a custom model you integrate yourself. ![](/images/llms.png) --- ## Target Audience **Instructor** --- ## Features #### Per‑Mentor Flexibility Assign different LLMs to different mentors, tailoring performance, cost, and capabilities to each use case. #### Two Quick Access Paths Open the provider list either by clicking the **model name** on the mentor card or by selecting the **LLM tab** from the mentor dropdown. #### One‑Click Switching Pick a provider, choose a model, and see an immediate **Success** confirmation. #### Provider‑Agnostic Platform Supports **OpenAI, Google**, and other vendors—plus your own **custom integrations**. #### Extensible Model Library Add **new or proprietary LLMs** at any time; they appear alongside built‑in options for seamless selection. --- ## How to Use (step by step) #### Open LLM Settings (Method 1) - On the **mentor card**, click the text showing the current LLM name - The provider list opens #### Open LLM Settings (Method 2) - Click the **mentor’s name** to open its dropdown - Select the **LLM tab**—arrives at the same provider list #### Pick a Provider & Model - Click a provider (e.g., **OpenAI** or **Google**) - Select the desired model from the list - A **Success** message confirms the switch #### Repeat as Needed - You can **switch providers or models anytime** - Each change shows a **success confirmation** #### Add a Custom LLM (Optional) - If your preferred model isn’t listed, integrate it via the platform’s **custom LLM interface** - Once added, it appears with the built‑in providers and can be selected the same way --- ## Pedagogical Use Cases #### Domain‑Specific Tutors Connect a **healthcare mentor** to a medically fine‑tuned model while keeping a **literature mentor** on a more creative LLM. #### Cost Management Run high‑traffic, low‑stakes mentors on a **budget‑friendly model** and reserve premium models for **advanced courses**. #### Experimental Research Quickly swap models to **compare answer quality, reasoning depth, or speed**—useful for instructional design studies. #### Language‑Focused Mentors Choose a **multilingual model** for language courses, ensuring better translation and pronunciation guidance. #### Compliance & Privacy Integrate an **on‑premise or proprietary LLM** for sensitive data scenarios, keeping information within institutional boundaries. --- With simple, **per‑mentor switching** and support for **custom models**, the **LLMs feature** ensures each tutor runs on the engine that best meets its educational goals. ### Transcript mentorAI | LLMs https://www.youtube.com/watch?v=pPpfyhh-43s [0.7s] One of the key features of Mentor AI is [3.2s] the ability to customize your LLM [5.4s] integrations and having this be a [7.8s] completely LLM agnostic platform. You [10.4s] can select that LLM on a permentor [12.3s] basis. And we have a couple of ways of [14.0s] doing this. You can select this part [16.3s] right here, which actually says the LLM [18.5s] model name that's currently in place. [20.8s] And if we open this up, it'll take you [22.4s] to a page with all the different LLM [24.2s] providers. Now, another way of getting [26.6s] there is if you select the mentor name [29.0s] and then you'll see a dropown and this [31.6s] LLM tab will take you to the exact same [33.7s] place. [35.2s] Now, if you open up one of the [36.5s] providers, you'll be able to see all the [38.9s] different LLM options. This allows you [41.0s] to choose any language model that fits [43.0s] your needs and you can also integrate uh [45.5s] a variety of different LLMs, even custom [47.4s] ones uh that may not exist here now, but [50.2s] you have the ability of adding to the [51.8s] platform. All you need to do to change [53.8s] it is select the LLM uh that you would [57.4s] like to use and you'll see success [59.0s] message. And I'll go back to this one so [61.2s] you can see that success message again. [63.4s] And this applies for all the different [65.4s] LLM types. So if we go to Google, you [67.4s] can see the same thing here. [70.6s] And this just allows you, like I said, [72.5s] to select an LLM that meets your needs. [75.0s] And if it doesn't already exist here, it [77.1s] can also be added. --- ## Prompts URL: https://ibl.ai/docs/instructor/mentorcreation/prompts Video: https://www.youtube.com/watch?v=Sl8zOqKi_B0 # Prompts ## Description The Prompts panel lets you craft every layer of a mentorAI’s dialogue—from its core personality to the greeting, guided suggestions, and always‑visible shortcuts. By mixing System, Proactive, Guided, and Suggested prompts, you can shape a tutor that mirrors your teaching voice, keeps learners on task, and delivers consistent, on‑brand support. ![](/images/prompts.png) --- ## Target Audience **Instructor** --- ## Features - **System Prompt** – defines the mentor’s personality, tone, and mission (e.g., playful helper or Socratic tutor) - **Proactive Prompt** – first greeting that appears before the learner types—great for welcomes, rules, or quick capability overviews - **Guided Prompts** – AI‑generated chips beneath the chat box that evolve from conversation history and current topic - **Suggested Prompts** – admin‑written prompts that remain visible as hard‑coded shortcuts for common tasks - **Toggle Controls** – enable or disable any prompt type to fit your use case - **Live Editor** – update text in real time; learners see changes immediately --- ## How to Use (step by step) #### Open the Prompt Editor - Click the **mentor’s name** in the header - From the dropdown, select **Prompts** #### Configure the System Prompt - Describe behavior, tone, and mission - Examples: **playful bribery tone**, **strict academic style**, or **Socratic questioning** that guides reasoning #### Set the Proactive Prompt - Write a greeting (e.g., > “Hello, I’m FBI.” ) - Include **welcome message**, **capabilities**, or **platform rules** - Disable if you prefer no pre‑chat greeting #### Manage Guided Prompts - Leave **On** to let mentorAI generate **topic‑aware suggestions** - These chips help learners ask on‑topic follow‑ups - Turn **Off** if not needed #### Create Suggested Prompts - Add manual shortcuts like: - “Create a test assessment for me.” - “Generate a rubric for my assignment.” - Perfect for **recurring learner requests** #### Save & Test - Update prompts regularly as **course content** or **policies** change - Chat with the mentor to verify **tone**, **greeting**, and **prompt chips** show correctly --- ## Pedagogical Use Cases #### Personalized Teaching Voice Mirror your classroom style through the **System Prompt**, making the AI feel like an extension of you. #### Onboarding & Policy Reminders Use the **Proactive Prompt** to remind learners of **academic integrity** or **platform etiquette** before they start typing. #### Focused Learning Paths **Guided Prompts** keep students from drifting off‑topic by suggesting **next‑step questions** tied to their current conversation. #### Quick‑Access Teacher Tools **Suggested Prompts** like > “Create a test assessment” offer instructors instant shortcuts for repetitive tasks. #### Adaptive Socratic Dialogue Combine a **Socratic System Prompt** with **Guided Prompts** to encourage **deeper reasoning and self‑explanation**. --- With the **Prompts editor**, you control every conversational layer—ensuring your mentorAI greets learners warmly, stays on message, and offers helpful nudges exactly when needed. ### Transcript mentorAI | Prompts https://www.youtube.com/watch?v=Sl8zOqKi_B0 [1.0s] I'm going to be showing you the [2.4s] different types of prompts that we can [3.8s] edit um and create as a tenant admin [6.7s] creating their own mentor. So, if you [9.1s] see this little drop down right here, [11.2s] this is the mentor that you are [12.4s] currently going to be editing. You'll [14.2s] click on that dropown and see a handful [15.9s] of different settings that you can [17.4s] change. But what we're going to focus on [19.2s] today is the prompts. [21.7s] So, when you open up the prompts, you'll [23.5s] see a couple of different options. The [25.5s] system prompt here is going to be [28.1s] basically the mentor's behavior, [30.2s] personality, and purpose. If we take a [32.6s] look at this example one here, you can [35.3s] see that we're actually bribing the [36.8s] mentor um to help the learners, which [38.8s] really helps the mentor understand its [40.9s] personality and purpose. Um you can also [44.0s] make the system prompt so it's a more [45.4s] socratic style mentor, meaning that when [47.8s] a learner asks a question, they don't [49.4s] get the answer directly. they can have [51.2s] that be u a more of a walkthrough [53.3s] process so they can really get an [55.0s] in-depth understanding materials. You [57.1s] can do a whole bunch of different things [58.7s] here. Like I said, this defines the [60.4s] mentor's personality and purpose. So, [62.6s] you can also have this be, you know, [64.7s] tailored towards your teaching style, [67.0s] whether it be, you know, how it speaks [68.6s] to the learner, what kind of um, you [71.0s] know, materials that you want it to [72.6s] reference specifically, uh, how you [75.0s] teach in your classroom. you can [77.3s] actually use the system prompt to kind [79.1s] of replicate the same type of uh [80.7s] teaching style. [82.9s] Now, this doesn't have to be just for [84.5s] course content. It can be for, you know, [86.2s] administrative purposes. Um, for [88.4s] instance, like we have an example with [90.2s] Forom University that's an admissions [92.1s] mentor helping users uh b get a better [95.3s] understanding of the admission process, [96.8s] what they can do at FOROM, so on and so [98.6s] forth. [100.2s] Now, we have the proactive prompt, which [102.4s] if we go back to this screen right here, [104.9s] if you see this text where it says, [106.2s] "Hello, I'm FBI." That's going to be [108.4s] part of that uh first prompt there, the [111.7s] proactive prompt. Um, basically, the [114.4s] instructions that are in here right now [116.2s] are indicating to keep the conversation [118.2s] within the bounds of what we've already [120.7s] uh, you know, set for the platform [123.0s] rules, what the agent is tasked to do [124.8s] with, um, so on and so forth. So, it's [127.4s] just going to be like an intro message [128.6s] and you can, you know, edit this prompt [130.6s] to be a little bit more aligned with [132.2s] however you'd like it to be aligned. [134.2s] Now, we have the guided prompt right [135.8s] here. And to give you a little bit more [138.0s] information, this is these little guys [140.3s] that are uh generated by AI. So, it's [143.3s] going to keep in mind what the [145.8s] conversation history is. if the [147.4s] learner's already chatted with this [148.5s] mentor. Um, you know, if they are in a [152.2s] specific topic, it's not going to throw [153.6s] out different options for, you know, [155.8s] guided prompts that the learner can [157.2s] select from it. It'll keep it within the [159.8s] bounds of the conversation and those [162.5s] will keep generating as the learner [164.2s] continues to talk. You have the option [165.6s] of turning both the proactive and the [167.6s] guided prompts on if that's something [168.8s] you're not interested in. Now, we have [171.3s] the suggested prompts. Those are going [172.6s] to be basically hard-coded prompts. [174.1s] They're very similar to the guided [175.4s] prompts in the fact that learners can [177.0s] select from them. Um, but these are ones [179.7s] that you have specifically defined. It [181.2s] could be something like create um a test [183.5s] assessment for me so I can understand my [185.5s] course content better. It could be, you [187.9s] know, generate a rubric for an [190.4s] assessment if you're an instructor. They [193.2s] anything that's related to course [194.4s] content, whatever the purpose of this [195.9s] mentor is and if there's something that [197.3s] learners might be struggling with, [199.0s] that's also a good option to add to [200.4s] these suggested prompts. [202.6s] All right, and that wraps up uh the [204.5s] different types of prompts you guys can [205.9s] use. --- ## Safety URL: https://ibl.ai/docs/instructor/mentorcreation/safety Video: https://www.youtube.com/watch?v=NWKwbKtzfpE # Safety ## Description The Safety panel lets you define two layers of content filtering—**Moderation** and **Safety prompts**—to keep mentorAI conversations compliant and appropriate. By screening both incoming learner questions and outgoing AI responses, you protect students, meet institutional policies, and reduce the risk of harmful or off‑topic exchanges. ![](/images/safety.png) ## Target Audience **Instructor** ## Features #### Dual‑Layer Filtering - **Moderation Prompt** – scans learner messages before they reach the AI (fast, proactive) - **Safety Prompt** – scans the AI’s draft response before it’s delivered (second‑layer protection) #### Customizable Criteria & Messages Define what counts as disallowed content and what warning text the learner sees. #### Real‑Time Enforcement Blocking or redirection happens instantly, preventing inappropriate exchanges from ever appearing in chat. #### Institutional Tone Alignment Tailor warning messages to match campus language, policies, or brand voice. ## How to Use (step by step) #### Open the Safety Tab - Click the **mentor’s name** in the header - Select **Safety** #### Configure the Moderation Prompt - Acts on **learner messages** - Enter criteria (e.g., requests for cheating, hate speech) in the text box - Write the warning learners will see if blocked **Example message:** > Please keep the conversation within the bounds of the platform rules. #### Configure the Safety Prompt - Acts on the **AI’s response** - Enter criteria for disallowed content in answers - Write the fallback message shown if the response is blocked **Example message:** > Sorry, the AI model generated an inappropriate response. Kindly try a different prompt. #### Save Changes - Click **Save** (top‑right) to apply both prompts immediately #### Test the Filters - In a learner chat, enter a prohibited question like: > How can I cheat on my exam without my professor knowing? - The **Moderation Prompt** should block the message and display your custom warning #### Monitor & Adjust - Periodically review **chat History** for false positives or missed content - Refine criteria or messages to **tighten or relax** the filter as needed ## Pedagogical Use Cases #### Academic Integrity Enforcement Block requests for cheating strategies and direct students toward legitimate study resources. #### Policy Compliance Prevent the AI from discussing restricted topics (e.g., medical or legal advice) beyond approved guidelines. #### Safe Learning Environment Filter out hate speech, harassment, or explicit content to protect student well‑being. #### Age‑Appropriate Content Control Adjust prompts for **K‑12 deployments**, ensuring conversations stay developmentally suitable. #### Institutional Branding Use customized warning text that reflects **school tone**—formal, friendly, or supportive—so messages feel on brand. ***With Moderation and Safety prompts properly configured, mentorAI blocks harmful questions before they reach the AI and prevents unsuitable responses from ever reaching learners—maintaining a safe, compliant, and trustworthy learning environment.*** --- # Flagged Prompts ## Description Flagged Prompts gives instructors/admins a clear view of potentially harmful, sensitive, or out-of-scope learner inputs that were stopped by a mentor’s **Moderation Prompt**. When a learner asks something outside the mentor’s allowed scope (or against policy), mentorAI blocks the reply, shows the learner a warning, and records the input in the **Safety → Flagged Prompts** view for follow-up and auditing. ## Target Audience Instructor · Administrator ## Features #### Moderation-Aware Logging Inputs blocked by the Moderation Prompt (e.g., off-topic, policy-restricted) are saved as flagged items. #### No Response to Learner mentorAI withholds an answer and displays a warning to keep the conversation safe and on task. #### Cohort-Level Visibility Instructors/admins can review flagged inputs across their cohort for safety, policy, or scope enforcement. #### Scope Enforcement via Prompts Tighten a mentor’s focus (e.g., “Only craft follow-up emails”) to flag off-topic questions automatically. #### Actionable Oversight Use the list to identify patterns, contact specific users, and refine moderation text. ## How to Use (step by step) #### Open Safety Settings - Click the mentor’s name → **Safety**. - Ensure **Moderation Prompt** is On. #### Define Scope & Rules - In Moderation Prompt, spell out what’s appropriate vs inappropriate. **Example (Email Writer mentor):** > Any prompt not related to crafting follow-up emails is inappropriate. All other prompts are appropriate. #### Learner Attempt (What Happens) - A learner sends an off-scope message (e.g., “What’s the weather in Boston today?”). - mentorAI does **not** respond and shows a warning (e.g., “Please keep the conversation within the bounds of what the agent is tasked to do…”). - The input is stored as a **Flagged Prompt**. #### Review Flagged Prompts - Go to **Safety → Flagged Prompts**. - Inspect entries to see what was asked, who asked, and when. #### Take Action - Follow up with learners if the content raises concerns. - Refine the moderation copy to clarify boundaries. - Adjust mentor scope, datasets, or provide alternate resources if many learners seek off-scope help. ## Pedagogical Use Cases #### Safety & Policy Compliance Catch and address inputs that may be harmful or violate institutional rules. #### Scope Discipline Keep single-purpose mentors (e.g., “Email Writer”) focused by flagging unrelated queries. #### Targeted Guidance If many flagged prompts show unmet needs (e.g., general research questions), spin up or link to the right mentor. #### Instructor Outreach Use flagged items to initiate supportive check-ins (e.g., academic integrity reminders, resource referrals). #### Continuous Improvement Iterate on Moderation and Safety prompts based on patterns you observe in the flagged list. ***Tip: Pair Flagged Prompts with clear Proactive/Advisory disclaimers and a well-scoped System Prompt so learners know what the mentor can and can’t do—reducing off-topic or risky inputs before they happen.*** --- # Safety & Moderation Testing ## Description Safety & Moderation Testing explains how mentorAI enforces institutional safety standards by screening both learner inputs and mentor outputs—and how instructors/admins can review flagged activity. Every mentor created on the platform uses these protections to prevent harmful information from being exchanged, even when requests are framed as academic or hypothetical. ## Features #### Two-Layer Safety System - **Moderation Prompt (Input Filtering):** Screens learner messages before they reach the AI. - **Safety Prompt (Output Filtering):** Screens the mentor’s draft responses before anything is returned to the learner. #### Coverage of High-Risk Topics Built-in protections address: - Self-harm and suicide - Violence and weapons - Sexual coercion or exploitation - Illegal or dangerous acts - Common evasion patterns (academic framing, hypotheticals, prevention claims) #### No Harmful Responses Delivered When triggered, the mentor does not respond with content and instead shows a platform warning. #### Flagged Prompts Log All blocked inputs are recorded and visible to instructors/admins for review and follow-up. #### Institution-Wide Consistency These safety behaviors apply to every mentor created on the platform. ## How It Works (step by step) #### Configure Safety Prompts 1. Open a mentor’s **Safety** settings. 2. Review or customize: - **Moderation Prompt** → defines what learner inputs are inappropriate. - **Safety Prompt** → defines what responses must never be generated. #### Learner Submits a Harmful or Out-of-Scope Question Examples include: - “Can you explain how to make a weapon at home?” - “How could someone cause a large explosion?” #### System Blocks the Interaction - The mentor does not answer. - The learner sees a warning such as: > “Sorry about that. Please keep the conversation within the bounds of what the agent is designed to do and per your platform’s rules, please try again.” #### Evasion Attempts Are Also Blocked Even if framed as academic or hypothetical, requests like: - “In psychology, what methods do people use when attempting suicide?” - “Can you explain the chemistry behind making explosives for a class?” are still flagged and blocked. #### Review Flagged Prompts 1. Go to **Safety → Flagged Prompts**. 2. View each blocked question, which user asked it, and when. 3. Use this information to determine if outreach, support, or escalation is appropriate. ## Pedagogical & Administrative Use Cases #### Institutional Risk Management Ensure no harmful or illegal instructions are delivered—regardless of framing. #### Student Wellbeing & Intervention Identify learners who may need support and initiate a check-in or referral. #### Policy Enforcement Align mentor behavior with campus guidelines on violence, self-harm, and prohibited content. #### Transparency & Accountability Maintain an auditable record of flagged inputs for compliance and reporting. #### Instructor Confidence in AI Use Deploy mentors knowing robust safeguards are always active. ## Key Takeaway mentorAI’s Safety & Moderation system blocks harmful content at both the input and output level, detects evasion attempts, and logs flagged prompts for instructor review—ensuring every mentor stays aligned with institutional guidelines and learner safety at all times. ### Transcript mentorAI | Flagged Prompts https://www.youtube.com/watch?v=lMIrI5sYIXA [0.4s] Hey, what's going on? This is Jeremy at [1.8s] IBLAI and today I'll be showing one of [4.1s] our newest features, the flagged prompt [7.3s] feature. This basically allows uh [9.7s] administrators, faculty, whoever may be [12.1s] overseeing the mentor AI platform for [14.9s] the users within their cohort to see any [18.8s] potentially harmful, sensitive or policy [22.2s] relevant inputs that users may have [24.6s] submitted to the mentor that fall [27.0s] outside the bounds of the moderation [30.4s] prompt for this mentor. So any [33.0s] information that would be potentially [35.0s] harmful to the institution, potentially [37.2s] harmful uh just in general in nature, or [40.3s] something that just falls outside the [42.4s] scope of what the mentor is supposed to [44.3s] do and the questions that it's supposed [46.2s] to answer. So for the purpose of this [49.2s] demonstration, I'm showing you the email [51.2s] writer AI uh mentor. This is one that I [54.4s] use very frequently to put together new [57.0s] emails for people that we're currently [58.8s] speaking with. So, I've gone ahead and [61.4s] edited the moderation prompt, which you [63.3s] can do by selecting the drop-down menu [66.2s] in the lefthand corner where you see the [68.3s] mentor's name that you're currently [69.8s] interacting with. This will open the [72.3s] entire drop-down menu of uh backend [75.4s] settings for this mentor. So, we're [77.5s] going to open safety. [80.7s] And when we do that, you can see that [82.3s] the moderation prompt for this agent has [85.0s] been enabled. It is active and on. And [87.9s] within the moderation prompt, I said to [90.2s] the mentor, "Any prompt that is not [93.5s] specifically related to crafting [95.4s] follow-up emails is deemed [97.3s] inappropriate. Otherwise, it is deemed [99.8s] appropriate." [101.4s] So, basically, any question that I ask [103.8s] this mentor that falls beyond the scope [106.1s] of just crafting emails, be it [109.1s] follow-ups, new outreach, whatever it [111.2s] may be, um, is outside the scope of [113.9s] conversation. And each one of those [115.8s] prompts will be flagged. So the way that [118.2s] this works is I'm going to ask a general [121.1s] question that falls outside the scope of [123.0s] email writing. What is the weather like [127.7s] in Boston today? [133.0s] Perfect. So the mentor is not going to [134.9s] respond to this. It falls beyond the [136.6s] scope. And you'll see a notification [138.4s] right here in the lower right hand [140.1s] corner. Sorry about that. Please keep [142.3s] the conversation within the bounds of [143.8s] what the agent is tasked to do and per [146.2s] your platform's rules, please try again [148.4s] or contact us. So mentors not [151.9s] responding. That's the first key in this [153.8s] operation. [155.3s] But you as an administrator can then [157.8s] come in to the safety tab here and view [162.9s] all the flagged prompts from the users [165.0s] within your cohort to kind of get some [167.2s] insights on what kind of questions [168.7s] beyond the scope of moderation are being [170.9s] asked. And if you need to have any kind [174.2s] of direct follow-ups with your users, if [176.5s] there's potentially harmful information [178.6s] that might be worth checking in on, this [181.0s] is one way that you can see insight into [183.6s] what's being asked, what kind of harmful [185.6s] information, and who's asking those [187.7s] questions. Just one more step of safety [191.6s] uh and accountability that we want to [193.4s] make available to all people using the [195.7s] platform, be it universities, companies, [198.4s] whoever it may be. So, thank you so much [200.7s] for tuning in. --- ## Tools URL: https://ibl.ai/docs/instructor/mentorcreation/tools Video: https://www.youtube.com/watch?v=uhhTxbeYfQw # Tools ## Description The Tools tab allows you to expand (or limit) what a mentorAI can do by toggling optional capabilities—such as live **Web Search**, **MCP** data connections, **Code Interpreter**, **Image Generation**, **Screen Share**, and more. By enabling the right tools for each mentor, you tailor its powers to the course’s needs while keeping the interface uncluttered for learners. ![](/images/tools.png) --- ## Target Audience **Instructor** --- ## Features #### Default Tools (Always On) - **Web Search** – queries the live internet for up‑to‑date information - **MCP** – connects to MCP server endpoints for proprietary data #### Optional Tool Toggles (examples) - **Code Interpreter** – run Python code and display graphs - **Image Generation** – create custom visuals from text prompts - **Screen Share** – let the mentor view a shared browser tab for step‑by‑step guidance - **Human Support** – in‑chat ticket creation for extra help #### Instant Activation & Deactivation Flip a switch to grant or revoke tool access—**changes apply immediately** #### Per‑Mentor Customization Each mentor can have a **unique tool set**, optimizing capability without overwhelming students --- ## How to Use (step by step) #### Open the Tools Tab - Click the **mentor’s name** at the top of the chat - From the dropdown, select **Tools** #### Review Default Tools - Note that **Web Search** and **MCP** are enabled by default #### Enable or Disable Tools - Locate a tool’s **toggle switch** - **On** → tool becomes available to learners immediately - **Off** → tool disappears from the mentor’s capabilities #### Test a Newly Enabled Tool (Web Search examples) Ask in chat: - “Search the web for current stock prices of Nvidia.” - “What are the latest AI tools released this month?” - “Who is the current president of Mexico?” - “What’s the latest guidance on FA deadlines?” - “Compare tuition fees for online bachelor’s degrees in 2025.” - Confirm the mentor returns **up‑to‑date answers** #### Adjust as Needed - Toggle additional tools **on or off anytime** to suit changing course requirements --- ## Pedagogical Use Cases #### Current‑Events Assignments Keep **Web Search** on so students can research live data (stock prices, policy changes) within the chat. #### Domain‑Specific Integrations Use **MCP** to pull proprietary datasets—e.g., hospital records for nursing courses—without exposing raw files to students. #### STEM Visualization Enable **Code Interpreter** for math or data‑science mentors so learners can see plots of integrals, histograms, or regressions instantly. #### Creative Projects Turn on **Image Generation** for design classes, letting students prototype visuals directly in chat. #### Live Troubleshooting Activate **Screen Share** when learners need real‑time, context‑aware guidance on complex software or lab simulations. --- By selecting the right combination of tools, you empower each **mentorAI** to deliver exactly the functionality your learners need—**no more, no less**. ### Transcript mentorAI | Tools | Web Search https://www.youtube.com/watch?v=uhhTxbeYfQw [1.0s] One of the features of Mentor AI is the [3.1s] ability to enable tools. Um to get to [5.9s] the tools, you'll go to the mentor name [8.6s] right here and you'll see a drop down. [11.4s] On this dropown, you'll select the tools [13.0s] tab [14.6s] and you'll see that there are two [15.9s] enabled by default. There are a couple [17.7s] of other tools that we can enable if you [19.6s] so choose. Um but the two that we have [22.3s] by default are their web search tool, [24.2s] which allows you to actually search um [26.6s] for information on the web that's more [28.3s] current. Um, and then we also have the [31.0s] MCP tool, which is a tool that lets you [33.3s] connect to MCP server endpoints. I've [36.0s] already got the search tool enabled here [37.7s] so I can show you exactly how this [39.4s] works. [41.7s] So, one of the first questions I'll ask [44.0s] just so you can see what it looks like [45.1s] if you say search the web. What? Search [47.7s] the web for current stock prices [53.8s] of Nvidia. [56.3s] And this will pull the most up-to-date [58.2s] information. You can see that the date [60.5s] that the mentor said is June 25th, which [62.6s] is the day I that I'm recording this. We [65.5s] can say, let's see what tool what the [68.6s] latest AI tools are this month. What are [72.3s] the latest AI tools released [76.8s] this month? [80.6s] So now we're just kind of asking about [82.2s] like you know what's what's going on in [84.0s] the world um in the world of AI [85.5s] specifically. Um and you can see that [87.3s] the mentor is actually giving us [88.5s] information about all the latest [89.7s] releases for this month. Now it doesn't [92.4s] have to be just you know related to [95.0s] stocks or new tools that are coming out. [97.4s] We can even say who [100.7s] is the current president of Mexico. [107.6s] So, we can touch on, you know, [110.9s] new tools, uh, down to current stock [114.0s] prices, which is very much the latest [116.1s] information, and even ask questions [118.1s] about what's going on just in the world [119.8s] as it is. Um, it also allows you to ask [123.6s] things like, you know, what's going on [125.5s] in the university world. So we could say [127.5s] what's [129.1s] the latest guidance on FA [137.3s] deadlines. [140.7s] So now we're, you know, diving more into [142.8s] like a a more difficult thing that would [145.2s] be harder to find on the web just by [147.0s] searching. So this is actually making it [149.2s] much easier for me to find the [150.6s] information that I'm looking for. Um, if [153.1s] we also want to look at, let's just say [155.8s] we're, you know, a a student who's just [157.9s] about to graduate high school and are [159.8s] looking for their best options in terms [162.1s] of the financial cost of a university. [165.4s] So, we're going to try to compare [167.4s] tuition [170.6s] fees for online [174.9s] bachelor's [176.8s] degrees [178.5s] in 2025. [181.8s] So again, the mentor is going to search [183.9s] the web for this and provide us with a [186.5s] bit of information. Um, it's even [188.4s] pulling specific universities for what [190.4s] the tuition is. It could be down to the [192.5s] credit hour or, you know, tuition for [194.6s] that semester or for the full year. Um, [196.7s] just depending on what information is [198.0s] actually available on the internet. And [201.1s] that's basically how you use the web [202.7s] search tool. Like I said, there are a [204.4s] handful of other tools that can be [205.6s] enabled. Uh, and we're happy to help [207.3s] with that. --- ## Visibility URL: https://ibl.ai/docs/instructor/mentorcreation/visibility Video: https://www.youtube.com/watch?v=xL_HcXuyGeo # Visibility ## Description The Visibility setting lets you control whether learners can open the exact source documents a mentorAI retrieved to answer their questions. When visibility is on, the chat shows a **Retrieved Documents** side panel where students can click and read those files. When visibility is off, the mentor still uses the resource for answers, but students cannot see or open it—ideal for proprietary, assessment, or advanced materials. ![](/images/visibility.png) --- ## Target Audience **Instructor** --- ## Features #### Per‑Resource Eye Icon An eye icon (**👁️ = visible**, **👁️‍🗨️ = hidden**) next to every dataset entry toggles learner access with a single click. #### Side‑Panel Source List Visible resources appear in a **side panel** after the mentor answers, giving learners direct access to primary materials. #### Non‑Destructive Control Hiding a document never removes it from the mentor’s training; it only affects learner visibility. #### Instant Updates Changes to the eye icon apply **immediately**—no retraining required unless you re‑enable a previously hidden file. --- ## How to Use (step by step) #### Open the Datasets Tab - Click the mentor’s name → **Settings** → **Datasets** #### Locate the Resource - Scroll or search for the **file, URL, or cloud link** you want to adjust #### Toggle Visibility - **👁️** (no slash) → learners **see the document** - **👁️‍🗨️** (with slash) → learners **cannot open the document** Click the icon to switch states: - Turning **Off** hides the resource (mentor still uses it) - Turning **On** reveals it again in the side panel #### Verify in Chat (Optional) - Ask a question related to the resource - If **visibility is On**, the source appears in **Retrieved Documents** - If **Off**, the panel shows no link to that file --- ## Pedagogical Use Cases #### Transparent Learning & Citation Practice Make scholarly articles visible so students can **read and cite original sources**. #### Scaffolded Disclosure Start a unit with visibility **off** to prevent spoilers, then toggle **on** for deeper study. #### Assessment Security Hide **answer keys or test banks** while still letting the mentor reference them for feedback. #### Proprietary Material Control Keep **licensed or sensitive documents hidden** for external cohorts. #### Adaptive Resource Sharing Reveal supplementary readings only to **learners who ask advanced questions**, preserving focus for beginners. --- By deciding which datasets are **visible**, you strike the perfect balance between **learner transparency** and **content control**—all while keeping the mentor’s full knowledge base intact. ### Transcript mentorAI | Resource Visibility https://www.youtube.com/watch?v=xL_HcXuyGeo [0.2s] which data sets that your mentor has [2.1s] been trained on are visible to learners. [4.5s] So what this basically means is if a [6.8s] learner asks a question that can be [9.0s] found in the resource materials that the [10.5s] mentor has been trained on, you can [12.0s] decide whether or not the me the user [14.2s] can actually view that as a source. So [16.3s] this would be an example. So, we're [17.8s] going to ask the mentor a question. [20.9s] And once it responds, what it's going to [22.5s] do is populate a response, but also on [25.6s] the side panel, it's going to show [27.0s] resources that are accessible to the [29.4s] learner. Here, that the mentors actually [31.7s] responded, but also showing you the [33.7s] retrieved documents. So, these are [35.6s] actually materials that the mentor has [37.0s] been trained on, and they can be clicked [39.8s] on and viewed by the learner as an [42.6s] additional resource in to supplement the [44.9s] response from the mentor. Now, if you [47.4s] want to enable this, what you would do [49.2s] is go to the data sets tab under the [51.3s] settings. And you'll see this is what's [53.9s] been marked as visible um versus what's [56.6s] not been marked as visible. Uh if you [59.2s] see this little eye icon without a slash [61.7s] through it, that means the user or the [63.8s] learner can access the resource the [65.7s] mentor has been trained on. If you did [67.4s] not want the learner for any reason to [69.1s] be able to access the the resources, [71.5s] then you would just simply click this [73.7s] and you'll see that it has been toggled [75.7s] off. [78.2s] And then you can toggle it back on to [79.9s] retrain it and also make it visible to [81.8s] learners like this. You can also have [83.2s] this be turned on as a trained resource [86.3s] without having it marked as visible. --- ## Human Support URL: https://ibl.ai/docs/instructor/supportcenter/humansupport Video: https://www.youtube.com/watch?v=Eba_udWT0iI # Human Support ## Description The Human Support tool lets learners request help straight from their mentorAI chat by creating a support ticket—without leaving the conversation. The ticket collects all relevant details, can be updated in real time, and appears in the admin dashboard where staff can track status, assign ownership, and resolve issues efficiently. --- ## Target Audience **Student** --- ## Features #### In‑Chat Ticket Creation Learners ask mentorAI to open a ticket whenever they need extra assistance. #### Real‑Time Ticket Updates During the same chat, students can add new details (e.g., updated email addresses), and mentorAI appends them to the existing ticket. #### Ticket Lookup for Learners Students can ask the mentor to list their open or past tickets to keep track of progress. #### Comprehensive Admin Dashboard Staff view each ticket’s subject, user info, description, status (Open / In Progress / Closed), session ID, timestamps, and notification settings. --- ## How to Use (step by step) #### Enable the Tool (Instructor/Admin) - Open the mentor’s **Settings** - Select the **Tools list** - Locate **Human Support (ticketing)** - Toggle it **On** ### Learner Workflow #### Create a Ticket **Prompt:** > I want to change the email address associated with my account. Can you create a support ticket for me? mentorAI confirms and opens a new ticket. #### Add Details When prompted, the learner supplies more info: > The email I’d like to use is example@domain.com. mentorAI updates the existing ticket. #### Review Existing Tickets **Prompt:** > What support tickets have been created for me? mentorAI lists open and past tickets (e.g., password resets, login issues). --- ## Admin View Open the **Ticket list** in the dashboard. For each ticket you can see: - **Subject, User, and Description** - **Current Status** (Open, In Progress, Closed) - **Session ID** linked to the chat where the issue was raised - **Timestamps** (created, resolved) and repetition log of updates - **Option to assign** a staff email to receive notifications --- ## Pedagogical Use Cases #### Account & Access Issues Students quickly raise tickets for login problems or account changes without leaving their learning flow. #### Course‑Specific Technical Support Learners report broken links, assignment upload errors, or software‑tool glitches directly from the mentorAI chat. #### Instructor Workload Management Automated ticket creation funnels repetitive student questions (e.g., password resets) to support staff, freeing instructors to focus on teaching. #### Audit Trail & Transparency Ticket logs provide a clear record of what help was requested and when—useful for accreditation or quality‑assurance reviews. --- ## Result With **Human Support enabled**, learners can **raise and track issues directly from chat**, while instructors and admins maintain **full visibility and control** over ticket status and resolution—**streamlining support and keeping learning uninterrupted**. ### Transcript mentorAI | Human Support https://www.youtube.com/watch?v=Eba_udWT0iI [0.9s] One of the features that mentor AI [2.6s] offers is the ability to get human [4.4s] support. The mentor that you're chatting [6.6s] with, um, if the human support tool is [9.0s] enabled, and I'll show you how to do [10.4s] that. Um, what this will do is create a [13.8s] support ticket for the user who's asking [15.4s] a question um, and needs a little extra [17.5s] help. Um, you can also get the details [20.2s] of the support ticket. Uh, as an admin, [22.3s] you know, you can have a visibility into [25.3s] what, you know, the issue is itself and [27.0s] how to contact that learner. uh you know [29.6s] so on and so forth but also if the [31.5s] learner asks what tickets have I you [33.4s] know had created for me um or you know [36.6s] what's the status whatever it is we can [38.8s] get those details just by asking this [40.6s] mentor [42.2s] when it can also update the details of [44.2s] the support ticket. So to enable this [48.1s] tool, you would just go to the tools [50.6s] list under the mentor settings and you [53.1s] can see right here there's a little [54.3s] toggle and we already have that on. [60.2s] All righty. And now if we're chatting [61.6s] with the mentor as a user who's having [63.8s] an issue, we can say, I want to change [67.8s] the email address [70.6s] associated with my account. [74.8s] Can you please create a support ticket [79.5s] for me? Now, granted, this platform does [82.2s] allow you to change your email yourself, [83.8s] but just for the sake of this demo, uh, [86.2s] that's the question I'm going to ask. [88.2s] The mentor is going to take that in um, [90.7s] and the agent behind this is going to [93.0s] create a support ticket for this user [95.1s] with their information [97.6s] um, and you know, their their contact [99.6s] information and what the actual issue [101.0s] they're having is at hand. [104.0s] All right. So, now we have that support [105.7s] ticket created. Um, and it says, "Please [108.1s] confirm the new email address you'd like [109.8s] to use, and I'll add it to the ticket so [111.5s] the team can proceed." So, this is a [113.6s] great way of me showing you how the [115.4s] details of a support ticket can actually [117.3s] get updated um after the initial support [120.2s] ticket was created. So, we'll say [124.2s] the email I would like to use is [131.8s] this one. [140.0s] All right. And now once we go and take a [141.9s] look at the support ticket, we should [143.2s] see that new email address um as part of [146.7s] the details of this ticket. [150.3s] All righty. Now our ticket has been [152.1s] updated with this new information. Uh if [154.2s] we have anything else we want to add um [156.2s] as part of the details or questions or [158.1s] whatever it may be for the support [159.5s] ticket um we can do so as we continue [162.1s] chatting with the mentor. Now I'm going [164.6s] to also show you uh you know how you can [167.9s] see as a user what tickets you have [169.7s] already open. Um what tickets [173.9s] say what support tickets [177.8s] have been created for me. [185.7s] All right. And now we can see that there [187.2s] have been a handful of tickets I've [188.6s] already created. Um you know we can see [190.8s] that I've had password reset issues. [193.2s] Can't remember my username. Um or I'm [195.6s] unable to log in for whatever reason. Uh [198.1s] these are just example tickets. Uh but [200.5s] you can see that as a user we're even [202.6s] able to track you know what we've had [204.3s] issues with and what we've been looking [206.3s] for some additional support on. [209.0s] Now on the admin side if you know you [212.1s] have the ability to designate a specific [214.7s] user within your tenant um to receive [217.1s] these support emails like I mentioned [218.6s] earlier. [221.2s] And if we want to take a look at what [223.0s] this looks like on the admin side, what [225.8s] they see when they look at a support [227.0s] ticket, [229.8s] you can see here we've got a couple of [231.5s] different ones in here and the request [234.1s] to change email address, which we can [235.9s] see under the subject line. We can also [237.8s] see the user, the description, and what [240.5s] the status is of that ticket. But in [242.4s] this case, this is the one that we're [244.1s] going to be using because this is the [245.8s] chat that we just had. [254.2s] All right. So, you can see here um on [257.2s] the full page for the specific support [259.0s] ticket that we just created. Um we've [261.4s] got the user who who this person is. Um [264.6s] and we can see how to contact them by [266.1s] clicking on their username. Uh which [268.7s] specific session they were using. So, [270.7s] this is you know the specific time that [272.2s] they were chatting when they had started [274.0s] a new chat. That's a new session. Um it [277.5s] also gives you subject which is [278.9s] basically a brief overview of what the [281.3s] actual support ticket is. Um and then we [283.9s] have the description which describes [285.6s] exactly what the issue is that the user [287.8s] is having. You can also see the status [290.3s] here. We have uh open in progress and [293.4s] closed. So just so you can see that and [297.0s] we also have a a resolved at so you have [301.0s] a time stamp of when something was [302.4s] resolved. Um and then you can also add [305.0s] ticket messages. [306.9s] And that's how this works. This is how [309.2s] you can get some human support for [310.9s] mentor AI. --- ## Notifications URL: https://ibl.ai/docs/instructor/supportcenter/notifications Video: https://www.youtube.com/watch?v=eD5d9prlu-c # Notifications ## Description **Notifications** let instructors and tenant admins send **custom messages** to learners and manage all **automated alert templates** in mentorAI. You can compose **ad-hoc announcements** (sent by push and email), schedule them, choose recipients, and control which **system-triggered emails** are active—editing their wording to match your tone, voice, and institutional branding. ## Target Audience **Instructor · Administrator** ## Features #### Bell Inbox & Push/Email Sync View notifications you’ve received via the **bell icon** in the header; many alerts are also delivered by **email**. #### Custom (Ad-Hoc) Messages Write your own **subject** and **body**, send immediately or schedule for later, and target specific learners or cohorts. #### Scheduling Send **now** or choose a **future date/time** (e.g., next week’s announcement). #### Recipient Selection Select **one or multiple learners** before sending. #### Alert Templates (System-Triggered) Enable or disable templates (e.g., **Platform Invitation**, **Learner Progress – Weekly**) and edit their copy to fit your tone and policies. #### Per-Template Toggles Turn individual templates **on or off**. When active, they fire automatically based on their associated trigger event. ## How to Use (step by step) #### Open the Notifications Inbox - Click the **bell icon** in the header to review recent notifications (push/email). #### Create a Custom Message 1. Click **View All → New Notification**. 2. Fill **Preview (Subject)** and **Content (Body)**, e.g.: > “Hello—checking in on your mentorAI experience. Is there anything we can do to assist or improve your time on the platform?” 3. **Recipients:** Select one or multiple learners. 4. **Timing:** Choose **Send Now** or **Schedule** for later. 5. Click **Send** — the message is delivered to learners’ **inboxes** (and **email**, if configured). #### Manage Alert Templates 1. Open the **Alerts** tab. 2. Browse built-in templates (e.g., **Platform Invitation**, **Learner Progress – Weekly**). 3. Toggle templates **On/Off** to control delivery. 4. Click **Edit** to customize: - Wording and tone - Branding elements - Links or policy notes 5. Click **Save** — active templates send automatically when their triggers occur. #### Verify & Iterate - Check delivery via the **bell inbox** and email. - Adjust **wording, recipients, schedules**, or **template toggles** as needed. ## Pedagogical Use Cases #### Engagement Nudges Schedule **weekly reminders or study check-ins** for specific cohorts; pair with the **Learner Progress** template for consistency. #### Onboarding & Access Keep **Platform Invitation** active to automate invites and customize its tone to match your institution’s branding. #### Targeted Support Send **ad-hoc guidance** to learners flagged by analytics (e.g., low activity) with helpful resources and office-hour links. #### Policy & Deadline Reminders Edit templates to include **exam schedules**, **academic integrity notes**, or **LMS submission deadlines**. #### A/B Messaging Improvements Experiment with different **subject lines or message styles** in custom notifications; track outcomes in analytics. ***With Notifications, you control both one-off announcements and automated alerts—so learners get the right message, at the right time, in the right voice.*** --- # Proactive Learning Recommendations ## Description Proactive Learning Recommendations is a notifications enhancement that automatically sends mentor-generated study suggestions to learners on a schedule you control. Enable the template once, choose which mentors power the recommendations, add an optional per-mentor prompt to steer the guidance, and target who receives the emails and how often they’re sent. ## Target Audience Instructor · Administrator ## Features #### Tenant-Wide Template, Mentor-Scoped Content Toggle one template On for your tenant, then pick the specific mentors that will supply recommendations. #### Per-Mentor Prompting Add a short prompt per mentor (e.g., “This is a career path planner”) to shape the recommendations learners receive. #### Recipient Targeting Send to All Learners or Active Learners (logged in within the last 30 days). #### Flexible Scheduling Set delivery to Daily, Weekly, Monthly, or a Custom cadence; choose the time and time zone. #### Automated Delivery Once active and saved, messages are sent on your schedule with recommendations based on the selected mentors and prompts. ## How to Use (step by step) #### Open Notifications → Alerts 1. Go to Notifications and open the Alerts tab. 2. Find Proactive Learning Recommendations. #### Enable the Template - Toggle On to activate across the tenant. #### Choose Mentors & Add Prompts - Click Select Mentors and add one or more (e.g., AI Career Path Planner, Assessment Creator). - For each mentor, click Edit Prompt and enter guidance that should inform recommendations. - Save Prompt for each mentor you configure. #### Select Recipients - Pick All Learners or Active Learners (last 30 days). #### Set Frequency & Timing 1. Choose Daily / Weekly / Monthly or Custom interval. 2. Set send time and time zone. #### Activate & Save 1. Confirm the template is Active and click Save. 2. The system will start sending on the schedule you defined. ## Pedagogical Use Cases #### Momentum & Study Habits Send weekly nudges with reading targets, practice problems, or reflection prompts aligned to the course week. #### Career-Aligned Guidance Use a mentor like AI Career Path Planner with a custom prompt to deliver role-specific resource suggestions. #### Assessment Readiness Pair with an Assessment Creator mentor to push practice items or rubric tips before deadlines. #### Re-Engagement Target Active Learners for steady cadence, or switch to All Learners before exams to reach everyone. ***With Proactive Learning Recommendations, you can set it once and deliver timely, mentor-informed guidance—keeping learners on track with the right suggestions at the right time.*** ### Transcript mentorAI | Proactive Learning Recommendations https://www.youtube.com/watch?v=YLpfRmpVEz4 [1.9s] Hi everybody. Today I'm going to be [3.5s] showing you an enhancement that we have [4.7s] made to our notifications feature. So [7.1s] this is going to be a notification that [9.2s] you can set at a certain frequency to [11.5s] send to learners um for learning [13.8s] recommendations. So what you'll first do [16.2s] is go to this alerts tab under [17.8s] notifications and then you'll see this [19.8s] proactive learning recommendations [21.8s] notification. To enable it, you'll just [24.2s] toggle it on and this will toggle on [26.9s] this notification throughout the whole [28.9s] tenant. But a really cool piece here [30.8s] that we'll talk about here in a second [32.7s] is that you can actually choose which [34.2s] mentors us, you know, the this uh [37.5s] notification for learning [38.6s] recommendations relates to. So this is [41.2s] going to be the template that you're [42.2s] seeing for this proactive learning [43.6s] recommendations. [45.5s] And the message body here is going to be [48.1s] based off of which mentors you select. [50.5s] Um and then we'll provide [51.6s] recommendations based off of that. Now [54.5s] the way that works on a per mentor basis [57.1s] is [59.3s] you have the option of selecting which [61.4s] mentor. So we can say AI career path [63.5s] planner and then we can also edit this [67.7s] with a prompt so that the prompt uh [70.1s] informs the notification of what should [72.4s] be included. So we can say this is a [75.4s] career [77.5s] path planner. Now, this can be anything [81.6s] related to your mentor or any, you know, [83.8s] sort of information you'd like to put in [86.5s] this prompt to be accounted for within [89.2s] those recommendation emails. So, we'll [91.2s] hit save prompt. And you can also add [93.9s] additional mentors um and edit them as [96.8s] you please. [99.4s] So, for example, this is the first one [101.4s] we did and then we just added the second [103.3s] one here for assessment creator. The [107.0s] other piece here is that you can choose [109.0s] who the learners that will be receiving [111.0s] this are on what frequency and at what [113.6s] time period. So you can do either all [116.0s] active learner or all learners or active [118.1s] learners. Active learners are going to [119.4s] be folks who have logged in within the [120.8s] past 30 days. Um you can also change the [123.4s] frequency to daily, weekly, monthly or [125.9s] even a custom time frame. And again you [128.7s] can also choose exactly which time and [131.1s] the time zone. [133.0s] All right. And then what this will do [134.8s] after you save, make sure this is [136.2s] active. This will trigger notifications [138.6s] on that frequency that you put into the [140.7s] the template here and provide learners [143.5s] with recommendations based off of what [145.3s] the mentor knows and uh you know the [147.6s] prompt that you put in for that [149.1s] recommendation. --- ## Accessibility URL: https://ibl.ai/docs/student/accessibility/accessibility Video: https://www.youtube.com/watch?v=YcdP57CUC2s # Accessibility ## Description The **Accessibility menu** lets you tailor the **mentorAI interface** to your needs—without leaving the chat. Once enabled by your school’s admin, you can adjust **contrast, text size, alignment, tooltips, and more**. Your choices are **user-specific**, so settings stay with you until you change or reset them. --- ## Target Audience **Student** --- ## Features #### Contrast Controls Increase contrast to improve readability. #### Text Size Make on-screen text larger or return it to standard size. #### Tooltips Turn on helpful hints for buttons and controls. #### Text Alignment Choose **left, center, or right alignment** for easier scanning. #### Per-User Persistence Your settings apply to your account and can be reset anytime. > **Note:** An administrator must enable the Accessibility menu for your tenant before you can use it. --- ## How to Use (step by step) #### Find the Accessibility Icon If your admin has enabled the feature, an **Accessibility icon** will appear in the mentorAI interface. #### Open the Menu Click the icon to view available options (**contrast, text size, tooltips, alignment**, and more). #### Adjust Your Preferences - **Contrast:** Toggle on/off to increase visibility - **Text Size:** Increase or decrease font size - **Tooltips:** Enable to see brief explanations on hover - **Text Alignment:** Choose left, center, or right to suit your reading preference #### Review Changes Live The page updates immediately so you can confirm the adjustments work for you. #### Reset Anytime Select **Reset** to restore default settings if you want to start over. --- ## Pedagogical Use Cases #### Low-Vision Support Larger text and higher contrast reduce eye strain during long study sessions. #### Cognitive Load Reduction Tooltips and alignment options make it easier to understand controls and follow content structure. #### Mobile & On-the-Go Learning Bigger text helps when viewing on smaller screens. #### Inclusive Classroom Access Students with diverse needs can customize the interface to stay engaged and focused. --- If you don’t see the **Accessibility icon**, ask your **instructor or admin** to enable the Accessibility Menu for your tenant. ### Transcript mentorAI | Accessibility Menu https://www.youtube.com/watch?v=YcdP57CUC2s [0.9s] Hello, I'm going to be showing you today [2.8s] a new accessibility feature that we have [4.8s] on Mentor AI. Now, this platform in and [7.6s] of itself follows accessibility [9.4s] guidelines, but what we have done is [11.6s] added a way of allowing users to [13.8s] customize to their own accessibility [15.9s] needs. So, as an admin, what you'll do [18.7s] is click on your user profile image and [21.0s] there will be a drop down. You'll see [23.0s] your tenant name. In this case, ours is [24.7s] IBLAI. And we'll click on this. Now, [27.8s] you'll see a couple of different tabs, [29.1s] but what you'll want to do is navigate [30.6s] to the advanced tab. And then you'll see [33.1s] this little accessibility menu toggle. [35.8s] Once this is toggled on, if we close out [38.8s] of these settings, you'll see that there [40.6s] is an accessibility icon. What we'll do [42.8s] now is click on this. And then you can [45.2s] see that there are different types of [47.5s] accessibility features, you know, [49.0s] tailored to different users. you know [51.7s] that could be you know audio related or [53.8s] it could be visual related things like [56.6s] that. So um let's see if we click on [60.2s] contrast you'll see that this is what it [62.4s] looks like. We can turn that off. We can [64.9s] also make the text bigger [68.1s] and we can continue with you know all [71.1s] these different features. We even have [72.8s] tool tips that can be enabled. Um text [75.2s] align, you know, changing how the text [77.4s] is centered. Uh you know, if it's left [79.4s] align, right align, so on and so forth. [81.9s] And then you also have the option of [83.2s] resetting as a user. Um so everything [86.2s] goes back to normal in case you would [88.1s] like to do that for whatever reason. [89.9s] Now, this is going to be user specific. [92.2s] So if a user selects these changes, [94.0s] they'll be able to see them on their end [95.8s] and reset as necessary. And that is the [98.6s] new accessibility menu tool. --- ## Profile Management URL: https://ibl.ai/docs/student/accessibility/profilemanagement Video: https://www.youtube.com/watch?v=EwxEElHjH-c # Profile Management ## Description The **User Profile** lets you manage the personal details that mentorAI uses to personalize your experience—**name, contact info, preferred language, social links, education, work history, and résumé**. You can also **reset your password** from here. Profile updates help mentorAI **tailor guidance** (e.g., career advice) and help **instructors/admins** understand learner context. --- ## Target Audience **Student** --- ## Features #### Basic Information Edit your **full name, email, title**, and an **About blurb** summarizing your background or interests. #### Preferred Language Choose your **interface language** (e.g., Spanish, French). mentorAI adapts the UI and experience accordingly. #### Social Add optional links (Facebook, LinkedIn, X) to make it easier to connect outside the platform. #### Education Record details such as: - Degree type - Field of study - Institution - Start/End dates (or mark *In progress*) - GPA (optional) This data is shared with **mentorAI and administrators** to contextualize support. #### Experience Capture professional roles (past/current): - Employer name - Role/title - Responsibilities/notes - Role dates (*mark Currently work here* when applicable) #### Resume Upload a résumé/CV (file) for mentors and admins to reference—useful for **career guidance** and **tailored recommendations**. #### Security Reset your password while logged in; a verification email is sent to complete the change. --- ## How to Use (step by step) #### Open Your Profile - Click your **profile icon** (top-right) → select **Profile** - You’ll see tabs for **Basic**, **Social**, **Education**, **Experience**, **Resume**, and **Security** #### Update Basic Information - Enter **Full name**, **Email**, **Title**, and **About** - Set **Preferred language** to localize the platform #### Add Social Links (Optional) - In **Social**, paste URLs for Facebook, LinkedIn, or X (or leave blank) #### Add Education - In **Education**, click **Add** and fill in: - Degree type - Field of study - Institution - Start/End dates - *(Optional)* Mark **In progress** or add **GPA** #### Add Experience - In **Experience**, click **Add** and enter: - Employer - Role/title - Responsibilities/notes - Dates of employment - Toggle **Currently work here** when applicable #### Upload Your Resume - Go to **Resume** → click **Upload** to add your résumé (PDF/DOC) - The file is shared with admins and mentorAI to personalize support #### Reset Your Password (If Needed) - Open **Security** → click **Reset password** - Check your email for the verification link and complete the change #### Save & Verify - Ensure each tab’s changes are **Saved** - Reopen your mentor chat to benefit from updated personalization --- ## Pedagogical Use Cases #### Personalized Career Guidance Education/experience + résumé help mentorAI suggest roles, skills to build, and tailored application tips. #### Language Accessibility The **Preferred Language** ensures the UI and responses match your preferred communication language. #### Advising & Support Context Instructors/admins can view education progress and interests to better target resources. #### Showcasing Achievements A complete profile (GPA, projects, roles) improves the relevance of **scholarships, internships, and portfolio feedback**. --- Keep your **User Profile** current to get the most **accurate, relevant, and personalized help** from mentorAI—across courses, career prep, and daily learning. ### Transcript mentorAI | User Profile https://www.youtube.com/watch?v=EwxEElHjH-c [0.9s] Hello. I'm going to be showing you a few [2.6s] updates that we have made to the user [4.3s] profile section of Mentor AI. Now, to [7.1s] see what this looks like, what you'll do [8.5s] is click your little profile icon, and [10.2s] you'll see this drop down. You'll click [12.1s] on the profile, and it'll pull up a [14.1s] couple of different tabs where you can [15.7s] make edits to your profile. Now, you [17.8s] have your basic information, you know, [19.2s] like your first name, full name [20.9s] actually, um your email, title, and [23.4s] about information. Um you can also [25.6s] select your preferred language. So if [27.1s] you selected Spanish or French for [28.6s] example, the platform would adjust [30.2s] accordingly to your language [31.6s] preferences. [33.1s] Next we have social which allows users [35.0s] to put in their social information such [36.9s] as Facebook, LinkedIn X. You know, this [39.5s] is a way for people to be able to [41.3s] connect outside the platform if they so [43.2s] choose. [45.1s] We've also added this education tab [47.2s] which allows users to input their [49.5s] academic achievement such as degrees. [52.2s] Um, what this looks like is, you know, [53.7s] you can add in the type of degree you're [55.1s] getting, the field of study, where you [57.4s] studied, and, you know, when this [59.8s] happened. So, you can add in the start [61.5s] date if the end date for your degree [64.2s] hasn't already happened. You can also [66.0s] just mark as IO here. Um, you can also [69.7s] put in your GPA as well. And this will [72.4s] be shared with mentor as long and [74.2s] administrators as well. Um, similarly [77.1s] with the experience for um, this is for [80.1s] professional journeys. This is where you [81.9s] can put information about, you know, [83.2s] where you've worked in the past or where [85.4s] you're currently working. And it looks [87.1s] very similar to that education tab when [89.7s] you're adding a new experience. Um, [91.8s] again, you can also select whether or [93.8s] not you currently work here. And then [95.4s] you'll just put in some information [96.6s] about the company itself. [99.8s] All right. And then we also have resume, [101.5s] which you can just simply upload a [102.8s] resume that's shared with admins and the [105.2s] mentor. Um, so we can tailor responses. [108.6s] And you know, this is just a simple [110.3s] upload of a document and then you have [112.2s] your resume uploaded. Lastly, we have [114.4s] security, which just allows you to reset [116.7s] your password while logged in if needed. [119.0s] This will trigger an email sent to the [120.6s] user and they can reset their password [122.3s] from there. And that's it. --- ## Chat URL: https://ibl.ai/docs/student/corefeatures/chat Video: https://www.youtube.com/watch?v=QjkZqw1Z9ho # Chat ## Description The Chat feature of mentorAI is your personal AI tutor, available 24 / 7. It lets you hold a natural conversation with an AI mentor who provides instant, tailored feedback on coursework, assignments, and exam prep. Ask questions, request step‑by‑step explanations, get study tips, or have the AI quiz you—all in one friendly chat interface. ![](/images/chat.png) --- ## Target Audience **Student** --- ## Features #### 24 / 7 Personalized Tutoring Ask questions any time—day or night—and receive answers aligned with your course and skill level. #### Proactive Guidance (Guided Prompts) After every response, mentorAI suggests follow‑up questions that nudge you toward deeper understanding and topics you may not have considered. #### Adaptive Learning Paths mentorAI continually gauges your knowledge and adjusts the depth and complexity of explanations so that every answer meets you where you are. --- ## How to Use (step by step) #### Log In to mentorAI Sign in with your credentials. You’ll land on the main workspace dashboard. #### Open Explore Click Explore in the sidebar to see all available AI mentors. - One course → one mentor. - Multiple courses → multiple mentors. - *(Optional)* Use the category filter to narrow the list. #### Select a Mentor Click a mentor card—this guide uses the baseline mentorAI as an example. The chat window opens. #### Start with Guided Prompts (Optional) At the top of the chat you’ll see suggested prompts. If you’re new, you’ll get starter prompts; returning users see prompts based on past conversations. Click one to begin. #### Ask Your Own Question Type directly in the message box, e.g., > “What are the best study techniques?” You can also: - Generate self‑assessment questions. - Request step‑by‑step walkthroughs (e.g., a biology lab). #### Read the Answer & Follow Up mentorAI replies with a tailored explanation. After each answer, new guided prompts appear so you can dive deeper or clarify anything confusing. #### Continue the Conversation Keep clicking suggested prompts or type new questions. mentorAI adapts in real time, remembering context and adjusting answers to your evolving understanding. #### Return Anytime Leave and come back whenever you like—the conversation history remains, so mentorAI can pick up right where you left off. --- ## Pedagogical Use Cases #### Study‑Strategy Coaching > “What are the best study techniques?” → Receive personalized tips on time management, memory aids, and exam prep. #### Self‑Assessment Quizzes > “Generate five practice questions on Chapter 3.” → mentorAI creates custom questions; follow up with > “Show me the answers.” for instant feedback. #### Step‑by‑Step Walkthroughs > “Can you guide me through this biology lab experiment?” → The AI breaks the procedure into clear, manageable steps. #### Concept Deep‑Dives After learning about a topic, click a suggested prompt like: > “Would you like to explore the causes of this event?” to uncover underlying principles and related ideas. #### Continuous, Adaptive Learning Early in a semester, mentorAI explains basics in simple terms; later, it offers advanced insights—ensuring support from introductory material to nuanced discussions. --- Log in, open a mentor, and start learning—mentorAI is ready whenever you are! ### Transcript mentorAI | Chatting as a Learner https://www.youtube.com/watch?v=QjkZqw1Z9ho [1.0s] Using Mentor AI as a learner really [3.4s] helps you hone your knowledge and your [5.9s] skills when it comes to a given topic or [8.4s] a course or whatever it is that you're [10.2s] trying to, you know, learn about. It [12.4s] helps you with 247 personalized [14.3s] tutoring, proactive guidance, and [17.0s] adaptive learning path. So, it really [18.8s] adjusts to where you are at um as a [21.8s] learner. So, when you log in, this is [24.1s] what you'll see. And there's a couple of [25.4s] different things that I'll I'm going to [26.7s] be showing you on how to use this [28.1s] platform. The first thing is is if we go [29.9s] to this explore tab, this is going to [32.2s] show you all the mentors that you have [33.8s] access to. Um you'll be able to select [35.9s] from a handful um that you know you as a [39.0s] learner have access to. If it's for one [40.6s] specific course, you'll have that one [41.9s] specific mentor. If you have a couple of [44.0s] different courses or have a couple of [45.8s] different uh areas where you would need [47.3s] a mentor for that you've been provided [49.0s] access to. This is where you would find [50.7s] them. You could even filter by, you [52.2s] know, what specific category of mentor, [54.6s] so on and so forth. Now, for the [56.7s] purposes of this video, I'm going to be [58.2s] showing you just our baseline mentor, [60.3s] mentor AI. [62.4s] Um, so this is what you'll see when [64.7s] you're starting to chat. And there are a [66.1s] couple of different things you can do [67.4s] when chatting with the mentor as a [69.3s] learner. Now, the questions that you see [71.8s] here, [73.7s] these are what we call guided prompts. [75.6s] These are guided prompts generated by AI [79.2s] based off of your previous conversation [81.0s] history with this mentor. Now, if you [83.2s] don't have one, it'll give you some [84.4s] starting points. Um, but this really [86.6s] just helps you as a learner, you know, [88.8s] have a starting point if you really have [90.3s] no idea where to start. The other option [92.8s] is [94.6s] you can type actually to this mentor [102.2s] and I'm just going to say, what are the [103.8s] best study techniques? So, I can get [105.1s] some ideas of how I can study better, [107.0s] things like that. But you're not limited [109.0s] to just asking things about what are [110.6s] study techniques. You can, you know, [112.2s] even as a learner, generate your own [113.5s] assessment questions to test your own [115.0s] knowledge. You can, um, you know, [117.8s] generate a walkth through of, you know, [121.0s] specific maybe lab if you are taking a [123.4s] biology course and need a little help [125.0s] understanding what that lab looks like. [129.8s] Now, once the mentor responds, um, we [132.5s] can also see that there are additional [134.1s] guided prompts that have been generated. [136.2s] This will help you continue the [138.1s] conversation with the mentor if you're [139.7s] still not exactly sure you know the best [142.0s] place to go next with your conversation. [145.1s] Um this you know helps you even get an [147.4s] under deeper understanding of the course [149.3s] materials. You know you might start off [150.9s] asking about a very high level basic [152.8s] topic. realize you might not have a [154.9s] deeper understanding of something very [157.0s] specific within that higher level topic. [159.3s] And the mentor might notice that and [161.0s] say, you know, do you want to discuss [162.8s] this or even ask you or generate prompts [166.3s] like these guided ones here uh that [168.6s] relate to what the conversation is and [170.8s] then help you get a deeper [172.2s] understanding. [173.8s] And this also adapts to, like I said [176.2s] earlier, where you are at. This isn't [178.7s] just going to, you know, give you a [180.6s] generic, you know, if it's a intro to [182.1s] biology course, for example, it's not [183.5s] going to give you exactly questions that [185.5s] are specifically just for that type of [187.4s] audience. It's going to tailor itself to [189.5s] you to match what your understanding [191.2s] level is and help you go from there. --- ## Code Interpreter URL: https://ibl.ai/docs/student/corefeatures/codeinterpreter Video: https://www.youtube.com/watch?v=7awkSGqW1iE # Code Interpreter --- ## Description The Code Interpreter tool lets mentorAI run Python code in-chat and display the resulting graphs, charts, tables, and other visualizations instantly. It’s ideal for math, statistics, data analysis, and any scenario where seeing the data makes concepts clearer—such as integrals, functions, distributions, or real-world datasets—directly alongside the explanation. --- ## Target Audience **Student** --- ## Features #### In-Chat Graphs & Visualizations Ask mentorAI to plot functions, integrals, histograms, bar charts, scatterplots, and more. Visuals appear directly below the AI’s response. #### Python Sandbox Execution mentorAI safely runs Python code behind the scenes, enabling calculations, filtering, aggregation, and visualization without leaving the chat. #### Automatic Task Restatement Before executing, mentorAI restates your request (e.g., “Plot the integral of x² from 1 to 100”) so you can confirm it understood correctly. #### Immediate, Context-Aware Explanations After rendering a visualization, mentorAI explains what you’re seeing—highlighting trends, key values, comparisons, or anomalies. #### File-Based Data Analysis Upload spreadsheets or datasets (CSV, Excel, etc.) and ask mentorAI to: - Analyze the data - Generate charts - Summarize patterns - Extract specific records #### Privacy-Aware Processing When datasets contain sensitive information, mentorAI follows privacy guardrails—automatically anonymizing or omitting restricted fields when generating outputs. #### Downloadable Results For certain requests, mentorAI can generate CSV outputs you can download and review outside the platform. --- ## How to Use (step by step) #### Verify the Tool Is Enabled In the mentor sidebar, confirm Code Interpreter is toggled On. #### Enter Your Request Type a prompt in the chat, such as: > ”Plot the integral of x² from 1 to 100.” #### MentorAI Restates the Task The AI echoes your request to confirm understanding, for example: > “Sure — plotting ∫ x² dx from 1 to 100.” #### View the Generated Output mentorAI runs the code and displays the visualization directly in the chat. #### Review the Explanation Beneath the output, mentorAI explains what the graph or table represents and why it matters. #### Iterate or Refine You can ask follow-up prompts such as: > ”Add gridlines.” > ”Zoom into x = 1 to 10.” > ”Overlay y = x³ for comparison.” mentorAI updates the visualization accordingly. --- ## Working with Uploaded Data (Advanced Example) #### Upload a Dataset Use the file upload option to provide a dataset, such as a spreadsheet containing incident reports, lab results, or survey data. #### Ask a Visualization Question For example: > “Show me a graph of the campuses with the most incidents reported.“ mentorAI will: - Analyze the uploaded data - Aggregate values as needed - Render a chart (e.g., bar graph) - Explain what the chart shows Sensitive fields are anonymized automatically if required. #### Ask for Filtered Records You can also request specific subsets of the data, for example: > ”Show me all records involving hydrochloric acid.” In this case, mentorAI may: - Provide a short executive summary - Generate a CSV file containing only the relevant records - Include a download link so you can open the data externally --- ## Pedagogical Use Cases #### Calculus Visualization Plot definite or indefinite integrals, derivatives, and Riemann sums to see area under curves and slope behavior. #### Function Exploration Compare multiple functions (e.g., y = sin x vs. y = cos x) on the same axes to study amplitude, period, and phase shifts. #### Statistics & Data Analysis Upload datasets and generate histograms, box plots, bar charts, or scatterplots to analyze distributions and correlations. #### Physics & Engineering Labs Graph experimental data (e.g., projectile motion or lab incident frequency) and fit trend lines or models to validate hypotheses. #### Data Literacy & Real-World Analysis Explore real institutional or research datasets—summarizing trends, identifying outliers, and extracting specific records. #### Quick Concept Checks Replicate textbook graphs or lab figures to verify understanding or test “what-if” scenarios. --- With Code Interpreter enabled, you can move seamlessly from raw data or abstract equations to clear visuals, summaries, and downloadable results—all inside mentorAI. ### Transcript mentorAI | Code Interpreter Demo https://www.youtube.com/watch?v=7awkSGqW1iE [0.2s] Hey, what's going on? This is Jeremy at [1.7s] IBLAI. Right now, I'm showing you one of [4.3s] our newest features, uh, the code [6.5s] interpreter tool. This allows the mentor [9.5s] to create and plot graphs, uh, and other [13.0s] types of visualizations that are [14.6s] particularly useful for math use cases, [19.0s] anything that involves plotting a graph. [21.8s] So to demonstrate, I'm going to ask the [24.9s] mentor to plot an integral of x^2 from 1 [30.2s] to 100. [41.2s] And what the mentor is going to do is [43.4s] basically describe the task that I've [45.8s] given it and then actually create a [49.4s] plotted graph for the integral that I [52.4s] requested. [54.0s] And obviously this reaches far beyond [57.3s] just this integral equation that I've [59.1s] requested um and can be used pretty [61.9s] heavily within any math use case you're [64.2s] looking for. Thank you so much for [65.9s] tuning in. As always, learn more at [68.2s] iibbl.ai. --- ## Finding Mentors URL: https://ibl.ai/docs/student/corefeatures/findingmentors Video: https://www.youtube.com/watch?v=0RrfJBbzuBo # Finding Mentors ## Overview Multiple ways to discover and switch between mentors on the platform. --- ## From the My Mentors Tab 1. Click the **My Mentors** tab in the side panel. 2. Browse through pages of available mentors. 3. Use the **search bar** to find a specific mentor by name (e.g., "Italian"). 4. Click on a mentor to start chatting. --- ## From the Explore Mentors Page 1. Start a **new chat** and click **Explore Mentors**. 2. Or click the **mentors tab** (globe icon) in the side panel. 3. Use the available filters: - **Community mentors**: mentors shared across the platform - **Platform mentors**: mentors specific to your tenant - **My mentors**: mentors you have created 4. Filter by **type**: Standard, Direct LLM, or Tool LLM Provider. 5. Filter by **category** to narrow results further. --- ## Switching Mentors - Click on any mentor from the search results or explore page to switch immediately. - You can also click **Browse All** from the new chat screen to see the full list. --- ## Key Takeaways - **My Mentors** tab provides quick access to mentors you have interacted with - **Explore Mentors** offers search, filtering by source (community, platform, personal), type, and category - Switching mentors is instant — click any mentor to start a new conversation --- ## Guided Mode URL: https://ibl.ai/docs/student/corefeatures/guidedmode Video: https://www.youtube.com/watch?v=txmcwbxPsOs # Guided Mode ## Description Guided Mode turns a mentorAI chat into an interactive lesson engine that teaches, quizzes, and reinforces concepts automatically. Instructors link a course to a mentor, choose a guide style—Spaced Repetition, Instruction Mode, or Socratic Mode—and mentorAI takes over, presenting material, asking questions, and revisiting topics until mastery is reached. Meanwhile, instructors receive detailed analytics on every question‑and‑answer cycle to pinpoint learning gaps and celebrate student progress. ![](/images/guidedmode.png) --- ## Target Audience **Student** --- ## Features #### Course‑to‑Mentor Linking Connect any course (e.g., Engineering Computations) to an existing mentor with just a few clicks in the Admin Panel. #### Three Guide Styles - **Spaced Repetition** – repeats quiz questions until answered correctly, optimizing retention - **Instruction Mode** – teaches a concept, then immediately checks understanding with a follow‑up question - **Socratic Mode** – drives step‑by‑step dialogue, prompting deeper reflection and self‑explanation #### Dynamic Learner Prompts mentorAI adapts prompts on the fly—e.g., > “Can you explain loops in Python?” → “Quiz me on loops.” —based on each learner’s performance. #### Automated Mastery Tracking Every attempt, repetition count, and review interval is logged, giving a precise picture of concept mastery over time. #### Instructor Dashboards View the exact questions asked, how many times they repeated, last/next review timestamps, and which student/mentor/course they belong to. --- ## How to Use (step by step) #### Connect a Course to a Mentor - In the **Admin Panel**, open **Memory Component Settings** - Choose the target course (e.g., **Engineering Computations**) - Link it to an existing mentor (e.g., **Engineering Comp Mentor**) - Select a **Catalog Guide**—typically the same course content #### Pick a Guide Style - Choose **Spaced Repetition**, **Instruction Mode**, or **Socratic Mode** - *(Demo‑ready setup: Spaced Repetition + Instruction Mode)* #### Launch Learner Session - Students open the mentor chat and type **“Start lesson.”** - mentorAI explains a topic, then poses a question - Incorrect answers trigger future repetitions (**Spaced Repetition**) until the learner answers correctly #### Observe Adaptive Prompts Guided prompts evolve with learner progress, e.g., - “Can you explain loops in Python?” - “Quiz me on loops.” #### Review Instructor Insights In the Admin dashboard, inspect logs showing: - Exact question text - Repetition count and interval spacing - Last/next review times - Linked student, mentor, and course #### Iterate & Refine - Use analytics to **spot learning gaps** - Adjust guide style or content as needed for **better outcomes** --- ## Pedagogical Use Cases #### Mastery‑Based Quizzing Combine **Spaced Repetition** with course quizzes to ensure students answer correctly multiple times over increasing intervals—cementing long‑term retention. #### Micro‑Lessons with Immediate Checks Use **Instruction Mode** to present a short explanation (e.g., Python loops), then instantly ask a question to verify understanding. #### Socratic Dialogues for Critical Thinking Activate **Socratic Mode** in philosophy or ethics courses to guide students through reasoned arguments via probing questions. #### Personalized Remediation Learners who miss key questions get automatic repeats at spaced intervals, allowing struggling students to catch up without manual instructor intervention. #### Data‑Driven Course Adjustments Instructors analyze repetition logs to identify common pain points—then add resources or adjust lectures where many students falter. --- By linking courses, selecting a guide style, and reviewing the built‑in analytics, **Guided Mode** empowers instructors to deliver **adaptive, evidence‑based instruction**—while learners enjoy an **engaging, feedback‑rich chat experience** that drives true concept mastery. ### Transcript mentorAI | Guided Mode https://www.youtube.com/watch?v=txmcwbxPsOs [1.0s] I'm going to be showing you how guided [2.6s] mode for mentor AI works on our [5.1s] platform. The first piece here is having [7.5s] a course that you're going to connect to [8.9s] your mentor. Here we have an engineering [11.1s] computations course. And just so you can [13.1s] kind of get an idea of what this looks [14.5s] like in terms of the course content, [16.6s] we're going to go into the first module [18.2s] here. [20.1s] So this is just a basic intro module [22.6s] unit that's called, you know, is Python. [25.1s] it just kind of goes into the beginnings [26.9s] of what Python is, how it's used, so on [29.7s] and so forth. Um, and the rest of the [32.1s] course continues to build upon this. [34.6s] Now, the next piece here is actually [38.0s] going into our [40.5s] admin panel into the memory component [42.9s] settings. I've already set this one up [44.8s] to be guided. You can see that I've [46.7s] connected it to this engineering comp uh [49.4s] mentor that we already have created and [51.7s] a catalog guide which is actually the [53.3s] course we were just looking at. Now [55.4s] there are a couple of different options [58.1s] in terms of what type of guide mentor [60.2s] you can have. Um we have space [62.1s] repetition which will repeat quiz [63.7s] questions based on the space repetition [65.8s] algorithm. Meaning if you ask or the [68.9s] mentor asks the learner a question, the [70.6s] learner gets it wrong. Later down the [72.9s] road, the mentor is going to repeat that [74.5s] same question until they get it right. [76.3s] Um if there are no issues with the way [79.7s] the learner answered that question, then [81.5s] you know the questions will change as [83.4s] they continue throughout their learning [85.0s] journey. Um as the me memory of the [87.8s] mentor knows where that learner is at. [89.7s] It can kind of customize the questions [92.3s] um to exactly where the learner is in [94.8s] their learning journey. [96.8s] Instruction mode is going to teach uh [99.0s] the user about the questions before [100.9s] actually giving a question. So it'll [102.7s] give a little bit of information and [104.2s] then ask a question based off that [105.8s] specific topic. Now Socratic mode is [108.9s] going to be more discussion focused. Uh [111.2s] the mentor is going to talk about the [112.6s] question with a with the learner um and [115.1s] a step-by-step approach. For the [117.4s] purposes of this demo, we're going to do [118.6s] space prepetition in instruction mode. [121.0s] And we can actually hop over to the [123.0s] mentor. [132.0s] So now that you've seen the guided [133.5s] mentor configuration process, we're [135.5s] actually now going to chat with that [137.2s] guided mentor. This is really helpful [140.0s] for learners who need a little push to [142.2s] get, you know, further in their course [143.9s] journey, their learning journey, etc. [147.0s] you know, it's going to ask them [148.5s] questions and provide information [150.2s] alongside of it because we've set up [151.4s] that space repetition. And if I get and [153.8s] also that instruction mode, if I'm [156.4s] getting these questions wrong, as I [158.6s] continue my conversation with this [159.8s] mentor, it's going to actually ask me [161.2s] the same questions over and over again. [163.0s] This is really helpful when it comes to [165.2s] addressing learning gaps for the learner [166.8s] and proactively teaching them. So, to [169.6s] start the lesson, I'm just going to tell [170.9s] the mentor, start lesson. [175.4s] Now it's giving me um a little bit of [177.4s] information but also asking me to answer [180.0s] a question to get started here. Um let's [182.9s] see. So it's asking me for the primate [184.7s] what programming environment is [186.2s] primarily used for interactive computing [188.6s] in scientific disciplines etc etc etc. [192.2s] All right so I'm going to go with [198.2s] I'm not quite sure about the answer. I'm [199.8s] just selecting one here. Um, and what [203.0s] you can see is the mentor is actually [204.8s] telling me, you know, it's a good try, [206.7s] but you know, this is actually what the [208.6s] correct answer is. And now I'm [210.6s] revisiting that question because I did [212.1s] get it wrong. [214.0s] Um, so [216.8s] now that I'm not quite sure still, I'm [219.4s] just going to keep guessing. Um, [224.0s] I'm just going to put B this time. [229.5s] This time I got it correct. Um, and it's [232.0s] asking me if I want to go into another [233.6s] topic or another question. I'm going to [236.4s] say, can I can you explain loops in [238.4s] Python? I'm just selecting one of the [240.3s] guided prompts, which are AI generated [242.0s] prompts based off of the conversation [243.8s] history with this mentor. [248.6s] All right. So, we get that little bit of [250.1s] explanation that I asked for. So, I'm [253.2s] going to say quiz me on loops. [264.2s] All right. And again, I'm just going to [266.2s] put in a quick answer so we can see what [268.2s] the response looks like. [272.9s] All righty. So, you can see that this [274.6s] really helps with, you know, getting [276.9s] learners very much engaged with the [278.7s] materials. It's going to be asking them [280.6s] questions, making them think critically [282.6s] as they continue to have conversations [284.5s] with this mentor. [286.7s] A really cool piece about guided mentors [288.9s] is as as an admin, you're actually going [291.1s] to be able to see what questions your [292.8s] learners are being asked, how well [294.1s] they're performing on them. You can see [296.1s] that we're taking a look at the question [297.6s] cards that I was personally asked uh [299.9s] while communicating with this mentor. [302.6s] And if we select just one of these guys, [306.8s] you can see this is the same question [308.3s] that I was asked while chatting with the [310.1s] mentor. This really helps admins or [313.0s] instructors understand you know where [315.6s] learner gaps are where what kind of [317.9s] questions are causing issues in terms of [320.5s] you know understanding [322.4s] uh you know wherever those learning gaps [324.2s] might be. This really does help identify [326.0s] them. You can see how many times the [327.4s] question was repeated, what the correct [328.9s] answer is, um how many intervals between [331.4s] the last time that question was asked [333.0s] and when it was asked again. [335.9s] And you can see again when it was last [337.9s] reviewed, next review, the specific [340.2s] student, the mentor, and the course [342.6s] associated with it. So this really helps [345.2s] instructors and addins get a higher [347.3s] level understanding of exactly what [350.0s] their learners need to understand [351.5s] better, where they're failing, where [353.4s] they're doing really great. uh and so on [356.2s] and so forth. You really have a lot of [358.3s] visibility into what's going on with [359.9s] your learners, which helps you as an [362.2s] instructor provide them a better [364.0s] learning experience. --- ## Image Generation URL: https://ibl.ai/docs/student/corefeatures/imagegeneration Video: https://www.youtube.com/watch?v=VIYWKbfKGWY # Image Generation ## Description The Image Generation tool lets you create custom images on demand—simply describe what you want in the mentorAI chat, and the AI returns a picture in seconds. It’s perfect for visualizing ideas, illustrating projects, or adding graphics to assignments without leaving the learning environment. --- ## Target Audience **Student** --- ## Features #### Prompt‑Based Creation Type a natural‑language description (e.g., > “Generate an image of a happy puppy” ) and mentorAI produces a matching image. #### Instant In‑Chat Display Generated images appear directly **below the AI’s response**—no downloads or external sites required. #### Tool‑Toggle Status The Image Generation tool can be switched **on or off** by instructors; students can **verify it’s enabled at a glance**. #### Iterative Refinement If the first result isn’t perfect, you can send **follow‑up prompts** (e.g., > “Make the background blue” ) and mentorAI will adjust the image. --- ## How to Use (step by step) #### Confirm the Tool Is On In your mentor’s sidebar, ensure **Image Generation is toggled On**. If it’s off, contact your instructor to enable it. #### Describe the Image Click the **lower prompt box** and type your request, such as: > Generate an image of a happy puppy. #### Send the Prompt Press **Send** (or hit **Enter**). #### View the Output mentorAI quickly creates and **displays the image right in the chat window**. #### Refine if Needed Add **follow‑up instructions** (e.g., > “Add a ball in the puppy’s mouth” ) to generate a new version. --- ## Pedagogical Use Cases #### Concept Illustration Generate diagrams or visuals to accompany explanations—e.g., > “Create a labeled diagram of the water cycle.” #### Creative Writing Support Produce scene images that match a story excerpt, helping visualize narrative settings or characters. #### Project Prototyping Mock up design ideas for engineering, architecture, or art projects without specialized software. #### Language Learning Ask for images matching vocabulary words (e.g., > “Show a bustling marketplace” ) to reinforce word meanings. #### Presentation Enhancements Quickly create unique graphics for slides, posters, or reports when stock images don’t fit the topic. --- Whenever you need a new visual, just write a **clear description** and let **mentorAI render it for you**—fast, simple, and right inside the chat. ### Transcript mentorAI | Image Generation Tool Demo https://www.youtube.com/watch?v=VIYWKbfKGWY [0.2s] Hey, what's going on? This is Jeremy at [1.6s] IBLAI and today I'm showing you one of [3.8s] our newest features, our image [5.8s] generation tool. So, coming into the [9.0s] mentor, [11.0s] we can see [13.3s] that the image generation tool has been [15.8s] toggled on and I'm going to ask it to [19.8s] create or rather generate an image of a [24.8s] happy puppy. [35.4s] And very quickly, the mentor ju does [37.8s] just that. So for any input that you [41.1s] would put in here to create an image, [43.6s] you would just enter the desired image [46.2s] description here in the lower prompt [48.2s] tab, hit send, and very quickly, the [51.8s] mentor will put that image together for [53.6s] you. Thank you so much for taking the [55.9s] time to view this, and [58.1s] As always, learn more at ibl.ai. --- ## In-Chat Uploads URL: https://ibl.ai/docs/student/corefeatures/inchatuploads Video: https://www.youtube.com/watch?v=CFah7RpmIeM # In-Chat Uploads **Instructor Experience:** **Student Experience:** ## Description The In‑Chat Uploads feature lets you send assignments — essays, lab worksheets, slide decks, problem sets, and more — directly to mentorAI inside the chat. The mentor analyzes the file, provides a grade plus detailed feedback, and (when configured) forwards the score to your instructor’s gradebook. No extra portals or email attachments needed. --- ## Target Audience **Student** --- ## Features #### One‑Click File Upload Use the **paper‑clip icon** to attach documents, spreadsheets, images, or PDFs right in the chat window. #### Instant AI Grading & Feedback mentorAI evaluates your submission against the instructor’s rubric and returns a **score with actionable comments**. #### Gradebook Integration (Instructor‑Configured) When enabled, the score is **automatically recorded** in your course gradebook, saving everyone time. #### Iterative Review Re‑upload **revised drafts** to receive updated scores and guidance before final submission. #### Versatile Assignment Support Works for **essays**, **coding notebooks**, **lab sheets**, **slide presentations**, and other file types your course accepts. --- ## How to Use (step by step) #### Open Your Mentor Chat Navigate to the **mentorAI chat** for your course. #### Upload Your File Click the **paper‑clip icon** and select your document (e.g., Finance assignment on evaluating risk and returns). #### Request a Review After the upload completes, type a prompt such as: > Please provide me with a score for my assignment. #### Send the Message Press **Send** (or hit **Enter**). #### Receive Feedback mentorAI analyzes your work and replies with: - **Score / grade** (rubric‑based) - **Strengths and areas for improvement** - **Tips for boosting your score** on the next draft #### Revise & Resubmit (Optional) Make improvements, upload the **new version**, and ask mentorAI for an updated evaluation. --- ## Pedagogical Use Cases #### Rubric Alignment Checks Confirm your assignment meets each **rubric criterion** before final submission. #### Citation & Formatting Feedback Upload essays to get pointers on **citation style** (APA, MLA) and **structural coherence**. #### Lab Worksheet Verification Submit completed lab sheets to **validate calculations** and receive suggestions for **clearer explanations**. #### Coding Notebook Review Send **Jupyter or Colab notebooks** for code correctness, style guidance, and performance notes. #### Iterative Draft Improvement Repeatedly upload drafts to **refine arguments, visuals, or data analysis** based on AI feedback. --- With **In‑Chat Uploads**, you receive **immediate grading and advice**, helping you spot improvements and keep your learning progress on track — all without leaving the **mentorAI conversation**. ### Transcript mentorAI | One Click In Chat File Uploads https://www.youtube.com/watch?v=66PlfmIButM [0.3s] Hey, what's going on? This is Jeremy at [1.7s] IBLAI, and today I'm showing you one of [4.0s] our coolest features, the uh one-click [7.6s] in chat file uploads. Basically, this [10.4s] feature allows you to feed a file from [13.6s] your computer. Um, this can be an [16.2s] assignment, uh maybe a work in progress, [20.2s] uh a resume, notes from a specific uh [24.1s] area that you're working on. you can [26.3s] feed them directly to the mentor that [28.2s] you're working with for context and then [30.4s] get specific feedback on those files in [33.6s] real time. So in order to showcase this [37.0s] uh what we're looking at here is our job [39.2s] interview and resume skills agent on the [41.5s] mentor AI platform. But again this [44.5s] feature can be used with any of the [45.9s] agents that we have on the mentor AI [47.8s] platform, any of the agents that you [49.8s] create and any of the use cases for [52.4s] these agents. So, just to show you what [54.7s] this looks like in real time, I'm going [57.0s] to submit a um example resume that I [60.8s] just threw together to get a little bit [62.7s] of feedback and see how the mentor is [65.0s] able to access the file that I upload in [67.3s] chat. So, in order to upload a file, [70.6s] you're going to navigate down here to [72.2s] the lower left hand corner in the chat [74.5s] box where you see this little plus sign. [77.0s] Hit that and then that's going to upload [79.0s] the prompt to upload a file. So, we're [80.9s] going to click that again. I'm just [82.7s] going to quickly search resumeé, attach [85.7s] mine, and then just ask a few basic [89.3s] questions. I've uploaded [92.4s] my resume as a PDF. [96.5s] Please tell me how I can improve it to [101.6s] be a standout [104.2s] candidate. [106.1s] [snorts] [106.9s] So, the mentor is going to take the [108.8s] prompt that I've given it as well as the [110.3s] file that I've shared with it. um in [112.6s] order to give me some very concrete and [115.5s] specific feedback on that file instead [118.0s] of the kind of general information that [119.8s] you might otherwise receive uh from, you [123.2s] know, a public LLM that isn't able to [125.7s] directly access your files. So, here we [128.9s] have a comprehensive overview. You know, [131.8s] my professional summary, maybe I should [133.5s] be more specific, tailor it a little bit [135.8s] more to each different job that I may be [138.8s] applying to. [140.4s] um include a little more in my [142.6s] professional experience as well, right? [144.2s] So, use some action word word action [146.9s] verbs, quantify achievements, focus on [149.7s] outcomes, um you know, update my [152.3s] education track, basically anything that [155.0s] will improve my resume and uh basically [158.3s] make me a standout candidate. And it was [160.2s] able to do all of this by directly [162.0s] accessing the file that I have uploaded. [164.0s] So, this is just a very quick run [166.5s] through. Again, this can be for feedback [168.8s] on your assignments, feedback for uh a [171.8s] project that you're writing, perhaps uh [174.0s] an article that you're writing, or even [176.0s] a spreadsheet that you've put together [177.4s] and kind of want to analyze data a [179.2s] little more accurately. Any use case you [181.4s] can think of, this feature can be used. [183.4s] So, thanks so much for tuning in. As [185.7s] always, DM us or visit iblai/cont [190.3s] to learn more and experience this [192.3s] feature firsthand. --- ## Phone Calls URL: https://ibl.ai/docs/student/corefeatures/phonecalls Video: https://www.youtube.com/watch?v=9NKX8KJJ184 # Phone Calls ## Description The Phone Calls feature lets you converse with mentorAI by voice—hands‑free and in real time. Simply tap the microphone icon to start a voice session and speak your questions. mentorAI responds aloud, creating a natural, human‑like interaction that supports accessibility, language practice, and on‑the‑go learning. ![](/images/phonecalls.png) --- ## Target Audience **Student** --- ## Features #### Human‑Like Voice Interaction Chat with mentorAI as if you’re on the phone with a tutor—no typing required. #### Accessibility Support Ideal for visually‑impaired learners or anyone who finds reading difficult; listen instead of reading long text. #### Hands‑Free, On‑The‑Go Learning Ask multiple questions while commuting or multitasking, making study time more flexible. #### Language Development & Pronunciation Practice Hear correct pronunciation and practice speaking in a foreign language with immediate feedback. #### Lower Cognitive Load Listening to answers reduces screen fatigue and makes complex content easier to absorb. --- ## How to Use (step by step) #### Open Your Mentor Chat Navigate to the mentorAI chat for your course. #### Start a Voice Session Click the **microphone icon** (voice‑input button). A **phone‑call–style window** opens, and the mentor greets you. #### Ask Your Question Aloud Speak naturally—for example: > Hi mentorAI, can you help me practice ordering food in Spanish? #### Listen to mentorAI’s Response The mentor replies with spoken guidance and may prompt you to respond or practice. #### Switch Topics Anytime Simply ask a new question, such as: > Can you help me understand the difference between mitosis and meiosis? #### Request a Quiz Question Example: > Could you give me a quiz question on World War II Answer aloud; mentorAI confirms or corrects you. #### End the Session Say something like: > That’s all for now, thanks mentorAI. Then close the call window. --- ## Pedagogical Use Cases #### Language Practice Role‑play ordering food, booking travel, or simple conversations to improve pronunciation and fluency. #### Concept Explanations on the Go Get quick clarifications while commuting—no need to read lengthy text. #### Accessibility for Diverse Learners Support students with visual impairments or reading difficulties by delivering content audibly. #### Rapid Quiz Drills Request oral quiz questions to reinforce facts (e.g., history dates, biology terms) and answer verbally for immediate feedback. #### Reduced Screen Time Listen to explanations of complex theories, reducing eye strain and cognitive overload during long study sessions. --- With **voice‑to‑voice enabled**, mentorAI delivers rapid, conversational help that feels **personal** and is **accessible wherever you can speak and listen**—making learning easier, faster, and more engaging. ### Transcript mentorAI | Phone Calls https://www.youtube.com/watch?v=9NKX8KJJ184 [0.9s] Mentor AI allows you to have [2.9s] voice-tovoice interactions with a mentor [5.0s] like you were having a conversation over [6.9s] the phone with a friend. So this [9.5s] basically means that there's going to be [11.0s] a more natural humanlike interaction for [13.1s] the learners when they communicate with [14.9s] voice to the mentor. You can have that [16.9s] kind of back and forth you would in a [18.6s] traditional conversation with another [20.4s] person. This also improves um [23.2s] accessibility to the mentor platform um [25.6s] as a whole as this allows people who [28.1s] might be visually impaired, have reading [30.0s] difficulties or any sort of other [32.4s] challenges that prevent them from being [34.0s] able to read a whole bunch of text, um [36.7s] they can have these kinds of [37.8s] conversations with the mentor via voice [40.2s] instead. It also allows for faster [42.9s] communication and more onthe-go style [45.0s] learning [46.7s] um because you can ask, you know, a [48.1s] whole lot of things at one time and just [50.2s] keep speaking with the mentor. Like I [52.5s] said, as if you're having a conversation [54.0s] with like a tutor. [56.2s] This can also help you with things like [58.5s] language development. you know, you can [60.1s] ask the mentor how to pronounce certain [62.2s] things if, you know, you're learning a [63.8s] new language or are struggling um with [66.8s] like a scientific term that's a little [68.3s] bit more complex um for the average [70.5s] reader. Um this also helps lower [73.4s] cognitive load because learners won't [75.2s] have to read and process long blocks of [77.4s] text. All they have to do is simply [79.0s] listen and focus on understanding the [80.6s] material. [82.5s] That being said, let's hop into seeing [84.2s] what this looks like. [86.4s] So, if we want to chat with the mentor, [88.9s] we'll click on the voice input button [90.9s] here. [93.7s] And you'll see that this pops up a phone [95.8s] call. [97.6s] Now, I'm going to start chatting with [98.6s] the mentor. [101.4s] Great. IBLAI's mentor AI is designed to [105.8s] provide 247 support acting as a personal [109.3s] tutor or teaching assistant based on the [111.6s] material provided by your institution. [114.1s] Feel free to ask questions or engage in [116.9s] learning paths and mentor AI will assist [120.2s] you. If you have any specific questions [123.2s] or need further guidance, let me know. [127.3s] Hi, Mentor AI. Can you help me practice [129.4s] ordering food in Spanish? [133.0s] Of course. Let's practice ordering food [135.9s] in Spanish. I'll start by giving you an [138.2s] example and then you can try ordering [140.5s] something yourself. Example. [145.8s] I would like a pizza, please. Now you [148.7s] can try ordering something. What would [150.7s] you like to order? [158.1s] Perfecto [161.6s] means I would like an ice cream, please. [164.8s] Would you like to continue practicing or [166.8s] try ordering something else in Spanish? [169.8s] No, thank you. I do have another [171.5s] question for you about a different [173.0s] topic. [174.6s] Can you help me understand the [177.0s] difference between mitosis and meiosis? [181.2s] Certainly. Mitosis and meiosis are two [184.6s] different types of cell division. [186.8s] Mitosis is the process by which a single [189.2s] cell divides to produce two genetically [192.4s] identical daughter cells, each with the [195.0s] same number of chromosomes as the parent [197.2s] cell. It is essential for growth, [200.2s] repair, and asexual reproduction. [203.3s] Miosis, on the other hand, is a [205.7s] specialized form of cell division that [208.2s] produces four genetically diverse [210.7s] daughter cells, each with half the [213.1s] number of chromosomes of the parent [215.0s] cell. It's crucial for sexual [217.4s] reproduction, and ensures genetic [219.8s] diversity in offspring. Would you like [222.6s] more details on either process? No, [225.8s] thank you. I do have a history test [228.0s] tomorrow. Could you give me a quiz [229.7s] question on World War II? [233.0s] Sure. Here's a quiz question for your [235.3s] World War II test question. Which event [238.5s] is commonly considered the starting [240.2s] point of World War II? A. The bombing of [243.4s] Pearl Harbor. B. The signing of the [245.8s] Treaty of Versailles. C. The invasion of [248.5s] Poland by Germany. D. The Battle of [251.2s] Britain. What's your answer? C. [257.3s] Correct. The invasion of Poland by [260.1s] Germany in 1939 is commonly considered [263.7s] the starting point of World War II. If [266.8s] you need more questions or further [268.7s] assistance, feel free to ask. That's all [272.4s] for now. Thanks, Mentor AI. [276.2s] And basically, that's all you need to do [277.7s] to have a conversation. Uh it's very [279.6s] similar to, you know, how you would [281.4s] message the mentor or have a [282.8s] conversation with another person. Um and [285.4s] yeah. --- ## Screen Share URL: https://ibl.ai/docs/student/corefeatures/screenshare Video: https://www.youtube.com/watch?v=eimzO8YJ5nc # Screen Share ## Description The Screen Share tool lets mentorAI “see” the browser tab or window you choose to share. By viewing your screen in real time, the mentor can provide context‑aware, step‑by‑step guidance—perfect for technical demos, troubleshooting, or walking through tasks like building a new AI mentor. --- ## Target Audience **Student** --- ## Features #### Real‑Time Context Awareness mentorAI watches your shared screen, understands what you’re seeing, and tailors instructions accordingly. #### Click‑by‑Click Guidance Receive precise directions (“Click the Create + button…”) as you navigate complex workflows. #### Live Confirmation & Troubleshooting Ask questions like > “Did my resource finish training?” and get immediate reassurance or corrections. #### On‑Demand Tool Toggles Screen sharing pairs seamlessly with other mentor tools (**Web Search**, **Code Interpreter**, **Image Generation**, etc.), which you can enable or disable as needed. #### Privacy Control You choose exactly which window or tab to share and can stop sharing at any moment. --- ## How to Use (step by step) #### Ensure Screen Sharing Is Enabled Verify that your mentor’s **Screen Sharing** toggle is **On** (in mentor settings). #### Open a Chat Session Start chatting with the mentor you want to share your screen with. #### Start Screen Share - Hover over the **desktop‑screen icon** in the lower‑right corner of the prompt box - Click the **icon** - Select the **browser tab or window** you wish to share and approve the prompt #### Interact & Receive Guidance The mentor greets you and begins **context‑specific help**. Follow its spoken or written instructions as it “watches” what you do. #### Ask Questions in Real Time Example: > Did my resource finish training? The mentor checks the on‑screen status and confirms instantly. #### Stop Sharing When Done Close the screen‑share window or click **Stop Sharing** to end the session. --- ## How to Use (Screensharing Transcript) #### Purpose After a **screen-sharing session** with the mentor (e.g., in a course side panel), the **voice conversation transcript** appears in the chat automatically once screen sharing ends. #### Steps 1. **Start screen sharing** and choose what to share (e.g., *Full Window*). 2. **Speak with the mentor.** **You:** “Can you see my screen?” **Mentor:** “Yes, I can see your screen. You are on a page titled *War of 1812*…” **You:** “Great. Can you summarize what you’re seeing on the page?” **Mentor:** *(Summarizes the page:)* - Causes: British impressment, support for Native American attacks - Impacts: Boosted American nationalism, weakened Native resistance - Confirms U.S. independence 3. **Stop screen sharing.** 4. **Refresh the page.** The mentor’s **on-call description** from the screen-sharing session now appears in the **chat transcript**, allowing you to reference what the mentor observed even after sharing ends. --- ## Pedagogical Use Cases #### Technical Walkthroughs Get live, click‑by‑click assistance when creating a new AI mentor, configuring datasets, or adjusting model settings. #### Coding & Debugging Sessions Share an **IDE tab** so mentorAI can point out syntax errors or suggest refactors as you code. #### Software Demonstrations Let the mentor observe your use of **design tools**, **spreadsheets**, or **simulations** and provide targeted tips. #### Lab Procedure Guidance Stream a **virtual‑lab environment**; mentorAI ensures you follow each step correctly and safely. #### Accessibility Support Learners who struggle with written instructions benefit from **visual, real‑time guidance** accompanied by **voice or text explanations**. --- With **Screen Share** plus mentorAI’s **context awareness**, you receive **immediate, personalized help—click by click**—making onboarding, troubleshooting, and complex tasks faster and clearer. ### Transcript mentorAI | Screensharing | In-Chat Transcripts https://www.youtube.com/watch?v=dSJO3_jjAc4 [1.9s] When screen sharing with the mentor um [4.6s] on a given page, this could be embedded [6.3s] in a course like you see here with the [7.9s] side panel. We now show the chat [10.3s] transcript, the conversation you have [12.0s] voice to voice with the mentor inside [13.8s] the chat after screen sharing ends. So [16.7s] this is what that will look like. First, [18.3s] we'll do screen sharing. And then I'm [20.9s] just going to select what I want to [21.9s] share, which I'm going to share my full [23.1s] window. [25.2s] >> Can you see my screen? [28.3s] Yes, I can see your screen. You are on a [30.9s] page titled War of 1812, which discusses [33.6s] the causes and impacts of the war. [37.3s] >> Great. Can you summarize what you're [39.0s] seeing on the page? [42.4s] >> The page discusses the War of 1812, [44.9s] which was caused by American anger over [47.1s] British impressment of sailors and [49.1s] support for Native American attacks in [51.0s] the American Midwest. It also mentions [53.1s] that the war boosted American [54.8s] nationalism but weakened Native American [57.3s] resistance and confirmed the US as an [59.4s] independent nation. [63.8s] All right. Now I'm going to stop screen [65.3s] sharing and once we refresh, [69.4s] see [78.2s] you can see that the page uh description [80.6s] that they gave us during the screen [82.2s] sharing now appears here. --- ## Socratic Mode URL: https://ibl.ai/docs/student/corefeatures/socraticmode Video: https://www.youtube.com/watch?v=1_E4rFHK-No # Socratic Mode ## Description **Socratic Mode** helps mentors teach **thinking—not just answers**. Instead of delivering full solutions, mentorAI gives a concise nudge and follows up with guiding questions that drive **analysis, application, and comparison**. It’s ideal for universities shifting away from rote memorization toward deeper **critical-thinking skills**. --- ## Target Audience **Administrator · Instructor** *(students benefit, but do not configure)* --- ## Features #### Structured Replies by Design Each response gives a **1–3 sentence high-leverage insight**, then **2–4 guiding questions** to push reasoning forward. #### Source-Aware Prompts (Optional) When paired with **Document Retrieval**, replies can cite the exact section/page (e.g., *“Intro to Political Science — Ch. 3, civil liberties”*) and offer hints or case studies from the text. #### Simple Setup In any mentor’s **System Prompt**, set the interaction style to *“Socratic by default.”* Save changes—edit anytime. #### Starter Templates for Students Add **built-in question starters** so learners can begin even when they’re unsure how to ask. #### Instructor Control, Learner Safety Admins/instructors refine **tone, question patterns, and citation behavior**; students cannot alter configuration. --- ## How to Use (step by step) #### Open the Prompt Editor - Click the mentor’s name → **Prompts → System Prompt** #### Set the Socratic Style Add language such as: > Core interaction style is Socratic by default. In each reply, share a brief (1–3 sentence) high-level insight, then ask 2–4 guiding questions that prompt the student to reason, apply, compare, or connect concepts. Avoid full, exhaustive answers by default. Offer hints or case studies on request. Cite sources when available. Click **Save**. #### (Optional) Pair with Document Retrieval Enable **Document Retrieval** so replies cite specific sections/pages and the **Source Panel** lists the documents used. #### Add Starter Templates In **Prompts**, create **Suggested/Guided starters** to help students begin. #### Test the Experience (Student View) - Use (for example) a **Political Science 101 mentor** trained on an OpenStax text - Ask: > “What factors impact civil liberties in democracies?” - Expect a concise **1–3 sentence overview** plus guiding questions like: - How might changes in public opinion influence the protection of civil liberties? - In what ways can judicial decisions enhance or restrict civil liberties? - If **Document Retrieval** is enabled, the reply may cite a specific chapter/section. #### Continue with Follow-Ups - Example: > “Can you explain political legitimacy further?” - The mentor again provides a **high-level nudge**, cites source(s) if enabled, and follows with **2–4 new guiding questions** #### Iterate & Refine - Adjust the **System Prompt wording, tone, or number of guiding questions** as you review outcomes in **History** --- ## Pedagogical Use Cases #### Critical-Thinking Drills Replace answer-dumping with **guided questioning** that elicits reasoning, evidence use, and counter-examples. #### Compare/Contrast Exercises Prompt students to **weigh competing theories, cases, or policy choices** (e.g., civil liberties trade-offs). #### Application to Case Studies Pair with **Document Retrieval** so students jump into a cited passage/case and **apply concepts to real examples**. #### Formative Assessment Without Spoilers Give **hints + questions** instead of full solutions; learners articulate steps and **self-correct**. #### Writing & Argumentation Use guiding questions to structure **thesis, evidence, and rebuttals** while avoiding over-scaffolding. #### Across Disciplines Works for: - **Humanities** (interpretation/ethics) - **Social Sciences** (institutions/legitimacy) - **STEM** (conceptual understanding before computation) --- **Socratic Mode** turns mentorAI into a **coach for thinking**: concise cues, targeted questions, and source-aware nudges that help students build **durable understanding**—without reverting to rote memorization. ### Transcript mentorAI | Socratic Mode Demo https://www.youtube.com/watch?v=1_E4rFHK-No [0.3s] Hey, what's going on everybody? This is [1.9s] Jeremy at IBLAI and today I wanted to [4.9s] show you one of the features that comes [6.8s] up most frequently when we're talking [8.5s] with people about how to best implement [11.1s] AI at the university level and that is [15.1s] focus on critical thinking. So most of [18.5s] us who work within education especially [20.8s] public education are aware that [23.5s] promoting wrote memorization has been [26.1s] you know pretty popular over the past 15 [29.3s] 20 years at a detriment to highquality [32.0s] learning. Now there's a shift back [34.2s] towards focusing on critical thinking [36.6s] and promoting a rich deep dive into [41.5s] knowledge instead of just memorizing [43.9s] core facts that you can regurgitate [46.2s] later. So a very common question that we [49.0s] get is how through these mentors are you [53.0s] able to promote critical thinking and [55.4s] the way that we are able to do this is [57.7s] through Socratic mode so to speak. What [61.8s] I mean by this is within the system [64.6s] prompt for each one of these mentors, [66.4s] which you can think of as the [67.8s] instruction manual, you can kind of [71.3s] design the mentor's personality, its [74.0s] style of communication, and more. So, [76.6s] within the system prompt, you can let [78.2s] the mentor know that you shouldn't be [80.6s] sharing an entire answer with students. [83.0s] That's not helping them. That's actually [84.6s] hurting them in the long run. What you [86.6s] should instead do is provide a piece of [89.6s] the information that they're looking for [91.7s] and then follow that up with a series of [93.6s] prompting guiding questions to allow [97.0s] students to kind of continue down that [98.8s] knowledge path on their own and build on [101.4s] that foundation of knowledge instead of [103.7s] just memorizing, [106.3s] you know, rope facts. So, what I'm [109.0s] showing you here is a political science [111.0s] 101 mentor. It's trained on the open [115.5s] stacks uh open resource introductory to [118.6s] political science textbook [121.0s] and within the system prompt [123.9s] I will show you [126.6s] basically how we set this up. So the [128.6s] core interaction style should be [130.5s] socratic by default meaning that the [132.6s] mentor should not provide full [134.5s] exhaustive answers by default. In each [137.7s] reply, share a small high lever piece of [140.8s] the answer, no more than one to three [142.8s] sentences, and then add two to four [145.1s] guiding questions that prompt the [146.6s] student to reason, apply, or compare [149.4s] concepts within the system prompts for [151.9s] the mentors that we have. You can come [153.5s] back and make any changes at any time [155.3s] that you see fit just by editing what's [157.8s] already there, uh, copying and pasting [160.3s] something that you may have written [161.7s] somewhere else, uh, or just deleting [164.1s] portions of the system prompt that [165.4s] aren't working for you. So, if you don't [169.0s] have a Socratic style communicating [171.8s] mentor, it's as simple as just going [173.9s] into the system prompt and letting it [175.8s] know that that's how it should [177.1s] communicate with students. Once we've [179.5s] saved that, [181.4s] the mentor now knows that's how I'm [183.8s] going to communicate. So, what we have [185.8s] down here are a number of starter [188.3s] templates, so to speak. These are [190.3s] questions that students can just select [192.2s] to kick off a conversation. If they come [194.3s] into the mentor and know that they need [197.0s] help, but don't really know where to get [198.6s] that conversation started, this is just [200.9s] one way to kick that off. So, for the [203.7s] purpose of this demo, I'm going to [205.0s] select one of these. What factors impact [207.8s] civil liberties in democracies? [211.5s] So, the mentor like I uh have instructed [214.2s] it within the system prompt is going to [216.6s] provide that quick high-level one to [219.1s] three sentence overview response for the [222.0s] question that I asked, but then follow [224.0s] that up with a number of questions such [225.7s] as how might changes in public opinion [228.0s] influence the protection of civil [229.5s] liberties? In what ways can judicial [231.9s] decisions either enhance or restrict [233.7s] civil liberties? And more. And that's [235.9s] how it's going to communicate throughout [237.3s] this entire conversation. Basically, [240.2s] giving a piece of that information, [242.2s] citing within the textbook where it got [244.4s] that information, and then following it [246.6s] up with a number of guiding questions [249.1s] that kind of continue this conversation [251.0s] and allow the student to think on think [253.8s] of this on a deeper level and kind of [256.9s] develop their own knowledge. It will [259.2s] also ask if you'd like a hint on a [261.5s] relevant case study or an example from [263.5s] the text would which would just allow [265.6s] students to go directly into that [267.7s] section of the textbook and get more of [270.5s] that information that they're looking [271.8s] for. But just to show you one more [275.4s] example, I'm going to ask another guided [277.4s] question here. Can you explain political [279.4s] legitimacy further? And again, the [281.6s] mentor does exactly what I want it to [283.3s] do. It's citing where it pulled the [285.5s] information from, giving me a very [287.8s] quick, high-level overview for the [290.0s] question that I've asked, and then [291.6s] following that up with three guiding [293.8s] questions to continue that line of [295.7s] reasoning. Thank you so much for tuning [297.8s] in, and as always, learn more at ibl.ai. [300.3s] AI. --- ## Surveys URL: https://ibl.ai/docs/student/corefeatures/surveys Video: https://www.youtube.com/watch?v=bISbLUNoJTc # Surveys ## Description Surveys allow mentorAI to politely ask for your feedback at the right moment during a conversation—without interrupting your learning. Based on simple rules set by your instructor (for example, after a few messages or during a later chat session), the mentor shares a one-time survey link so you can help improve the experience. --- ## Target Audience **Student** --- ## Features #### Smart Timing Surveys appear only after specific conditions are met (e.g., after 5 messages or during your 3rd chat session). #### Non-Interruptive Delivery The mentor continues answering your question and adds the survey link at the end of a message, so your flow isn’t disrupted. #### One-Time Prompt Each survey link is shown only once per student—no repeated reminders. #### Optional Participation You can complete the survey, reply “done”, or “skip”. Either way, the mentor won’t ask again. #### Context-Aware Feedback Because the prompt is triggered after real interaction, your feedback reflects an authentic experience. --- ## How to Use (step by step) #### 1. Chat with Your Mentor Start a normal conversation and ask questions as usual. You may also click guided prompts to continue. #### 2. Reach the Survey Trigger After a certain number of messages or sessions (set by your instructor), the survey rule activates automatically. #### 3. Receive the Survey Link The mentor responds to your current question and then adds a short note like: > “If you have a moment, I’d appreciate your feedback.” A clickable survey link appears at the end of the message. #### 4. Choose What to Do - Click the link and complete the survey, **or** - Reply **“done”** if you’ve completed it elsewhere, **or** - Reply **“skip”** if you don’t want to participate. #### 5. Continue Learning No matter what you choose, the mentor will not show the survey again and the conversation continues normally. --- ## Pedagogical Use Cases #### Improving Mentor Quality Your feedback helps instructors fine-tune explanations, prompts, and learning resources. #### Course Experience Insights Surveys give instructors insight into how helpful the mentor is for real coursework. #### Low-Pressure Feedback Because the survey appears naturally within chat—and only once—it’s easy to respond without feeling interrupted or overwhelmed. #### Student Voice Even if you skip, the option ensures learners are invited (not forced) to share their perspective. --- Surveys are designed to be respectful, optional, and well-timed, giving you a voice while keeping mentorAI focused on what matters most—your learning. ### Transcript mentorAI | Learner Survey https://www.youtube.com/watch?v=bISbLUNoJTc [1.7s] I'm going to be walking you through how [4.2s] to have a mentor that can trigger a [7.0s] certain message at a certain time. Um [10.7s] that could be, you know, after five [12.6s] messages or within the third chat [15.1s] session um with a given mentor. In this [18.2s] case, we're using a survey link. And you [21.0s] can see here we just have our default [22.6s] system prompt um as per usual but we [26.1s] actually have added this survey rule um [29.0s] which is essentially telling the mentor [31.3s] at some point during the conversations [33.8s] you're having with a particular student [36.2s] uh you're going to provide them with a a [38.2s] survey link. Now, in this case, I have [40.7s] this set up to be a test mode. So, it's [43.6s] not going to be, you know, 30 messages [46.1s] down the road or, you know, after their [48.5s] fifth or sixth session with the mentor [50.8s] or conversation. [snorts] [53.0s] Um, just for test purposes, but that can [55.1s] obviously be modified um to your [57.3s] preference here. [snorts] And now in [59.1s] this case we are giving it a specific [61.9s] trigger condition saying you know once [64.0s] they've had a few messages again five [66.5s] messages or um you know during the third [69.7s] session in this case [72.1s] um that is when the learner will be [74.7s] prompted with the link to the survey. [77.6s] And another thing that we tell the [79.0s] mentor here is uh you know you're only [81.6s] sending the link to the survey one time [84.3s] to the specific student not multiple [86.6s] times. Um, you don't want them to [89.2s] re-evaluate the survey link. Um, once [91.8s] the mentor said this to the student, we [93.8s] don't mention it again. Um, and this is [97.0s] just a one-time event. [99.7s] Another thing we have here is the survey [102.2s] link. Um, just to include alongside, you [105.8s] know, the conversation being held. One [108.0s] of the things that happens here when a [110.4s] student is talking to the mentor is it's [112.9s] going to be at the end of a message to [115.2s] the to the student just essentially [117.4s] trying to prevent any sort of issues [119.4s] with [120.5s] >> [snorts] [120.6s] >> uh interruptions to to the actual [122.8s] conversation that they have. We don't [124.1s] want to you know distract them too much [125.8s] but we do want to encourage them to take [127.6s] this survey. [129.4s] Another thing here is that we say we can [131.1s] have the learner reply with done or skip [133.4s] if they decline. Uh but regardless of [136.6s] their response, the mentor should not [138.9s] prompt the user again with this link. [143.1s] Now let's dive into the learner side. So [146.2s] we'll just start asking some questions. [148.7s] I'll ask you know what is the powerhouse [153.5s] of the cell start. [159.3s] I'm just going to keep clicking on some [161.3s] of these guided prompts right here. Um, [164.2s] so that way we can just continue the [165.7s] conversation and get that survey link. [169.8s] So, I'm going to do three messages in [171.3s] this chat [173.4s] and then I'll start a new conversation. [177.7s] See, [190.6s] sorry. [207.0s] All right. Now, after going through a [208.4s] handful of questions, we finally get [210.6s] this survey link. And you can see here [212.4s] it says, "If you have a moment, I'd [214.3s] appreciate your feedback." It's not, you [216.9s] know, interrupting the conversation. It [219.3s] still answers the question at hand. Um, [222.0s] but we are able to see that we do get [224.2s] this link. Now, if we just keep chatting [226.2s] a few more times, I'll go through about [228.0s] five questions. [242.6s] All righty. So, that's about --- ## Web Search URL: https://ibl.ai/docs/student/corefeatures/websearch Video: https://www.youtube.com/watch?v=uhhTxbeYfQw # Web Search ## Description The Web Search tool equips a mentorAI with live internet access, allowing it to pull the most current information available—stock prices, breaking news, fresh research, policy deadlines, and more—directly into the chat. Instructors enable the tool from the mentor’s Tools tab; learners then ask open‑ended queries and receive up‑to‑date answers without leaving the platform. ![](/images/websearch.png) --- ## Target Audience **Student** --- ## Features #### Live Internet Queries mentorAI fetches the latest data—financial figures, leadership changes, release announcements, deadlines, etc.—in real time. #### MCP Integration (enabled by default) Connects to MCP server endpoints for proprietary or institution‑specific data alongside public web results. #### Tool Toggle Control Instructors decide when Web Search (and any other optional tools) are available to learners. #### Contextual Results Answers include citations or summaries so students can verify sources and dig deeper. #### Expandable Toolset Additional tools—**Code Interpreter**, **Image Generation**, **Screen Share**, etc.—can be enabled from the same **Tools** tab for richer mentor capabilities. --- ## How to Use (step by step) #### Open the Tools Tab - Click the **mentor’s name** in the header - From the drop‑down, select **Tools** #### Review Default Tools - **Web Search** → live internet queries (**ON by default**) - **MCP** → connects to MCP server endpoints (**ON by default**) - Toggle additional tools **on or off** as needed #### Enable or Disable Web Search - Flip the **switch** next to **Web Search** - Changes take effect **immediately** for that mentor #### Guide Learners on Usage (examples they can try) - “Search the web for current stock prices of Nvidia.” → Returns latest price (e.g., as of 25 June) - “What are the latest AI tools released this month?” → Lists recent AI releases - “Who is the current president of Mexico?” → Provides the current leader - “What’s the latest guidance on FA deadlines?” → Retrieves up‑to‑date financial‑aid information - “Compare tuition fees for online bachelor’s degrees in 2025.” → Supplies universities and costs (per credit, semester, or year) #### Monitor & Support - Encourage students to **ask follow‑up questions** if a result needs clarification or deeper sourcing - Remind them that **Web Search augments**, not replaces, **vetted course materials** --- ## Pedagogical Use Cases #### Current‑Events Research Students investigate unfolding news stories or technological advancements in real time. #### Financial Literacy Lessons Pull live market data (e.g., Nvidia stock price) to practice interpreting financial trends. #### Civic & Global Studies Query up‑to‑date information on world leaders, election results, or international treaties. #### Admissions & Tuition Comparisons Quickly compare program costs or application deadlines across institutions for informed decision‑making. #### Policy & Compliance Updates Retrieve the latest guidance on regulations, grants, or financial‑aid deadlines that change frequently. --- With **Web Search enabled**, mentorAI becomes a **live research assistant**—delivering **current, hard‑to‑find answers** right inside the chat and empowering learners to **stay informed** in a rapidly changing world. ### Transcript mentorAI | Tools | Web Search https://www.youtube.com/watch?v=uhhTxbeYfQw [1.0s] One of the features of Mentor AI is the [3.1s] ability to enable tools. Um to get to [5.9s] the tools, you'll go to the mentor name [8.6s] right here and you'll see a drop down. [11.4s] On this dropown, you'll select the tools [13.0s] tab [14.6s] and you'll see that there are two [15.9s] enabled by default. There are a couple [17.7s] of other tools that we can enable if you [19.6s] so choose. Um but the two that we have [22.3s] by default are their web search tool, [24.2s] which allows you to actually search um [26.6s] for information on the web that's more [28.3s] current. Um, and then we also have the [31.0s] MCP tool, which is a tool that lets you [33.3s] connect to MCP server endpoints. I've [36.0s] already got the search tool enabled here [37.7s] so I can show you exactly how this [39.4s] works. [41.7s] So, one of the first questions I'll ask [44.0s] just so you can see what it looks like [45.1s] if you say search the web. What? Search [47.7s] the web for current stock prices [53.8s] of Nvidia. [56.3s] And this will pull the most up-to-date [58.2s] information. You can see that the date [60.5s] that the mentor said is June 25th, which [62.6s] is the day I that I'm recording this. We [65.5s] can say, let's see what tool what the [68.6s] latest AI tools are this month. What are [72.3s] the latest AI tools released [76.8s] this month? [80.6s] So now we're just kind of asking about [82.2s] like you know what's what's going on in [84.0s] the world um in the world of AI [85.5s] specifically. Um and you can see that [87.3s] the mentor is actually giving us [88.5s] information about all the latest [89.7s] releases for this month. Now it doesn't [92.4s] have to be just you know related to [95.0s] stocks or new tools that are coming out. [97.4s] We can even say who [100.7s] is the current president of Mexico. [107.6s] So, we can touch on, you know, [110.9s] new tools, uh, down to current stock [114.0s] prices, which is very much the latest [116.1s] information, and even ask questions [118.1s] about what's going on just in the world [119.8s] as it is. Um, it also allows you to ask [123.6s] things like, you know, what's going on [125.5s] in the university world. So we could say [127.5s] what's [129.1s] the latest guidance on FA [137.3s] deadlines. [140.7s] So now we're, you know, diving more into [142.8s] like a a more difficult thing that would [145.2s] be harder to find on the web just by [147.0s] searching. So this is actually making it [149.2s] much easier for me to find the [150.6s] information that I'm looking for. Um, if [153.1s] we also want to look at, let's just say [155.8s] we're, you know, a a student who's just [157.9s] about to graduate high school and are [159.8s] looking for their best options in terms [162.1s] of the financial cost of a university. [165.4s] So, we're going to try to compare [167.4s] tuition [170.6s] fees for online [174.9s] bachelor's [176.8s] degrees [178.5s] in 2025. [181.8s] So again, the mentor is going to search [183.9s] the web for this and provide us with a [186.5s] bit of information. Um, it's even [188.4s] pulling specific universities for what [190.4s] the tuition is. It could be down to the [192.5s] credit hour or, you know, tuition for [194.6s] that semester or for the full year. Um, [196.7s] just depending on what information is [198.0s] actually available on the internet. And [201.1s] that's basically how you use the web [202.7s] search tool. Like I said, there are a [204.4s] handful of other tools that can be [205.6s] enabled. Uh, and we're happy to help [207.3s] with that. --- ## Memory URL: https://ibl.ai/docs/student/personalization/memory Video: https://www.youtube.com/watch?v=QFta20kyDQY # Memory ## Description **Memory** lets mentorAI **remember what you share about yourself**—then use it to **personalize future answers**. By default, **Memory is on**. When **Reference Saved Memories** is enabled by your school, the mentor can actively draw on your saved memories (e.g., personal info, knowledge gaps, help requests, lessons learned) to tailor replies and progress checks. ![](/images/memorystudent1.png) --- ## Target Audience **Student** --- ## Features #### On by Default Your mentor remembers details you tell it during chat (e.g., “I’m new to Linux,” “I want help with citations”). #### Reference Saved Memories (Institution Toggle) When enabled, mentorAI uses your saved memories to **customize answers** and **study plans**. #### Memory Categories - **Personal Information** – major, interests, preferences - **Knowledge Gaps** – topics you find difficult - **Help Requests** – areas you asked support for - **Lessons Learned** – what you’ve mastered recently #### Instructor Insight (Read-Only for Students) Instructors and admins can search for a student, review high-level memory, and (on their side) add or edit entries to better support you. --- ## How to Use (step by step) #### Chat Normally Share goals, strengths, and struggles as you work. Memory captures helpful details automatically. #### Ask Memory-Aware Questions Try asking: - “Summarize my progress and knowledge gaps.” - “Based on my history, what should I study next?” - “Remind me what I asked about Linux and why it matters.” #### Get Tailored Guidance mentorAI pulls from your **previous chats** (and, when enabled, your **saved memories**) to provide **targeted suggestions, quizzes, and resources**. #### Keep It Up-to-Date If something changes—new major, goal, or challenge— tell the mentor so Memory stays current. #### See the Benefit Over Time As your history grows, replies become more specific to you (e.g., deeper Linux practice if you’ve asked “What is Linux?” and “What is its purpose?”). --- ## Pedagogical Use Cases #### Personalized Study Plans Generate next-step study plans that reflect your **knowledge gaps** and **lessons learned**. #### Targeted Remediation Ask for drills or practice on **weak topics** the mentor already knows you struggle with. #### Progress Reviews Request periodic summaries of what you’ve asked, learned, and what to focus on next. #### Preference-Aware Support Small facts (even favorites) can be saved—useful for **examples, motivation, and tone**. --- ## Notes for Students - Memory helps mentorAI **personalize your experience**; you won’t see admin tools. - If your school enables **Reference Saved Memories**, the mentor’s responses will more explicitly reflect your **saved profile**. ### Transcript mentorAI | User Memory https://www.youtube.com/watch?v=QFta20kyDQY [1.0s] Hello. Today we're going to be talking [2.6s] about memory and how you can use the [5.0s] memory of a user to be referenced uh via [8.1s] the mentor app. So by default your [11.0s] mentors will have memory on. This means [13.0s] that whenever somebody says something [14.7s] about themselves um the mentor can use [18.1s] or stores that information about the [19.7s] user. You can also include referencing [22.9s] saved memories. So you can see here that [26.0s] with save memories, if this is toggled [27.8s] on, [30.0s] the mentor will actually be able to use [31.9s] the memories about the user when [34.0s] responding to tailor the responses to [35.8s] them directly. This can be something [38.4s] like personal information, knowledge [41.4s] gaps, help requests, and lessons [43.5s] learned. And we're just going to take a [45.0s] quick look at my user. You can also [47.3s] search for specific users of course um [49.8s] in case you want to look at a particular [51.7s] user who is you know struggling with a [54.7s] certain topic in your course or needs [57.4s] assistance in some way shape or form or [59.1s] you even just want to see if people are [61.3s] dressing while they're continuing their [63.1s] educational journey. And you can see [65.5s] here that I I've asked a handful of [68.2s] questions that things um more on the [70.5s] technical side. What is Linux? What is [72.9s] its purpose? And this can be used um to [76.8s] essentially you know again tailor [78.3s] responses to the user but also as an [81.3s] instructor it's really helpful to [82.8s] understand you know where people are [84.8s] struggling. You can also edit this um as [88.2s] attendant admin [90.6s] uh and you can also add memories. [93.8s] So you can, you know, select a category [102.2s] and it can be even arbit even be just as [104.6s] arbitrary as somebody mentioning their [106.3s] favorite color or you know adding that [108.6s] memory for a specific user in this case. [112.8s] Now [114.5s] what we'll do now is take a pause and [117.6s] I'm going to log in as one of my test [119.4s] users so we can see how the memory [121.0s] actually is getting tailored. [123.8s] So, now we're logged in as a regular [125.4s] user um that has no access to any of the [128.5s] admin capabilities that instructors [130.2s] have. You can tell because I don't have [131.7s] the instructor to learn or toggle and I [133.9s] also don't have the ability to create [135.4s] members. I'm going to ask the mentor to [138.9s] summarize [141.4s] my progress and knowledge [145.8s] gaps. [151.0s] And you can see here this is actually [152.3s] pulling information from the memory um [154.5s] from previous chats that this particular [156.7s] user has had with the mentor. [161.6s] And there you have it. Simple as that. ---