Adexia is an AI tool designed to assist with VC English, offering three main uses: creating class activities for teachers, providing unlimited practice for students, and aiding in SAX and trial exams for benchmarking and grading.
Chapters
00:00
What is Adexia?
Introduction to Adexia and its purpose
00:23
Core Use Cases
Overview of the three core use cases
00:59
Teacher Class Activity
Demonstration of the first use case: teacher creating a class activity
Transcript
00:00
Adexia is an AI which can provide expert feedback for VC English, which is trained and calibrated by VCAA assessors.
00:08
There are three core use cases for the technology.
00:11
The first use is as a teacher to create class activities, the second is for students to do unlimited practice.
00:18
And the third is its use.
00:20
In SAX and trial exams to facilitate benchmarking, blind grading and moderation.
00:26
To start, I'll go through the first use case of a teacher using this to do a class activity.
00:30
To do this, you just open your class and click the Create Activity button in the top right.
00:35
You simply choose the unit, the text or framework, The activity type.
00:40
Say you want to write a body, paragraph or a full essay, and then you can either write your own or choose a prompt from our library.
00:46
After filtering for the theme command, term, difficulty, and if you want a passage, and then choosing one or multiple for your students to respond to afterwards, you can choose the due date.
00:55
Let's say it wants to be due next week, Wednesday.
00:58
You can choose whether you're happy with the AI giving feedback instantly, or if you want to review it first, whether you want it providing feedback or you're happy with it providing a grade as well, and whether you're happy with the students typing it, or if you want them to handwrite the response and then use our scanning software on their phone or laptop webcam to upload the response.
01:17
Afterwards.
01:17
You can publish directly or save it in test mode.
01:20
In test mode, you can view it as a student and actually submit as you would a student.
01:25
Where you can see the instructions and then type your response.
01:28
After typing some of their response, they can submit to get feedback.
01:32
Afterwards, the AI will provide them feedback, both on the left hand side of the script and consolidating it to the right hand side.
01:39
The student can then read and implement this feedback, typing directly into the page and then ticking fixed on each of the feedback points.
01:46
And then once they're happy with it, they can then submit to get more feedback.
01:49
And then continue to iterate on the response, making it better and better.
01:52
And then also they can see all of their password submissions in the dashboard here.
01:56
Then returning to the teacher view, the teacher sees all of their students and they can also view the activity as to whether they are on the platform at this current second.
02:04
And then inside of each student, they can review all of their submissions alongside the feedback they receive for each submission and also provide a AI report where they can watch a full replay of the student's creation of the where you can see here that it was all just magically pasted in at one point in time, with 99% of the words being pasted and 100% of those being more than 90% likely AI enabling you to ensure appropriate AI use.
02:30
If you click this button here, you can also see a cross submission summary of that student's strengths, weaknesses, next steps, and alongside the replay, Then once you're happy with it, you can publish it to your students.
02:42
So that's the class creation workflow.
02:44
Secondly, you have the optional student practice, where if I view as a student, you can see that they always have this unlimited practice tab where they can go and create similar practice activities that the teacher just made, but just for themselves.
02:56
And then on the teacher side, you'll see all the student submissions as they go here and be able to click in and review any of their work.
03:03
Then for high stakes usage in SAX and trial exams we have dedicated benchmarking, blind grading and moderation workflows to drastically increase the accuracy and and reliability of judgments.
03:16
Firstly in benchmarking you can assign a random number of scripts, let's say 15 for everyone in the teaching team to grade independently where when they open the script they don't see any grades or any feedback from Adexia until they put in their own grade and their own feedback after which they can reveal Adexia's judgments.
03:34
Then once everyone has reviewed all of their scripts you go on to phase two where where we actually then reveal everyone's grades across all of the benchmarking samples and then you can order those grades by the spread across all of the graders and then pre fill the final agreed grade based on the median result excluding outliers.
03:55
So then you can then take this table into a calibration meeting and discuss any misalignments and then go through and approve the final grade for each of the scripts.
04:05
We can then generate a benchmarking graph where we graph each of the individual teachers grades versus the final accepted grade which was agreed upon in the meeting.
04:15
And then we can also generate some calibration advice where the AI can analyze all of the teacher's judgments and the feedback they gave and try and identify potential misalignments and give them some advice on the next scripts which they'll be reviewing.
04:29
After this process, we can proceed to blind grading where during an exam.
04:34
We would have been sent say a thousand page PDF and then we can then use the COVID sheets to identify the student and teacher that that script is linked to.
04:43
However, we can put those scripts into teachers accounts in a de identified manner where we randomly assort the scripts and de identify them so there's no bias in the blind grading of those teachers.
04:56
With those scripts they can also view the original handwritten sample.
04:59
However we will transcribe it for easier review.
05:02
The current handwriting.
05:04
Transcription accuracy, Although not perfect, is substantially better than most human judgments.
05:09
You can then once again review adexia's feedback.
05:12
But before seeing its grade, you you have to enter your own grade and fill in the SAC rubric before you can reveal the AI's judgment.
05:20
That way it's acting as a real time moderator.
05:22
The teacher can, as always, edit and add any of the feedback that they would like.
05:26
And they can also record voice notes.
05:28
Great work, Jimmy.
05:29
Enabling an additional layer of personalization for your students.
05:33
Finally, once everyone has done their blank grading, we can then go into a final moderation step where each teacher can be assigned a certain number of scripts to cross check and after which we can reveal another moderation table where we show both the initial grader and their initial judgment and final judgment.
05:49
After revealing the AI's judgment alongside the moderator's initial and final judgment, once again we can sort by the potential spread of the judgments and go through and review them in a moderation meeting and also once again generate a report which helps us visualize the teacher's grades versus in this case adexia's grades and also show any moderations visually alongside a AI analysis on any potential moderation that may be required.
06:14
So that is the core platform and we'll constantly be extending and trying to provide more value to VCE English teachers.
06:21
If you want to experiment with the platform, please sign up today and would love to hear your thoughts and feedback