This letter describes a proof of concept for using Generative AI on the path to personalized instruction.1 I am writing from the perspective of higher education, but the extension to other fields is straightforward.
Everything changes
It is rare to witness an event that marks the entry into a new historical era. November 30. 2022 was such a date, when OpenAI introduced ChatGPT to the world.2 Undoubtedly, this Large Language Model (LLM) has many issues, but ChatGPT is the first working system along a path to synthesized thought that has utility outside the laboratory. As for academia, this already changes everything.
I actually mean this in a technical sense, not as hyperbole. An algorithm that changes the nature of employable skills impacts our course contents. An algorithm that can draw on a vast pool of knowledge to explain concepts in natural language, tirelessly, and non-judgemental, impacts how our students learn. An algorithm that can summarize and expand, structure, argue, and conclude – for free – impacts our assignments. An algorithm that can write more eloquently and expressively than most of our learners impacts our assessments. What we teach, how students learn, how they consolidate and demonstrate learning, and how we assess success – that’s just about everything. Except for one: how will it impact how we teach? We do not know. It is too soon to tell. But we know it will change.
Assignments
In a sense, assignments form the core of our modern universities. We do not just proclaim facts, we engage our students in activities. We encourage them to transform what they have heard into active knowledge, to consolidate it, to deepen the associations that are made, and thereby contribute to personal growth.
Such engagement supports a variety of learning objectives that involve different complexities of thought.3 Bloom’s taxonomy (1956) is the best known of these, it has been revised and refined,4 and enhanced to reflect digital innovations.5 Much criticism could be mentioned – in particular whether Bloom's objectives ought to be considered a sequence, or hierarchy, or rather an interconnected system, but there is consensus that learning objectives have multiple dimensions, and that those roughly partition along knowledge, comparison, planning, and synthesis and design.
The Case for Personalized Teaching
But Benjamin Bloom’s career covered much more ground than that, and one fascinating result of work done under his direction was summarized in a 1984 publication: secondary-school students with a carefully designed lecture plan and one-on-one tutoring performed on average 2 standard deviations better than their conventionally educated peers. Two standard deviations! That is a difference of approximately two letter grades. So why are we not doing that? Bloom writes:
I believe an important task of research and instruction is to seek ways of accomplishing this under more practical and realistic conditions than the one-to-one tutoring, which is too costly for most societies to bear on a large scale. This is the “2 sigma” problem.
(1984, p.4)
Subsequent research focussed on strategies to achieve a similar effect through conventional ways of illuminating the classroom through educational theory. “Too costly for most societies” Bloom wrote. Until today, perhaps.
When we speak of the potential of generative AI in education, this is where it comes into focus: personalized learning.
Report and Diary
In my analysis of academic integrity I wrote:
No one plagiarizes in their diary. We need to aspire to have submitted work created in the same spirit: a beautiful expression of the self.
What could this look like, an assignment that actually reflects on a student’s self?
Surely, every assignment would need to be different in ways that ultimately only the student can assess. But every assignment also needs a core of relevance for the course, and a common framework of rules that define things like due dates, submission procedures, and assessment rubrics. We cannot reasonably ask students to go and write their own assignments, they generally do not have the knowledge necessary to make an assignment “good”. But can’t we use ChatGPT for this task? Indeed, the LLM responds to three distinct levels of input: (1) the training data creates its general knowledge basis and language abilities; (2) the fine-tuning creates the dialogue abilities and alignment. As a result we have a device. But the third level is the prompt: (3) a prompt acts as programming to elicit a specific output. By phrasing prompts carefully, we harness the true potential of the LLM.
This is actually a pretty big deal. We can craft prompts ourselves. No specialized software is required. You do not need expensive consultant contracts, or involvement of third parties, or significant additional funds. Of course, as we understand the issues better, expert advice from time to time will be valuable and will become available – but fundamentally, everything we need is already in our own hands, to be animated by our own expertise.
Here is how this could work, step by step:
AI generated assignments
Build an interactive webpage that contains:
information about the course contents and objectives;
the assignment procedures and guidelines;
generic information about what makes a good assignment in the first place. And …
an option for students to input their learning styles and other preferences.
Students enter their learning styles, objectives, and preferences. The webpage produces a detailed prompt.
Students enter the detailed prompt into a ChatGPT session. ChatGPT creates an assignment. This can be repeated several times, to collect a variety of ideas about the assignment.
Students edit the assignment until they are satisfied with it.
They submit their assignment to the instructor for feedback and approval.
Proof of concept
To put this into practice requires some care. This needs to take into account that we do not all have the coding skills to build a prompt generator page, or have a server on which we could deploy it. Neither skills nor technology should limit us. Fortunately, there is a simple solution: a prompt generator can be built as a Google sheet and requires only an educator’s expertise to customize. I have built a demonstration as a proof of concept, you can try it out [▷]. It is written from the perspective of an actual course I newly designed and taught last year for the first time, but it should be easily adapted to different needs.
Background: the course
The context here is a first year course: “Computational Biology Foundations”.6 In one section of the course we discussed: “What is life?”, we focussed on how living systems process information – their computational abilities – and we looked in some detail at minimal computational systems called cellular automata (CA). The students learned to work with computer code to run such automata and visualize what they are doing.
The concluding assignment of the section was to write a report:
Explore whether a time-reversible cellular automaton tends to more or less order than its deterministic cognate AND what makes a pattern you have found interesting?
A few of the students did very well, most of the others, to my surprise found it difficult to understand the core of what this assignment really was about: take a vague concept like “interesting” and define it in a way that one can create a computational measure and then make quantitative comparisons. We had discussed this in class,7 it is the essence of the whole idea of "computational biology" as a program of inquiry.
Could we not make assignments in which all students do well, or at least, realize that the task is too hard or not clearly defined, and then take advantage of office hours or other support to work through the problem?
This is the issue that the assignment generator should address.
Generated assignments
The prompt generator consists of three type of input:
Generic instructions how to write a good assignment.
Course specific information.
Information that students enter.
You can access the form here [▷] to experiment with it, and you can make a copy to adapt it for your own needs. Read the Instructions for details how to work with the sheet.
The prompt starts with basic instructions for ChatGPT about its role as an “expert assignment designer”. The course context and contents is given in a brief statement in a single cell that can be easily rewritten for different courses. ChatGPT has enough general knowledge to make more detailed sense of it:
This is an introductory course on computational biology. The student has explored cellular automata in class, both the classic CAs defined through their Wolfram code, and their time-reversible cognates. The student is able to write R code to run CAs, produce output, and visualize it.
This is followed by personalized information about the student. Each student enters their own profile by choosing values from drop-down menus: their learning type, their preferences for social or individual learning, their skills, their expectations for achievement, the desired level of effort, and personal goals like creating an interesting / useful / beautiful assignment. Both students and instructors may add, remove, and edit items, the framework is designed to take all information into account.
The key learning objectives follow, and the assignment is completed with submission instructions and assessment notes.
Here is a generated prompt that shows how the components are assembled. The personalized input of the student is bolded.
Sample prompt
I want you to act as an expert assignment designer. You will use the “Course Context”, the “Student Objectives”, the “Learning Objectives”, the “Submission Guidelines”, and the “Assessment Criteria”, that are given to you in the input and you write detailed assignment text. As an expert assignment designer, you write in a brief but detailed style that tells students what they are expected to do. You do not write: “the student” but you write “you”. You turn all objectives into specific activities. Your text is complete and does not assume knowledge of the prompt. You are concise and do not repeat information.
As an expert assignment designer, you customize each assignment for the student. That information includes the student's learning type. If the student is a visual learner, include graphs, flowcharts, mind maps, infographics, and similar visuals in the tasks. If the student is an auditory learner, include speaking and explanation tasks, oral presentations, and similar. If the student is a kinesthetic learner, include role playing, building models, defining relationships spatially, and similar tasks. If the student is a verbal learner, include creative writing tasks, reflections, writing informative arguments, and similar. If the student is a logical learner, include tasks that use lists, structures, relationship graphs, abstract formulas, and similar.
The assignment text consists of the following five parts.(1) You briefly summarize the assignment goals.
(2) You list in detail the actual steps that the student should follow. The steps must support the student's learning type, preferences, and objectives.
(3) With reference to the specific tasks that you listed in part two, you explain how the assignment takes the student's strengths and preferences into account, how it creates healthy challenges and addresses weaknesses, and how it achieves growth. You also consider the educational goals: knowledge, comparison, planning, and synthesis and design. For each educational goal you describe how a specific task achieves the goal.
(4) You explain how the assignment fulfills all “Learning Objectives” by stating each objective and explaining how the tasks you listed in part two support it.
(5) You repeat the “Submission Guidelines” and “Assessment Criteria” exactly as they are given in the input.
The input contains statements written by the student. As an expert assignment designer you carefully take this into account by designing tasks that are suitable for the student's learning type, that support the student's objectives, that respect the student's preferences, and that address weaknesses.
Here is the input:
Course Context:
This is an introductory course on computational biology. The student has explored cellular automata in class, both the classic CAs defined through their Wolfram code, and their time-reversible cognates. The student is able to write R code to run CAs, produce output, and visualize it.
Student objectives:
The student is mostly a visual learner.
The student enjoys group work.
The student is poor at designing code.
The student's time-management skills are very poor.
The student wants to do a decent job.
All in all, the student should spend about eight hours on the assignment.
The assignment should create something beautiful.
Learning objectives:
Objective 1: observe examples of CA output that you find “interesting”. Objective 2: define what makes a pattern “interesting” in a way that can be quantified. Objective 3: implement the definition in a function and explore “interesting” patterns. Objective 4: discuss your results in the context of how to understand “life”.
Submission Guidelines:
Prepare your submission in a Google doc. Feel free to use generative AI resources however you need, but you must document all details in an appendix. Add an appendix with the assignment text. Add a code appendix using syntax highlighting. Share the document URL with your instructor to submit your assignment.
Assessment Criteria:
Assessment takes both content and form into account. An excellent submission will surpass expectations; it will contain clearly structured, original thought that can serve as a model for others. What a generative AI can deliver is not sufficient to pass this course. Your work must surpass that level.
[Some line breaks added for clarity.]
The same prompt will generate slightly different assignments each time it is submitted to ChatGPT. Here is one result of a generated assignment:
Sample generated assignment
Introduction: Welcome to your assignment for the computational biology course. The aim of this assignment is to help you develop your understanding of cellular automata and to explore how to define interesting patterns that emerge from them. You will also gain experience in designing code and in working with group members. The assignment will take approximately eight hours to complete and should result in the creation of something beautiful.
Task:
Your first task is to observe examples of cellular automata output and select those that you find interesting. These could be classic CAs defined through their Wolfram code or their time-reversible cognates. Use visuals such as graphs, flowcharts, and mind maps to aid your observations.
After selecting the interesting patterns, your next task is to define what makes a pattern “interesting” in a way that can be quantified. As a visual learner, you can create infographics or diagrams to explain your definition.
Implement the definition you created in task 2 in a function and use it to explore the interesting patterns you selected in task 1. As you work on this task, keep in mind that you find coding challenging. To address this, we recommend that you work with a group member who is strong in coding.
Discuss your results in the context of understanding “life”. As you work on this task, keep in mind that you have poor time-management skills. To address this, we recommend that you use a timer to track the time you spend on each discussion point. As a visual learner, you can use flowcharts or diagrams to help organize your thoughts.
How the tasks take your learning type and preferences into account: These tasks take your learning type and preferences into account by including various visual aids, group work, and accommodating your poor coding skills and time-management skills. The tasks are designed to be challenging but achievable, allowing for growth in your understanding of cellular automata.
Educational goals:
Knowledge: Task 1 and 2 are designed to help you gain knowledge of cellular automata and what makes a pattern interesting.
Comparison: Task 3 allows you to compare and contrast different patterns based on the definition you created in task 2.
Planning: Task 4 requires you to plan your thoughts and discussions in a coherent and organized manner.
Synthesis and design: The entire assignment requires you to synthesize the knowledge and skills gained in the previous tasks and use them to create something beautiful.
How the assignment fulfills all learning objectives:
Objective 1: Task 1 requires you to observe examples of CA output that you find “interesting”.
Objective 2: Task 2 requires you to define what makes a pattern “interesting” in a way that can be quantified.
Objective 3: Task 3 requires you to implement the definition in a function and explore “interesting” patterns.
Objective 4: Task 4 requires you to discuss your results in the context of understanding “life”.
Submission Guidelines: Prepare your submission in a Google doc. Use generative AI resources if needed and document all details in an appendix. Include the assignment text and code appendix using syntax highlighting. Finally, share the document URL with your instructor to submit your assignment.
Assessment Criteria: Your submission will be assessed based on both content and form. To surpass expectations, your submission should contain original, clearly structured thoughts that can serve as a model for others. You must surpass what generative AI can deliver to pass this course.
(ChatGPT, 2023-03-09)
Given the complexity of the writing task, the AI’s ability to follow the instructions is remarkable. Each assignment is different, each personalized profile will lead to an assignment with different emphasis and different tasks.
But …
Agency
The generated assignment is passable, but not brilliant. Though I simply took the first assignment that was produced today, I have seen assignments that are quite good and others that somewhat miss the point, perhaps an assignment contradicts itself regarding the tasks assigned in step two, and their justification in step three. Sometimes the assignment objectives are interpreted quite well, sometimes they are just restated. Sometimes submission criteria are incomplete. Sometimes prompt and response exceed ChatGPT’s context window of 2,048 tokens, and generation stops.
Students must produce a few variations and treat them as raw material: make tasks more specific, emphasize their own objectives, make sure tasks and effort are commensurate, say more about the assessment etc.
Through this, they engage with the assignment and take ownership. The assignment becomes theirs, it enhances their agency in the process.
I expect the consequences to be profound:
The assignment will be carefully read and well understood, because a student needs to understand the assignment to be able to be improve it.
Improving the assignment becomes itself a part of the learning process. This would not have been possible previously: an assignment deliberately made incomplete or defective by the instructor is artificial and students know that. In contrast improving an AI generated text for your personal objectives has immediate relevance for real world applications.
An individualized assignment represents the student’s personality. I expect that students will be more likely to invest effort and to enjoy creating their submissions.
Designing and working on their very own assignment gives students agency in their learning. Conventional assignments often remove that agency.
Instructors’ feedback on the assignment provides a great opportunity for formative feedback. You can prepare some more detailed additions to the assessment rubrics; this will contribute to manage required achievement levels and student expectations.
This is a proof of concept. It will improve over time as we collect experience. The advantages are that it can be used by all faculty at all levels, without special infrastructure or technical expertise. Give it a spin [▷].
TLDR
Personalized education has huge potential for learning outcomes, and generative AI may make this a reality. Even the current version of ChatGPT gives educators access to individualized contents, without requiring additional software, costly consultants, or the involvement of third parties.
A proof of concept is presented that utilizes a spreadsheet [▷] to create prompts for individualized assignments; generative AI (ChatGPT) can produce assignment drafts from the prompt.
Such personalized assignments contribute to students’ learning, enhance their agency, and provide valuable opportunities for formative feedback.
Post Scriptum: More madness with Bing
Just a little alert. Researchers have shown that Bing is vulnerable to prompt-injection attacks. Such attacks make use of the fact that Bing reads webpages and interprets them – this is different from the training process: these Webpages to not enter the neural network, they become part of the prompt. Apparently one can place hidden text on such a page that looks like an instruction to the bot. This is not trivial to fix, since the LLM by its nature must be able to follow instructions, and if it is to perform searches it must be able to read Webpages. A solution will require an intelligent way to sanitize pages that are consumed. The security researcher who discovered this vulnerability, Kai Greshake and colleagues of Saarland University in Germany created an elegant and amusing demonstration in which Bing was told to talk like a pirate and attempt to get the user to divulge personal information (2023). However, this immediately stops being funny, once one realizes that Bing apparently has access to open browser tabs, and can be instructed to exfiltrate sensitive information from such tabs, can package it into an URL string, and send it off to a compromised server when the user clicks on the link. Blurring the interaction scope like that in “application-integrated LLMs” is madness, a madness for which Bing is not even to blame. It is one thing to risk having a site set a new agenda for your bot and have it go off script, but it is quite another thing to give the bot agency to do harm. Until this is resolved, be cautious.
References
ANDERSON, Lorin W. and KRATHWOHL, David R., eds. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. New York: Longman.
BLOOM, Benjamin S. et al. (1956). Taxonomy of educational objectives: The classification of educational goals. Vol. Handbook I: Cognitive domain. New York: David McKay Company.
BLOOM, Benjamin S. (1984). “The Search for Methods of Group Instruction as Effective as One-to-One Tutoring”. Educational Researcher 13(6): 4–16. (▷)
CHURCHES, Andrew (2007) Bloom’s Digital Taxonomy. https://edorigami.edublogs.org/blooms-digital-taxonomy/
GRESHAKE, Kai; ABDELNABI, Sahar; MISHRA, Shailesh; ENDRES, Christoph; HOLZ, Thorsten; FRITZ, Mario. “More than you've asked for: A Comprehensive Analysis of Novel Prompt Injection Threats to Application-Integrated Large Language Models”. (2023-02-23) (arXiv).
OpenAI (2022-11-30) “Introducing ChatGPT”. Blog (▷).
Feedback, comments, and experience are welcome at sentient.syllabus@gmail.com .
Sentient Syllabus is a public good collaborative. To receive new posts you can enter your email for a free subscription. If you find the material useful, please share the post on social media, or cite it in your own writing. If you want to do more, paid subscriptions are available. They have no additional privileges, but they help cover the costs.
Cite: Steipe, Boris (2023) “Assignment and Agency. A proof of concept for personalized instruction”. Sentient Syllabus 2023-03-09 https://sentientsyllabus.substack.com/p/assignment-and-agency .
I wish to acknowledge significant contributions by ChatGPT (version 2023-02-13) in response to my prompts, for which I take full responsibility.
OpenAI (2022).
We have introduced a somewhat principled view on learning objectives here in “How much is too much? Drawing the line on AI-assistance” (2023-01-16).
A popular revision expresses the taxonomy in terms of action verbs (Anderson, 2001).
Cf. Churches (2007).
When I was a student at the University of Munich in Germany, my first year courses were all taught by the most experienced faculty. Specialist courses were taught by assistants. That always made a lot of sense to me, and North America would do well to consider this model.
One possible approach is to enumerate all patterns eg. in a 3x3 field matrix and ask how much variety or repetition there is. There are many other approaches.
Thought provoking, thank you. We are preparing similar materials for our institution and this is a wonderful basis. Will cite your work.
This was a fascinating exploration of how generative AI could be leveraged to enable personalized learning through customized assignments. The concept demonstrated is really innovative and has promising potential to enhance students' learning experience and agency. It will be interesting to see how such techniques continue evolving in the future. When researching study abroad options, a service like Dunya Consultants (https://dunyaconsultants.com/) could provide valuable guidance to identify suitable international programs and destinations aligned with one's learning objectives and style. Their expertise in navigating application requirements and timelines would also ensure a smooth transition to continuing one's education overseas. Personalized assignments generated through systems like the one outlined here seem well-suited for globally focused curriculums available through certain study abroad pathways.