Decisions made by developers are impacting all aspects of our society, including education. The implications are huge for our present and our future.
As student researchers living this transformative moment, we have critical insights for developers creating educational AI tools. Our research reveals that students turn to AI when overwhelmed by workload and limited resources, often using it to fill gaps where human support is unavailable. Current AI detection tools are creating distrust between students and faculty, with false accusations damaging educational relationships and learning environments. Students want AI tools that help them understand complex concepts and encourage critical thinking, not replace independent thought. Importantly, developers must consider issues of access and equity, as not all students have equal access to AI tools, and existing systems often show bias. We're seeking AI tools that enhance rather than replace learning, or replace our future jobs. We're seeking tools with features that break down difficult material while maintaining academic integrity. These insights come directly from us as students navigating AI in our daily lives, and we invite developers to consider these perspectives when creating AI tools.
From "The Impact of AI on Creative Writing":
"Well, I feel like they should just know what their like software is being used for....they should add features that flag potential misuse or even provide um educational prompts about academic integrity....I think that could help um users really be aware of how their AI use is going to impact their their education... maybe a disclaimer about the limits of what AI can do to sort of remind students that their own thoughts and their own learning process is valuable and that they can also create important ideas and work on their own."
From "Time Well Spent?":
"Any AI builder should be aware that it's also a tool used by students. So there should be maybe a feature that they can implement in AI to not allow copy-paste or something to mark something that teachers won't have to mark it as plagiarism but that AI itself prevents that from happening."
From "The AI Crutch":
"when it comes to talking about the people who actually make AI and improve it. I would say one of the improvements that they should add is when someone is looking for a solution or answer to any topic, chemistry, biology, math, whatever it is, in order for you to see the answer, you have to actually go through a step by step and understand it before you're just given the answer."
From "Digital Divide":
"There is a factor of social responsibility. Basically there is an impact of automation of jobs that are being addressed with AI right - there is this fear that AI is going to replace a lot of jobs. So whoever is creating this software to automate part of these manual processes, they need to come up also with ways to train individuals with upskilling initiatives so they don't find themselves out of job or really having resentments towards this technology that basically benefits everybody."
"Those who create and use these platforms for fun and recreational purposes, I'd say be cautious because how you want your platform to be used, while it's not directly your responsibility if your platform is being used for different purposes, it becomes an issue when you don't consider the wider impact that it could have. In this case, the way AI is used could either help, harm, or undermine an entire generation of students pursuing their dream field or still navigating the education system. It's important to think about potential consequences and the roles of these platforms play in shaping students' experiences and the future."
From "Dialogue with Ai":
"Just knowing the risk factors that come with making a site that's so widely known and will be used to create more harm than good. I think educating yourself on those would be important, especially when making an app that could hurt a lot of people."
"I feel like they also need to be held accountable for the ethics that they are lacking when they're creating these tools because as we said earlier before, there's students out here that are not using AI but are getting in trouble for doing so. And I feel like that really goes towards the people that are creating these these platforms. You need to educate the teachers that are using it. You need to educate the students that are using it and you also need to put out disclaimers like say a a teacher would use an AI tool to track if a student is using Chad GPT or whatever. There needs to be a disclaimer about the fact that this might not be AI. We're just testing for similarity. You know, people need to be more educated. And I think who else can educate the most than those that are creating it?"
Copyright © 2024 AI Archives: Student Researchers Document the Transformation - All Rights Reserved.
Take good care of you.Take good care of each other
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.