When college administrator Lance Eaton created a working spreadsheet about the generative AI policies adopted by universities last spring, it was mostly filled with entries about how to ban tools like ChatGPT.
@readbeanicecream Don’t forget, ChatGPT is not Open Source. My biggest fear is how the developers and owners of proprietary AI ChatGPT (and all tools using it’s API) will control the “market”.
I mean, we were all taught to use Macs in the classroom. That didn’t stop my parents from having a Windows machine, and from me preferring Windows my entire life (except for maybe the first 2 years of computer instruction at school or so). Sure it’s not OSS, but the skills are vastly transferrable & will serve you for the next couple decades until we have a new paradigm shift.
@RheingoldRiver Warning, a little bit longer reply.
It’s not about the skills and if you like a system or not. It’s being invested into an eco system. You only used it in the classroom to learn about basic computing. But if you have multiple Apple systems like phones and Macs which operate together and not that well with other systems, have lot of personal files and configuration that are tied to specific programs on your personal home system. Then you are less likely to switch. Windows. Lot of documents and photos or other files, and programs are often tied to programs available only on Windows or best used with. You might only know Windows programs and many people (after years of personal files and experience) are often not willing to learn new concepts and restart their organization.
But that’s just an example. The actual thing I am talking is about ChatGPT and it’s API. Many cool programs and websites are build to operate on ChatGPT and what it offers, through the programming interface it has. Developers get invested into this eco system and the users therefore too. It’s not a simple replace an URL, they have to “reprogram” the backend to use an alternative system and it might even not offer all features of ChatGPT (through the API). Yes, a few people can switch easily, for simple programs. But most would hesitate unless it is horrendously broken.
Look how people don’t even change from Chrome to any other browser, even if Google is clearly collecting data and wants to block specific plugins. Most don’t care. And if it’s too late, then it is too late just to change everyone. I don’t want paint the end of the world, no, because I do not rely on it. I just try to explain my concerns.
I get where you’re coming from, but I don’t think that you’re quite understanding the role of school? In school they teach you one programming language so you can learn many. You might be the most comfortable with Java, but you’ll quickly see why you need to witch. And sure, we were taught how to use a bunch of Mac-only programs, but more than that we were taught how basic UX of programs works, how to use the internet, how to search with keywords, I think a bit later on Google existed and we were taught how to use the advanced search. All of this is very transferable knowledge, even if you feel most at home somewhere.
Also keep in mind most kids in school are not blindly doing what school says to do outside of school and building heavily into the ecosystem they were told to do. Certainly not those who are using APIs.
Anyway I guess my point is, it is much better to learn something than nothing, and foundational knowledge is the point. In this case, it does not matter so much what you learn, and you are best off deep diving into something kids already have a bit of familiarity with, so that they can build onto existing knowledge.
Edit: Although, I would agree with you if this is like, an advanced LLM programming class. In that case for sure an OSS version should be preferred. But this sounds like basic LLM literacy?
I get where you’re coming from, but I don’t think that you’re quite understanding the role of school? In school they teach you one programming language so you can learn many.
I understand the role and that is exactly what I said. You are not personally invested into the eco system, just because you got teached the fundamentals how it works. Let me make another example. If you buy a lot of Playstation games on the Playstation store, then you are invested into this system. This is not just teaching how gaming and consoles work, but you have a lot of personal valuable money spent into the eco system. Then you are less likely to switch to Xbox.
It’s just an example. When you learn a programming language in school, you learn how programming in general works and can learn the language you want to. However if you are invested into one, say Python, have lot of personal programs written, libraries, meet friends and communities in the eco system and probably use lot of tooling depending on it. Then you don’t just flip the switch and forget all about. That is what I’m talking about the personal investment.
But you know, we loose track of the original question. Because this is not about just learning to use an AI tool. That is why the discussion about the school does not work as an example. In example big websites, like StackOverflow and many other integrate ChatGPT into their eco system. My point is not that its not impossible to switch to a different system, the point is if the world becomes controlled with it and the standard, then most likely it won’t be just replaced by everyone. Even code generation with ChatGPT is integrated into programmers IDEs.
And then ChatGPT developers have a lot of influence across a lot of high profile applications and many many smaller ones. Even if they could change, does not mean it will happen in the mass. Look Windows <-> Linux in example. Just because most people could, won’t just do it. Same for developers. Most people don’t care and just use what’s popular.
What is there to teach? It’s conversational. If you can write coherently, you can use GPTs. Someone in the English department should leverage “AI” hype to get more funding.
Expectations and limitations. IMO it is important to teach people what it actually does and that it can generate false statements as well as what sorts of questions are more likely to lead to false statements. It is a tool, and like any tool it has incorrect ways to use it. Even if it seems simple from the outset.
Oh shit, I remember taking a short course on research in college. The necessity of parsing your sources were a big topic, with an entire section on Wikipedia and social media. AI is gonna be the biggest revision to that course and academia in general yet.
deleted by creator