ChatGPT encapsulates norms. For example, if you ask it how to build bombs, it will respond that if you have a problem, you should consult a healthcare professional... But who defines these norms? From a legal perspective, this is interesting. In my opinion, ChatGPT is characterized above all by its inability to transgress these norms. In law, the concept of morality has been defined by the same article of the Civil Code since 1830, except that before, what was contrary to morality was cohabitation, and now, to caricature, it is selling your eggs on the internet... Society is changing, but ChatGPT is incapable of interpreting and evolving norms. It is also incapable of considering cases of force majeure. In law, for example, it is illegal to speed unless it is to rescue someone in danger. ChatGPT is royally psychologically rigid and misunderstands exceptions and empathy. In this sense, it would be a magnificent plea for the role of humans. Because ChatGPT is incapable of the worst, it is also incapable of the best.
Elise Degrave, Faculty of Law, NaDI Institute