A joint declaration by students and faculty members of the Faculty of Social Sciences

Whether it's search queries or text editing, artificial intelligence (AI) has become an integral part of everyday life at our university. “AI is the ability of a machine to display human-like capabilities such as reasoning, learning, planning and creativity” (European Parliament, 2023). AI tools are dedicated applications that integrate AI functionalities and make them usable as tools. They offer potential for scientific work and have long been used in a variety of ways in higher education. Students and lecturers are equally learners in this context, as technical developments are so rapid that didactic and application-related requirements can hardly keep pace.

The writing-intensive social science subjects have to teach a critical approach to text-capturing and text-generating AI tools in teaching and practice this in research in order to use them as positive resources. This is because the use of AI tools generally has an impact on the quality of written work. The general quality criteria for good academic work and regulations on plagiarism keep on being a priority in the assessment of academic performance. Examination regulations are not yet geared toward the use of AI tools. Furthermore, the ethical and political implications of an AI-driven society are of great interest, especially for the social sciences.

An inter-status working group at the Faculty of Social Sciences has developed a joint declaration on standards for the use of AI tools in studies and teaching at the Faculty of Social Sciences at Georg August University of Göttingen.1 Voluntary commitments have been formulated on four key points to support students and lecturers both in the reflective use of AI tools:

  1. Create transparency for all parties involved
  2. Targeted promotion of writing skills
  3. Observe quality criteria for scientific work
  4. Clarify examination-related questions



1. Create transparency for all parties involved

The University of Göttingen is in favor of using AI tools. The Faculty of Social Sciences also emphasizes their creative and critical use in the following statement. Like any ‘new’ technology, AI is not an end in itself. Accordingly, in teaching and learning settings, it can be sensible to use the potential of AI tools, to refrain from using them, or to introduce them later in the work process for didactic reasons.


At the end of a seminar, students should be able to comment on an argument. This includes researching and formulating pros and cons, as well as writing a well-reasoned position statement that includes their own opinion. This cognitively demanding skill can only be trained effectively if the individual steps are first worked out independently. AI cannot replace the formulation of one's own opinion. After the statement has been written, AI can be used as a virtual sparring partner to check the consistency of the argumentation, optimize the rhetoric, and act as a source of ideas for aspects that have not yet been considered.



At the end of a seminar, students should be able to search for information on a specific topic in various text types (e.g., empirical studies, practical journals, anthologies, handbooks, monographs). This targeted focus on the essentials becomes automatic through repeated application during the course of study and saves a lot of time in the long term, for example when preparing a research status report. AI can be consulted for additional information on specific questions (e.g., concrete statistical parameters).


The use of or decision not to use AI tools should therefore be clearly stated as a framework condition in advance of work assignments and, in particular, when creating written work, and justified from a didactic perspective (e.g., in seminar plans, worksheets, handouts).

Due to the rapid technological developments, it is neither possible nor necessary for lecturers to have a substantial knowledge advantage in the practical use of AI tools. Their strengths lie rather in the critical examination of media. They also bring valuable skills such as “traditional head and hand work” to the table – for example, in scientific research, structured text editing, and analytical classification of information. Students often already have a wide range of experience with AI tools in private or university contexts. Their strengths lie in their practical and experimental approach, which extends beyond individual courses.

What we agree on: We, the lecturers, strive to be open-minded about the use of AI and to justify its use in teaching in a transparent and comprehensible manner. We, the students, are encouraged to respect the framework conditions and to contribute our knowledge of AI and its possible applications productively as impetus for further development in teaching.



2. Targeted promotion of writing skills

One of the key competencies taught at the Faculty of Social Sciences is writing. We use a writing phase model that conceives academic writing as consisting of researching knowledge, processing and structuring it, and writing your own texts. AI tools can be used in different ways in the individual writing phases. The Virtual Competence Center: Artificial Intelligence and Academic Work offers an overview of external AI tools. The university's Digital Teaching and Learning service brings together the university's own AI tools.

Schreibphasenmodell KI Erklärung_english

When writing social science texts, the writing phase has an “epistemic (‘knowledge-creating’) function” (Struger 2017, p. 107, translated into English). This means that writing serves the process of acquiring and producing knowledge. Thoughts are not only represented through writing, but are first formed, deepened, and expanded. In this epistemic-heuristic function, writing is an integral part of (social) scientific thinking, working, and knowledge creation. Therefore, independent text production is particularly crucial for advancing one's own cognitive progress. It is therefore advisable to limit the use of text-generating AI tools to specific tasks and purposes, such as generating text introductions or designing transitions.

The “mental and manual work” involved in scientifically sound research, reading and excerpting texts, and citing and writing scientific texts in accordance with scientific quality criteria has to be the initial focus. Only through understanding and learning writing skills can a critical approach to AI-generated text summaries or fragments be developed in a further step of competence acquisition. The understanding and scientific penetration of a topic must be evident as individual achievement in the text production. The extent to which AI tools help in this process ultimately remains an individual decision based on different teaching and learning skills. There is no obligation to use AI tools.

What we agree on: We, the lecturers, define the content and formal requirements for written work and contribute our experience to point out recurring ‘stumbling blocks’ in the writing process. We welcome questions about content and concept, whether or not they relate to AI tools. We, the students, first plan the writing process carefully and use AI tools in certain phases of writing and for specific purposes, provided that we believe they offer useful support. We discuss any questions or uncertainties with our lecturers.



3. Observe quality criteria for scientific work

When creating scientific work with and without AI tools, the same quality criteria apply as those set out in the Guidelines for Good Academic Practice of the University. The content and methodological quality of an academic paper is primarily measured by a relevant and precise research question that is answered using a suitable methodological approach and whose results are classified within the theoretical background. The formal quality of an academic paper must be measured against the requirements of good text production, argumentation, and careful source documentation.

Study program or course-specific requirements at the Faculty of Social Sciences supplement or specify these general assessment criteria. The Student Office of the Faculty of Social Sciences provides an overview of the guidelines, assessment criteria, and materials for the subjects.

Information on the use of AI tools that may influence both the content and formal quality of work is a useful addition to the mandatory declaration of independence. The university's Declaration on the use of ChatGPT and comparable tools should be used to document the use of AI tools and adapted to the objectives of the respective course. The writing phase model is a suitable system on the basis of which the use of AI tools and the associated purpose can be specified in a comprehensible manner in the declaration. We advise against specifying a percentage for use, as this is neither calculable nor meaningful.


Appendix: Declaration on the use of ChatGPT and similar tools in exams

In this paper, I have used ChatGPT or other AI tools as follows:

[ ] Not at all

[x] during the brainstorming phase: ‘Orientation’ writing phase. (e.g. use of Consensus and ScholarGPT to obtain an initial assessment of the state of research on my research question and, if necessary, to revise the research question).

[ ] during the creation of the outline

[x] for developing software source code (e.g. Python, Excel, MPlus): Writing phase ‘cross-phase’. Use of ChatGPT to create a formula that calculates sums for displayed cells in the Excel data set, but not for hidden cells.

[ ] for optimising or restructuring software source code

[x] for proofreading or optimisation: for the ‘Revise drafts’ writing phase. Use of ChatAI for the introduction with the requirement to adapt the diction so that the introduction makes the reader want to continue reading without becoming unscientific.

[x] Further, namely: Writing phase ‘Reading and structuring’. Use of ChatPDF to examine longer empirical works in particular to ensure they are relevant to the research question and to explain complex methodological designs. Use of AI Prompt Generator to specify the tasks for ScholarGPT based on the preliminary state of research.


With its consultation hours for academic work, the university offers a central information platform with a wide range of support services to systematically develop academic skills.

What we agree on: We, the lecturers, refer to the university-wide principles of good academic practice. Furthermore, we set out the assessment criteria in general and expectations regarding AI tools in particular at an early stage in the courses for which we are responsible. We strongly recommend the use of citation software. We, the students, take responsibility for the texts we produce and clearly indicate where and for what purpose we have used AI tools in the writing process. We also strive to independently expand our writing and research skills beyond the courses we attend.



4. Clarify examination-related questions

The use of AI tools is not objectionable under examination law, as long as the General Examination Regulations (APO) does not prohibit their use. However, if sources are hallucinated or paraphrased plagiarism is taken from AI-generated texts even when AI tools are permitted, the previous rules on plagiarism and attempts at deception in accordance with §18 (5) APO shall continue to apply and will be punished accordingly under examination law. Appropriate plagiarism detection software may be used in accordance with §15 (3) APO. The Guidelines for dealing with plagiarism for students and for teachers raises awareness of the various forms of plagiarism and measures to prevent it.

If the use of AI tools for examinations is explicitly excluded or specified in the examination regulations (module directory, examination requirements, etc.), failure to comply with these requirements will be considered an attempt at deception. As with plagiarism, the burden of proof lies with the examiners.

Apart from examination-related issues, the quality criteria of academic writing and subject- or course-specific requirements at the Faculty of Social Sciences are decisive for grading. The extent to which dedicated AI verification tools can be used and what consequences this will have for examination regulations in the future remains a question to be clarified at the university level (APO).

What we agree on: We, the lecturers, raise awareness of the different types of plagiarism and point out the possibility of plagiarism checks. In addition, we encourage our students to critically read cited texts with regard to the source work and to orient themselves in their own academic writing to existing standards and conventions as established in the specialist literature and by other authors. We, the students, recognise our rights and obligations in accordance with the General Examination Regulations (APO) and check the texts we write for any unintentional plagiarism.



The didactic and examination-related questions surrounding the use of AI tools in writing-intensive social science subjects are part of a larger social context. AI raises new political, ethical and ecological questions.

With the AI Act , the European Union is requiring providers and developers of AI tools who wish to market their applications within the union of states to comply with binding standards in all member states. Areas of application such as emotion recognition or social scoring, which certain AI tools would be technically capable of, are excluded from the outset. However, such political regulations do not release end users from their individual duty to inform themselves about how solutions that protect rights and are at the same time practicable can be found within a data protection-compliant framework (DSK 2024), especially in critical infrastructures such as universities (cf. ChatAI of the GWDG oder Netzwerk KI und digitale Autonomie in Wissenschaft und Bildung).

The results produced by AI tools depend not only on user input, but also to a large extent on the data used to train the large language models (LLMs) working in the background. This carries the risk of unintentionally reproducing structures of inequality if one assumes that AI tools are objective or neutral (Dengg 2023; Gross 2023). This bias can be countered with open-source applications that disclose source code and algorithms. It is therefore important to question which AI tools and their products are accessible to all users in the sense of the FAIR principles and which ones, as commercial offerings, only benefit a specific group of students, teachers or researchers.

On the environmental side, the direct impacts of resource-intensive data centres are water demand (Li et al. 2025) and energy consumption (Burian & Stalla-Bourdillon 2025). The indirect effects of AI can be seen, for example, in the increased consumption of products such as autonomous driving (Hintemann 2025).

These selected challenges have consequences for how individuals use AI tools in their working life in the social sciences. Despite all openness to technical progress, a critical perspective is essential.


1 The statement was presented to the Study Commission on 23 April 2025 and approved by the Faculty Council on 14 May 2025.