Research interests

An insight in the fields currently relevant to the Didactics group at The Department of Informatics.

Educational Experiments and Interventions

Related fields: psychology, statistics

Learning is a mysterious beast. It is influenced by many social processes, the effects of which we are often unable to estimate until we observe students’ grades or dropouts. It is important for us as educational researchers to be able to understand the causes and consequences of these observations. For both externally imposed structural shifts and our own pedagogical interventions, we want to be able to predict the effect that they may have on the learning experience in informatics courses.

Experiments are the most reliable method for assessing the effects of such changes. An experimental approach is common to many fields: from validating hypotheses in natural and social science research, to supporting decision-making in industry. As one example from our group, we worked with social psychology educational experiments that manipulate the story setting of INF100 exam questions to assess the potential impact of stereotypes on gender gaps in grades (see the figure below). 

Many interesting questions can be wrapped in an experimental setup. Does the physical environment of the classroom affect students attitudes? Does gamification along with course assignments support engagement?

Do you have an interesting idea about the learning experience in informatics? Test it with an experiment!

Graph showing different exam results based on gender and "masculine" vs "feminine" exams
<em>Grade gaps across different versions of exams. Here, the story topics of the problems were the experimental treatment, and the grades an outcome (src: yet unpublished paper).</em>. Photo: Didactics group, UiB

Mastery Learning

Related fields: visualization, machine learning

Mastery learning is an educational philosophy about progressive advancement in the learning process. It maintains that students must achieve a level of mastery in prerequisite knowledge before they move forward to learn subsequent information. Think about it as a ladder, like in this example about learning itself: in order develop the complex skills higher up in the pyramid, we first should know how to define/memorize, classify, or to compare.

From wikimedia: A visual representation of Bloom's revised taxonomy, with indications of possible classroom activities associated with each level.
Photo: Tidema, WikimediaCommons

A consistent representation of prerequisite knowledge allows us to understand how students progress through the curriculum, and which concepts are fundamental to their comprehension of the subject. For example, we can apply causal discovery methods to map students’ CS1 concept mastery (figure below). How can this information support instructors in learning and assessment design, such as adaptive learning approaches? Can these models be useful for students, and if yes, how can they best be presented?

A prototype of a network of curriculum prerequisites in INF100
<em>A prototype of a network of curriculum prerequisites in INF100&nbsp;</em>. Photo: Didactics group, UiB

Interviews and Observation Studies

Related fields: sociology, psychology

Interviews and observations are key to helping educators investigate their teaching. Each year, a few hundred students enrol in the informatics bachelor programs, and all of them have different attitudes, backgrounds and desires in informatics. In order to adapt course design to such a heterogeneous group of students, we must constantly conduct observational studies. How much do grades in INF100 depend on what you already know from before university? Does the course itself even make a difference? What do students think about their learning experience in the course? 

One specific example is a think-aloud interview that we conducted for the pior-knowledge PIKA test. The purpose of the interviews was to investigate response processes of students?. Does the task you given to student elicits the actual cognitive process you wanted to assess? Does it really measure what they know about programming, or correct answer could be achieved with other sort of cues / heuristics / misconceptions/ guessing?

Tool development

(also as collaboration with other institutes)

Related fields: software development, UX/UI design

If you’re more interested in software or tool development, we can offer projects along those lines, too. Either based around education themes, or in collaboration with other institutes across UiB. Some project suggestions are listed here, but you can of course discuss your own software ideas with us.

Dissemination of PIKA Results

To explore how much programming students know before starting higher education, the Prior Informatics Knowledge Assessment (external link) (PIKA) is administered in the very first week of university. This test assesses how well students know the fundamental programming concepts that typically are taught in an introductory programming course.

The students’ test responses are intended to inform course instructors of what their students know, don’t know and what programming misconceptions they hold. We are interested in developing a web tool to disseminate these results. To do so requires an effective manner of summarizing quantitative and qualitative data. This project includes the exploration of visualization methods, statistical analyses and web development to present student programming answers.

Exam Analyzer

Every semester hundreds of exams are created to check whether students have achieved the learning outcomes of their respective courses. However, exam tasks can lack validity and in some cases fail to measure what they intend to measure. To support more rigorous development and analyses of exam validity, we aim to develop an exam analyzer, a tool which performs item analyses, exploring how well each task supports fair assessment across a broad range of student abilities. The analyzer takes a given exam’s tasks and grades as input, and showcases task analyses to the course’s instructor, helping them gather information about their exam. 

Outreach and science communication

Demonstrating the uses of informatics to create enthusiasm in the next generation is also a big part of what we do in our group. We have created resources for outreach used at Forskningsdagene (external link) and other venues, where we for example demonstrate projects that students at the Department of Informatics have created, how computers work and how the field of informatics has an impact on the society we live in. Our contribution to Forskningsdagene was featured in a Bergens Tidende article in 2025 (external link).

A current project in our group is an LLM explainer website, https://llms.no/ (external link), where the user can build a simple language model step by step, learning about features of LLMs in the process. Effective learning needs good teaching materials and helpful activities. Questions such as “Can we find new and interesting ways of explaining a concept?”, “How do we know that an activity is effective?” are possible investigations for a masters project.

Last updated: 11.03.2026