Header logo is


2019


no image
Spatial Continuity Effect vs. Spatial Contiguity Failure. Revising the Effects of Spatial Proximity Between Related and Unrelated Representations

Beege, M., Wirzberger, M., Nebel, S., Schneider, S., Schmidt, N., Rey, G. D.

Frontiers in Education, 4:86, 2019 (article)

Abstract
The split-attention effect refers to learning with related representations in multimedia. Spatial proximity and integration of these representations are crucial for learning processes. The influence of varying amounts of proximity between related and unrelated information has not yet been specified. In two experiments (N1 = 98; N2 = 85), spatial proximity between a pictorial presentation and text labels was manipulated (high vs. medium vs. low). Additionally, in experiment 1, a control group with separated picture and text presentation was implemented. The results revealed a significant effect of spatial proximity on learning performance. In contrast to previous studies, the medium condition leads to the highest transfer, and in experiment 2, the highest retention score. These results are interpreted considering cognitive load and instructional efficiency. Findings indicate that transfer efficiency is optimal at a medium distance between representations in experiment 1. Implications regarding the spatial contiguity principle and the spatial contiguity failure are discussed.

re

link (url) DOI [BibTex]


no image
Load-inducing factors in instructional design: Process-related advances in theory and assessment

Wirzberger, M.

TU Chemnitz, 2019 (phdthesis)

Abstract
This thesis addresses ongoing controversies in cognitive load research related to the scope and interplay of resource-demanding factors in instructional situations on a temporal perspective. In a novel approach, it applies experimental task frameworks from basic cognitive research and combines different methods for assessing cognitive load and underlying cognitive processes. Taken together, the obtained evidence emphasizes a process-related reconceptualization of the existing theoretical cognitive load framework and underlines the importance of a multimethod-approach to continuous cognitive load assessment. On a practical side, it informs the development of adaptive algorithms and the learner-aligned design of instructional support and thus leverages a pathway towards intelligent educational assistants.

re

link (url) [BibTex]


no image
Doing more with less: Meta-reasoning and meta-learning in humans and machines

Griffiths, T., Callaway, F., Chang, M., Grant, E., Krueger, P. M., Lieder, F.

Current Opinion in Behavioral Sciences, 2019 (article)

re

DOI [BibTex]

DOI [BibTex]


no image
Cognitive Prostheses for Goal Achievement

Lieder, F., Chen, O. X., Krueger, P. M., Griffiths, T.

Nature Human Behavior, 2019 (article)

re

DOI [BibTex]

DOI [BibTex]


no image
Effects of system response delays on elderly humans’ cognitive performance in a virtual training scenario

Wirzberger, M., Schmidt, R., Georgi, M., Hardt, W., Brunnett, G., Rey, G. D.

Scientific Reports, 9:8291, 2019 (article)

Abstract
Observed influences of system response delay in spoken human-machine dialogues are rather ambiguous and mainly focus on perceived system quality. Studies that systematically inspect effects on cognitive performance are still lacking, and effects of individual characteristics are also often neglected. Building on benefits of cognitive training for decelerating cognitive decline, this Wizard-of-Oz study addresses both issues by testing 62 elderly participants in a dialogue-based memory training with a virtual agent. Participants acquired the method of loci with fading instructional guidance and applied it afterward to memorizing and recalling lists of German nouns. System response delays were randomly assigned, and training performance was included as potential mediator. Participants’ age, gender, and subscales of affinity for technology (enthusiasm, competence, positive and negative perception of technology) were inspected as potential moderators. The results indicated positive effects on recall performance with higher training performance, female gender, and less negative perception of technology. Additionally, memory retention and facets of affinity for technology moderated increasing system response delays. Participants also provided higher ratings in perceived system quality with higher enthusiasm for technology but reported increasing frustration with a more positive perception of technology. Potential explanations and implications for the design of spoken dialogue systems are discussed.

re

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
A meta-analysis of the segmenting effect

Rey, G. D., Beege, M., Nebel, S., Wirzberger, M., Schmitt, T., Schneider, S.

Educational Psychology Review, 2019 (article)

Abstract
The segmenting effect states that people learn better when multimedia instructions are presented in (meaningful and coherent) learner-paced segments, rather than as continuous units. This meta-analysis contains 56 investigations including 88 pairwise comparisons and reveals a significant segmenting effect with small to medium effects for retention and transfer performance. Segmentation also reduces the overall cognitive load and increases learning time. These four effects are confirmed for a system-paced segmentation. The meta-analysis tests different explanations for the segmenting effect that concern facilitating chunking and structuring due to segmenting the multimedia instruction by the instructional designer, providing more time for processing the instruction and allowing the learners to adapt the presentation pace to their individual needs. Moderation analyses indicate that learners with high prior knowledge benefitted more from segmenting instructional material than learners with no or low prior knowledge in terms of retention performance.

re

DOI [BibTex]

DOI [BibTex]


no image
A rational reinterpretation of dual process theories

Milli, S., Lieder, F., Griffiths, T.

2019 (article)

re

DOI [BibTex]

DOI [BibTex]


Thumb xl linear solvers stco figure7 1
Probabilistic Linear Solvers: A Unifying View

Bartels, S., Cockayne, J., Ipsen, I. C. F., Hennig, P.

Statistics and Computing, 2019 (article) Accepted

pn

link (url) [BibTex]

link (url) [BibTex]

2017


Thumb xl probls sketch n3 0 ei0
Probabilistic Line Searches for Stochastic Optimization

Mahsereci, M., Hennig, P.

Journal of Machine Learning Research, 18(119):1-59, November 2017 (article)

pn

link (url) Project Page [BibTex]

2017


link (url) Project Page [BibTex]


Thumb xl early stopping teaser
Early Stopping Without a Validation Set

Mahsereci, M., Balles, L., Lassner, C., Hennig, P.

arXiv preprint arXiv:1703.09580, 2017 (article)

Abstract
Early stopping is a widely used technique to prevent poor generalization performance when training an over-expressive model by means of gradient-based optimization. To find a good point to halt the optimizer, a common practice is to split the dataset into a training and a smaller validation set to obtain an ongoing estimate of the generalization performance. In this paper we propose a novel early stopping criterion which is based on fast-to-compute, local statistics of the computed gradients and entirely removes the need for a held-out validation set. Our experiments show that this is a viable approach in the setting of least-squares and logistic regression as well as neural networks.

ps pn

link (url) Project Page Project Page [BibTex]


no image
Krylov Subspace Recycling for Fast Iterative Least-Squares in Machine Learning

Roos, F. D., Hennig, P.

arXiv preprint arXiv:1706.00241, 2017 (article)

Abstract
Solving symmetric positive definite linear problems is a fundamental computational task in machine learning. The exact solution, famously, is cubicly expensive in the size of the matrix. To alleviate this problem, several linear-time approximations, such as spectral and inducing-point methods, have been suggested and are now in wide use. These are low-rank approximations that choose the low-rank space a priori and do not refine it over time. While this allows linear cost in the data-set size, it also causes a finite, uncorrected approximation error. Authors from numerical linear algebra have explored ways to iteratively refine such low-rank approximations, at a cost of a small number of matrix-vector multiplications. This idea is particularly interesting in the many situations in machine learning where one has to solve a sequence of related symmetric positive definite linear problems. From the machine learning perspective, such deflation methods can be interpreted as transfer learning of a low-rank approximation across a time-series of numerical tasks. We study the use of such methods for our field. Our empirical results show that, on regression and classification problems of intermediate size, this approach can interpolate between low computational cost and numerical precision.

pn

link (url) Project Page [BibTex]


no image
Convergence Analysis of Deterministic Kernel-Based Quadrature Rules in Misspecified Settings

Kanagawa, M., Sriperumbudur, B. K., Fukumizu, K.

Arxiv e-prints, arXiv:1709.00147v1 [math.NA], 2017 (article)

Abstract
This paper presents convergence analysis of kernel-based quadrature rules in misspecified settings, focusing on deterministic quadrature in Sobolev spaces. In particular, we deal with misspecified settings where a test integrand is less smooth than a Sobolev RKHS based on which a quadrature rule is constructed. We provide convergence guarantees based on two different assumptions on a quadrature rule: one on quadrature weights, and the other on design points. More precisely, we show that convergence rates can be derived (i) if the sum of absolute weights remains constant (or does not increase quickly), or (ii) if the minimum distance between distance design points does not decrease very quickly. As a consequence of the latter result, we derive a rate of convergence for Bayesian quadrature in misspecified settings. We reveal a condition on design points to make Bayesian quadrature robust to misspecification, and show that, under this condition, it may adaptively achieve the optimal rate of convergence in the Sobolev space of a lesser order (i.e., of the unknown smoothness of a test integrand), under a slightly stronger regularity condition on the integrand.

pn

arXiv [BibTex]

arXiv [BibTex]


no image
Fast Bayesian hyperparameter optimization on large datasets

Klein, A., Falkner, S., Bartels, S., Hennig, P., Hutter, F.

Electronic Journal of Statistics, 11, 2017 (article)

pn

[BibTex]

[BibTex]


no image
Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

Wahl, N., Hennig, P., Wieser, H. P., Bangert, M.

Physics in Medicine & Biology, 62(14):5790-5807, 2017 (article)

Abstract
The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ##IMG## [http://ej.iop.org/images/0031-9155/62/14/5790/pmbaa6ec5ieqn001.gif] {$\leqslant {5}$} min). The resulting standard deviation (expectation value) of dose show average global ##IMG## [http://ej.iop.org/images/0031-9155/62/14/5790/pmbaa6ec5ieqn002.gif] {$\gamma_{{3}\% / {3}~{\rm mm}}$} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

pn

link (url) [BibTex]

link (url) [BibTex]


no image
Analytical probabilistic modeling of RBE-weighted dose for ion therapy

Wieser, H., Hennig, P., Wahl, N., Bangert, M.

Physics in Medicine and Biology (PMB), 62(23):8959-8982, 2017 (article)

pn

link (url) [BibTex]

link (url) [BibTex]


no image
Embedded interruptions and task complexity influence schema-related cognitive load progression in an abstract learning task

Wirzberger, M., Bijarsari, S. E., Rey, G. D.

Acta Psychologica, 179, pages: 30-41, Elsevier, 2017 (article)

Abstract
Cognitive processes related to schema acquisition comprise an essential source of demands in learning situations. Since the related amount of cognitive load is supposed to change over time, plausible temporal models of load progression based on different theoretical backgrounds are inspected in this study. A total of 116 student participants completed a basal symbol sequence learning task, which provided insights into underlying cognitive dynamics. Two levels of task complexity were determined by the amount of elements within the symbol sequence. In addition, interruptions due to an embedded secondary task occurred at five predefined stages over the task. Within the resulting 2x5-factorial mixed between-within design, the continuous monitoring of efficiency in learning performance enabled assumptions on relevant resource investment. From the obtained results, a nonlinear change of learning efficiency over time seems most plausible in terms of cognitive load progression. Moreover, different effects of the induced interruptions show up in conditions of task complexity, which indicate the activation of distinct cognitive mechanisms related to structural aspects of the task. Findings are discussed in the light of evidence from research on memory and information processing.

re

DOI [BibTex]

DOI [BibTex]


no image
Empirical Evidence for Resource-Rational Anchoring and Adjustment

Lieder, F., Griffiths, T. L., Huys, Q. J. M., Goodman, N. D.

Psychonomic Bulletin \& Review, 25, pages: 775-784, Springer, 2017 (article)

re

[BibTex]

[BibTex]


no image
Strategy selection as rational metareasoning

Lieder, F., Griffiths, T.

Psychological Review, 124, pages: 762-794, American Psychological Association, 2017 (article)

re

Project Page [BibTex]

Project Page [BibTex]


no image
A computerized training program for teaching people how to plan better

Lieder, F., Krueger, P. M., Callaway, F., Griffiths, T. L.

PsyArXiv, 2017 (article)

re

Project Page [BibTex]

Project Page [BibTex]


no image
Toward a rational and mechanistic account of mental effort

Shenhav, A., Musslick, S., Lieder, F., Kool, W., Griffiths, T., Cohen, J., Botvinick, M.

Annual Review of Neuroscience, 40, pages: 99-124, Annual Reviews, 2017 (article)

re

Project Page [BibTex]

Project Page [BibTex]


no image
The anchoring bias reflects rational use of cognitive resources

Lieder, F., Griffiths, T. L., Huys, Q. J. M., Goodman, N. D.

Psychonomic Bulletin \& Review, 25, pages: 762-794, Springer, 2017 (article)

re

[BibTex]

[BibTex]

2005


no image
Bruder sein das ist nicht schwer, Schwester sein dagegen sehr?" - Geschlechtsspezifische Betrachtungsweisen zur Situation von Geschwistern behinderter Kinder und Jugendlicher

Wirzberger, M.

Protestant University of Applied Sciences, Bochum, 2005 (thesis)

Abstract
Die Diplomarbeit beschäftigt sich mit der Lebenssituation von Geschwistern behinderter Kinder und Jugendlicher und berücksichtigt hier besonders den Aspekt des Geschlechts. Nach einer Darstellung familiensoziologischer Grundlagen erläutert die Verfasserin den hohen Stellenwert von Geschwisterbeziehungen innerhalb der Familie, sowie deren Entwicklung und Veränderung im Laufe des Lebens. Der Schwerpunkt liegt dabei auf dem Kindes- und Jugendalter. Anschließend werden Grundlagen, Prozesse und Mechanismen geschlechtsspezifischer Sozialisation, und die Auswirkungen des Geschlechts auf die Geschwisterbeziehung thematisiert. Kapitel 3 beschäftigt sich zunächst mit dem Begriff der Behinderung mit Bezug auf das SGB IX und die ICF. Danach beschreibt die Verfasserin, mit welchen spezifischen Belastungen sich die Eltern behinderter Kinder und Jugendlicher konfrontiert sehen. Die Auswirkungen einer Behinderung auf die Geschwister stehen im Mittelpunkt dieser Arbeit und werden anhand von Studien von HACKENBERG und GROSSMAN, sowie Aussagen von ACHILLES ausführlich dargestellt, wobei auch hier der Aspekt des Geschlechts detailliert in die Schilderung der Situation einbezogen wird. Um eine Verbindung von Theorie und Praxis zu gewährleisten, werden zusammenfassende Hypothesen formuliert und anhand von drei Fallgeschichten exemplarisch überprüft. Abschließend erläutert die Verfasserin die Konsequenzen ihrer Diplomarbeit für die heilpädagogische Arbeit.

re

DOI [BibTex]