Jump to contentJump to search

Birk Diedenhofen

Birk Diedenhofen

Researcher

Dr. Birk Diedenhofen
Building: 23.03
Floor/Room: 00.25
+49 211 81-12065
+49 211 81-11753

Interessen

  • Psychologische Diagnostik & Testtheorie
  • Onlineforschung
  • R

Zeitschriftenartikel

  • Diedenhofen, B., Hoffmann, A., Aust, F., & Müller, S. (2023). Detecting Applicant Faking With a Context-Specific Overclaiming Questionnaire. Journal of Individual Differences. Advance online publication. https://econtent.hogrefe.com/doi/full/10.1027/1614-0001/a000411

  • Hoffmann, A., Diedenhofen, B., & Müller, S. (2022). The utility of overclaiming questionnaires depends on the fit between test content and application context. Current Psychology. https://doi.org/10.1007/s12144-022-03934-x

  • Papenberg, M., Diedenhofen, B., & Musch, J. (2019). An experimental validation of sequential multiple-choice tests. Journal of Experimental Education, 89(2), 402-421. https://doi.org/10.1080/00220973.2019.1671299

  • Diedenhofen, B., & Musch, J. (2019). Option weights should be determined empirically and not by experts when assessing knowledge with multiple-choice items. International Journal of Selection and Assessment, 27(3), 256-266. https://doi.org/10.1111/ijsa.12252

  • Diedenhofen, B., & Musch, J. (2018). An investigation into the usefulness of time-efficient item selection in computerized adaptive testing. Psychological Test and Assessment Modeling, 60(3), 289-308. https://www.psychologie-aktuell.com/fileadmin/Redaktion/Journale/ptam_3-2018_289-308.pdf

  • Meisters, J., Diedenhofen, B., & Musch, J. (2018). Eyewitness identification in simultaneous and sequential lineups: an investigation of position effects using receiver operating characteristics. Memory, 26(9), 1297-1309. https://doi.org/10.1080/09658211.2018.1464581

  • Stoeber, J., & Diedenhofen, B. (2017). Multidimensional perfectionism and counterfactual thinking: Some think upward, others downward. Personality and Individual Differences, 119, 118-121. https://doi.org/10.1016/j.paid.2017.07.009

  • Diedenhofen, B., & Musch, J. (2017). PageFocus: Using paradata to detect and prevent cheating on online achievement tests. Behavior Research Methods, 49, 1444-1459. https://doi.org/10.3758/s13428-016-0800-7

  • Sommer, J., Diedenhofen, B., & Musch, J. (2017). Not to Be Considered Harmful: Mobile-Device Users Do Not Spoil Data Quality in Web Surveys. Social Science Computer Review, 35(3), 378-387. https://doi.org/10.1177/0894439316633452

  • Diedenhofen, B., & Musch, J. (2017). Empirical option weights improve the validity of a multiple-choice knowledge test. European Journal of Psychological Assessment, 33(5), 336-344. https://doi.org/10.1027/1015-5759/a000295

  • Diedenhofen, B., & Musch, J. (2016). cocron: A web interface and R package for the statistical comparison of Cronbach’s alpha coefficients. International Journal of Internet Science, 11, 51-60. http://www.ijis.net/ijis11_1/ijis11_1_diedenhofen_and_musch.html

  • Hoffmann, A., Diedenhofen, B., Verschuere, B., & Musch, J. (2015). A strong validation of the crosswise model using experimentally-induced cheating behavior. Experimental Psychology, 62(6), 403-414. https://doi.org/10.1027/1618-3169/a000304

  • Haarhaus, B., & Diedenhofen, B. (2015). Die Qual der Vorauswahl: Online-Tests identifizieren geeignete Kandidaten besser als Schulnoten. Wirtschaftspsychologie aktuell, 2, 16–18.

  • Blank, H., Diedenhofen, B., & Musch, J. (2015). Looking back on the London Olympics: Independent outcome and hindsight effects in decision evaluation. British Journal of Social Psychology, 54(4), 798-807. https://doi.org/10.1111/bjso.12116

  • Diedenhofen, B., & Musch, J. (2015). cocor: A comprehensive solution for the statistical comparison of correlations. PLoS ONE, 10(6), e0121945. https://doi.org/10.1371/journal.pone.0121945

  • Aust, F., Diedenhofen, B., Ullrich, S., & Musch, J. (2013). Seriousness checks are useful to improve data validity in online research. Behavioral Research Methods, 45, 527-535. https://doi.org/10.3758/s13428-012-0265-2

R-Pakete

Konferenzbeiträge

  • Diedenhofen, B., Hofmann, A., & Musch, J. (2023). Experimental Validation of a New Axiomatically Derived Scoring Rule for the Assessment of Knowledge over the Internet. Talk at 53rd Annual Meeting of the Society for Computation in Psychology, November 16th, San Francisco, USA.

  • Angelike, T., Diedenhofen, B., & Musch, J. (2023). Experimental Validation of a New Axiomatically Derived Scoring Rule for the Subset Selection Response Format. Talk at 53rd Annual Meeting of the Society for Computation in Psychology, November 16th, San Francisco, USA.

  • Diedenhofen, B., & Hoffmann, A. (2021). Overclaiming questionnaires detect applicant faking as well as an established measure of faking. Talk at 63th Conference of Experimental Psychologists, March 14–16th, Ulm.

  • Diedenhofen, B., Willing, S. & Musch, J. (2018). Eyewitness performance in sequential lineups: A ROC-based analysis of order effects. Talk at 60th Conference of Experimental Psychologists, March 11–14th, Marburg.
  • Diedenhofen, B., & Musch, J. (2017). Alternative scoring procedures and response formats for online achievement tests. Talk at 47th Annual Meeting of the Society for Computers in Psychology, November 9th, Vancouver, Canada.
  • Diedenhofen, B., & Musch, J. (2017). Das Antworte-Bis-Es-Stimmt-Verfahren verbessert die Validität von Multiple-Choice-Tests. Vortrag auf der 14. Arbeitstagung der DGPs-Fachgruppe für Differentielle Psychologie, Persönlichkeitspsychologie und Psychologische Diagnostik, 4. bis 6. September, München.
  • Diedenhofen, B., & Musch, J. (2016). PageFocus: Using paradata to detect window switching behavior in online tests. Talk at 46th Annual Meeting of the Society for Computers in Psychology, November 17th, Boston, USA.
  • Papenberg, M., Diedenhofen, B. & Musch, J. (2016). Alternative Auswertungsverfahren und Antwortformate für Multiple-Choice-Tests. Vortrag auf dem 50. Kongress der Deutschen Gesellschaft für Psychologie (DGPs), 18. bis 22. September, Leipzig.
  • Diedenhofen, B., & Musch, J. (2016). Option weights should be determined empirically and not by experts when assessing knowledge with multiple-choice items. Talk at 58th Conference of Experimental Psychologists, March 21–23th, Heidelberg.
  • Papenberg, M., Diedenhofen, B., & Musch, J. (2016). An experimental validation of sequential multiple-choice tests. Talk at 58th Conference of Experimental Psychologists, March 21–23th, Heidelberg.
  • Stangenberg, K., Diedenhofen, B., & Musch, J. (2016). Does the presentation and scoring of pick-n items affect test reliability and validity? an experimental investigation of the pick-n answer format. Talk at 58th Conference of Experimental Psychologists, March 21–23th, Heidelberg.
  • Diedenhofen, B., & Musch, J. (2016). PageFocus: A paradata tool to detect window switching behavior in online tests. Talk at 18th General Online Research Conference, March 3–4th, Dresden.
  • Papenberg, M., Diedenhofen, B., & Musch, J. (2016). Changing the scoring procedure and the response format to get the most out of multiple-choice tests conducted online. Talk at 18th General Online Research Conference, March 3–4th, Dresden.
  • Diedenhofen, B., & Musch, J. (2015). Empirische Optionsgewichte verbessern die Validität eines Multiple-Choice-Wissenstests. Vortrag auf der 13. Arbeitstagung der DGPs-Fachgruppe für Differentielle Psychologie, Persönlichkeitspsychologie und Psychologische Diagnostik, 21. bis 23. September, Mainz.
  • Musch, J., & Diedenhofen, B. (2015). PageFocus: Paradatenbasierte Betrugsdetektion und -vermeidung in Online-Leistungstests. Vortrag auf der 13. Arbeitstagung der DGPs-Fachgruppe für Differentielle Psychologie, Persönlichkeitspsychologie und Psychologische Diagnostik, 21. bis 23. September, Mainz.
  • Diedenhofen, B., Trost, S., & Musch, J. (2015). PageFocus: Using paradata to detect and prevent cheating in online achievement tests. Talk at 17th General Online Research Conference, March 18–20th, Cologne.
  • Sommer, J., Diedenhofen, B., & Musch, J. (2015). Not to be considered harmful: Mobile users do not spoil data quality in web surveys. Poster at 17th General Online Research Conference, March 18–20th, Cologne.
  • Diedenhofen, B., & Musch, J. (2015). Answer-until-correct responding improves the validity of multiple-choice tests. Talk at 57th Conference of Experimental Psychologists, March 8–11th, Hildesheim.
  • Hoffmann, A., Diedenhofen, B., Verschuere, B. J., & Musch, J. (2015). Eine experimentelle Validierung des Crosswise-Modells. Vortrag auf der 57. Tagung experimentell arbeitender Psychologen (TeaP), 8. bis 11. März, Hildesheim.
  • Diedenhofen, B., & Musch, J. (2014). Zwei WWW-Seiten zur Signifikanztestung von Korrelationen und Cronbach Alphas. Poster auf dem 49. Kongress der Deutschen Gesellschaft für Psychologie (DGPs), 21. bis 25. September, Bochum.
  • Diedenhofen, B., & Musch, J. (2014). Using JavaScript to detect cheating in
    online tests. Talk at 56th Conference of Experimental Psychologists, March 31st–April 2nd, Gießen.
  • Diedenhofen, B., & Musch, J. (2014). Pagefocus: A new JavaScript to detect cheating in online tests. Talk at 16th General Online Research Conference, March 5–7th, Cologne.
  • Diedenhofen, B., & Musch, J. (2013). cocor und cocron: Zwei R-pakete zur Signifikanztestung von Korrelationen und Cronbach Alphas. Poster auf der 12. Arbeitstagung der DGPs-Fachgruppe Differentielle Psychologie, Persönlichkeitspsychologie und Psychologische Diagnostik, 23. bis 25. September, Greifswald.
  • Musch, J., & Diedenhofen, B. (2013). Zur Verbesserung der Wissensdiagnostik durch empirische Optionsgewichtung. Vortrag auf der 12. Arbeitstagung der DGPs-Fachgruppe Differentielle Psychologie, Persönlichkeitspsychologie und Psychologische Diagnostik, 23. bis 25. September, Greifswald.
  • Sommer, J., Diedenhofen, B. & Musch, J. (2013). A comparison of expert and empirical option weighting as alternative procedures to improve the reliability and validity of multiple choice tests. Talk at 55th Conference of Experimental Psychologists, March 24–27th, Vienna.
  • Diedenhofen, B., & Musch, J. (2013). Empirical option weighting increases the reliability and validity of multiple choice tests. Talk at 55th Conference of Experimental Psychologists, March 24–27th, Vienna.
  • Diedenhofen, B., & Musch, J. (2012). Zur Optimierung der Zeiteffizienz beim computeradaptiven Testen. Vortrag auf der 54. Tagung experimentell arbeitender Psychologen (TeaP), 1. bis 4. April, Mannheim.
  • Diedenhofen, B., & Musch, J. (2011). Zur Berücksichtigung der Bearbeitungszeit beim computeradaptiven Testen. Vortrag auf der 11. Arbeitstagung der DGPs-Fachgruppe für Differentielle Psychologie, Persönlichkeitspsychologie und Psychologische Diagnostik, 26. bis 28. September, Saarbrücken.
  • Ullrich, S., Diedenhofen, B. & Musch, J. (2010). Leben wir in der besten aller möglichen (Koalitions)Welten? Eine Analyse der Partei- und Koalitionspräferenzen bei der Bundestagswahl 2009. Poster auf dem 47. Kongress der Deutschen Gesellschaft für Psychologie (DGPs), 26. bis 30. September, Bremen.
  • Musch, J., Ullrich, S. & Diedenhofen, B. (2010). Selection bias in web surveys and the use of propensity scores in forecasting the result of the 2009 German federal election. Talk at 12th General Online Research Conference, May 26–28th, Pforzheim.
  • Diedenhofen, B., Aust, F., Ullrich, S. & Musch, J. (2010). Seriousness checks are useful to improve data validity in online research. Poster presented to 12th General Online Research Conference, May 26–28th, Pforzheim.
Responsible for the content: