Finally out (already mentioned earlier this year) – now in it’s full glory @ Current Directions in Psychological Science.
Some papers have somewhat weird starting points – this one had an awesome starting point – Lake Louise (Canada):
In a little suite we (Joe Johnson, Ulf Böckenholt, Dan Goldstein, Jay Russo, Nikki Sullivan, Martijn Willemsen) sat down during a conference called the ‘Choice Symposium‘ and started working on an overview paper about the history and current status of different process tracing methods. One central result (why can’t all papers be like that) is the figure below where we try to locate many process tracing methods on the two dimensions: temporal resolution and distortion risk (i.e., how fast can a method measure a process and how destructive is this measurement).
Schulte-Mecklenbeck, M., Johnson, J.G., Böckenholt, U., Goldstein, D., Russo, J., Sullivan, N., & Willemsen, M. (in press). Process tracing methods in decision making: On growing up in the 70ties. Current Directions in Psychological Science.
Ah – everybody was trying to find a path all the time:
This syllabus of an (obviously) awesome class has a ton of good reads:
Everything is fucked: The syllabus
by Sanjay Srivastava
I would have two additions:
- A multi lab replication project on ego-depletion (Hagger & Chatzisarantis, 2016)
- And the response from Roy Baumeister and Kathleen D. Vohs
It’s a really good statement of how f… up things are (in addition to all the other good examples above) …
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” – Max Planck
Cilia Witteman and Nanon Spaanjaars (my dutch connection) worked together on a piece on whether psychodiagnosticians improve over time (they don’t) in their ability to classify symptoms to DSM categories. This turned out to be a pretty cool paper combining eye-tracking data with a practical, and hopefully, relevant question.
Schulte-Mecklenbeck, M., Spaanjaars, N.L., & Witteman, C.L.M. (in press). The (in)visibility of psychodiagnosticians’ expertise. Journal of Behavioral Decision Making. http://dx.doi.org/10.1002/bdm.1925
This study investigates decision making in mental health care. Specifically, it compares the diagnostic decision outcomes (i.e., the quality of diagnoses) and the diagnostic decision process (i.e., pre-decisional information acquisition patterns) of novice and experienced clinical psychologists. Participants’ eye movements were recorded while they completed diagnostic tasks, classifying mental disorders. In line with previous research, our findings indicate that diagnosticians’ performance is not related to their clinical experience. Eye-tracking data pro- vide corroborative evidence for this result from the process perspective: experience does not predict changes in cue inspection patterns. For future research into expertise in this domain, it is advisable to track individual differences between clinicians rather than study differences on the group level.
Andrew Gelman talked about a really old paper I did together with Anton Kühberger ages ago. It was actually the first paper / ‘real’ scientific project I was involved in.
It generated quite the buzz over its 20 year lifespan and was cited a whopping 13 times (stats look good without y-axis) …
Going back to it, I was happy to see that we already talked about replication (and were very reluctant to push the button harder – as we would have not been able to get through the reviews, I guess) … Things have changed.
Recently Ryan Murphy and myself realised that a startup here in Berlin features ideas of our 2011 Flashlight paper.
Well, the guys at attensee.com did a great job taking the idea we had much further we ever thought one would be able to take it …
Here is a feature I totally love – a live heat map of what you are looking at … awesome!