ICMPC 12 Day 1: Second afternoon session
The last section of the full Day 1 (Day 2 in the schedule) again featured a choice of five different parallel sessions. I decided to attend the curiously named ‘Replication and ‘truth’ in music psychology’, partly because of the interesting title and partly because our research group (and one of our masters students Kelly) had been involved in this symposium.
The session was held in Hall 1. At this point I will permit myself a little whinge about this location – the acoustics are appalling! It is a huge barn of a structure, made much worse by the labyrinthine air conditioning system which sounds like a freight train rumbling past and which is on the whole time.
I am still in search for the perfect location within the room where these issues are minimised. I hope readers will bear this in mind if I miss out an important point from their talks!
So, on with the symposium. The first speaker, Reinhard Kopiez, gave a superb overview of the field of replication. He pointed out that music psychology is now entering a consolidation phase and presented the interesting argument that we don’t need more new findings at this point (rather an extreme view, probably just for effect so bear with him) ; instead we need to go back over our major findings, the ones that are repeated over and over again in the text books, and determine the extent to which they are ‘true’ – i.e. replicable and the real size of the underlying effect.
He outlined a couple of famous cases in science where breakthrough findings that could have changed the course of a discipline turned out to be non replicable: The cold fusion theory (best explained by methodological failures) and the production of semiconductors made from plastic (attributed to fraudulent data).
The importance of replication in science is therefore beyond dispute, as it allows the revelation of ‘true’ effects and a determination of their relative importance – in music psychology we have barely begun this gold standard of academic quality control.
Replication studies also allow for a second important phase in science, that of meta-analysis. Again, music psychology is in its infancy with meta-analysis. A quick search of the literature revealed only about 6 such studies in the whole field.
Reinhard went on to outline key issues regarding the impacts of Type 1 error in science (e.g. you think you find something but there is really nothing there – a false positive) and stressed the need to consider a priori power analysis to minimise these risks. He recommended a number of texts for help with this area but I wrote down ‘The essential guide to effect sizes’ by Ellis (2010) as one that he particularly eulogised.
Reinhard went on to outline one of his own replication studies that used effective power analysis and careful design. He showed that bimanual training does not in fact have an effect on handedness where previous studies had reported an association. In this case he cited a sampling error in the original paper; when the N is too small you can mistakenly think an effect is there because you haven’t sampled effectively from the population: A very good point and one that we have to be particularly careful about in studies of amusia where the population sample is often too small.
At the end Reinhard recommended a link to the ‘ Reproducibility Project’ where you can find more information
The next two papers presented replication studies in our field. The first looked at the Deurtsch octave illusion, where perception of a two tone pattern differs depending on handedness. The original finding was that most right handers report hearing the high tone in the right ear whereas left handers either show no bias or perceive more complex tone patterns.
The authors set out to replicate this effect but also added a new spin. This ‘new spin’ idea is very important for replication studies – whatever journal editors claim it is my experience that you need at least some new contribution to the literature to get published. They tried a different measure of handedness (a manual tapping task) as well as the traditional questionnaire that Deutsch had used.
In the end the authors reported that the Deutsch octave illusion could be replicated when using her original materials. Their new measure of handedness actually improved the paradigm, giving a much clearer distinction between right and left handers. So they had shown the power of the original effect and reduced the misclassification of handedness that often happens with a simple questionnaire.
The second study aimed to look at the ‘Levitin effect’ whereby normal listeners without absolute pitch are capable of reproducing favourite music to a high degree of accuracy, suggesting a component of absolute pitch memory must exist in most of us. The aim of the paper was to attempt to replicate this effect across 6 different European labs (including the MMB group).
Participants were simply asked to sing a favourite pop song and their first recognisable note was compared to an established recording.
The authors found that the original effect could be replicated but that the true effect size was likely to be smaller to that reported in the original study. The majority of the labs found performance that was worse than the Levitin paper (this is termed a ‘Deline effect’) but that still hinted at the existence of an absolute pitch memory component in normal listeners for very well known tunes.
At the end of the session Henkjan Honing gave a short but stimulating discussion of the points raised and outlined a number of main drivers for replication. In particular I noted the ‘Laboratory effect’ whereby different labs nearly tend to produce different results – the important thing here is to determine the nature of the distributions and to narrow down (by meta analysis) the likely size of the ‘true’ effect.
He also highlighted the danger of the ‘Media attention effect’ whereby a finding makes its way into the public conscious and therefore people do not want to hear that it might not be true. Mozart anyone? Finally, he pointed out the danger of the ‘Overestimated effect of effects’ with which I thoroughly agree – this describes the problem of focusing on finding an effect and then not worrying about an explanation: Most of the time the ‘truth’ is in the underlying mechanism, not the firework on the surface.
To close, Henkjan suggested a number of ways forward for the discipline at this points:
1) Mandatory publication of data with every paper
2) Publication of reviews along with a quality assessment process for reviews
3) Transparency at all times- this allows verifiability and ultimately REPLICATION.
All in all, a very stimulating session to bring the first full day to a close.
6 Comments
Aaron Wolf
Excellent review! Thanks so much!
Mark Riggle
These are very good points about replication of results, but as Henri Poincare said: “Science is built up with facts, as a house is with stones. But a collection of facts is no more a science than a heap of stones is a house.” What is missing in Music Psyc are theories of cause-and-effect — mechanisms that explain the data. There are patterns in the data, but that is not explanatory theory; at best they become descriptive theory. Until explanatory theories exist that are consistent or directly inconsistent with the data, replication of experimental results still will have little meaning.
Thomas
Thanks for the great writeups so far! Could you list those 6 music psychology meta-analyses? I’m interested in reading them.
vicky
You are very welcome. I am afraid that they were not presented in the talk but you could contact the author (Reinhard) for the details.
All the best
Vicky
vicky
An excellent and thoughtful argument. I would like to add to your points if I may and reiterate the importance of not going down the route of assuming identifying a brain area is locating a cause – I see this too often in the literature! Best, Vicky
Pingback: