This page addresses a list of questions about the analysis of press regulation coverage:
What is the purpose of the research?
The project investigates how press regulation was covered in the UK national press following the publication of the Leveson Report in November 2012.
Press regulation is a fairly unique area of public life, where the traditional channels of information about complex policy areas – large news media organisations – have a concrete interest in policy outcomes.
As a result, scrutiny of coverage of press regulation provides a unique insight into the ability of affected news organisations to balance commercial or strategic interests against the quasi-constitutional functions of a free and plural press that underpin modern democratic politics:
- To foster the transparency and accountability of powerful interests in society;
- To provide the public with a diverse range of information and viewpoints;
- To accurately portray issues of public importance to citizens.
This project explores whether these functions were fulfilled in reporting on press regulation.
What is being measured?
The project analyses articles about, or containing references to, press regulation in the UK national press in the 12-month period following the publication of the Leveson Report, from 29th November 2012 to 29th November 2013.
Though it is possible that a small number of articles may have been missed, the intention was to gather every relevant article published during this period, by searching news databases and websites for every article containing one or more of the following phrases:
- “Royal Charter”
- “Privy Council”
- “Independent Press Standards Organisation”
- “Press Standards Board of Finance”
- “Hacked Off”
- “Press Regulation”
- “Press Laws”
This produced a total of 2,047 articles (any duplicates were excluded). Given the large size of this sample, it is extremely unlikely that the number of relevant articles that may have been missed would have any significant impact on the main findings.
Beyond the basic statistics of articles (publisher, headline, date, wordcount, etc.), the project also analyses aspects of the ‘tone’ of coverage, to compare how press regulation was portrayed by national newspapers.
How is ‘tone’ measured?
In any research project, a balance has to be reached between deriving information from the source material, representing the complexity of the information, and physically conducting the analysis. Like a map, the larger the represented area, the more detail may be sacrificed for an accurate representative picture.
Accordingly, due to the very large number of articles analysed, this project applies a relatively simple and straightforward measurement of tone. Tone is measured on the basis of whether ‘positive’ or ‘negative’ statements about key issues in press regulation are found in articles.
These key issues were:
- The Leveson Report and its recommendations
- The Royal Charter that was agreed in March 2013 and eventually sealed by the Privy Council in October 2013
Positivity and/or negativity in an article is confirmed by the presence of specific ‘frames’ or evaluative statements (the terms ‘positive’ and ‘negative’ are used here in relation to their attitude to Leveson or the Charter, rather than any absolute sense of being ‘right’ or ‘wrong’):
- Supportive of Leveson Recommendations: Any statement in support of (a) the Leveson Report in general, or (b) any of its recommendations.
- Supportive of statutory underpinning of press regulation: either (a) a statement in support specifically of the Leveson recommendation on statutory underpinning; or (b) a general statement in support of statutory underpinning for press regulation.
- Supportive of Royal Charter: Any statement in support of the Cross-Party Charter, or its specific provisions.
- Threat to press freedom: Any reference to either Leveson or any proposed method of press regulation as a potential threat to press freedom, or to freedom of expression.
- Criticism of Leveson recommendations/cross-party Royal Charter provisions: Any critical reference to specific recommendations in the Leveson Report, or to any of the provisions of the Cross-Party Royal Charter.
- Questions the Legitimacy of the Leveson Report: Critical references that directly imply that the Leveson Inquiry or Report were flawed, corrupt, or otherwise illegitimate (including conspiracy, narrowness of remit or expertise of the judge, misconception in setting-up of the Inquiry, waste of public money).
- Damage to the UK’s international reputation: Any reference to either of two approximate arguments: that Britain will no longer set a good example for press freedom worldwide if Leveson or the Royal Charter system were to be implemented; or, the implementation of Leveson or the Royal Charter will be copied by undemocratic governments to crack down on journalists.
- Criticism of the process of agreeing the Royal Charter: Critical references specifically to the process of agreeing the Royal Charter – references to the “pizza deal”, “stitch-up”, etc.
These frames were then used to ascertain whether an article was designated:
- ‘Positive-only’ (contained only a combination of supportive frames)
- ‘Negative-only’ (contained only a combination of critical frames)
- ‘Both’ (contained a combination of both supportive and critical frames)
- ‘None’ (contained none of these frames)
What about bias in the research?
In content analysis like this, it is difficult to ensure that any inherent or subconscious biases in the mind of the researcher are not replicated in the analysis. For research to generate meaningful results, any project must be capable of being replicated by a different researcher following the same instructions.
To demonstrate that this research is replicable and that the results are reliable, a process of Inter-Coder Reliability (ICR) testing was undertaken. 10% of the sample (slightly over 200 randomly-selected articles) were distributed to researchers with no links to the project, or to the Media Standards Trust. The chosen researchers were nominated by the LSE Media Governance Master’s Degree programme, and coded the 10% of articles according to the methods used in the project.
The list below shows the results for each variable, including the Percentage Agreement between coders (i.e. how often the guest coder agreed with the main researcher), and the Cohen’s Kappa score – a statistical measure of inter-coder agreement, where values of over 0.8 (on a scale of -1 to 1) are generally agreed to indicate very high levels of agreement:
- Threat / 91.6% / 0.831
- Criticism / 94.1% / 0.837
- Legitimacy / 98.0% / 0.836
- International / 99.0% / 0.936
- Process / 98.5% / 0.895
- Support Leveson / 96.1% / 0.862
- Stat. Underpinning / 97.5% / 0.844
- Support Charter / 98.0% / 0.872
These results suggest that the methods used to ascertain tone in articles is straightforward enough to be employed by other researchers, with a high degree of agreement on the results. In other words, if the tone analysis was conducted by another researcher or team of researchers, it is highly likely that very similar results would be recorded. It can be argued, then, that the results in this aspect of the project are both reliable and replicable.
The full list of coding results can be downloaded here:
(NB: For each group of two columns, the first column is the main researcher’s answer, and the second column contains the guest coder’s answer. Cases 2-102 show the comparison between the main researcher and guest coder 1, and cases 103-204 show the comparison between the main researcher and guest coder 2).
The results report generated by using ReCal2 can be downloaded here:
Can the results be verified?
As with the previous instalment of the MST’s analysis of Leveson coverage, all the data that conclusions have been derived from are made freely available online here. A log of any errors or amendments to the dataset is also made public here. All data is derived from the text of the 2,047 articles that make up the sample. Since some of these are behind paywalls online, we cannot publish the full list online. However, we can make a document containing all source material available to interested researchers. It can be requested by emailing firstname.lastname@example.org. In this way, all of the data, and the ways in which this project has derived conclusions from it, can be verified by the public.