Counting on an Iterative Process: Initial Lessons from the Research Assessment Exercise 2008

2011 
1. IntroductionWe may 'all know the Research Assessment Exercise (RAE), but it is far from true that we all 'love? it (Oppenheim, 1996). The RAE 2008 was intended to create a ranking of institutions according to quality of research activity as a basis for the determination of research grants from the HE funding bodies (RAE, undated). The results in the field of Business and Management Studies may have been disappointing for a number of institutions.The ratings themselves form a type of feedback, especially by ranking the institutions in comparison with each other. However, this feedback is difficult to use in making improvements to performance as there is no indication as to the direction of renewed effort. The ratings in the areas of Research Outputs, Research Environment, and Esteem Indicators graded business schools according to published criteria requiring 'originality, significance and rigour? at 'world-leading? (4*), 'internationally excellent? (3*), 'recognized internationally? (2*) and recognized nationally? (1*) levels (Ghobadian, 2009; RAE, undated; Seale, 1999).'Originality? requires the innovation or distinctiveness of: the methodological approach, the datasets used, the research questions posed and the underlying theoretical framework. 'Significance? includes the insight and scope of coverage of the work, the impact on the discipline in the UK or internationally, the extent to which it has opened up new areas of research and current or potential impact on policy and practice.'Rigour? involves the contextualisation of the work, the strength, appropriateness and intellectual coherence and the extent to which the research outcomes are supported (Ghobadian, 2009).This on-going study 'drills down? into material available on the RAE 2008 website in order critically to evaluate the salient features of the more successful business schools. We intend to generate a model that can be of practical use in the RAE?s successor, the Research Excellence Framework (REF) 2013. This is an area that has received little attention in the academic literature on Research Methods in Business and Management; however, it is important to investigate and analyse the peer evaluation of scholarly endeavours. For this paper we have focussed on the outputs of the research from the perspective of research methodology: the paradigms which seem to be in opposition and the methods employed. We demonstrate the iterative character of our project in terms of the theoretical base and also the sample chosen.Assessing the quality of research publications is clearly a contentious activity; even the search for judgement criteria is seen as 'controversial? (Seale, 1999). The problem is magnified by the existence of the two seemingly monolithic paradigms. The positivist approach, as we know, favours quantitative methods which are able to 'prove? results by counting responses (often large numbers of them) and performing mathematical feats to demonstrate correlations; the newer interpretivist, or phenomenological, qualitative tradition, on the other hand, prefers to delve behind the figures (often very few of them, even a sample of one only) to discover the reasoning behind behaviours and attitudes.The battle over what constitutes the best research wages on a field where the definition of 'best? is not agreed. An extreme personal experience illustrates this contest in unusually concrete terms. One of the writers of this paper witnessed a scene at an international conference (not the ECRM) three years ago in which a keynote speaker giving an address on qualitative methods was booed and shouted down by a group (manifestly identifying themselves as) of quantitative researchers.This paper continues with the following structure. After reviewing the literature in the field of quality in Research Methods and discussing the method for this study, it will use Westminster Business School, a post-1992 institution, as an example, and focus on the Research Output aspect of the RAE ratings. …
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    1
    References
    4
    Citations
    NaN
    KQI
    []