Developing an evaluation methodology for The Media Majlis Audiences

Majlis360, MHM cover image

Morris Hargreaves McIntyre (MHM) is working collaboratively with The Media Majlis to help embed a culture of and skilled practice in evaluation within the museum and its staff.  This is following a logic model approach to evaluation, as illustrated in the following model:

___Evaluation logic model


The first article in this series of three described the first four stages of the logic model, articulating the context, ambitions, strategies through which these ambitions will be achieved and the desired results (Hargreaves, 2020).  The resulting vision, aims, objectives, strategies, outputs and outcomes were presented in the form of a Strategy Tree, which is a deceptively simple but rigorous and visually compelling presentation of the why, what and how of the museum’s strategic plan. The strategy for The Media Majlis resulted in 30 outputs and 34 outcomes that will evidence the achievement of The Media Majlis across the seven pillars of its strategy: artistic content, learning approach, grow visitation, audience engagement, build knowledge, recognition and organizational excellence.  These range from the number of programs delivered as an output, to the audience feeling empowered to interrogate the media around them as an outcome.

___A two-part evaluation framework

The Strategy Tree is one of two tools that make up the Evaluation Framework.  The second is a Methodology Matrix which illustrates the monitoring that is needed to answer the question arising in Stage 5 of the logic model: “How will we know?” (the green circle in the diagram). The Methodology Matrix stems directly from the Strategy Tree by taking the desired results that the organization wants to achieve, expressed as outputs and outcomes on the tree, and turns them into indicators that can be measured. The matrix also identifies the audience from whom this data will be collected, the type of data (quantitative or qualitative) to be collected, and the research methods through which the data for each output and outcome can be collected.

___Differentiating target audiences from evaluation audiences

The Media Majlis has identified six target audience segments encompassing professionals, faculty, students, work experience, locals and tourists, with sub-segments as illustrated in the table below.


The Media Majlis Audience Segments

Professionals in areas of


Faculty at

Education City
Other higher education academics


Education City
Further afield
Prospective students
Graduate students

Work experience

Student workers




Visitors to other Education City cultural sites (e.g., QNL)

*Defined as a secondary audience segment


These target audiences are described in terms of the type of connection they may have such as media or journalism professionals, students or academics, or by their socio-demographic/geographic profile, such as locals or tourists.

However, for evaluation purposes, the audiences need to be identified by the route through which they will engage with the museum, as this will determine the type of research which will be undertaken with them. Data from the same target audience may be captured in multiple ways through the evaluation—a single person may be an exhibition visitor, a program participant, and a virtual visitor, three different routes to the museum.  Undertaking research with ‘on-site visitors’ for example will likely capture data from professionals, faculty, students, locals and tourists. Thinking of target audiences resulted in the identification of 8 evaluation audiences: on-site visitors; program/event participants; Northwestern faculty and students; prospective students; group visitors; virtual visitors and sector peers.

___Exploring potential methods

Having identified the key evaluation audiences, the next task was to find the most effective methods to capture the wide range of data desired. A number of factors were considered when identifying potential methods reflecting: the location of the museum; the highly interactive nature of the exhibitions and their use of the latest, often emerging, digital technologies; profile factors specific to individual audiences; and the range of data desired and resources available to undertake the evaluation.

Although increasing, there is still limited research undertaken with museum audiences in Qatar. But knowledge gained by both MHM and museum staff in museums in Qatar has suggested that a smaller than average proportion of exhibition visitors might be willing to respond to surveys; the number of people is expected to be low unless surveys are very short. However, given that a core audience for the museum is an academic one we felt this might overcome this resistance.  A preference for being interviewed rather than responding through self-completion surveys has also been found in previous local research.  Prior experience also suggested that audiences may be reluctant to willingly and openly offer critique, even when requested in the spirit of improving the museum’s offer. In terms of qualitative research, taking part in one-to-one interviews or with only a small number of known other respondents has also been found to be preferred.

In terms of the museum-specific context there were five initial factors that we believed would be relevant in determining the most appropriate, effective and efficient research methods.  Firstly, the main exhibition space overall is 372 sq.m./4,000 sq.ft. and while the internal structures are flexible, the design of exhibitions breaks down the space into zones—the first exhibition (2019) was such that the visitor experience was of seven smaller zones within the space.  In selecting methods, this meant we had to account for the fact that at any point within the exhibition, the visitor might well be aware of the presence of a researcher and we didn’t want to either intimidate visitors or influence their behavior. 

Secondly, we knew the first exhibition Arab Identities, images in film would offer the potential to gather digital analytics on the use of interactives and that some interactive screens would also include questions gathering audience response to the content of the exhibition.  However, due to technical delays the detail of exactly what data might be collectable and available was limited. 

Thirdly, there was potential to use close-circuit television (CCTV) to undertake observation of visitor behavior in the exhibition, thereby removing the aforementioned researcher from the physical space.  The omnipresence of CCTV in the Gulf region suggested that while the museum would still want to meet both Qatari and American ethical standards in its use for visitor research, there may not be the level of ethical concerns around its use in this way that might exist in some other parts of the world (Erskine-Loftus, 2016).

A factor which should not be overlooked is one which is separate from any actual exhibition—the location of the main exhibition space within the university is separate from the reception desk at the entrance to the university. Visitors engage with staff at the reception desk but not at or in the exhibition.

Finally, the museum decided not to collect visitor contact data due to the stringent European General Data Protection Regulations (GDPR, 2016) which would apply due the museum visitation, and employment, of European Union nationals. Therefore, undertaking research online after a person’s visit is not an option.

When considering the potential methods for evaluation against different audiences, some informed presumptions must be considered. One type of evaluation audience is ‘groups’, such as faculty and students, who we believed were likely to have a much closer connection to both the museum and greater knowledge of the subject matter due to the relevance to their studies. This audience will probably feel confident in their opinions, and therefore most likely be comfortable sharing them.  Other groups, such as tourists or more social group visitors, may have looser connections to the museum/university and may have significantly less prior subject knowledge and this may affect their willingness to participate in evaluation. This audience may have learnt a lot from an exhibition—learning more about the subject being their motivation for visiting—but this may not make them confident in sharing their opinions of it.  This range within just one evaluation audience, with likely discrete needs and motivations for visiting the museum, requires the measurement of audience-specific experience and outcomes.

As outlined above, the range of data desired to evaluate the achievement of The Media Majlis spans the seven pillars of the strategy: artistic content, learning approach, grow visitation, audience engagement, build knowledge, recognition and organizational excellence. The measures of success also required gathering both quantitative data, for example the number of exhibitions accompanying programs and participants, and qualitative data such as audiences understanding different perspectives, or students gaining unrivalled learning experiences. 

The evaluation plan needed to consider the resources available, in terms of time, skills and funding. At the time of planning the initial evaluation the museum had a small team of six staff and did not have a dedicated position for evaluation.  While all staff were involved in the development of the evaluation framework, which resulted in this being truly owned across and embedded in the organization, responsibility for implementing the evaluation plan was shared between the Director and the Exhibitions Coordinator, both of whom have prior experience of either directing or undertaking visitor research in museums. Additional resources for undertaking the research itself was available through student workers and placement students, who are provided with opportunities to gain hands-on experience in all aspects of evaluation including specific training to undertake this work.

___A pilot study utilizing multiple methods

Given the range of unknowns due to the innovative nature of the museum and its location, the range of data desired and the factors highlighted above, the first two years (2019–2020) were planned to be pilot studies to test what might work in practice.  Taking into consideration all the factors identified, the following methods were identified as potentially effective:



Primary method

  Secondary method



On-site exhibition visitors

Exhibition exit survey


Digital component analytics


Reception desk count of visitors

Exhibition entry count (via CCTV)

Bookshop sales

Program/event participants



Event monitoring (number of programs; number of tickets sold; number of attendees)

Northwestern Qatar faculty and students

Pre- and post-course survey


Knowledge audit

Course usage

Northwestern Qatar students

Pre- and post-course survey

Analysis of work produced

Course usage

Prospective students



Event monitoring

Group bookers



Booking records

Virtual visitors

Web survey


Social media coverage analytics

Social media content analysis

Web analytics

Potential visitors



Visit enquiry records




Media monitoring


___The Methodology Matrix

As described above, all this is brought together in the Methodology Matrix which forms the second of the two documents making up the Evaluation Framework. The image below represents an excerpt, to illustrate the structure and scope of the methodology matrix.




___Putting the evaluation strategy into practice

The third article of this series will share how the Evaluation Strategy was put into practice at The Media Majlis and critique its effectiveness.



Erskine-Loftus, P. (2016) ‘Technology and the ‘point of experience’: aspects of CCTV as possible museum exhibition evaluation and experience tracker in Qatar’, Multaqa, Gulf Museum Educators Journal 2(autumn), 20­–27.

GDPR (2016)

Hargreaves, J. (2020, June 3) How will the Media Majlis know what difference it has made to audiences?  The Media Majlis. Retrieved from:


  • Author credits

    Jo Hargreaves

    Jo Hargreaves is a co-founder of the international consultancy Morris Hargreaves McIntyre (MHM).  She is a leading authority on strategic planning, evaluation and audience development in the cultural sectors and is at the forefront of developing frameworks to explore the difference that culture makes to people and to measure its social, economic and cultural value.  Jo has led on projects in Qatar, New Zealand, Australia, Hong Kong and the UK.  An accomplished trainer and mentor, Jo lectures worldwide on strategic planning, audience development and evaluation.  She is a Fellow of the Royal Society of Arts, a Fellow of the Chartered Institute of Marketing, and a member of the Market Research Society and the Evaluation Society.