
This page outlines the methodology used to collected data for the 2020 Profiling the Profession project.
The project started in June 2020, after funding was agreed under a Covid-19 Emergency Response Funding Agreement with Historic England. The project team consisted of: Kenneth Aitchison, Poppy German and Doug Rocks-Macqueen.
Survey Development
An original first draft of questions was created by the project team in a Word document – developed from previous iterations of the survey and split into two sections, questions for individuals and questions for organisations. It was decided from the onset of the project to collect data from both individuals and organsations. The reasoning behind this was to:
- reduce the time organisations would have to spend filling in the survey;
- to ask questions of individuals that employers would not necessarily know but which are important to the sector.
Drafts were distributed to various stakeholder organisations for comment. In alphabetical order
- ALGAO
- Archaeological Training Forum – not listed elsewhere:
- Archaeology Scotland
- CADW
- CBA
- Department of Communities
- University Archaeology UK
- BAJR
- CIfA (staff and Equality and Diversity special interest group)
- FAME
- Historic England
- Historic Environment Scotland
- Prospect trade union
- Society of Museum Archaeologists
Taking on board feedback provided by these organisations, the questionnaire document went through a total of 19 drafts. In addition, four other documents, containing suggested questions, were sent to the project team by organisations. This draft set of questions, in a Word document, was then used to create the first draft of the digital survey in Novisurvey. The survey was designed so that not all questions were asked of all respondents, as they would not all be relevant. ‘Logic’ functionality was created so that depending on the responses to certain questions respondents would or would not see certain other questions.
The survey instrument was then tested by the project team and sent to eight other individuals to test. 12 drafts of the two survey instruments – individuals and organisations – were created using testing feedback. These changes were mainly regarding functionality e.g. ensuring that the drop-down options worked, results were recorded, etc. Typographic errors were corrected and some minor changes were made to the question text in both surveys.
Survey Distribution
While the questionnaire was being prepared the team compiled a list of organisations, together with some self-employed individuals, to be the target population for the organisation survey. This was based on lists compiled by Landward Research previously together with the following means:
- examining both CIfA and BAJR directories
- examining the list authors in OASIS
- searching various charity directories for terms such as ‘archaeology’
- examining University archaeology programmes
Contacts were found via websites for all the potential participants. In total, 2,017 contacts were in the initial list. A introduction message was created for the two different surveys outlining who should fill them in. These also included links to PDFs of the questions so that potential respondents could see what they would be filling in.
The organisation survey was distributed to that list via email on November 30th, 2020. Instructions in the email explained the project, provided a link and also asked organisations to pass on the link to the individuals survey to their employees. Both FAME and CIfA posted news articles on their websites and also encouraged their member organisations to fill it in. ALGAO sent messages to their members. CIfA also sent a messages to all of its individual members. The link to the surveys were also posted on social media – by Landward Research and many of the partner or supporting organisations, including tweets plus posts on the Facebook groups, such as BAJR. The survey was sent to various university archaeology societies through their Facebook pages. Automatic reminder emails were sent by Novisurvey two weeks later. Of the 2,017 contacts, 49 bounced back as undelivered and 77 opted out/declined to participate.
Beginning in January, 2021, the project team looked at the responses and started to target non-respondent organisations for follow up. Emails and in some cases calls were made to non-respondent organisations. For individuals reminders were sent out by CIfA, FAME and ALGAO. Links were reposted on social media.
The survey closed on January 20th, 2021.
Data Cleaning
There were 195 responses to the organisational survey and 1312 to the individual survey. The data were downloaded in an Excel spreadsheet from Novisurvey and examined. For organisations, there were several duplicate responses and several responses which were started but no information was provided. Once those were removed we were left with 169 usable responses from organisations.
It is important to note that not all respondents answered all the questions they were presented with:
‘I didn’t have permission from our HR Department to complete elements of the About Our Staff page so have had to leave blank.’
– Respondent
The answers were further examined to see if there were any notable errors in responses. Several minor ones were found and corrected:
- Answers in percentages when whole numbers were required. This occurred on staff location where several small organisations put in 100, to mean 100% of their staff worked in a certain location. This was caught because their location staff numbers did not match their total staff numbers. Phone calls and emails to the organisations concerned confirmed this.
- For a very small number of responses, respondents the survey returned results counting down from 1000. So sometimes when a respondent put in working 2 months the form recorded 9998. Similarly for organisations, if they put in having two staff, it came back as 9998. This was confirmed by contacting some of the organisations to see if they indeed had 2 staff or 9998, and they had two. This appears to be an error with Novisurvey. As this was an easily identifiable error these values were changed in the responses.
Once the data were cleaned work began on presenting the results and analysis of the individual responses.
Data Analysis
The individual questions underwent further analysis. Individual responses were placed into a combined all responses document as well as four other documents based on responses to the question, ‘if they considered themselves’: a professional, a student, a student and professional or a former archaeologist.
Using the software R each column of data in the cleaned Excel spreadsheet was compared to every other column to derive p-values which were used to evaluate possible patterns between responses. To accomplish this work in R, two libraries were used, openxlsx and plyr. All answers were converted to categorical data so that Chi-square tests could be run on them i.e. individual ages were put into 5 and 10 years categories. Most of the data was already categorical. Cells with null values would throw errors in R so they were replaced with ‘UNANSWERED’ text values.
p-values
First developed by Ronald Fisher in the 1920s, the p-value provides an index of the evidence against the null hypothesis (that two variables are not related). Originally, Fisher only intended for the p-value to determine if further research into a phenomenon could be justified (Fisher 1925). He saw it as one bit of evidence to support further investigation, rather than as conclusive evidence of significance. This is how p-values are used in the report, as an indication of the need for further investigation. Given drawbacks in p-values we used an arbitrary cut off point of R2= .001 instead of the more commonly used .05. This lowers, but does not eliminate, the chances of having false positives. Any reader should take into account the following when reviewing these results:
1. p-values are indications of the need for further research, not indications of significance.
2. The survey is not a random sample of the target population. Any p-value results are the representation of the respondents, not all archaeologists. However, with such large samples of organisations and individuals in is highly likely that the sample is representative and thus an accurate representation of the profession.
Over 2000 combinations were returned with a p < 001. However, many of those were a result of mutually exclusive answers. For example, if a respondents age was 30-34 then that column will return low p-values for ages 25-29, 35-39, etc. because they could only have one age. There was a relationship with the other responses i.e. they didn’t answer them, but those are not informative results and were discarded. After sifting through the responses, less than a quarter, roughly 400 of the responses were worth further examination. These were then further examined and graphs or tables made to visualise the relationship where appropriate (not for all 400 combinations), there was overlap in some of these results e.g. multiple relations between ages and gender but all could be examined on one graph/table of gender by age.
Questionnaires and Messages
The list of questions can be found in these PDFs along with the text that respondents were shown explaining the project:
Methodology for calculating sector size
Respondent organisations were asked to provide counts of there staff – including all staff directly employed in support of archaeology, which could include administrators, accountants, etc.
Full-time equivalent
Respondents were asked to give these numbers as full-time equivalents (FTE), which calculates workers based on hours worked e.g. two members of staff working 50% of full-time each would equal one full-time equivalent person (.5 + .5 = 1). The reason for this is to avoid double counting. If one person worked part-time for one organisation and part-time for another, and we asked for full person staff numbers from both organisations, we would have counted the person twice, inflating the numbers. With our individual survey, roughly 5% of respondents provided information from more than one archaeology job, so not using FTE would inflate the numbers by ~5%.
Responses
Using the orginsations categories of work (see Section 1 of the results) we multiplied their reported staff numbers by these areas to get counts in each e.g. if they had ten staff and undertook 70% of their work as contracting and 30% as consulting we estimated that they had 7 contracting staff and 3 consultants.
NOTE: for future surveys, such as State of the Archaeological Market and the next Profiling the Profession, we will consider directly asking for the numbers of staff working in each category, rather than asking about percentages of work. Cleaning this data was very time consuming, and introduced some uncertainty to the results. We do not recommend gathering data using the precise method used for this questionnaire (asking for proportions of work and then applying those percentages to the total number of staff).
108 organisations undertook work in multiple areas and many put in 1% or 2% for some of their work in areas such academia (see Table 1.1.1 in 1.1 Size of UK Archaeology). Indeed, many working in development-led archaeology publish monographs and undertake research which is the sort of work defined by the category. However, we used two external sources of data for counting the size of the sector, one providing information on the size of university archaeology departments, which meant that for the purposes of this count that academia meant working at a university and not undertaking development-led archaeology while doing so. Because of this, results of 10% or less were evenly redistributed to other categories for respondents which ensured the results were comparable to other data sources i.e. only those working at Universities were classified as academia.
NOTE: classifying academic work as only taking place at Universities does not reflect the views of authors but is a necessity for obtaining the most accurate total number of archaeologists working. See ‘External data’ below.
Five organisations were contacted for clarification as they put down a significant percentage of their work involved local heritage management but were believed by the research team to not undertake this work, or that it only constituted a small amount of their work. After confirming with the organisations, for four, that number was moved to other categories, for one it was reduced to 5%.
External data
We used two external data sources to facilitate our estimations of the sectors size, Higher Education Statistical Agency (HESA) and the ALGAO counts of local authority archaeologists.
For archaeologists based at Universities we used the data provided by the HESA. All universities report their staff numbers to HESA so is an exact count of archaeologists working for universities. We gathered the data on academic and non-academic positions from the Archaeology Cost Centre. With non-academic staff, we removed the counts from Universities with development-led contracting services e.g. Universities of Durham and Leicester. For the University of Cambridge, their development-led staff were listed as academic staff and so were removed from this count. This was done to avoid over-counting with the other categories. We had to check and correctly allocate the data provided for University College London. In total, there were 725 academic and 105 non-academic archaeologists employed by Universities. The non-academic staff includes administrators, technicians, lab managers, etc.
ALGAO data were obtained from their annual counts of staff in England, plus hand counts of staff in Wales and Scotland. Both ALGAO and HESA data was in FTE.
Estimated missing archaeologists
With exact counts for ‘academic’ (in a very narrow definition) and local authority curation services (not all delivered by local authorities), we applied the following methods to estimate or count the other sub-sectors.
National bodies
For national bodies we received responses for all except two orgnisations, and one included non-archaeology related staff. The missing organisations were contacted and exact numbers obtained.
Museums and visitor attractions
For museums and heritage site based archaeologists we had counted the staff at the National Museums i.e. British Museum, National Museums in Wales and Scotland and the Ulster Museum. For non-national museums, we estimated the non-respondents had similar levels to respondents, an average of 1 FTE archaeologist on staff. We multiple this by the number of organisations that did not respond but which we considered were likely to have archaeologists on staff and combined this the hand count data and respondents to generate an estimated total of 170 for the whole sub-sector.
Contractors and consultants
We merged contractor and consultants into a single category to represent those working in development-led archaeology, not in a local government oversight capacity. We contacted all of the organisations that did not respond, this time just asking for staff numbers. For those that did not responded but were a CIfA Registered Organisation, we used the numbers presented in the CIfA Yearbook or publicly available information – e.g. a company’s website said ‘this is a partnership of two archaeologists’. Most responded or had publicly available information but we were left with 55 organizations that did not. Looking at their websites and number/types of reports they have produced in local HERs we estimated size based on comparisons with organisations with similar profiles that we did have size data for. We thus estimated that these 55 organisations employed 130 staff.
For self-employed archaeologists, we used the respondents that identified as self-employed to the individual survey (~100 FTE) and a multiplier of five to estimate there are 500 self-employed archaeologists. The individual survey, as discussed elsewhere is this report, is a representative sample of the sector, and so we are assuming that sample scales up i.e. we sampled 18.75% of the so the true number is actually approximately 5.3 times the number that responded, which we have rounded to 500.
There is the risk that the individual survey under- or over sampled self-employed archaeologists. We do know from previous surveys and the list gathered for this survey there are 250+ self-employed specialists, and several organisations only use self-employed workers for field-work. Any error could not be greater than ~250 less self-employed archaeologists.
Public archaeologists
We used the same method as with self-employed archaeologists. We had roughly 50 FTE responses to the individual survey. And, as above we used a multiplier of 5 to get an estimated total of 250 FTE for this category.
References
Fisher, Ronald Aylmer. “Theory of statistical estimation.” Mathematical Proceedings of the Cambridge Philosophical Society. Vol. 22. No. 5. Cambridge University Press, 1925.
Image Credit
Inside my Archaeology Bag by Terry Brock. From Flickr CC BY NC 2.0
Version control and change log
As a digital document we may update parts of this page in the future to account for corrections or the need for clarification. Please use the version when citing:
Version: 1.1
Change log: no changes
CREDITS
Title: Profiling the Profession
2020 Authors: Kenneth Aitchison, Poppy German and Doug Rocks-Macqueen
Published by: Landward Research Ltd
Version Date: 2021
ISBN: 978-0-9572452-8-0
DOI: https://doi.org/10.6084/m9.figshare.14333387
License: CC BY SA 4.0 for all text and figures. Header images are from different sources check image credits for their specific licensing.
2020 funders: Historic England, with support from Historic Environment Scotland, CIfA and FAME.
Questions about Profiling the Profession: enquiries@landward.eu