Organizational Benchmark Survey Case Study - Identifying Need


This organizational benchmark survey case study from PGA Group Consulting Psychologists gives an insight to the construction, implementation and outcomes of a benchmark survey.

Making effective decisions and measuring their outcome formed the theme of the survey. This organizational benchmark survey case study shows how this methodology and process helped our client to build a useful tool to identify customer needs, focus priorities, and develop new business opportunities.


Client

A UK-based management consultancy company.


Business

Management training and development.


Location

UK (the organizational benchmark survey was conducted in English, the final report and client’s feedback were in English).


Background

This organizational benchmark survey was an unusual and novel assignment for us:

Unusual, in that the client was, in some respects, a competitor of ours. They had little real experience of survey work and had approached us on recommendation. The client decided to place their trust in us and suggested we share findings which were relevant to us both (an arrangement to be reflected in the fee). We are flexible and agreed with the client to take this a stage further by incorporating into the organizational benchmark survey some items which were of relevance only to us.

Novel, in two main respects: we had an opportunity to try out some new ideas of our own in relation to current views on scale construction expressed by psychologists in the human potential movement; and, we got acquainted with some finer aspects of direct mail marketing.

The enduring impression we were left with was how much fun this project was. Obviously, this was serious business. Nevertheless, the whole project was really enjoyable. We had a client who was decisive and open. They were patient about the sheer amount of research and preparation that had to be made. After all, any building needs a firm foundation upon which to stand. Everything ran as smoothly as anyone could reasonably expect. And we achieved good results and made our client happy.


Organizational Benchmark Survey Aims and Objectives

Our client’s objective was to gain marketing information by surveying Human Resources Directors and Managing Directors sampled from their population of interest. Further, our client wanted to build an analysis of needs tool to help them win new business and strengthen their position. They envisaged that the organizational benchmark survey data could help them with the latter goal also.

Our client had the idea of building benchmark data so that the needs of any prospective client organization could be seen relative to its peers, and in absolute terms. Our client wanted to be able to mine their database in the future to analyse trends.

Fortunately, our client had done a lot of homework prior to the start of the organizational benchmark survey project. They knew exactly what they wanted to find out, and the size and composition of their organizations of interest.

One area in which our client was unsure of however was how best to segment their sample population by industry sector. They realised that the Standard Industrial Classification (SIC) of economic activity was too complex and unwieldy for their purposes. We suggested that they adopt a condensation of the SIC, published by the Market Research Society (MRS). This they did and it proved useful. PGA Group Consulting Psychologists‘ Business Activity Classification (BAC) - SIC Alternative features the condensation used.


Methodology

A self-completion survey questionnaire was developed to gather the information sought. Given the target respondents’ seniority, our client decided to mail out the organizational benchmark survey pack for completion and return using prepaid business first-class mail.


Pilot Work

A number of issues had to be resolved before the organizational benchmark survey questionnaire could be finalised. This would involve the testing of pilot material.

The client had determined the content, i.e. what they wanted to know. Suitable survey scales and items were constructed and tested.

Psychologists in the human potential movement had experimented with test item construction without the use of traditional number scales. A typical number scale might be, “Mark your level of agreement on this six-point scale between ‘1. Agree’ and ‘6. Disagree’”. Such numbering was in common use, even with semantic scales, e.g., “Do you: 1. Agree Strongly, 2. Agree Broadly, 3. Agree Slightly,... 6. Disagree Strongly,” and so on.

The human potential movement psychologists were of the opinion that numerical rating scales may inadvertently ‘lead’ a respondent while making their choices. The individual might be influenced in selecting a value from the response categories offered because of the numbers they are labelled with. Further, it had been argued that confusion for some respondents may be intensified because individual questions or items, as well as blocks or sections of questions, are often numbered.

While the above may or may not be true, it is probable that the task of judging a shade of agreement would place a cognitive burden on the respondent, given the abstract nature of the task. Of course, a semantic differential scale could be used without a numbering system, provided that responses could be coded accurately.

We decided with our client to experiment at the pilot stage with removing numbers from the organizational benchmark survey completely, except for items which required a ranked order response (for obvious reasons).

It was felt that a sample population of likely high cognitive ability would be relatively unaffected by the approach. As for the response coding, it was decided to scan completed questionnaires, thus obviating the appearance of numerical coding on the organizational benchmark survey form itself.

One other novel idea we wanted to try was to elicit satisfaction/dissatisfaction and need/lack of need by computing a differential score from two distinct responses to the same item (with very little or no additional cognitive or time overhead being placed on respondents). For example, we could state the positive proposition ‘Our organization is fulfilling its true potential.’ By asking for shade of agreement (how it is) and shade of importance or relevance of the issue or topic to the respondent organization, we may infer the degree and direction of satisfaction (how we would like it to be) and the potential interest or motivation to do something about it if the relevant situation is unsatisfactory.

Our client was especially excited by the above approach because they realised that they may be able to persuade a prospective client that, even though they may not feel the need to address the issue in their organization, perhaps they should think again because their competitors are doing so (a ‘fact’ demonstrated by the ‘social proof’ of the benchmark data).

Pilot work conducted suggested that the above ideas would be workable. It was decided to incorporate the ideas above into the final form of the organizational benchmark survey questionnaire document.


The Organizational Benchmark Survey

The final form of the organizational benchmark survey document was designed in-house by us for eventual scanned input to our relational database management system (RDBMS).

We calculated the necessary sample sizes, taking into account assumptions made about response rates. This was to help ensure that a meaningful analysis of the organizational benchmark survey data could be made.

Our client identified and rented a suitable respondent mailing list. The list contained some interesting commercial data which would later prove useful in the analysis. List data were prepared and read into our database system.

Our client consulted with us on the construction of a suitable cover letter to accompany the organizational benchmark survey document. In particular, we showed our client the key factors to be addressed in the survey pack which can influence the rate of return of usefully completed survey forms.

We supplied the survey form artwork to a commercial printer for document production. Having the documents printed commercially was less expensive and more efficient than outputting to a laser or inkjet printer for instance.

Our client produced the personalized covering letter and mailing labels in-house by merging information from within the rented list database, and stuffed the survey packs themselves. The mailing out day was chosen carefully, in line with recommendations from the direct mail industry; this is because the day of week for mailing out can influence the response rate.

Completed organizational benchmark survey forms were received to our reply-paid address. Retuned forms started to be received four days following mail out of the survey pack. Returns peaked at three weeks from the date of mail out, much in line with industry expectations.

The return rate achieved was an impressive 8.4% - very good, especially as the mailing had gone out ‘cold’ to a busy, high-status sample population.

Returned forms were scanned on arrival. The scanning system we use virtually eliminates input error. Coding error is virtually non existent. Data are available for immediate analysis from our relational database management system.

Everyone was pleased with the overall smoothness of the operation.

As an aside, it would be easy today for survey forms to be e-mailed to recipients for completion and return, or for survey questionnaires to be completed via the Internet. Of course, a number of additional issues to those concerning survey design and analysis would need to be addressed. Nevertheless, the technology could bring greater benefits and opportunities. Sophisticated rules or adaptive questions (based on the survey answers input so far) could be constructed. Data could be analysed on a completely ad hoc basis by accessing the database via a secure Extranet, 24 hours a day, seven days a week. Feedback of selected data or reports could be provided to respondents via the Internet also. There is potential here for an interesting, interactive survey experience to take place.


Analysis

An ad hoc descriptive analysis of returned data was made as the completed organizational benchmark survey forms were received using a computer spreadsheet application. However, serious analysis of the data using IBM SPSS was carried out once the quantity of survey forms likely to be returned were received.


Results

Making effective decisions and measuring their outcome formed the theme of the organizational benchmark survey. The issues addressed were confirmed as important to a representative sample of private sector and not-for-profit organizations.

Quality in human resource decision-making has an implication for the whole organization. The value of innovation in identifying where investment is best directed is an essential component of business process re-engineering and other ideas, including the learning organization.

The results of this organizational benchmark survey proved to be highly interesting and potentially very useful.

An astonishing result obtained was that few organizations stated that they measure the return on investment concerning the development of their people, despite a huge interest being expressed about this topic in the management literature of the day. At the time of the original organizational benchmark survey, it was estimated (by extrapolation) that approximately £9,000,000,000 (GBP 9 Billion) was being spent annually in the UK on training and development with little or no measurement of intervention effectiveness being made.

The detailed findings focussed on:

  • A structured analysis of an individual organization’s standing, showing which issues are identifiable as really important and where resources may best be directed.

  • A benchmark indication of how each respondent organization compared with others in the survey, i.e. which issues are of common importance and where their organization is placed in relation to them.

All of the 34 items in part-one of the organizational benchmark survey (which addressed organizational issues, management development, management effectiveness, recruitment and selection, and residual organizational issues) had been rated of high importance by the sample group as a whole. This indicates, in part, that the items selected for inclusion in the survey were of high relevance to respondents in the survey population as a whole.

It should be noted that ratings were derived from participants’ perceptions about their organization’s performance (in terms of the extent to which they agreed/disagreed with the positive aspects of each topic) against the relative importance they placed on each topic in terms of their current experience within their organization.

Perceptions are not to be confused with objective fact of course; as stated in Korzybski’s dictum, “The map is not the territory” (interested readers will find a discussion related to this topic on our site in the article Self and Inner Self: Existence, Identity and Being - “I Am Me”).

It may be worthwhile, we would suggest, to explore the factors which could have contributed to respondents’ ratings. Where topics are highlighted as potential issues, further consideration might be given to the potential real existence of the issue, its impact on the effectiveness of the respondent organization and how it may best be addressed.

In terms of strengths, respondent organizations might usefully consider the following questions:

  • Does the organization recognize and capitalize upon these strengths?

  • If not, how could better use be made of them?

  • Would it be beneficial to develop some of these strengths further and, if so, how could the organization go about doing this?

In terms of potential issues, respondent organizations might want to pay attention to the interaction between issues:

  • What are the potential consequences of not paying attention to these issues?

  • What might the benefits be of paying attention to these issues?

  • How would the organization best go about addressing these issues?


Outcome

Our client’s aims and objectives were satisfied. Interesting and useful data were obtained. The methodology was proven. Further, a useful database of information which would form the basis of future organizational benchmark surveys and trends analysis was built.

Our client went on to develop an effective organizational needs analysis instrument, based upon the methodology used and the survey findings obtained. The instrument maps the relative performance on, and the importance of key topics concerning organizational effectiveness, both for the respondent organization and the relevant sample group as a whole. Comparisons can readily be made between the two. This has since proved to be useful in helping our client obtain new and repeat business, and to maintain and strengthen their position in their marketplace.


Links

You may find these links on our site to be of interest:

Employee Attitude Survey Case Study - Performance Turnaround
This case study is about how negative performance factors were identified and addressed to the benefit of the organization and its people.

Stress Audit Case Studies - Successful Stress Management At Work
The stress audit can be classed as a form of specialised survey. These case studies are about how the negative effects of stress at work were identified, addressed and relieved to the benefit of three organizations and their people.

Self and Inner Self: Existence, Identity and Being - “I Am Me”
This article discusses, among other things the relationship between perception and reality. It identifies some of the difficulties in arriving at objective truth via observation or reasoning. Explained is Korzybski’s dictum, “The map is not the territory.”

Other relevant site navigation links appear at the foot of this page.


I hope you have found this organizational benchmark survey case study from PGA Group Consulting Psychologists informative, useful and beneficial.

Should you have any questions, or would like further information, my team and I would be very happy to help. Details and an e-mail form to contact/locate us can be found here: www.pgagroup.com/contact-pga-group.html

Peter Gerstmann
Principal
PGA Group Consulting Psychologists - www.pgagroup.com


Top of this Organizational Benchmark Survey Case Study Page


PGA Group Consulting Psychologists Case Studies Index Page


PGA Group Consulting Psychologists Home Page