Educational Technology & Society 5 (2) 2002
ISSN 1436-4522

Technology in Organizational Learning: Using High Tech for High Touch

Jane B. Maestro-Scherer
Robert E. Rich
Clifford W. Scherer
Schelley Michell-Nunn

Fathom Inc., 10 Boiceville Road, Brooktondale, NY, USA
Tel: 607 - 539 – 6372
JM24@Cornell.edu

 

ABSTRACT

This study describes the use of technology to enhance an experiential adult learning process, which occurred in a participatory organizational climate assessment. In this case, computer software and hardware capabilities enabled greater hands-on involvement by employees, and promoted the self-discovery of knowledge that was then converted into action strategies for organizational improvement. It is a pilot effort to join survey research with participatory organizational change. The setting is a small city government of approximately 420 employees in the Northeastern United States. The application is an organization-wide assessment of the work environment with special emphasis on issues of diversity.

Keywords: Organizational change, Organizational learning, Self-discovery, Survey research


Introduction

Technologies have generally been successful in transcending distance and time. It is not surprising, therefore, that the commonly cited advantages of new technologies in learning relate to distance neutralization and asynchronicity. Examples of these include bringing video into the classroom and the use of the Internet to link students at remote sites. Groups and individuals can access information from remote locations at anytime of the day via the Internet. They can “meet” with others in multiple locations while sitting at their computer terminals or in ‘wired’ meeting rooms, and they can share ideas through email or groupware without the restrictions of time, the constraints of multiple geographic locations, or the complications of scheduling.

The presence of technology in our everyday lives is not, however, without cost. As we have moved to increasing use of technology in learning environments, there is a potential loss of interpersonal contact due to lack of visual and auditory cues, and a potential loss of active learner engagement when removed from direct involvement with other learners, teachers or experts. Technologies in adult learning have generally been used to provide images of individuals separated by time and space, or to act as projectors to show the results of analysis completed by experts, to convey conclusions, questions or agendas pre - determined by the presenter. These uses put the learner in a passive role of receiving information. The point is that the technology easily bridges time and space, and facilitates the transfer of prepared information, but does not appear to facilitate interaction. Experience suggests, however, that new technologies can enhance interpersonal participation in the learning process. Advances in computer software and hardware, along with group process techniques, can be used to create an interactive learning experience that engages the learners in generating a collective knowledge base from which to construct action plans. Active involvement of a group in knowledge creation has been found to be an important part of producing organizational change (Schein, 1996). The challenge has been to develop a process that effectively utilizes new technologies to significantly increase learner engagement in group activities.

The case described here can be categorized as a participatory example of pragmatic action research that attempts to increase the ability of the involved community or organization members to control their own destinies more effectively and to keep improving their capacity to do so (Greenwood & Levin 1998). The capacity of the organization is increased through learning about both process  -  the ‘how’ of conducting the research, such as developing a valid survey instrument, and the content  -  the ‘what’ of the research focus, in this case, the workplace environment.

 

An Overview of The Case

The case study is one in which employees of a city - ranging from youth workers, tree trimmers, firefighters, to senior managers - took on the task of examining and understanding their work environment. The chronology of the participation approach was as follows:

  1. Formation of a Work Environment Task Force
  2. Design and implementation of a survey by a Survey Design Team recruited from a “diagonal slice” of the workforce
  3. Convening of data analysis sessions where employees determined the issues they considered important for further examination
  4. Production of final recommendations from Issue Action Teams for final presentation to the City Common Council.

Out of approximately 420 employees, the process engaged the voluntary participation of nearly 150 employees. The work environment survey designed and implemented by the organization’s employees achieved a response rate of 68%.

This case originated as a result of alleged diversity-related oppression and intimidation occurrences in the work environment. City leadership created, in response, a Work Environment Task Force and charged it with conducting a study to determine if the ad hoc incidents reported were widespread in the fabric of the organization, and, if so, to identify what could be done to provide corrective action. The initial step in the implementation of the project was for the city to assemble a Survey Design Team that would work with the consultants to draft the survey instrument. The Survey Design Team was ultimately comprised of a representative sample of the organization ranging from clerks to department heads. Twenty seven individuals volunteered from the 12 separate departments.

The consultants conducted 5 half-day design sessions with the group to construct the final survey instrument. The Survey Design Team sessions began with brainstorming and prioritizing questions for the survey. The consultants also asked the group to predict findings related to the questions. The purpose of this step was to identify current assumptions about what was going on in the organization which could be used later in data analysis. The range of questions moved well beyond the initial focus and ultimately produced a comprehensive work environment questionnaire. The questionnaire was then pre - tested with a different group to help eliminate questions that did not produce significant variance and make revisions where confusion or gaps were reported.

The survey was then administered to the whole organization with a subset of Survey Design Team members serving as survey administrators. Sessions were set up in the departmental locations where the survey project was explained and its confidential nature reinforced. The consultants attended these sessions and physically received the completed surveys to further reassure employees that their individual responses would not be seen by their colleagues or superiors. The consultants then performed the normal tasks of data entry, verification and cleaning.

Once the database was assembled, interactive data analysis sessions were convened with various employee groups. In these sessions, employees were first presented with demographics for an understanding of the representativeness of the respondent group relative to the overall workforce. They then moved into their own exploration of the data, assisted in technical aspects by the consultants. Experience shows that non-researchers can become remarkably adept in data analysis. They are clever in the use of demographics and inter-question relationships to further explicate generalized results. In one especially gratifying data analysis session, for example, one of the city government’s tree trimmers became quite adept at reading frequency tables. He soon crossed over to cross-tabulations and before the session was over, was leading the data interpretation for the group with multi-layered tables. Reading statistical tables is not a noteworthy skill in and of itself. Most important is the participative and interactive nature of these sessions. The level of sensitivity and complexity is engendered by the differences of role positions and organizational experiences of the group members. This contextualization enhances analysis of survey results. No outside consultant, no matter how skilled, can match this richness of interpretation provided by those immersed in the culture and jobs on a daily basis.

A total of eight of these data analysis sessions were held, the findings from which were compiled into a document which was presented back to the Work Environment Task Force. The Task Force then chose three priority areas to address immediately: Oppression and Intimidation, Supervision and Discipline, and Health and Safety. The consultants then designed and conducted Issue Action Team sessions for these areas involving again, diagonal slices of the workforce. In these sessions, participants reviewed findings, further analyzed data, and formulated recommendations for action. These recommendations were presented to the Work Environment Task Force and then to the City’s Common Council. The Common Council adopted the recommendations and allocated resources for implementation. Special improvement teams have since been established to develop more detailed action plans and oversee implementation.

 

The Action Research Methodology

The action research method used here is designed as a development tool in organizations (Rich & Maestro - Scherer, 2001). It assumes that the organization already recognizes that a problem, concern or need for some type of change already exists. In these situations, more often than not, organizational members come to a realization that information held collectively within their organization needs systematic collection, processing and analysis if they are to have a better understanding of the circumstances and issues relevant to their concerns.

The approach used in this case depends upon organizational members themselves to conduct the research. This collaboration forms a partnership with organizational members as experts in content and consultants as experts in process and technical areas. Figure 1 expresses this relationship.

 


Figure 1. Action research (AR) working relationship


Survey research is used as a central element in this process. “Surveys are especially useful for the systematic measurement of attitudes, beliefs and values across a sample of the population under study” (Whyte, 1991, 269). Surveys provide an avenue of expression that is available to all organizational members. If response rates are high, aggregate results may represent the entire organization. Competence in survey design and response rates, of course, affect the validity and reliability of outcomes.

Whyte (1991) cautions that surveys provide an unreliable guide to understanding human actions. This limitation calls for participatory methods in absorbing and making meaning of survey results and supports the technique of data analysis in face-to-face settings where members of the organization can use their knowledge to augment interpretation.

The appropriateness and effectiveness of a survey research process begins with the determination of what questions to ask and then how to ask them. In the traditional model these decisions are left largely to the outside researchers, with some organizational consultation. In this model, the consultants and the members of the organization take on this task together. Decisions about content are left to employees. Decisions about research design are left largely to the consultants. The work provided by the consultants is subject to the critical review of the group in terms of the clarity of language used vis-a-vis the survey population.

Once data has been collected, verified and entered into a database, employees are directly engaged in analysis. Groups are convened with the data loaded in a computer and an image projected on a large screen. Participants use the original survey questionnaire to make queries and everyone sees the result at the same time. Advances in statistical software allow fast production of frequency or cross-tabulation tables. As different types of tables are requested and produced, participants are given training in how to read and analyze them. As the group deepens its analysis they become familiar with basic concepts of data manipulation.

Additional inquiries to isolate results for sub-groups, for example, could only be produced by ‘going back to the shop’ to do additional runs. Days and weeks were often required to respond to the need for data to be presented differently to improve its relevance and explanatory power to the organization. There was no possibility of conducting a free-flowing inquiry process that allowed groups to spontaneously follow their instincts in making meaning of survey results. This discontinuity significantly limited the ability of organizations to gain an adequate understanding of their own data from which to develop action plans. Often, outside researchers not only constructed the analysis but also the recommended actions, creating a further ‘disconnect’ and possibility for error in interpretation and action. In contrast, conducting this work within an action research framework integrates the technology into group process (See Figure 2 below).

 


Figure 2. Technology use in traditional and AR settings


Direct and continuous engagement of organizational members in the process has a further benefit relating to the success of the survey effort in the improvement of response rates. In the organization involved in this case, previous surveys had produced response rates below 20%. The response rate for the survey developed collaboratively was 68%. This difference may be attributed to the extent of diffusion involved in the development process itself. Employees who were involved in the design of the survey instrument were able to brief their colleagues as the instrument was being developed, thereby creating some readiness and expectation. Further, they were able to function as survey administrators who set up sessions in their units for the surveys to be completed and because of their intimate involvement, could explain the purpose of the survey and, in fact, individual questions within it. This added dramatically to the credibility of the effort in an environment where there was considerable skepticism over another survey.

 

Findings

While the use of technology has moved closer to an idea of interactivity, that interactivity is generally between people and machines. The case described in this paper examines the use of technology in a group process to increase individual participation and group learning. A number of findings and observations seem relevant. The use of interactive technology in sessions gave participants immediate feedback on questions they formulated. The immediate feedback appears to stimulate excitement and heighten interest. The use of interactive technology promotes free flowing ideas, with new questions feeding off previous discoveries. Participants engage in discussions of how to best analyze the data to probe areas of concern, thus increasing the anticipation and interest in the resulting answer provided by the data.

Research literacy (Merrifield, 1997) and confidence increases as participants begin to understand the nearly unlimited opportunities in mining the data. The interactive process enables a more complex understanding of the organization, while at the same time introducing the realization that there are limits to analyzing the data, and that there is only so much that can be understood and used. Ultimately, participants conclude that they need to focus on the most important issues and concentrate on greater depth of understanding.

Self-discovery of data makes understanding more real and compelling. Because the data they are analyzing in real time is data they helped create through development of the questionnaire and data collection, the ownership the group feels diminishes the tendency to discount findings. As they develop their competencies and understanding of the data, the group begins to better understand the range of opinions represented in the organization, while at the same time recognizing that those few extreme positions do in fact, represent extreme positions. This appears to do two things. First, it increases the coherence within the organization as they realize that there are centrally held, generally moderate opinions. Second while this realization diminishes the power of extreme positions, it also identifies extreme positions that may require attention.

The establishment of ownership early - on in the process, and the immediate feedback offered by interactive data analysis act to discourage the development of defensive positions vis-à-vis findings that run counter to preconceived notions about the organizations.

Perhaps it is this last finding that is the most important. The interactive use of technology appears to be instrumental in helping push participants into double - loop - learning, challenging basic assumptions held about the organization and its members (Argyris et al., 1985). This deeper, frame breaking type of learning is generally met with substantial resistance, particularly if delivered to the organization by an external source. If an outside consultant, for example, operating in a typical mode of developing a survey, collecting data and reporting back to the organization, reported that management believed that workers were underpaid, there would most likely be disbelief, an attempt to discredit the information coming from the consultant because the finding is so far removed from the internal “truisms” in the organization. In this particular case, for example, one of the survey findings was exactly that: management thought workers were underpaid.

Participants’ reaction to this finding was not to discredit or challenge the findings, but one of surprise. How, after all, could the workgroups discredit or challenge the findings when they had ownership of the data and the entire process? Perhaps of all the findings of this experience, it is this single factor that may challenge the typical organization - consultant paradigm and suggest that the highly interactive nature of this design advances the potential for double loop learning.

 

Conclusion

In participatory work, methods are the message. In this case, new technology, along with horizontally structured group process, enabled participants to experience a high degree of ownership during all phases of the project. This combination allowed them to contribute and value their own expertise and to develop new skills and confidence. As Hirschorn (1988) points out, “...techniques themselves can function as transitional objects. They help adult learners make the transition from feelings of incompetence to feelings of competence” (116). This effect is critically important to the goal in pragmatic action research of increasing the capacity of the group for future learning, planning and problem solving. In the context of this case, two major objectives were achieved: 1) that of expanding and deepening the understanding of the organization about itself; and, 2) developing ‘research literacy’ for future inquiries.

While technology tends to be used more as a passive mechanism it need not limit participation but rather can even be used to increase participation. In this project, technology was embracing rather than distancing. Software and hardware advances allowed participants to perform data analysis themselves in real time which produced the outcome of participants knowing thoroughly how and why findings were derived and, ultimately, converted into recommendations for improvement. This transparency, distributed deeply, can significantly assist in mobilizing change efforts in organizations.

 

References

  • Argyris, C., Putnam, R., & Smith, D. M.. (1985). Action science. San Francisco, CA: Jossey - Bass.
  • Dixon, N. M. (1997). The hallways of learning. Organizational Dynamics, 25(4), 23 - 34.
  • Greenwood, D. J., & Levin, M. (1998). Introduction to action research. Thousand Oaks, California: Sage Publications, Inc.
  • Hirschhorn, L. (1988). The workplace within. Cambridge, MA: The MIT Press.
  • Merrifield, J. (1997). Knowing, learning, doing: Participatory action research. Focus on Basics 1(A), 23-26.
  • Reason, P., & Bradbury, H. (2000). Handbook of action research. Thousand Oaks, California: Sage Publications, Inc.
  • Rich, R., & Maestro-Scherer, J. B. (2001). Self-Informing organizational change: A participatory method of data collection, analysis and action planning. Revue Internationale de Psychosociologie, 16/17, 1 - 16.
  • Schein, E. H. (1996). Kurt Lewin’s change theory in the field and in the classroom: Notes toward a model of managed learning. Systems Practice 9(1), 27-47.
  • Whyte, W. F. (1991). Social theory for action. Newbury Park, California: Sage Publications, Inc.

decoration