List of working papers.
SPREAD OF A NEW IDEA INFLUENCING LIVING CONDITIONS: FORMAL MATHEMATICAL MODEL
August 1990
The process of acceptance or rejection of new ideas influencing living conditions is formulated as a deterministic continuous time version of information diffusion. The resulting model consists of two coupled nonlinear differential equations with a time lag. It was analytically treated and numerically solved. The solutions, which exhibit qualitatively different behavior depending on parameters of the model, are sociologically interpreted.
A MODIFIED EQUILIBRIUM MODEL
January 1991
The iterative algorithm was offered which simulates dynamics of change of prestige of actors who are exchanging attention until the equilibrium state is reached. Performance of the algorithm was checked on several mock tasks. Although the results have only educational importance, searching for patterns of relationship between dynamics of change in prestige of actors and initial distribution of prestige seems interesting and promising. Questions of convergence to the equilibrium state and its stability are to be addressed.
The proposed algorithm can be easily elaborated in two directions. First, a variety of mechanisms of development toward equilibrium may be devised using concrete information about particular processes going on in the real world. Another advancement may deal with the transformation of diagonal elements, which represent self-attention of actors and stay constant in the present algorithm.
READING ECONOMY AND SOCIETY BY MAX WEBER: FOUR ESSAYS ON RATIONALITY
December 1991
First essay draws inferences about the role of scientific institution from examining Luther's attack on formal rationality of Roman law.
Second essay studies how Weber constructs ideal-type rational economic actors in his treatment of capital accounting.
Third essay points to the forces that promote and hinder emergence of rational bureaucratic rule.
Fourth essay examines how Weber applies his theory of ideal-type bureaucracy when participating in the debate among his contemporaries about elimination of the constitution article which forbids the same person to be simultaneously a member of Bundesrat and Reichstag.
RECENT DEVELOPMENTS IN NON-LINEAR SYSTEM DYNAMICS: UNFOLDING THE MEANING OF SOCIOLOGICALLY RELEVANT CONCEPTS
April 1992
Three interrelated concepts--deterministic chaos, catastrophe and slaving principle--may turn to be the powerful tools for analyzing richly detailed historical, non-common-sense social phenomena. With these concepts the modern theory of non-linear differential equations is able to describe discreet and chaotic processes, which previously were considered amenable only for probabilistic treatment.
The concepts facilitate both understanding and mathematical description of the following phenomena:
1) Evolution of those systems, which are equal in all respects and emerge under virtually same initial conditions, yet take different developmental paths. Three possible explanations--deterministic chaos, "strange attractor", and bifurcation point--are presented.
2) Sudden jumps of smoothly developing systems (catastrophes). The system's development during an abrupt change is governed by the same laws but is much more rapid than that generally characteristic for the system.
3) Hysteresis, which is defined as dependence of a system's state both on the values of independent variables and on the system's history.
4) Some variables become completely dependent on others; i.e., enslaved by others. Dimensionality of the system decreases and a pattern of development can be recognized (self-organization). The opposite shift toward chaos may also occur. In other words, the number of variables that adequately describe the system may decrease and increase with time (intermittency).
Mathematical and statistical modeling of these four phenomena require a combination of formal theorizing and empirical research of the ethnomethodological kind. There are fundamental limits of the models' predictive accuracy. Their estimation becomes a paramount issue.
TOTAL QUALITY MANAGEMENT AS A FIELD TO STUDY RATIONALIZATION
November 1992
Evolution of a firm's control structure naturally has led to selecting quality as a bottom-line indicator because of quality's holistic nature. Exactly this nature requires from a firm to increase rationality of its operations to the level when this is perceived as threatening individual autonomy. Deeper understanding of rationalization and its side effects may be the way to deal with this problem. Social research may play an important role here. The Total Quality Management (TQM) movement provides scientists with an opportunity to study and, may be, to take part in the large-scale endeavor to find the balance between ritual activities and action guided by technical rationality. This issue is closely related to the basic question of relationship between individual autonomy and acceptance of organizational values and norms. Scientific exploration of this question may have deep impact on organizations, what should be taken into consideration by a responsible researcher.
SOVIET BUREAUCRACY: AN ATTEMPT TO FIT INTO THE WEBERIAN FRAME OF ANALYSIS
March 1993
There are two interrelated purposes pursued in this paper: to find out what features of the modern (1965-85) Soviet bureaucracy may be analyzed within the existing social scientific framework and to make one more step toward better understanding and re fining of the abstract concepts of this framework. The undertaking will increase our ability to comprehend development of states formed from the Soviet Union.
IMPACT OF CULTURE ON LEARNING: SOME THOUGHTS
March, 1995
A culture that emphasizes values of rationality and of deferred needs' gratification may impede individual learning. The resulting gap between individual and organizational learning may lead to the stagnation of learning of both kinds. A more diversified culture, which legitimates intuition, direct pursuit of gratification of curiosity, and differences in cognitive styles and thought processes may be more learning-friendly. Yet loss of universality of values may decrease legitimacy of the whole cultural system.
EMERGENCE OF POLITICS IN COLLECTIVIZED SCIENCE
OR EROSION OF LEARNING ORGANIZATION OF TRADITIONAL SCIENCE
April 1997
The paper analyzes changes in the match between motivations and abilities of scientists and social structure of their embedding institution.
Using Popper's (1982) notion of three worlds the normative definition of science is given. Projecting it on the list of motivations of scientists, compiled from the literature and organized according to the Maslow's (1970) classification of basic human needs, the profile of motivations and abilities of the ideal-type scientist is obtained. It is shown that this idealization was close enough to reality before the Second World War to make the science's regulative mechanism, which was built around the publication system, to work.
The core of the paper analyzes how the changes of scientific institution induced by its growing prestige in larger society made its traditional normative system inadequate and led to the increasing politicization of science during the last fifty years. It is argued that coping with recent institutional changes eroded science as a place in social universe where an individual could be engaged in work which is "innocent, enjoyable, and ultimately beneficial to mankind" (Ravetz 1971).
Finally, several forecasts about the future of science and a current experiment (Hock 1994; Roth and Senge 1996) in creating an institution able to integrate the ideal-type scientist are presented.
RELIABILITY OF CODING COGNITIVE STATES OF PARTICIPANTS FROM VIDEORECORDINGS OF A BUSINESS PROCESS RE-ENGINEERING WORKSHOP.
April 1998.
Inter- and intra-coder coefficients--both of them will be abbreviated as ICR from now on--are an important measure of quality of coding (Cronbach and Gleser 1953; Cohen 1969; Bakeman and Gottman 1997). If their value is approximately 0.8 or higher, reviewers will have no objections for publishing such paper, and other researchers will take the findings and the paper's author seriously. But why 0.8 is acceptable? Is Cohen's Kappa equal to 0.7 still sufficient? Sufficient for what? On a more practical note, how much effort should one invest into drilling his or her coders to maximize ICR? What if one has obtained Kappa=0.1? Should he or she select different coders, simplify coding schemes, or just scrap the whole study? If we want to plunge into philosophy of science, we may ask how getting Kappa=0.8 or looking on this number promotes anybody's understanding?
The paper addresses the above issues by describing an empirical study of ICR in the case of coding cognitive states of participants in complex collaborative problem solving. The main purpose of the study was to develop and apply a methodology for reliably coding and reliably estimating ICR in the way that facilitates further inquiry and growing understanding and leads to cumulative science (Levy 1993).
The first section formulates concrete goals for estimating inter-coder reliability in the present study. Second, phenomena influencing ICR's value are discussed. It is shown that selection and training of coders are to be explicitly considered. Third, coder selection procedures--which were derived from the goals formulated in the first section of this chapter, nature of the coding task, and financial constraints allowing to hire only one person in addition to the researcher--are presented. Fourth, a procedure adopted in this study for calculating an ICR coefficient for time-delimited codes is introduced. Fifth section describes how coder was trained, how researcher learned too, and what longitudinal data were collected to estimate the training's and coder's impact on inter-coder reliability. Sixth, analyses of the collected data are presented and lessons for reliable coding are drawn. A number of necessary conditions for maintaining this level of reliability are formulated. At this time we will have several dozens of reliability coefficients calculated. They will range from 0.17 to 0.99. The concluding part of this paper presents a "reliability square," that combines four of them in a visual display helpful for interpreting several meaningful facets of the ICR.
CHALLENGES OF BUILDING A TRULY DYNAMIC THEORY OF COLLABORATIVE FACE-TO-FACE PROBLEM SOLVING.
March 1998.
Four challenges may be of general interest for researchers studying processes of organizational learning and, in particular, small group processes.
First of all, trying to relate existing literature in the area of small group research to what I was observing in the field, once again I became keenly aware about the well-known and seemingly irreconcilable trade off between accuracy and generality of description. Yet now it seems me possible to resolve the dilemma by generalizing not directly from data but from a sample of models. The dilemma and a proposal of how to deal with it are discussed in the 1st section of this manuscript.
Videotaping serves as the best means to collect data necessary for modeling problem-solving processes. Yet most companies were reluctant to allow a person with a camera, more precisely several cameras, to record their meetings. Also there was a legitimate question of the extent in which videotaping would distort a natural flow of interaction making the recorded material not representative of real situations. Now it can be reported, that this challenge was successfully overcome as is described in the 2nd section.
Coding videotapes raised an issue of usefulness of efforts to maximize inter-coder reliability coefficients. Judging from the research literature, the practice is pervasive in the modern social scientific discourse. The 3rd section inquires whether it serves well the goals of scientific enterprise and proposes an alternative approach.
Finally, endeavors of designing coding schemes to provide a factual framework for modeling led me to distinguishing between two goals of modeling: to replicate reality, i.e. to create artificial life, and to understand critical influences on group effectiveness. Because in my dissertation study I need both kinds of models, the efforts to bring them together resulted in the three-phase iterative approach for creating a description of cognitive and emotional dynamics. Both the distinction between two kinds of modeling and the methodology allowing to relate them are presented in the 4th section of the manuscript.
SOLVING A PROBLEM AND GETTING ALONG: TOWARD THE EFFECTIVE ROOT CAUSE ANALYSIS.
January 1999.
This study is based on action research of a Business Process Re-engineering (BPR) workshop. It proposes a classification scheme of the problem-solving states of participants that can serve as a conceptual bridge between cognitive and emotional dynamics of the Root Cause Analysis (RCA) and its effectiveness. A number of practical suggestions based on the analysis of videotapes is formulated and illustrated by examples and excerpts from the transcripts.