Return to Home Page

		 Systems Theory and Innovation

                                                       by Paul A. Herbig*  

		Systems Theory and Innovation

ABSTRACT

different approach to technological change.
relating the amount of change and its rapidity to a system construct we call openness or closedness of a system and relate a society or culture as a type of system.
origin and diffusion of technological change are greatly influenced by the economic environment that they take place. dynamic process 




Introduction
DEFINITIONS:
system:  a set of complex elements interdependent and intimately interwoven within an environment that act to generate a stable equilibrium of events; a natural phenomena. a system is analogous to a skyscraper: the elements being the supports, walls, plumbing, wiring, etc. and the foundation being the interdependence of the elements. in a system members are not significantly connected with each other except with reference to the whole systems.systems implies relationships, structure, and interdependence.

open system: when inputs and/or outputs to/from the system do not necessarily end within the system.  A significant amount of energy radiated is derived external to the system or escape from the system.

"living systems" must be open systems  open to exchanges with other systems, not closed systems

closed system:  all inputs  are derived from the system and  all outputs return to the system. equilibriums can exist only within a closed system.

 system net: where inputs to the system minus outputs escaping from the system; a zero based system is where inputs equal outputs.steady state achieved only at the level of maximum potential energy.

elements: individual items--either  physical or intangible that compose a system

change: a new unknown variable or a significant increase or decrease of intensity on an element or a significant time shift in a cyclical occurrence.

cycle: a regularly occurring event, predictable, 
general systems paradigm:  theory development must begin by careful specification of the institutions and linkages among institutions within a system.
state of a system at a moment of time is the set of relevant properties that the system has at that time.
GENERAL RULES OF A SYSTEM

I.  A system is dynamic, always in a constant flux of change

II.  At any point in time, the system appears stable, unchangeable, eternal

III.  System tend to emulate similar physical characteristics:
		inertia--tendency to continue movement in same direction
		friction--forces meeting
		momentum: force contained in motion

IV.  All systems will seek an equilibrium point at all times.  All systems are self correcting with built in feedback loops 

V.  Systems adapt to change of any intensity . .  by changing the composition of the system (that is eliminating, adding, or rearranging elements as necessary to reestablish equilibrium) This evolution is the price paid for maintaining  stability and system integrity.  Systems adapt (evolve) to survive; systems that fail to adapt will become extinct.
	
VI.  Elements of a system are not necessarily unique to that system, they can be parts of many other systems and their roles and importance (criticality) can vary between systems.

VII.The extremes, the most significant of a range of potential changes, have the tendency to cause lasting change and influence the stability or integrity of a system.  Unless counterbalanced, extremes will cause system instability and when resolved a different equilibrium will result.

VIII.  A society is a human system.

IX.  Systems react in self defense; negative feedback is a defense mechanism that allows for continuous stability of a system through the feedback loop process.  All systems tend to protect themselves first and their own survival. Systems evolve slowly through adopting to continuous change.

X.  The larger the population, the more predictable it as a mass becomes.  Individual traits are random and unpredictable but as a mass, the theory of large numbers, a summing of all the parts, a symmetry and defined characteristics of the group exists. The occurrence of an event can be predicted with reasonable probability but the timeliness of it can not.  The narrower the time range, the narrower the probability.  Probability never truly reaches 0 or 1 but always lies between.

XI.  In a revolutionary change, the system is altered abruptly, severely, totally.  This change requires a long time to reach equilibrium point.  In a revolutionary change, elements can be expunged or new elements appear almost instantaneously.  The stability and even the survival of a system is always at risk when undergoing a revolutionary change.

XII.  When evolving systems are stunted or restrained, function and tension develops.  Much like an earthquake, continual restraint will cause instability and a "fault:  When tension mounts unacceptably, the fault slips, a revolutionary change results, instant instability is achieved, and time to equilibrium is indefinite--the day of reckoning has arrived.

XIII. Externalities (external actions) could cause immediate instability within a system.  Instability will remain until the externality resolved.  Thereafter the effect will be damped only by time.

XIV.  It is rare that an internally generated action could cause an immediate instability . Normally action is a result of a long term trend.

XV.  Actions and effects are commonly separated by a significant time.

XVI.  Commonalities occur regularly within system dynamics (history does indeed repeat itself) the same set of events with the same set of stimuli should cause same set of results.

XVII.  Society is a system, conflict a process and conflict resolution a function of the institutions within the system.

PRINCIPLES OF SYSTEMS

A.  Perfectly closed systems are improbable, with a tendency towards stagnancy if not recessive.  If continued ,inbreeding results in inevitable decline and extinction

B.  Open systems are unpredictable.  An inherently instable situation can be staved off indefinitely by conversion to an open system. Stability can be tentatively achieved via an open system.  However, upon reversion to a closed system, instability is intensified, change is quickened.

c. Most systems tend to be a net closed system.   Some external inputs to a system and some output escaping from the system is an healthy state.  When the level of these inputs and outputs become significant versus the entire system resources, the system tends to cease being an independent system and the system will tend to become part of a larger system, a dependent.

d.  As cycles are inherently natural events, systems learn to accommodate all cycles.

E.  Systems are independent of each other but do interact.  One system may act as a change catalyst for another.

F.  The effect of a change upon a system lags from its initiation, the greater the intensity of the action (change) the quicker the first response is felt; the however, the time to equilibrium is directly proportional to the severity of the change.

G.  The greater, the intensity of a change variable , the greater the response.  The relationship is not merely arithmetic but exponential.

H.  One massive change has a greater effect on the system than a number of smaller reactions that add up to the same intensity.

J.  Systems tend to react to a change action in a predictable manner; however, the type, intensity and timeliness is unknown and unpredictable.  Change is similar to 'noise' an all encompassing randomness.

K.  Systems evolve in a natural progression.  External factors (change) may effect, override, reverse, accelerate this evolution.

L.  Reaction to change invokes a sinusoidal reaction.  Several iterations are normal.  Action 1 leads to an opposite less than intensity reaction 2 which leads to an opposite less than intensity action 3 and so on until the equilibrium point is reached.

M.  Each element of a system has a distinctive place and purpose  and relationship (cause and effect) with other elements.  Elimination or paralyzation of any element will have a detrimental effect on the stability of the entire system.

N.  Collapse of a system will occur when a critical number of elements are paralyzed, defunct, or otherwise disabled.  Collapse will appear spontaneous and nearly instantaneously.  Analogous to knocking one too many walls or supports from a building, straw on the camel's back, or the final cut on a giant tree.

O.  Change in effect, creates confusion, chaos and uncertainty within the system.  A fixed time is required for an equilibrium to be re-established and the effects of change damped to nearly zero.

P.  To minimize the effect of change on a system, change should be smoothed both in intensity and over time.  Instability to the system will be minimal.

Q.  To produce fast quick response, a large intensity of change should be accorded to a system.

R.  Nature is simple; the best systems are simplistic in nature.  All systems have common sense, elegance, and a minimal of complexity.   (Complexity being not knowing why something is the way it is or why)  Nature is complex but its individual parts are simple, the relationships rather simple, and nothing extremely complicated. A goal of simplicity within all structures or operating elements is desired.

S.  Adaption to a constant environment:  gradual evolution of internal structure to product fine tuning.

T. Adaption to a changing environment: radical or incremental changes of the internal structure may result depending upon the severity of the change.

U.  Closed systems must attain a time independent equilibrium state
   Open systems may obtain a steady state.

V.  the form of a system must be appropriate to its size.

W.  all living entities are open systems composed of subsystems which interrelate to accomplish input, transformation, and output activities (Miller 1978)

X.  Systems can be classified according to their common properties.  By knowing the class to which a system belongs, one can know many of a system's properties without having to observe the system itself.



Constructs of a System (Carman 1979)

1.  goals of the system under study
 
2. the environment and constraints within which the system exists, including degree of openness of the system

3.the control mechanisms for the systems

4.the actors that comprise the system--their number and definition

5.a description of these actors--their internal structure, resources, relative sizes, and relative power

6. the functions performed by the actors

7.linkage between actors

8.description of exchange activities between actors.

9. how values are assigned by the actors.

10.rules governing exchanges

11.transaction cost functions

12.  level of information available to the actors.

Miller's Universal Subsystems of Living Systems:
see also(Ashmos and Huber 1987) and  (Reidenbach and Oliva 1981)

Input Transducer
Internal Transducer
Channel and Net
Decoder
Associator
Memory
Decider
Encoder
Output Transducer
Reproducer
Boundary
Ingestor
Distributor
Convertor
Producer
Storage
Extruder
Motor
Supporter



SYSTEMS and Innovation:

We define openness as the freedom to move into and out of a system at will.  Barriers falling during Europe 1992 will transform Western Europe (common market, European Community)  into an open state.  A perfectly open society is one without restriction on goods, services, or people into or out of the society. In an open system therefore is a change of components of a system. Often, an open system obtains a state where the system remains constant although there is a continuous flow of the components through it. . . steady state.  is irreversible and can move from one steady state to another. property is important in explaining the increasing complexity exists in open systems.

We define closedness   as guarded borders , restricting entry and exit of goods, services, and people to and from it.  A perfectly closed society allows no entry in and no exit out of the society. A closed system  must eventually attain an equilibrium state.  In this equilibrium, the internal elements reach a state of maximum disorder, randomness, entropy and minimum free energy.  

Technologies are complex systems.   A complex system is made up of a large number of parts which interact in a non simple way.  In such a system the whole is more than the sum of the parts.  the properties of the whole can not be inferred from the properties of the parts and their interactions.  any technological system has an inner environment, characterized by the parts which constitute the system and their arrangement and an external environment.  the operation of the system and the achievement of its goals implies adaption to its external environment. a system is separated from its environment by means of a boundary or interface.(saviotti) 

homeostasis . .  complex system shields itself from external changes and to maintain the invariance of some internal properties. homeostasis works within a range but changes outside the range will cause severe internal changes.  in fact once the changes surpass the homeostasis range, drastic increases in internal elements occur in an attempt to regain equality with the external force. homeostasis occurs through feedback or self regulation behavior. in order to achieve homeostasis, the system must have internal variables which can monitor and counteract the external environment.

systems evolve when they reach a sufficient level of complexity, have flexible feedbacks between their components , are exposed to a sufficiently rich and constant energy flow, and when their normal functioning is disturbed. ( Laszlo 1985)  It is this factor of disturbance that is the evolutionary trigger for systems. If it is below the critical level, the systems normal feedback buffers it out and a return to stability with no evolutionary change occurring.  If it surpasses the critical level, the feedback cycles are disrupted and the previous system vanishes and decomposes to more strongly bound components to another stable level. But just at that critical level, it is moved out of normal flow to another level.  when that critical level is reached, a freedom of choice occurs, a bifurcation, and a new system diverges from the old.



PROPOSITIONS:

Proposition 1a:  The greater the openness of a society, the greater the 		innovative capacity of that society.

Proposition 1b: The greater the openness of a society, the more readily an innovation is adopted (less resistance) and the faster its diffusion.

Proposition 2a: The more closed a society is, the fewer innovations that occur. 

Proposition 2b: The more closed a society is, the slower the adoption of innovations (diffusion) and the more resistance is encountered prior to its diffusion.

Proposition 3: A closed society, when opened, undergoes revolutionary 		changes.  The more abrupt and more severe the transition that occurs, the greater the magnitude of change to the society that occurs.

Proposition 4: A Closed society equilibrium tends to be significantly different from equilibriums residing outside the closed society. In most cases the equilibrium is considerably below and represents a stagnant society.  Only by infusion from the outside does it progress.  Innovation comes from the 	outside.

Proposition 5: An evolutionary process of societies exist. Systems evolve evolutionary.  in so doing the elements may change, mutate alter disappear become extinct or new elements appear. as in any evolution survival of elements depends upon adaptability and ability to change as the system changes.  however much elements change, the system cohesiveness, integrity, stability remains.  when evolving systems  are stunted or restrained, friction and tension develops.  continual restraint will cause system wide instability, a "fault" when tension mounts unacceptably, the fault slips and instant instability results and a revolutionary change created.  much like an earthquake.


Proposition 6:  Openness is a function of a society's democracy and capitalistic characteristics. Closedness is considered a lack of those characteristics.  If this is true, then 

Proposition 7:  non democratic, non-capitalistic  societies should be poor innovators and poor adopters of innovation.

Proposition 8: An externality to a system will affect closed systems greater than open systems.  Open systems are more resilient and adaptable to externalities.

Proposition 8b: in a revolutionary change, the system is altered abruptly severely totally. this change takes a n extreme amount of time before a new equilibrium is reached.  in a revolutionary change, elements can cease to exist or new elements appear nearly instantaneously. the stability of the system is always threatened in a revolutionary change.  open systems adapt and can accept revolutionary changes better than closed systems.  these revolutionary changes could be caused by radical innovations.

So far we have only discussed the effects of the system upon the innovative process.  But can the technology, the new innovation affect the system?  Could it be the impetus for a state change in the system? Why should this be so.

Proposition 9: Radical innovations can be sufficient and necessary to initiate a state change from closed to open system.

Proposition 10:  Radical innovations are not sufficient to affect a state change from an open to closed system. 
 Innovations can also be part of a system. innovations as a part of a system. 
Proposition 11:  Innovations that are part of a system or cluster  will be adopted or rejected as a cluster.  

Proposition 12:  innovations that are part of a system will tend to be adopted quicker then will innovations that are isolated and appear individual changes.

SUMMARY AND CONCLUSIONS:




























    REFERENCES
Ackoff, Russell L (1971),"Towards a System of Systems Concepts", 	Management Science, Vol 17, No 11 (July) pg 661-670

Ashmos, Donde P. and George P. Huber (1987),"The Systems Paradigm in 	Organizational Theory: Correcting the Record and Suggesting the 	Future",Academy of Management Review, Vol 12 No 4, pg 607-621

Beck, Don Edward (1982),"Beyond the Grid and Situationalism: A Living 	Systems View",Training and Development Journal, August ,76-83

Bouldling, Kenneth E. (1956),"General Systems Theory: The Skeleton of 	Science", Management Science, Vol 2 No 3 (April) pg 197-207

Bryant, Denise and Stephen L. Merker (1987),"A Living Systems Process 	Analysis of a Public Transit System," Behavioral Science, Volume 	32, pg 293-303

Carman, James C. (1979), "A Systems/Exchange Approach to the Analysis of Health Care Delivery", Fourth MacroMarketing Seminar , August Boulder Colorado, pg 47-63

Dixon, D.F. and I.F. Wilkinson (1984)," An Alternative Paradigm for 	Marketing Theory," European Journal of Marketing, pg 40-49

Dowling, Grahame R.(1983),"The Application of General Systems Theory 	to an Analysis of Marketing Systems", Journal of MacroMarketing,  Fall,  pg 22-31

Glaser, S. (1985),"Marketing System and the Environment", European 	Journal of Marketing, Vol 19 No 4, pg 54-72

Henderson, Bruce D. (1983),"The Anatomy of Competition", Journal of 	Marketing, Vol 47 (Spring) 7-11
Merker, Stephen L. and Connie Lusher (1987)," A Living Systems Process Analysis of an Urban Hospital", Behavioral Science, Volume 32, pg 	304-315
------------------(1985),"Living Systems Theory: A Framework for 	Management," Behavioral Science,  Volume 30, pgs 187-194

Miller, James  (1978), Living  Systems, McGraw Hill, New York

Mokwa, Michael P., Ben M. Enis, and Ricky W. Griffin(1979), 	"MacroMarketing Analysis and Theory Development: Perspectives from 	Parsons' General Theory of Social Systems,"  Fourth MacroMarketing Seminar, August, Boulder Colorado, pg 41-56

Reidenbach, R. Eric and Terence A. Oliva (1981)," General Living Systems 	Theory and Marketing: A Framework for Analysis", Journal of 	Marketing, Vol 45 (Fall), pg 30-37

________________________________(1983)," Toward a Theory of the 	Macro Systemic Effects of the Marketing Function", Journal of MacroMarketing, Fall, 33-40

Ruscoe, Gordon C., Robert L. Fell, Kenneth T. Hunt, Steven L. Merker, 	Lorena R. Peter, James S. Cary, James Grier Miller, Bradford G. 	Loo, Robert W. Reed, and Mark I. Sturm (1985), "The Application of 	Living Systems Theory to 41 US Army Battalions", Vol 30, pg 7-49

Saviotti, P.P. (1986),"Systems Theory and Technological Change," Futures, 
	December, pg 773-785

Slater, Charles C. and Dorothy Jenkins(1978), "Systems Approaches to 	Comparative Macro-Marketing", Third MacroMarketing Seminar, 	August, Kingston, Rhode Island    pg 371-379

Sweeney, Daniel J. (1972), "Marketing: Management Technology or Social 	Process", Journal of Marketing, Vol 36 (October), pg 3-10

White, Phillip D. (1981), "The Systems Dimension in the Definition of 	MacroMarketing," Macromarketing, Vol 1 (spring) pg 11-13