17
IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL 1 IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? Rene MANDEL www.value-architecture.com Version 1 13/04/2015

Is increasing entropy of information systems a fatality

Embed Size (px)

Citation preview

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

1

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY?

Rene MANDEL

www.value-architecture.com

Version 1 13/04/2015

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

2

SUMMARY

Table of contents

1 FOREWORD ................................................................................................................................................................ 3

1.1 CONTROL OF INFORMATION SYSTEMS COMPLEXITY ....................................................................... 3

1.2 A SCIENTIFIC AND TECHNICAL CHALLENGE ........................................................................................ 3

2 INTEGRATION OF INFORMATION SYSTEMS .................................................................................................. 5

2.1 ALGEBRA OF IS COMPONENTS .................................................................................................................... 5

2.2 FATE OF THE COMPLEXITY ......................................................................................................................... 6

2.2.1 Drift towards complexity ............................................................................................................................... 6

2.2.2 Existence of reference data. ........................................................................................................................... 6

2.2.3 Imperfect integration of reference components .......................................................................................... 6

2.3 MECHANICS OF COMPLEXITY ..................................................................................................................... 9

2.3.1 Divergence according to the rate of imperfection ....................................................................................... 9

2.3.2 Real reasons of particularities....................................................................................................................... 9

3 ACT ON THE HEART OF INTEGRATION ......................................................................................................... 10

3.1 MAKE THE MOST PERFECT COMPONENTS MARKET ........................................................................ 10

3.1.1 ROI for the SI with an algebra perfect ...................................................................................................... 10

3.1.2 Act on the internal market .......................................................................................................................... 10

3.2 GENERIC INTEGRATION CAPABILITIES ("JANUS" PRINC IPLE) FOR REFERENCE COMPONENTS ........................................................................................................................................................ 11

3.2.1 Generic data integration functions ............................................................................................................. 12

3.2.3 Case of Master Data. .................................................................................................................................... 14

3.2.4 Manage subsidiarity ..................................................................................................................................... 15

3.3 PROFESSIONALIZING INFORMATION SYSTEM ACTORS .................................................................. 15

4.1 RESISTANCE TO PYRAMIDAL GOVERNANCE AND HEAVY ME THODS ........................................ 16

4.2 HAVE DECISIONS NATURALLY IN THE CORRECT SENSE ................................................................ 16

4.3 SYNCHRONIZE DATA ON SIMPLICITY .................................................................................................... 17

4.4 CHANGE OF PARADIGM ............................................................................................................................... 17

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

3

1 FOREWORD

1.1 CONTROL OF INFORMATION SYSTEMS COMPLEXITY Information systems are result of a reality increasingly complex: multiplication of sensors, extension of application areas, stacking of technologies, application sedimentation,... Intuitively one can describe this drift over time, as an increase of "entropy" of systems (distinct from other entropies known, as a human factor is taken into account). Enterprise Architecture aims to impose a model for order. This is a rational and authoritarian approach, it assumes that the model is respected and applied in duration. In practice, rational planning, certainly desirable, is often impractical and expensive. A more modest approach relies on gradual order, at projects pace, at lower cost. But clearly in absence of central and collective effort, the disorder increases "naturally". How much energy is needed to return to order? By imposing or facilitating agility? Moreover how to define the IS entropy? Anyway, is increase of IS entropy a fatality? The same causes that cause fatal dead ends and embolism in a vicious cycle operation, could act instead by virtuous circles and gradually reduce complexity. This simplification would be natural, unplanned and undirected. One of the ways for this natural transformation of IS heritage, is to introduce some key components, entropy killers: the socio-technical system then transform itself, a localized manner. The ordering effort would be minimal and relayed naturally by all IS stakeholders. However this way, opened, as described below, by technological advances, is ignored by the dominant ideology. The increase in IS entropy is not inevitable, it results firstly from a natural drift, and on the other hand the utopia of an unrealistic control ignoring the dynamics of micro-decisions of IS construction, and ignoring technological opportunities. We need to change the EA paradigm, and consider alternatives to purely methodological approaches.

1.2 A SCIENTIFIC AND TECHNICAL CHALLENGE The activity of software development has become, in the course of time, major in the economy:

• Software components are present everywhere, in all types of physical media, and by virtual way in the "cloud".

• All organizations have internal teams, or subcontracts, to assemble, integrate components and

incorporate IS more and more crucial to their activity, for analysis, strategy, customer engagement...

• The macroeconomic weight of these activities is paramount: number of jobs, global

redistribution of skills, induced financial flows,...

Software engineering should rely on a science of the software to the extent of these issues. It is clear that this science of the software has not been hitherto commensurate with the issues, and that in practice information systems of large organizations face challenges of complexity. IS indeed are rigid, over-complexes, expensive, fragile, stovepipe "in silos", and ultimately difficult to control

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

4

The efforts of governance and Enterprise Architecture methodologies, seeks to alleviate these practical and organizational difficulties. This authoritarian and rational approach has limits, in particular by its heaviness and its pyramidal organizational implementation, even if it seeks to reform by integrating the "agile" approaches. Indeed the IS’s extension is an explosive phase, which no longer allows to rely exclusively on ancient recipes based on an obsolete technological past and old tools. The challenge is both scientific and technical:

• Scientific because there is no enough recoil to explain, and to model the behavior of stakeholders in projects that result in nesting, duplications, dysfunction observed.

• Technique, because the technological world is rapidly evolving, and new technological bases

allow to expect a paradigm shift so that the mastery of complexity is consubstantial to software development. This major fact is an unknown opportunity that should shake up the methodological approaches. Still need that the subject should be considered with rigor, in relation to the challenges.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

5

2 INTEGRATION OF INFORMATION SYSTEMS In the activity of software production, for several decades, the limitations are known. In particular the perverse effects of poorly controlled integration led to develop methods for Enterprise Architecture (Zachman, Togaf,...) and governance. One of the key questions is the integration of components to constitute, by assembling, ISs.

2.1 ALGEBRA OF IS COMPONENTS IS are formed, whatever they may be, from various components (software, technical). We can have a global vision of an IS as a whole, which evolves by:

• Assembly and disassembly of components • and the life cycles of the components themselves.

This cycle of an IS, which is not the "run time" cycle, but the software and technical components evolution, is complex. Each component has its own life cycle, with a phase of creation, maintenance episodes, reshape, necrosis (phasing out),... A strong feature of IS is the interdependence of the components. Indeed multiple interactions are necessary, appropriate, or even involuntary. We can say that, between two components, there is a "function" of interaction that allows to consider they constitute a new component. This new component can itself interact with others. This function is "transitive", and by successive assemblies sets of larger size are created (associativity). Thus, from the set theory point of view, components obey specific laws. It could be the analog with molecular physics, and the behavior of atoms between them (see http://www.value-architecture.com/2015/03/les-limites-de-la-complexite-des-si.html)... Appropriate interactions are named “integration”. Components integration comply with a typical "algebra" (you can represent it by an algebraic formulation). Thus, could be installed as hypothesis an algebra of the components 'perfect' respecting a law of composition associative ((x * y) * z = x * (y * z) and commutative: x * y * z = x * z * y). But in practice, the particularities of integration, as the dispersion of projects between teams and at the time, suggest that this assumption is not realistic in the State of contemporary software development art.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

6

2.2 FATE OF THE COMPLEXITY

2.2.1 Drift towards complexity Over time, new interactions appear, by "transitivity" of integration: step by step involuntary and non-articulated dependencies thus emerge. Indeed, integration is controlled by proximity, but the spread of knowledge is still limited, and on a human basis. Thus, beyond circles of proximity, indirect and unintended dependencies are created. Imperfections will increase over time:

• knowledge is gradually diluted • new needs appear to respond with new integrations • even new component emerge creating gaps or inconsistencies.

This phenomenon is classic and well known in its different forms: spaghetti effect, affiliation of systems with systemic risk (domino effect), rigidity of the heritage making evolutions (need of heavy redesigns) expensive or impossible, technology stack,... For example the spaghetti effect is due to the combination of the particularities of the transmitters and receivers: If there are n emissions of a set of information under n variants, and receptions of this ensemble under p other variants, are built over the projects n X p types of exchanges, while passing through a focal point would require n + p. More generally n + p integrations are multifaceted n X p. Thus, SI are 'over-complex', having be extended as "coral reef" by opportunistic additions and integrations focusing on proximity and short term. They are over-complex because a simpler composition could produce the same result, or even better, if one takes into account the improvement in flexibility.

2.2.2 Existence of reference data. In all IS, is the need to share "reference data". This state of fact, that suffers no exception, is known since a long date. The key role of these data for control systems disorder, well beyond the strict sphere of the IS, is also accepted by all organizations. The organizational response is to implement a governance of these data, minimalist governance, or on the contrary maximalist, but which is often source of internal tension. In fact a balance of powers is to be found to preserve the essential consistency, in the center, and non-less essential autonomy, in periphery. As result of this search for balance, there is excess, either in one direction or another, which generate dysfunctions: inconsistencies, redundancies, stiffness, confusion, double entries, workarounds. And in many cases the reference data, for example about clients (notion of 'golden data') are not in quality.

2.2.3 Imperfect integration of reference components Still in the set vision, should focus on a particular class of components: these reference components (of type 'master data' or 'data wells'). A reference component plays a particular role within a perimeter (if all or a subset): it is likely to be 'integrated' with all the other components of that subset, and this integration is perfectly transitive: a

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

7

component incorporating such reference inherits this option and can in turn serve identically as reference. Somehow the low of composition is associative: (x * y) * r = x * (y * r) and the legacy of the reference properties might be noted Hr(x*y*r) = Hr(y*r) = Hr (r). In reality, the integration of these components is always «imperfect» and gives rise to aberrations. How to explain the imperfections of this "algebra"?

2.2.3.1 Usurpers components Components, which are not classified as reference components, can, for various reasons (lack of knowledge, desire to control, will to modify the model, add particularities...) be substituted as reference within the subset. Spoofing can be:

• Conceptual: modification of the model, introduction of a variant without real contribution, denial of the genericity, nesting of a supplement,

• Syntax: introduction of a local dialect, making specific local exchanges and require a translation

to be included within the scope of election of the reference component,

• On the modes of latency: introduction of kinematics of special exchange (batches, pace of update, messages, invocation of services...).

2.2.3.2 Opportunistic integration

Without that deliberate creation of usurpers components, integration links can grow in disarray, by proximity, without respecting the quality of the integration proposed by the reference components. For instance, if a reference component does not offer the expected exchange, the temporality required mode,... an usurper component may seem illusory legitimate, and take a sustainable position, even if the justifying gap disappears. So a random grid of components is created, and makes the system more complex.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

8

Perver

Vraie

sourceUsurpeur

Sain

Vraie

source

Développement d’un maillage sur-

complexe

Leading gradually to:

2.2.3.3 Integration gaps It is also common, given the difficulties to integrate components, that it avoids, relying on "manual" solutions: entered in doubles, instructions, reports of data to view,... These solutions, if the volumes are low, are little safe in the medium term, and must be framed by controls that also pose the question of reference data.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

9

2.3 MECHANICS OF COMPLEXITY

2.3.1 Divergence according to the rate of imperfection Over time components evolve, population change, and integration links are and undo. There is a certain intensity of projects, and demographics of all these elements. First, there is a derivative of the perimeter, which is generally in the sense of an extension of the system, and its component population. Thus, it becomes more complex. But it should not become over time 'over-complex' by increase of the above problems. In practice, with a high imperfection rate, complexity grows by a combinatorial way, needing redesigns and costly reengineering, and requiring heavy governance. In short the complexity obeys a series law which is rapidly divergent: natural imperfections are introduced over time, with a cumulative negative effect. However if new imperfections do not occur, old imperfections will be naturally resorbed over developments (an imperfect integration being replaced by perfect integration). The series will be convergent and complexity will reduce naturally. To simulate this series, on basis of a few set of assumptions (coupling between generations, rates of imperfections in these couplings, size of successive generations,...) could be created a numerical model. Entropy would be measured within the graph of dependencies between basic components ("atoms"): for instance, average number of arcs by component.

2.3.2 Real reasons of particularities Indeed, in the interposition of "usurpers components", there are real reasons:

• Lack of modeling of the subsidiarity o example of the repositories (structures, identities,...) o example of semantic subsidiarity

Frequent differences of opinion are actually due to a hidden subsidiarity, which is not objectified. A practical way to combat this is to objectify the subsidiarity by a suitable modelling, and set up this model to make it flexible and scalable.

• No management of multiple latencies: o workflow services at different time scales, o batches of variables sizes: updated, stock,...

• Partition between 2 worlds (batch, real-time) and multiplication of protocols and formats • No management of the 3 dates and historical depth (see the 'data wells' approach). These subjects, poorly studied at the conceptual level, divide editors, and compartmentalize the IS. Yet it is the same facets of the IS objects, treated as different, without objectifying the common 'model'.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

10

3 ACT ON THE HEART OF INTEGRATION As in other sciences, we can on the one hand study the system in its entirety, or in its detailed operation. For example physics a first clear General, although laws that the intuition of the existence of atoms is very old. Quantum physics came then to formulate models of behavior and interaction at the scale of atoms. But, in physics, the man is irrelevant to these atomic interactions. In the IS lifecycle, on the other hand, human behavior, both those of individuals, and especially those due to social conditioning, are determinative. Elsewhere the tools available in the IS design, and during maintenance activities, are succinct, with handcrafted-based maps.

3.1 MAKE THE MOST PERFECT COMPONENTS MARKET

3.1.1 ROI for the SI with an algebra perfect The question of the ROI of IS is paramount: a IS costs and adds value throughout its lifespan, and one can judge its usefulness only on this duration. We have proposed a mathematical model for ROI of infrastructure components (see: http://fr.slideshare.net/RenMANDEL/roi-infrastructure-v8 )). Integration of 2 components, which creates a new logical component, alters this ROI. On the one hand it has also a cost, and value, and has an impact on the costs and values of integrated components. One can thus consider that this integration have also a ROI, and that the function of integration, active in all components, is "projected" on the space of ROI. And if one is rational, the integration is realized only if it improves the ROI. This "atomic" model is simple. We can propagate it at the macro level by making assumptions, and in particular that the algebra of the components is "perfect":

• A component that would be useful to other components would be 'integrated' whatever 'distance' between them.

• The above-mentioned adverse effects would be known and mastered by the actors concerned,

and limited to not degrade the ROI in any case (it could be formulated in a probabilistic approach).

Of course these calculations are difficult and arbitrary, because while we know costs, otherwise for the value which is variable and dependent on the external environment. Reference components find their justification in their integration to the others, as if there was an internal transaction, with a transfer price. This therefore raises the issue of the 'market' of internal components.

3.1.2 Act on the internal market Thus, beyond this perfection of desirable integration, the issue is the perfection of the internal market of exchange of components: a reference component will be 'integrated' (a kind of internal economic transaction) if it allows an optimal evolution of the ROI (see the note cited above on the ROI). This explains the need of balanced information, in a 'perfect' market, so the buyers has a knowledge of the benefits, guarantees, risks. A contractual approach (see the theory of contracts in industrial economics) would meet this objective.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

11

This transparency of the internal market of components, applied to the components of reference, would avoid 'spoofing', which places in the heritage usurpers components, with perverse effects at short term and mostly in the medium term. Also implies that the components of reference are available on the market "in advance of phase", compared to the other components development cycles.

3.2 GENERIC INTEGRATION CAPABILITIES ("JANUS" PRINCIPLE ) FOR REFERENCE COMPONENTS

Another key to optimize the integrations around the reference components is to give them 'generic' integration capabilities, enabling them to be assembled in all contexts, 'perfectly' and without extra cost. For this purpose the qualities are:

• Anticipation of future developments, to ensure projects a 'service' adapted to new technologies and future requirements: new exchange method, more aligned with current opportunities for projects, (ESB, low latency, Cloud,...) and with the expectations imposed on IS (all connected, mobility, traceability, transparency, all sources, agility,...).

• Offer interfaces for a ' non-intrusive ' insertion within the existing heritage: accept all exchange methods and integration, for coexistence with existing heritage without additional costs, and in a short time.

This can be described as a "Janus" capacity, like the Roman God with double face, one turned to the past, the other towards the future.

This capacity is achieved through the development of data services using a library of data integration (several solutions exist on the market), and the design of a generic model adapted to the type of reference data.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

12

An interoperability architecture can for example be incorporated to apply to the worlds of distribution of flows (batch, messages) and of orchestration of services (web services, Soa,...), and make them converge.

Puits de données

Web

CRM Legacy …

Web

ERP

MDMInteroperability System

Orchestration Distribution

of Flow

Data Wells

3.2.1 Generic data integration functions The library of data integration breaks down into several functions. It must also be enriched by a layer of services and pilotage. These functions are described below (can also refer to the publications of Gartner on the subject).

3.2.1.1 The "transport". This is the level of the tubing that allows to carry lots of data or messages. There are batchs of data file transfers on the one hand, on the other hand posts, for example through an ESB.

3.2.1.2 "Connectivity". These components enable to connect to a variety of databases of various publishers or the open source world. Allow others to interact in various formats: fixed fields, marked (XML, JSON, EDI), specific to ERP,...

3.2.1.3 'Conversion ' functions. Thanks connectivity components, it is possible to convert various formats to each other. In addition, by a few clicks certain libraries allow mapping of data to recompose new exchanges or flow from data selected in the stream or existing trade. These components can be activated in a batch context, for example in a classical ETL, as in a 'message' or real time context.

3.2.1.4 "Storage" function. This 'layer' is of variable use according to usage. The 'storage' function can use several techniques: Classic DBMS, or new family "NoSql" solutions. It is necessary to offer all historical views of data.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

13

3.2.1.5 «Synchronizing» functions, data quality Consolidation of different flows and exchange shows various data quality problems: duplicate, differences, inconsistency of identification from the repositories, temporal inconsistencies, not quality of identifiers, addresses,... functions must encapsulate the diagnosis and correction functions. They are generally using solutions running with Master Data Management (MDM).

3.2.1.6 "Data services" functions. Data services are aimed to expose and disseminate data. Functional approaches must expose them by grouped, batchs, latencies expected by internal or external structures, and applications that wish to use them. This is to develop and adapt to the domain.

3.2.1.7 The "cockpit". This level pilot and oversees the distribution of the flow and orchestration of services. 3.2.2 Generic modelling of “data wells”. The wells were first designed to substitute for N + P interactions in the N X P interactions that emerge naturally, according to the following scheme:

N X P flowsN X P Format Conversions

Becomes:

Master

DataCRM Source

Extern

Source

360 VisionBusiness

IntelligencePartner

PuitsWell Format

Conversions

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

14

For a discussion of the principle see: http://www.value-architecture.com/2014/03/a-la-decouverte-des-puits-de-donnees.html The key to the design is in the identification of the generic "grain" that allows to trace all the evolutions of objects. Data, at this finest “grain” level, is placed in the well. Defining events and different dates is essential, to ensure the stability of the model. (see also: http://fr.slideshare.net/RenMANDEL/principe-du-puits ) ). In principle the wells must not be intrusive, or require process redesign. It is out of question to dispossess applications of their processes, in general very nested. However wells provide traceability, in particular for the monitoring of data quality (cf. «tri-dated» generic model).

The “tri-dated" generic model of a well

There is a simple way to identify potential wells: objectify the present cycles in the ecosystem (see: http://www.value-architecture.com/2013/02/les-azimuts-des-chaines-de-valeur.html), because each cycle is at the origin of events, motivating a well dedicated to him. Well, key to coherence, is aimed at tracing of one such cycle.

3.2.3 Case of Master Data. Note that master data, which structure the IS, are rather positioned between several cycles, and ensure consistency inter-cycles. The market of MDM solutions provides flexible and operational solutions. Using the library of integration will also allow an insert non-intrusive and scalable, unlike some much centered governance for MDM projects. The principle is to organize a focal point to disseminate data that currently play the role of Master Data.

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

15

This does not exclude an evolution gradually controlled, in particular to ensure the quality of data, and synchronizing sources. The impact on the supply process is then to examine, and involves a reengineering. Some data and control functions can be common between wells and Master Data repositories.

3.2.4 Manage subsidiarity You would think that there is only one configuration in the architecture of reference data. In reality, the perimeters of these systems can be variable, and current solutions can anticipate this flexibility. As mentioned above, should on this occasion model subsidiarity and propagate model between different repositories and wells. A unique and integrated model is unlikely to allow a balance between:

• Consistency on objects, on common concepts,

• Essential autonomy. Subsidiarity is a sensitive issue, "cursors" of subsidiarity are not always obvious. It is therefore useful to provide this setting to create a flexibility and resist to possible organizational earthquakes.

3.3 PROFESSIONALIZING INFORMATION SYSTEM ACTORS Around this issue of IS simplification, with the emergence presented above, opportunities should be, if actors come out of current crafts. The question, overall, is to defer a portion of the energy placed in opportunistic investments, or in major methodological engagement, to:

• Invest, at lower cost, in "Janussiens" components, avoiding the pitfall of MDM conventional approaches too greedy in governance and «big bang»

• Promote Internal "commercialization", with offers of high-level data services,

• Take into account needs of the projects, at the decentralized level.

Wells 360 Vision

Focal Point

Conversion Library

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

16

4 AN ALTERNATIVE TO GOVERNANCE

4.1 RESISTANCE TO PYRAMIDAL GOVERNANCE AND HEAVY METHOD S We have seen that the response to the inconsistencies, derivatives, and other confusions occurring naturally was to introduce still more governance of data and, more particularly, of reference data. This governance responds effectively to a major control issue. However it runs into several resistors:

• It is an organization to set up, often with dedicated resources,

• Often there are internal opposition, with different objectives and different trades, and a pyramid development in the Organization,

• Transitional arrangements to implement the system (MDM or other), in a rational and exclusive

approach are long and sown with pitfalls.

• The conceptual foundations of these systems are fragile, and don’t solve issues of subsidiarity, dating, archiving, of traceability,...

Historical Enterprise Architecture methodologies are also heavy (impressive documentary volumetry, consulting...) and costly (personal "certified", and specialty procedures) to implement, and find their justification in large projects, which combine costs and risks.

4.2 HAVE DECISIONS NATURALLY IN THE CORRECT SENSE We have seen that the availability on the market of routines to "data integration" allows to create, around these wells and repositories, a belt of conversion for access in all temporalities and latencies, and according to all types of protocols... These components, organized into 'data services', manage the complexity of the "dialects" and opportunistic variabilities. The challenge is to make these services transparent, easily-accessible, operational, demonstrable by a Proof Of Concept, so that stakeholders in the projects make good decisions without any form of trial. The technological opportunity offer at a low cost solution to the enormous issues mentioned here. The motivation must come from actors projects, responding to their wishes:

• Of cost control,

• Availability in the short term,

• Agility in the medium term,

IS INCREASING ENTROPY OF INFORMATION SYSTEMS FATALITY? R. MANDEL

17

• Guarantee of reliability. Decisions of the projects will then 'naturally' go in the right direction, that of simplicity. And these same forces acting in imperfect context, with 'Janussien' components, gradually simplify the SI.

4.3 SYNCHRONIZE DATA ON SIMPLICITY In this quest for simplicity, reference data are key. There also exist universally in all contexts. They are the pillars of all IS, in the existing as to the target. They trace cycles of objects through all ages and organizations::

• People, structures, products, concepts, services and entities... The IS have become complicated by overlapping of features, we must find fundamental modularity and simplicity by synchronizing on these pillars. This requires to manage particularities in all their dimensions: subsidiarities, dates, latencies, protocols,... This will guide the capitalization for new advances in extensive digitalization: Technical disruption (Cloud, Big data, Hadoop) and cultural gap require a vision renovated, out of silos, far from IS and process maps approaches. A vision aligned with fundamental value chains, and invariants of reference data models.

4.4 CHANGE OF PARADIGM The paradigm of the EA is based on a rational approach, supposing to control the IS by a global vision. However, the international domination goes in the direction of 'reassuring' standards, which seems serious, regards to thickness of documentation, and the number of mobilized experts. This trend is relayed through trainings, certifications, which spread and perpetuate the model. Certainly, the need for broad guidelines remains, and should not reject the usefulness of reflections on the target. But the paradigm is clearly to challenge, to revisit, to reverse the burden of proof, and explore solutions more adapted to the current "agile", full technological and connected civilization.