11
Interpretable neural networks based on continuous-valued logic and multicriteria decision operators Orsolya Csisz´ ar a,b,* , G´ abor Csisz´ ar c , J´ ozsef Dombi d a Faculty of Basic Sciences, University of Applied Sciences Esslingen, Esslingen, Germany b Institute of Applied Mathematics, ´ Obuda University, Budapest, Hungary c Institute of Materials Physics, University of Stuttgart, Stuttgart, Germany d Institute of Informatics, University of Szeged, Szeged, Hungary Abstract Combining neural networks with continuous logic and multicriteria decision making tools can reduce the black box nature of neural models. In this study, we show that nilpotent logical systems offer an appropriate mathematical framework for a hybridization of continuous nilpotent logic and neural models, helping to improve the interpretability and safety of machine learning. In our concept, perceptrons model soft inequalities; namely membership functions and continuous logical operators. We design the network architecture before training, using continuous logical operators and multicriteria decision tools with given weights working in the hidden layers. Designing the structure appropriately leads to a drastic reduction in the number of parameters to be learned. The theoretical basis offers a straightforward choice of activation functions (the cutting function or its differentiable approximation, the squashing function), and also suggests an explanation to the great success of the rectified linear unit (ReLU). In this study, we focus on the architecture of a hybrid model and introduce the building blocks for future application in deep neural networks. The concept is illustrated with some toy examples taken from an extended version of the tensorflow playground. Keywords: neural network, XAI, continuous logic, nilpotent logic, adversarial problems 1. Introduction AI techniques, especially deep learning models are rev- olutionizing the business and technology world. One of the greatest challenges is the increasing need to address the problem of interpretability and to improve model trans- parency, performance and safety. Although deep neural networks have achieved impressive experimental results e.g. in image classification, they may surprisingly be un- stable when it comes to adversarial perturbations, that is, minimal changes to the input image that cause the net- work to misclassify it [1, 17, 22, 23]. Combining deep neural networks with structured logical rules and multi- criteria decision tools, where logical operators are applied on clusters created in the first layer, contributes to the re- duction of the black box nature of neural models. Aiming at interpretability, transparency and safety, implementing continuous-valued logical operators offers a promising di- rection. Although boolean units and multilayer perceptrons have a long history, to the best of our knowledge there has been little attempt to combine neural networks with continuous logical systems so far. The basic idea of continuous logic * Corresponding author Email addresses: [email protected] (Orsolya Csisz´ ar), [email protected] (G´aborCsisz´ar), [email protected] (J´ozsefDombi) is the replacement of the space of truth values {T,F } by a compact interval such as [0, 1]. This means that the in- puts and the outputs of the extended logical gates are real numbers of the unit interval, representing truth values of inequalities. Quantifiers x and x are replaced by sup x and inf x , and logical connectives are continuous functions. Based on this idea, human thinking and natural language can be modeled in a sophisticated way. Among other families of many-valued logics, t-norm fuzzy logics are broadly used in applied fuzzy logic and fuzzy set theory as a theoretical basis for approximate rea- soning. In fuzzy logic, the membership function of a fuzzy set represents the degree of truth as a generalization of the indicator function in classical sets. Both propositional and first-order (or higher-order) t-norm fuzzy logics, as well as their expansions by modal and other operators, have been studied thoroughly. Important examples of t-norm fuzzy logics are monoidal t-norm logic of all left-continuous t- norms, basic logic of all continuous t-norms, product fuzzy logic of the product t-norm, or the nilpotent minimum logic of the nilpotent minimum t-norm. Some indepen- dently motivated logics belong among t-norm fuzzy logics as well, like Lukasiewicz logic (which is the logic of the Lukasiewicz t-norm) and G¨ odel-Dummett logic (which is the logic of the minimum t-norm). Recent results [3, 5–9] show that in the field of contin- uous logic, nilpotent logical systems are the most suitable Preprint submitted to Information Sciences February 10, 2020 arXiv:1910.02486v2 [cs.AI] 7 Feb 2020

Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

  • Upload
    others

  • View
    24

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

Interpretable neural networks based on continuous-valued logic and multicriteriadecision operators

Orsolya Csiszara,b,∗, Gabor Csiszarc, Jozsef Dombid

aFaculty of Basic Sciences, University of Applied Sciences Esslingen, Esslingen, GermanybInstitute of Applied Mathematics, Obuda University, Budapest, Hungary

cInstitute of Materials Physics, University of Stuttgart, Stuttgart, GermanydInstitute of Informatics, University of Szeged, Szeged, Hungary

Abstract

Combining neural networks with continuous logic and multicriteria decision making tools can reduce the black boxnature of neural models. In this study, we show that nilpotent logical systems offer an appropriate mathematicalframework for a hybridization of continuous nilpotent logic and neural models, helping to improve the interpretabilityand safety of machine learning. In our concept, perceptrons model soft inequalities; namely membership functions andcontinuous logical operators. We design the network architecture before training, using continuous logical operators andmulticriteria decision tools with given weights working in the hidden layers. Designing the structure appropriately leadsto a drastic reduction in the number of parameters to be learned. The theoretical basis offers a straightforward choice ofactivation functions (the cutting function or its differentiable approximation, the squashing function), and also suggestsan explanation to the great success of the rectified linear unit (ReLU). In this study, we focus on the architecture of ahybrid model and introduce the building blocks for future application in deep neural networks. The concept is illustratedwith some toy examples taken from an extended version of the tensorflow playground.

Keywords: neural network, XAI, continuous logic, nilpotent logic, adversarial problems

1. Introduction

AI techniques, especially deep learning models are rev-olutionizing the business and technology world. One of thegreatest challenges is the increasing need to address theproblem of interpretability and to improve model trans-parency, performance and safety. Although deep neuralnetworks have achieved impressive experimental resultse.g. in image classification, they may surprisingly be un-stable when it comes to adversarial perturbations, that is,minimal changes to the input image that cause the net-work to misclassify it [1, 17, 22, 23]. Combining deepneural networks with structured logical rules and multi-criteria decision tools, where logical operators are appliedon clusters created in the first layer, contributes to the re-duction of the black box nature of neural models. Aimingat interpretability, transparency and safety, implementingcontinuous-valued logical operators offers a promising di-rection.

Although boolean units and multilayer perceptrons havea long history, to the best of our knowledge there has beenlittle attempt to combine neural networks with continuouslogical systems so far. The basic idea of continuous logic

∗Corresponding authorEmail addresses: [email protected]

(Orsolya Csiszar), [email protected](Gabor Csiszar), [email protected] (Jozsef Dombi)

is the replacement of the space of truth values {T, F} bya compact interval such as [0, 1]. This means that the in-puts and the outputs of the extended logical gates are realnumbers of the unit interval, representing truth values ofinequalities. Quantifiers ∀x and ∃x are replaced by supxand infx, and logical connectives are continuous functions.Based on this idea, human thinking and natural languagecan be modeled in a sophisticated way.

Among other families of many-valued logics, t-normfuzzy logics are broadly used in applied fuzzy logic andfuzzy set theory as a theoretical basis for approximate rea-soning. In fuzzy logic, the membership function of a fuzzyset represents the degree of truth as a generalization of theindicator function in classical sets. Both propositional andfirst-order (or higher-order) t-norm fuzzy logics, as well astheir expansions by modal and other operators, have beenstudied thoroughly. Important examples of t-norm fuzzylogics are monoidal t-norm logic of all left-continuous t-norms, basic logic of all continuous t-norms, product fuzzylogic of the product t-norm, or the nilpotent minimumlogic of the nilpotent minimum t-norm. Some indepen-dently motivated logics belong among t-norm fuzzy logicsas well, like Lukasiewicz logic (which is the logic of the Lukasiewicz t-norm) and Godel-Dummett logic (which isthe logic of the minimum t-norm).

Recent results [3, 5–9] show that in the field of contin-uous logic, nilpotent logical systems are the most suitable

Preprint submitted to Information Sciences February 10, 2020

arX

iv:1

910.

0248

6v2

[cs

.AI]

7 F

eb 2

020

Page 2: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

for neural computation, mainly because of their boundedgenerator functions. Moreover, among other preferableproperties, the fulfillment of the law of contradiction andthe excluded middle, and the coincidence of the residualand the S-implication [10, 25] also make the application ofnilpotent operators in logical systems promising. In [3, 5–9] a rich asset of operators were examined thoroughly: in[5], negations, conjunctions and disjunctions, in [6] impli-cations, and in [7] equivalence operators. In [8], a para-metric form of a general operator oν was given by usinga shifting transformation of the generator function. Vary-ing the parameters, nilpotent conjunctive, disjunctive, ag-gregative (where a high input can compensate for a lowerone) and negation operators can all be obtained. More-over, as it was shown in [9], membership functions, whichplay a substantial role in the overall performance of fuzzyrepresentation, can also be defined by means of a generatorfunction.

In this study, we introduce a nilpotent neural model,where nilpotent logical operators and multicriteria deci-sion tools are implemented in the hidden layers of neuralnetworks (see Figure 3). Only the weights of the first layer(parameters of hyperplanes separating the decision space)are to be learned, and the architecture needs to be de-signed. In a more sophisticated version, left for futurework, the type of the operators in the hidden layers canalso be learned by the network, or e.g. by a genetic algo-rithm. Moreover, in [9] the authors showed that the mostimportant logical operators can be expressed as a compo-sition of a parametric unary operator and the arithmeticmean. This means that the neural network only needs tolearn the parameters of the first layer and (if not initiallygiven) the parameters of these unary operators in the hid-den layers.

In the nilpotent neural model, the activation functionsin the first layer are membership functions representingtruth values of inequalities, normalizing the inputs. Atthe same time, the activation functions in the the hiddenlayers model the cutting function (or to avoid the vanish-ing gradient problem, its differentiable approximation, theso-called squashing function) in the nilpotent logical op-erators. The theoretical background offers a straightfor-ward choice of activation functions: the squashing func-tion, which is an approximation of the rectifier. The factthat the squashing function, in contrast to the rectifier, isbounded from above, makes the continuous logical conceptapplicable.

The article is organized as follows. After summarizingthe most important related work in Section 2, we revisitthe relevant preliminaries concerning nilpotent logical sys-tems in Section 3. The nilpotent neural concept is de-scribed in Section 4. In Section 5, the model is illustratedwith some extended tensorflow playground examples. Fi-nally, the main results are summarized in Section 6.

2. Related Work

Combinations of neural networks and logic rules havebeen considered in different contexts. Neuro-fuzzy systems[20] were examined thoroughly in the literature. These hy-brid intelligent systems synergize the human-like reasoningstyle of fuzzy systems with the learning structure of neu-ral networks through the use of fuzzy sets and a linguisticmodel consisting of a set of IF-THEN fuzzy rules. Thesemodels were the first attempts to combine continuous log-ical elements and neural computation.

KBANN [24], Neural-symbolic systems [14], such asCILP++ [13], constructed network architectures from givenrules to perform knowledge acquisition.

Kulkarni et al. [19] used a specialized training proce-dure to obtain an interpretable neural layer of an imagenetwork.

In [18], Hu et al. proposed a general framework capableof enhancing various types of neural networks (e.g., CNNsand RNNs) with declarative first-order logic rules. Specif-ically, they developed an iterative distillation method thattransfers the structured information of logic rules into theweights of neural networks. With a few highly intuitiverules, they obtained substantial improvements and achievedstate-of-the-art or comparable results to previous best-performing systems.

In [26], Xu et al. developed a novel methodology forusing symbolic knowledge in deep learning by deriving asemantic loss function that bridges between neural outputvectors and logical constraints. This loss function cap-tures how close the neural network is to satisfying theconstraints on its output.

In [12], Fischer et al. presented DL2, a system fortraining and querying neural networks with logical con-straints. Using DL2, one can declaratively specify domainknowledge constraints to be enforced during training, aswell as pose queries on the model to find inputs that sat-isfy a set of constraints. DL2 works by translating logicalconstraints into a loss function with desirable mathemat-ical properties. The loss is then minimized with standardgradient-based methods.

All of these promising approaches point towards thedesirable mathematical framework that nilpotent logicalsystems can offer. Our general aspiration here is to pro-vide a general mathematical framework in order to benefitfrom a tight integration of machine learning and continu-ous logical methods.

3. Nilpotent Logical Systems and Multicriteria De-cision Tools

In this Section, we show why a specific logical system,the nilpotent logical system is well-suited to the neuralenvironment. First, we provide some basic preliminaries.

The most important operators in classical logic are theconjunction, the disjunction and the negation operator.

2

Page 3: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

conjunction disjunction

Figure 1: Nilpotent conjunction and disjunction followed by their approximations using the squashing function

These three basic operators together form a so-called con-nective system. When extending classical logic to contin-uous logic, compatibility and consistency are crucial. Thenegation should also be involutive; i.e. n(n(x)) = x, for∀x ∈ [0, 1]. Involutive negations are called strong nega-tions.

Definition 1. The triple (c, d, n), where c is a t-norm,d is a t-conorm and n is a strong negation, is called aconnective system.

As mentioned in the Introduction, numerous continu-ous logical systems have been introduced and studied inthe literature. In this study, we will show how nilpotentlogical systems relate to neural networks.

Definition 2. [5] A connective system is nilpotent, if theconjunction c is a nilpotent t-norm, and the disjunction dis a nilpotent t-conorm.

In the nilpotent case, the generator functions of thedisjunction and the conjunction (denoted by t(x) and s(x)respectively) are bounded functions, being determined upto a multiplicative constant. This means that they can benormalized the following way:

fc(x) :=t(x)

t(0), fd(x) :=

s(x)

s(1). (1)

Note that the normalized generator functions are nowuniquely defined.

Next, we recall the definition of the cutting function,to simplify the notations used. The differentiable approx-imation of this cutting function, the squashing functionS(x) introduced and examined in [16], will be a ReLu-like bounded activation function in our model. In [8], theauthors showed that all the nilpotent operators can bedescribed by using one generator function f(x) and thecutting function.

Definition 3. Let us define the cutting operation [ ] by

[x] =

0 if x < 0x if 0 ≤ x ≤ 11 if 1 < x

Remark 1. Note that the cutting function has the samevalues as ReLu (rectified linear unit) for x ≤ 1, but itremains bounded for ∀x ∈ R.

Proposition 1. With the help of the cutting operator, wecan write the conjunction and disjunction in the follow-ing form, where fc and fd are decreasing and increasingnormalized generator functions respectively.

c(x, y) = f−1c [fc(x) + fc(y)], (2)

d(x, y) = f−1d [fd(x) + fd(y)]. (3)

Remark 2. For the natural negations to coincide, as shownin [5], fc(x) + fd(x) = 1 must hold for ∀x ∈ [0, 1], whichmeans that only one generator function, e.g. fd(x) isneeded to describe the operators. Henceforth, fd is rep-resented by f(x).

Remark 3. Note that the min and max operators (oftenused as conjunction and disjunction in applications) canalso be expressed by [ ] in the following way:

min(x, y) = [x+ [y − x+ 1]− 1] , (4)

max(x, y) = [x+ [y − x]] , (5)

where x, y ∈ [0, 1].

The associativity of t-norms and t-conorms permits usto consider their extensions to the multivariable case. In[8], the authors examined a general parametric operatoroν(x) of nilpotent systems.

3

Page 4: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

Table 1: The most important two-variable operators ow(x)

w1 w2 C ow(x, y) for f(x) = x Notation

LOGICAL OPERATORSdisjunction 1 1 0 f−1[f(x) + f(y)] [x+ y] d(x, y)conjunction 1 1 −1 f−1[f(x) + f(y)− 1] [x+ y − 1] c(x, y)implication −1 1 1 f−1[f(y)− f(x) + 1] [y − x+ 1] i(x, y)MULTICRITERIA DECISION TOOLS

arithmetic mean 0.5 0.5 0 f−1[f(x)+f(y)

2

]x+y2 m(x, y)

preference −0.5 0.5 0.5 f−1[f(y)−f(x)+1

2

]y−x+1

2 p(x, y)

aggregative operator 1 1 −0.5 f−1[f(x) + f(y)− 1

2

] [x+ y − 1

2

]a(x, y)

Definition 4. Let f : [0, 1] → [0, 1] be an increasing bi-jection, ν ∈ [0, 1], and x = (x1, . . . , xn), where xi ∈ [0, 1]and let us define the general operator by

oν(x) = f−1

[n∑

i=1

(f(xi)− f(ν)) + f(ν)

]=

= f−1

[n∑

i=1

f(xi)− (n− 1)f(ν)

].

(6)

Remark 4. Note that the general operator for ν = 1 isconjunctive, for ν = 0 it is disjunctive and for ν = ν∗ =f−1

(12

)it is self-dual.

On the basis of Remark 4, the conjunction, the dis-junction and the aggregative operator can be defined inthe following way.

Definition 5. Let f : [0, 1]→ [0, 1] be an increasing bijec-tion, x = (x1, . . . , xn), where xi ∈ [0, 1]. Let us define theconjunction, the disjunction and the aggregative operatorby

c(x) := o1(x) = f−1

[n∑

i=1

f(xi)− (n− 1)

], (7)

d(x) := o0(x) = f−1

[n∑

i=1

f(xi)

], (8)

a(x) := oν∗(x) = f−1

[n∑

i=1

f(xi)−(n− 1)

2

], (9)

respectively, where ν∗ = f−1(12

).

A conjunction, a disjunction and an aggregative oper-ator differ only in one parameter of the general operatorin (6). The parameter ν has the semantic meaning of thelevel of expectation: maximal for the conjunction, neutralfor the aggregation and minimal for the disjunction. Next,let us recall the weighted form of the general operator:

Definition 6. Let w ∈ Rn, f : [0, 1]→ [0, 1] an increasingbijection with ν ∈ [0, 1], x = (x1, . . . , xn), where xi ∈ [0, 1].The weighted general operator is defined by

oν,w(x) := f−1

[n∑

i=1

wi(f(xi)− f(ν)) + f(ν)

]. (10)

Note that if the weight vector is normalized; i.e. for∑ni=1 wi = 1,

oν,w(x) = f−1

(n∑

i=1

wif(xi)

). (11)

For future application, we introduce a threshold-based op-erator in the following way.

Definition 7. Let w ∈ Rn, x = (x1, ...xn) ∈ [0, 1]n, ν =(ν1, ...νn) ∈ [0, 1]nand let f : [0, 1]→ [0, 1] be a strictly in-creasing bijection. Let us define the threshold-based nilpo-tent operator by

oν,w(x) = f−1

[n∑

i=1

wi (f(xi)− f(νi)) + f(ν)

]=

= f−1

[n∑

i=1

wif(xi) + C

], (12)

where

C = f(ν)−n∑

i=1

wif(νi). (13)

Note that for f(x) = x, (12) gives the functions mod-eled by perceptrons in neural networks:

[n∑

i=1

wixi + C

]. (14)

Based on Equations (7) to (9), it is easy to see that theconjunction, the disjunction and also the aggregative op-erator can be expressed in this form. The most commonlyused operators for n = 2 and for special values of wi andC, also for f(x) = x, are listed in Table 1.

4

Page 5: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

Table 2: The most important unary operators oα,γ(x)

γ oα,γ(x) for f(x) = x Notation

possibility 0 f−1[αf(x)] [αx] τP (x)necessity 1− α f−1[αf(x)− (α− 1)] [αx− (α− 1)] τN (x)

sharpness α−12 f−1[αf(x)− (α−1)

2 ] [αx− (α−1)2 ] τS(x)

Now let us focus on the unary (1-variable) case, ex-amined in [9], which also plays an important role in thenilpotent neural model. The unary operators are mainlyused to construct modifiers and membership functions byusing a generator function. The membership functions canbe interpreted as modeling an inequality [4]. Note thatnon-symmetrical membership functions can also be con-structed by connecting two unary operators with a con-junction [3, 9].

Definition 8. Let x ∈ [0, 1], α, γ ∈ R and let f : [0, 1]→[0, 1], a strictly increasing bijection. Then

oα,γ(x) := f−1[αf(x) + γ]. (15)

Remark 5. Note that as shown in [9], Equation (15) com-posed by the (weighted) arithmetic mean operator as aninner function, yields to Equation (12).

The most important unary operators for special γ val-ues are listed in Table 2.

Our attention can now be turned to the cutting func-tion. The main drawback of the cutting function in thenilpotent operator family is the lack of differentiability,which would be necessary for numerous practical applica-tions. Although most fuzzy applications (e.g. embeddedfuzzy control) use piecewise linear membership functionsowing to their easy handling, there are areas where theparameters are learned by a gradient-based optimizationmethod. In this case, the lack of continuous derivativesmakes the application impossible. For example, the mem-bership functions have to be differentiable for each inputin order to fine-tune a fuzzy control system by a simplegradient-based technique. This problem could be easilysolved by using the so-called squashing function, whichprovides a solution to the above-mentioned problem bya continuously differentiable approximation of the cuttingfunction.

The squashing function given in Definition 9 is a con-tinuously differentiable approximation of the generalizedcutting function by means of sigmoid functions (see Fig-ure 2).

Definition 9. The squashing function [9, 16] is definedas

S(β)a,λ(x) =

1

λβln

1 + eβ(x−(a−λ/2))

1 + eβ(x−(a+λ/2))=

1

λβlnσ(−β)a+λ/2(x)

σ(−β)a−λ/2(x)

.

β = 50

β = 5

β = 2β = 1

Figure 2: Squashing functions for a = 0.5, λ = 1, for different βvalues (β1 = 1, β2 = 2,β3 = 5, and β4 = 50)

where x, a, λ, β ∈ R and σ(β)d (x) denotes the logistic func-

tion:

σ(β)d (x) =

1

1 + e−β·(x−d). (16)

By increasing the value of β, the squashing function ap-proaches the generalized cutting function. In other words,β shows the accuracy of the approximation, while the pa-rameters a and λ determine the center and width. Theerror of the approximation can be upper bounded by c/β,which means that by increasing the parameter β, the errordecreases by the same order of magnitude. The derivativesof the squashing function are easy to calculate and can beexpressed by sigmoid functions and itself:

∂S(β)a,λ(x)

∂x=

1

λ

(σ(β)a−λ/2(x)− σ(β)

a+λ/2(x))

(17)

∂S(β)a,λ(x)

∂a=

1

λ

(σ(β)a+λ/2(x)− σ(β)

a−λ/2(x))

(18)

∂S(β)a,λ(x)

∂λ= − 1

λS(β)a,λ(x) +

1

(σ(β)a+λ/2(x) + σ

(β)a−λ/2(x)

)

(19)

The squashing function defined above is an approx-imation of the rectifier (rectified linear unit, ReLU) for

5

Page 6: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

x ≤ 1, with the benefit of having an upper bound. Be-ing bounded from above makes the use of continuous logicpossible. Also note the significant difference between theproperties of the squashing function and the sigmoid. Us-ing sigmoids, nilpotent logic can never be modeled. Thefact that on the other hand, ReLu can approximate thecutting function, may offer an interpretation to its effec-tiveness and success. The fact that the squashing functionis differentiable and its derivatives can be expressed bysigmoids improves efficiency in applications. An illustra-tion of the nilpotent conjunction and disjunction opera-tors with their soft approximations using the squashingfunction are shown in Figure 1. Note that not only logi-cal operators, but also multicriteria decision tools, like thepreference operator can be described similarly. This meansthat our model offers a unified framework, in which logicand multicriteria decision tools cooperate and supplementeach other.

4. Nilpotent Logic-based Interpretation of NeuralNetworks

The results on nilpotent logical systems discussed inSection 3 offer a new approach to designing neural net-works using continuous logic, since membership functions(representing the truth value of an ineqaulity), and alsonilpotent logical operators can be modeled by perceptrons.Whether for image classification or for multicriteria de-cision support, structured logical rules can contribute tothe performance of a deep neural network. Given that thenetwork has to find a region in the decision space or in animage, after designing the architecture appropriately, thenetwork only has to find the parameters of the boundary.Here, we propose creating basic building blocks by apply-ing the nilpotent logical concept in the perceptron modeland also in the neural architecture.

Boolean units and multilayer perceptrons have a longhistory. Logical gates (such as the AND, NOT and ORgates) are the basis of any modern day computer. It iswell known that any Boolean function can be composedusing a multilayer perceptron. As examples, the conjunc-tion and the disjunction are illustrated in Figure 6. Notethat for the XOR gate, an additional hidden layer is alsorequired. It can be shown that a network of linear classi-fiers that fires if the input is in a given area with arbitrarycomplex decision boundaries can be constructed with onlyone hidden layer and a single output. This means that if aneural network learns to separate different regions in then-dimensional space having n input values, each node inthe first layer can separate the space into two half-spacesby drawing one hyperplane, while the nodes in the hiddenlayers can combine them using logical operators.

In Figure 5, some basic types of neural networks areshown with two input values, finding different regions ofthe plane. Generally speaking, each node in the neuralnet represents one threshold and therefore it can draw oneline in the picture. The line may be diagonal if the nodes

Learning the parameters

of the boarders

Circles for typical (counter-) examples

in the decision spaceHyperplanes

in the decision space

Soft inequalities

Soft inequalities

Logical operators and multicriteria decision tools

Hyperplanes in the decision space

Circles for typical (counter-) examples

in the decision space

Figure 3: Nilpotent neural model

P-3 -2 -1 1 2 3

0.2

0.4

0.6

0.8

1.0

Weighted sum + bias

Cutting / Squashing function

Truth value of the inequality

b<latexit sha1_base64="B8QOfSq5IKQ3NJiokTqrVfN3Ulk=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipGQzKFbfqLkDWiZeTCuRoDMpf/WHM0gilYYJq3fPcxPgZVYYzgbNSP9WYUDahI+xZKmmE2s8Wh87IhVWGJIyVLWnIQv09kdFI62kU2M6ImrFe9ebif14vNeGNn3GZpAYlWy4KU0FMTOZfkyFXyIyYWkKZ4vZWwsZUUWZsNiUbgrf68jppX1U9t+o1ryv12zyOIpzBOVyCBzWowz00oAUMEJ7hFd6cR+fFeXc+lq0FJ585hT9wPn8AxOGM5g==</latexit><latexit sha1_base64="B8QOfSq5IKQ3NJiokTqrVfN3Ulk=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipGQzKFbfqLkDWiZeTCuRoDMpf/WHM0gilYYJq3fPcxPgZVYYzgbNSP9WYUDahI+xZKmmE2s8Wh87IhVWGJIyVLWnIQv09kdFI62kU2M6ImrFe9ebif14vNeGNn3GZpAYlWy4KU0FMTOZfkyFXyIyYWkKZ4vZWwsZUUWZsNiUbgrf68jppX1U9t+o1ryv12zyOIpzBOVyCBzWowz00oAUMEJ7hFd6cR+fFeXc+lq0FJ585hT9wPn8AxOGM5g==</latexit><latexit sha1_base64="B8QOfSq5IKQ3NJiokTqrVfN3Ulk=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipGQzKFbfqLkDWiZeTCuRoDMpf/WHM0gilYYJq3fPcxPgZVYYzgbNSP9WYUDahI+xZKmmE2s8Wh87IhVWGJIyVLWnIQv09kdFI62kU2M6ImrFe9ebif14vNeGNn3GZpAYlWy4KU0FMTOZfkyFXyIyYWkKZ4vZWwsZUUWZsNiUbgrf68jppX1U9t+o1ryv12zyOIpzBOVyCBzWowz00oAUMEJ7hFd6cR+fFeXc+lq0FJ585hT9wPn8AxOGM5g==</latexit><latexit sha1_base64="B8QOfSq5IKQ3NJiokTqrVfN3Ulk=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipGQzKFbfqLkDWiZeTCuRoDMpf/WHM0gilYYJq3fPcxPgZVYYzgbNSP9WYUDahI+xZKmmE2s8Wh87IhVWGJIyVLWnIQv09kdFI62kU2M6ImrFe9ebif14vNeGNn3GZpAYlWy4KU0FMTOZfkyFXyIyYWkKZ4vZWwsZUUWZsNiUbgrf68jppX1U9t+o1ryv12zyOIpzBOVyCBzWowz00oAUMEJ7hFd6cR+fFeXc+lq0FJ585hT9wPn8AxOGM5g==</latexit>

x1<latexit sha1_base64="9Ebez7AsyXnUS7K0KmzSc78bSSE=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwALsI2g</latexit><latexit sha1_base64="9Ebez7AsyXnUS7K0KmzSc78bSSE=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwALsI2g</latexit><latexit sha1_base64="9Ebez7AsyXnUS7K0KmzSc78bSSE=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwALsI2g</latexit><latexit sha1_base64="9Ebez7AsyXnUS7K0KmzSc78bSSE=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N2IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwALsI2g</latexit>

x2<latexit sha1_base64="Akp+sGFHCQNeqan2vZLSuyZOX9Y=">AAAB63icbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fj04rGC/YA2lM120y7d3YTdiVhC/4IXD4p49Q9589+YtDlo64OBx3szzMwLYiksuu63s7a+sbm1Xdop7+7tHxxWjo7bNkoM4y0Wych0A2q5FJq3UKDk3dhwqgLJO8HkNvc7j9xYEekHnMbcV3SkRSgYxVx6GtTLg0rVrblzkFXiFaQKBZqDyld/GLFEcY1MUmt7nhujn1KDgkk+K/cTy2PKJnTEexnVVHHrp/NbZ+Q8U4YkjExWGslc/T2RUmXtVAVZp6I4tsteLv7n9RIMr/1U6DhBrtliUZhIghHJHydDYThDOc0IZUZktxI2poYyzOLJQ/CWX14l7XrNc2ve/WW1cVPEUYJTOIML8OAKGnAHTWgBgzE8wyu8Ocp5cd6dj0XrmlPMnMAfOJ8/Qk2NtQ==</latexit><latexit sha1_base64="Akp+sGFHCQNeqan2vZLSuyZOX9Y=">AAAB63icbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fj04rGC/YA2lM120y7d3YTdiVhC/4IXD4p49Q9589+YtDlo64OBx3szzMwLYiksuu63s7a+sbm1Xdop7+7tHxxWjo7bNkoM4y0Wych0A2q5FJq3UKDk3dhwqgLJO8HkNvc7j9xYEekHnMbcV3SkRSgYxVx6GtTLg0rVrblzkFXiFaQKBZqDyld/GLFEcY1MUmt7nhujn1KDgkk+K/cTy2PKJnTEexnVVHHrp/NbZ+Q8U4YkjExWGslc/T2RUmXtVAVZp6I4tsteLv7n9RIMr/1U6DhBrtliUZhIghHJHydDYThDOc0IZUZktxI2poYyzOLJQ/CWX14l7XrNc2ve/WW1cVPEUYJTOIML8OAKGnAHTWgBgzE8wyu8Ocp5cd6dj0XrmlPMnMAfOJ8/Qk2NtQ==</latexit><latexit sha1_base64="Akp+sGFHCQNeqan2vZLSuyZOX9Y=">AAAB63icbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fj04rGC/YA2lM120y7d3YTdiVhC/4IXD4p49Q9589+YtDlo64OBx3szzMwLYiksuu63s7a+sbm1Xdop7+7tHxxWjo7bNkoM4y0Wych0A2q5FJq3UKDk3dhwqgLJO8HkNvc7j9xYEekHnMbcV3SkRSgYxVx6GtTLg0rVrblzkFXiFaQKBZqDyld/GLFEcY1MUmt7nhujn1KDgkk+K/cTy2PKJnTEexnVVHHrp/NbZ+Q8U4YkjExWGslc/T2RUmXtVAVZp6I4tsteLv7n9RIMr/1U6DhBrtliUZhIghHJHydDYThDOc0IZUZktxI2poYyzOLJQ/CWX14l7XrNc2ve/WW1cVPEUYJTOIML8OAKGnAHTWgBgzE8wyu8Ocp5cd6dj0XrmlPMnMAfOJ8/Qk2NtQ==</latexit><latexit sha1_base64="Akp+sGFHCQNeqan2vZLSuyZOX9Y=">AAAB63icbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fj04rGC/YA2lM120y7d3YTdiVhC/4IXD4p49Q9589+YtDlo64OBx3szzMwLYiksuu63s7a+sbm1Xdop7+7tHxxWjo7bNkoM4y0Wych0A2q5FJq3UKDk3dhwqgLJO8HkNvc7j9xYEekHnMbcV3SkRSgYxVx6GtTLg0rVrblzkFXiFaQKBZqDyld/GLFEcY1MUmt7nhujn1KDgkk+K/cTy2PKJnTEexnVVHHrp/NbZ+Q8U4YkjExWGslc/T2RUmXtVAVZp6I4tsteLv7n9RIMr/1U6DhBrtliUZhIghHJHydDYThDOc0IZUZktxI2poYyzOLJQ/CWX14l7XrNc2ve/WW1cVPEUYJTOIML8OAKGnAHTWgBgzE8wyu8Ocp5cd6dj0XrmlPMnMAfOJ8/Qk2NtQ==</latexit>

xn<latexit sha1_base64="lCtG22HLO6NTgevdeCwz7kRPYV0=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbt0dxN2N2IJ/QtePCji1T/kzX9j0uagrQ8GHu/NMDMviAU31nW/ndLa+sbmVnm7srO7t39QPTxqmyjRDFssEpHuBtSg4ApblluB3VgjlYHATjC5zf3OI2rDI/VgpzH6ko4UDzmjNpeeBqoyqNbcujsHWSVeQWpQoDmofvWHEUskKssENabnubH1U6otZwJnlX5iMKZsQkfYy6iiEo2fzm+dkbNMGZIw0lkpS+bq74mUSmOmMsg6JbVjs+zl4n9eL7HhtZ9yFScWFVssChNBbETyx8mQa2RWTDNCmebZrYSNqabMZvHkIXjLL6+S9kXdc+ve/WWtcVPEUYYTOIVz8OAKGnAHTWgBgzE8wyu8OdJ5cd6dj0VrySlmjuEPnM8fnXmN8Q==</latexit><latexit sha1_base64="lCtG22HLO6NTgevdeCwz7kRPYV0=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbt0dxN2N2IJ/QtePCji1T/kzX9j0uagrQ8GHu/NMDMviAU31nW/ndLa+sbmVnm7srO7t39QPTxqmyjRDFssEpHuBtSg4ApblluB3VgjlYHATjC5zf3OI2rDI/VgpzH6ko4UDzmjNpeeBqoyqNbcujsHWSVeQWpQoDmofvWHEUskKssENabnubH1U6otZwJnlX5iMKZsQkfYy6iiEo2fzm+dkbNMGZIw0lkpS+bq74mUSmOmMsg6JbVjs+zl4n9eL7HhtZ9yFScWFVssChNBbETyx8mQa2RWTDNCmebZrYSNqabMZvHkIXjLL6+S9kXdc+ve/WWtcVPEUYYTOIVz8OAKGnAHTWgBgzE8wyu8OdJ5cd6dj0VrySlmjuEPnM8fnXmN8Q==</latexit><latexit sha1_base64="lCtG22HLO6NTgevdeCwz7kRPYV0=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbt0dxN2N2IJ/QtePCji1T/kzX9j0uagrQ8GHu/NMDMviAU31nW/ndLa+sbmVnm7srO7t39QPTxqmyjRDFssEpHuBtSg4ApblluB3VgjlYHATjC5zf3OI2rDI/VgpzH6ko4UDzmjNpeeBqoyqNbcujsHWSVeQWpQoDmofvWHEUskKssENabnubH1U6otZwJnlX5iMKZsQkfYy6iiEo2fzm+dkbNMGZIw0lkpS+bq74mUSmOmMsg6JbVjs+zl4n9eL7HhtZ9yFScWFVssChNBbETyx8mQa2RWTDNCmebZrYSNqabMZvHkIXjLL6+S9kXdc+ve/WWtcVPEUYYTOIVz8OAKGnAHTWgBgzE8wyu8OdJ5cd6dj0VrySlmjuEPnM8fnXmN8Q==</latexit><latexit sha1_base64="lCtG22HLO6NTgevdeCwz7kRPYV0=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbt0dxN2N2IJ/QtePCji1T/kzX9j0uagrQ8GHu/NMDMviAU31nW/ndLa+sbmVnm7srO7t39QPTxqmyjRDFssEpHuBtSg4ApblluB3VgjlYHATjC5zf3OI2rDI/VgpzH6ko4UDzmjNpeeBqoyqNbcujsHWSVeQWpQoDmofvWHEUskKssENabnubH1U6otZwJnlX5iMKZsQkfYy6iiEo2fzm+dkbNMGZIw0lkpS+bq74mUSmOmMsg6JbVjs+zl4n9eL7HhtZ9yFScWFVssChNBbETyx8mQa2RWTDNCmebZrYSNqabMZvHkIXjLL6+S9kXdc+ve/WWtcVPEUYYTOIVz8OAKGnAHTWgBgzE8wyu8OdJ5cd6dj0VrySlmjuEPnM8fnXmN8Q==</latexit>

o(x)<latexit sha1_base64="xnOuw06z5oNcN8P/A5N7khe5R+M=">AAAB+nicbVDLSsNAFL3xWesr1aWbwSLUTUlE0GXRjcsK9gFtKJPJpB06mYSZiVpiP8WNC0Xc+iXu/BsnbRbaeuDC4Zx7Z+49fsKZ0o7zba2srq1vbJa2yts7u3v7duWgreJUEtoiMY9l18eKciZoSzPNaTeRFEc+px1/fJ37nXsqFYvFnZ4k1IvwULCQEayNNLArca2fioDK/IHscXpaHthVp+7MgJaJW5AqFGgO7K9+EJM0okITjpXquU6ivQxLzQin03I/VTTBZIyHtGeowBFVXjZbfYpOjBKgMJamhEYz9fdEhiOlJpFvOiOsR2rRy8X/vF6qw0svYyJJNRVk/lGYcqRjlOeAAiYp0XxiCCaSmV0RGWGJiTZp5SG4iycvk/ZZ3XXq7u15tXFVxFGCIziGGrhwAQ24gSa0gMADPMMrvFlP1ov1bn3MW1esYuYQ/sD6/AHQjJOx</latexit><latexit sha1_base64="xnOuw06z5oNcN8P/A5N7khe5R+M=">AAAB+nicbVDLSsNAFL3xWesr1aWbwSLUTUlE0GXRjcsK9gFtKJPJpB06mYSZiVpiP8WNC0Xc+iXu/BsnbRbaeuDC4Zx7Z+49fsKZ0o7zba2srq1vbJa2yts7u3v7duWgreJUEtoiMY9l18eKciZoSzPNaTeRFEc+px1/fJ37nXsqFYvFnZ4k1IvwULCQEayNNLArca2fioDK/IHscXpaHthVp+7MgJaJW5AqFGgO7K9+EJM0okITjpXquU6ivQxLzQin03I/VTTBZIyHtGeowBFVXjZbfYpOjBKgMJamhEYz9fdEhiOlJpFvOiOsR2rRy8X/vF6qw0svYyJJNRVk/lGYcqRjlOeAAiYp0XxiCCaSmV0RGWGJiTZp5SG4iycvk/ZZ3XXq7u15tXFVxFGCIziGGrhwAQ24gSa0gMADPMMrvFlP1ov1bn3MW1esYuYQ/sD6/AHQjJOx</latexit><latexit sha1_base64="xnOuw06z5oNcN8P/A5N7khe5R+M=">AAAB+nicbVDLSsNAFL3xWesr1aWbwSLUTUlE0GXRjcsK9gFtKJPJpB06mYSZiVpiP8WNC0Xc+iXu/BsnbRbaeuDC4Zx7Z+49fsKZ0o7zba2srq1vbJa2yts7u3v7duWgreJUEtoiMY9l18eKciZoSzPNaTeRFEc+px1/fJ37nXsqFYvFnZ4k1IvwULCQEayNNLArca2fioDK/IHscXpaHthVp+7MgJaJW5AqFGgO7K9+EJM0okITjpXquU6ivQxLzQin03I/VTTBZIyHtGeowBFVXjZbfYpOjBKgMJamhEYz9fdEhiOlJpFvOiOsR2rRy8X/vF6qw0svYyJJNRVk/lGYcqRjlOeAAiYp0XxiCCaSmV0RGWGJiTZp5SG4iycvk/ZZ3XXq7u15tXFVxFGCIziGGrhwAQ24gSa0gMADPMMrvFlP1ov1bn3MW1esYuYQ/sD6/AHQjJOx</latexit><latexit sha1_base64="xnOuw06z5oNcN8P/A5N7khe5R+M=">AAAB+nicbVDLSsNAFL3xWesr1aWbwSLUTUlE0GXRjcsK9gFtKJPJpB06mYSZiVpiP8WNC0Xc+iXu/BsnbRbaeuDC4Zx7Z+49fsKZ0o7zba2srq1vbJa2yts7u3v7duWgreJUEtoiMY9l18eKciZoSzPNaTeRFEc+px1/fJ37nXsqFYvFnZ4k1IvwULCQEayNNLArca2fioDK/IHscXpaHthVp+7MgJaJW5AqFGgO7K9+EJM0okITjpXquU6ivQxLzQin03I/VTTBZIyHtGeowBFVXjZbfYpOjBKgMJamhEYz9fdEhiOlJpFvOiOsR2rRy8X/vF6qw0svYyJJNRVk/lGYcqRjlOeAAiYp0XxiCCaSmV0RGWGJiTZp5SG4iycvk/ZZ3XXq7u15tXFVxFGCIziGGrhwAQ24gSa0gMADPMMrvFlP1ov1bn3MW1esYuYQ/sD6/AHQjJOx</latexit>

·<latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit>

·<latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit>

·<latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit><latexit sha1_base64="OWKfto14+6ruTIxUg53GwPMFvUc=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDabTbt2kw27E6GU/gcvHhTx6v/x5r9x0+agrQ8GHu/NMDMvSKUw6LrfTmltfWNzq7xd2dnd2z+oHh61jco04y2mpNLdgBouRcJbKFDybqo5jQPJO8H4Nvc7T1wboZIHnKTcj+kwEZFgFK3U7rNQYWVQrbl1dw6ySryC1KBAc1D96oeKZTFPkElqTM9zU/SnVKNgks8q/czwlLIxHfKepQmNufGn82tn5MwqIYmUtpUgmau/J6Y0NmYSB7Yzpjgyy14u/uf1Moyu/alI0gx5whaLokwSVCR/nYRCc4ZyYgllWthbCRtRTRnagPIQvOWXV0n7ou65de/+sta4KeIowwmcwjl4cAUNuIMmtIDBIzzDK7w5ynlx3p2PRWvJKWaO4Q+czx8PM47G</latexit>

nX

i=1

wixi � b > 0<latexit sha1_base64="8fUIboWcbuhGiSFWZ+dIwjLG9CM=">AAACBnicbZDLSgMxFIYz9VbrbdSlCMEiuLHMiKAbpejGZQV7gXYcMmnahiaZIcmoZZiVG1/FjQtF3PoM7nwbM+0stPVAwsf/n0Ny/iBiVGnH+bYKc/MLi0vF5dLK6tr6hr251VBhLDGp45CFshUgRRgVpK6pZqQVSYJ4wEgzGF5mfvOOSEVDcaNHEfE46gvaoxhpI/n2bkfF3E/omZveJiKF94bTh+w6DM4d3y47FWdccBbcHMogr5pvf3W6IY45ERozpFTbdSLtJUhqihlJS51YkQjhIeqTtkGBOFFeMl4jhftG6cJeKM0RGo7V3xMJ4kqNeGA6OdIDNe1l4n9eO9a9Uy+hIoo1EXjyUC9mUIcwywR2qSRYs5EBhCU1f4V4gCTC2iRXMiG40yvPQuOo4joV9/q4XL3I4yiCHbAHDoALTkAVXIEaqAMMHsEzeAVv1pP1Yr1bH5PWgpXPbIM/ZX3+AEu/mPw=</latexit><latexit sha1_base64="8fUIboWcbuhGiSFWZ+dIwjLG9CM=">AAACBnicbZDLSgMxFIYz9VbrbdSlCMEiuLHMiKAbpejGZQV7gXYcMmnahiaZIcmoZZiVG1/FjQtF3PoM7nwbM+0stPVAwsf/n0Ny/iBiVGnH+bYKc/MLi0vF5dLK6tr6hr251VBhLDGp45CFshUgRRgVpK6pZqQVSYJ4wEgzGF5mfvOOSEVDcaNHEfE46gvaoxhpI/n2bkfF3E/omZveJiKF94bTh+w6DM4d3y47FWdccBbcHMogr5pvf3W6IY45ERozpFTbdSLtJUhqihlJS51YkQjhIeqTtkGBOFFeMl4jhftG6cJeKM0RGo7V3xMJ4kqNeGA6OdIDNe1l4n9eO9a9Uy+hIoo1EXjyUC9mUIcwywR2qSRYs5EBhCU1f4V4gCTC2iRXMiG40yvPQuOo4joV9/q4XL3I4yiCHbAHDoALTkAVXIEaqAMMHsEzeAVv1pP1Yr1bH5PWgpXPbIM/ZX3+AEu/mPw=</latexit><latexit sha1_base64="8fUIboWcbuhGiSFWZ+dIwjLG9CM=">AAACBnicbZDLSgMxFIYz9VbrbdSlCMEiuLHMiKAbpejGZQV7gXYcMmnahiaZIcmoZZiVG1/FjQtF3PoM7nwbM+0stPVAwsf/n0Ny/iBiVGnH+bYKc/MLi0vF5dLK6tr6hr251VBhLDGp45CFshUgRRgVpK6pZqQVSYJ4wEgzGF5mfvOOSEVDcaNHEfE46gvaoxhpI/n2bkfF3E/omZveJiKF94bTh+w6DM4d3y47FWdccBbcHMogr5pvf3W6IY45ERozpFTbdSLtJUhqihlJS51YkQjhIeqTtkGBOFFeMl4jhftG6cJeKM0RGo7V3xMJ4kqNeGA6OdIDNe1l4n9eO9a9Uy+hIoo1EXjyUC9mUIcwywR2qSRYs5EBhCU1f4V4gCTC2iRXMiG40yvPQuOo4joV9/q4XL3I4yiCHbAHDoALTkAVXIEaqAMMHsEzeAVv1pP1Yr1bH5PWgpXPbIM/ZX3+AEu/mPw=</latexit><latexit sha1_base64="8fUIboWcbuhGiSFWZ+dIwjLG9CM=">AAACBnicbZDLSgMxFIYz9VbrbdSlCMEiuLHMiKAbpejGZQV7gXYcMmnahiaZIcmoZZiVG1/FjQtF3PoM7nwbM+0stPVAwsf/n0Ny/iBiVGnH+bYKc/MLi0vF5dLK6tr6hr251VBhLDGp45CFshUgRRgVpK6pZqQVSYJ4wEgzGF5mfvOOSEVDcaNHEfE46gvaoxhpI/n2bkfF3E/omZveJiKF94bTh+w6DM4d3y47FWdccBbcHMogr5pvf3W6IY45ERozpFTbdSLtJUhqihlJS51YkQjhIeqTtkGBOFFeMl4jhftG6cJeKM0RGo7V3xMJ4kqNeGA6OdIDNe1l4n9eO9a9Uy+hIoo1EXjyUC9mUIcwywR2qSRYs5EBhCU1f4V4gCTC2iRXMiG40yvPQuOo4joV9/q4XL3I4yiCHbAHDoALTkAVXIEaqAMMHsEzeAVv1pP1Yr1bH5PWgpXPbIM/ZX3+AEu/mPw=</latexit>

�1<latexit sha1_base64="VHuAnIK6HQukuXfWtT+pc/O3tuM=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBiyWRgh6LXjxWsR/QhrLZTtqlm03Y3Qgl9B948aCIV/+RN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzSRBP6JDyUPOqLHSw4XXL1fcqjsHWSVeTiqQo9Evf/UGMUsjlIYJqnXXcxPjZ1QZzgROS71UY0LZmA6xa6mkEWo/m186JWdWGZAwVrakIXP190RGI60nUWA7I2pGetmbif953dSE137GZZIalGyxKEwFMTGZvU0GXCEzYmIJZYrbWwkbUUWZseGUbAje8surpHVZ9dyqd1+r1G/yOIpwAqdwDh5cQR3uoAFNYBDCM7zCmzN2Xpx352PRWnDymWP4A+fzB+PijOw=</latexit><latexit sha1_base64="VHuAnIK6HQukuXfWtT+pc/O3tuM=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBiyWRgh6LXjxWsR/QhrLZTtqlm03Y3Qgl9B948aCIV/+RN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzSRBP6JDyUPOqLHSw4XXL1fcqjsHWSVeTiqQo9Evf/UGMUsjlIYJqnXXcxPjZ1QZzgROS71UY0LZmA6xa6mkEWo/m186JWdWGZAwVrakIXP190RGI60nUWA7I2pGetmbif953dSE137GZZIalGyxKEwFMTGZvU0GXCEzYmIJZYrbWwkbUUWZseGUbAje8surpHVZ9dyqd1+r1G/yOIpwAqdwDh5cQR3uoAFNYBDCM7zCmzN2Xpx352PRWnDymWP4A+fzB+PijOw=</latexit><latexit sha1_base64="VHuAnIK6HQukuXfWtT+pc/O3tuM=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBiyWRgh6LXjxWsR/QhrLZTtqlm03Y3Qgl9B948aCIV/+RN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzSRBP6JDyUPOqLHSw4XXL1fcqjsHWSVeTiqQo9Evf/UGMUsjlIYJqnXXcxPjZ1QZzgROS71UY0LZmA6xa6mkEWo/m186JWdWGZAwVrakIXP190RGI60nUWA7I2pGetmbif953dSE137GZZIalGyxKEwFMTGZvU0GXCEzYmIJZYrbWwkbUUWZseGUbAje8surpHVZ9dyqd1+r1G/yOIpwAqdwDh5cQR3uoAFNYBDCM7zCmzN2Xpx352PRWnDymWP4A+fzB+PijOw=</latexit><latexit sha1_base64="VHuAnIK6HQukuXfWtT+pc/O3tuM=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBiyWRgh6LXjxWsR/QhrLZTtqlm03Y3Qgl9B948aCIV/+RN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzSRBP6JDyUPOqLHSw4XXL1fcqjsHWSVeTiqQo9Evf/UGMUsjlIYJqnXXcxPjZ1QZzgROS71UY0LZmA6xa6mkEWo/m186JWdWGZAwVrakIXP190RGI60nUWA7I2pGetmbif953dSE137GZZIalGyxKEwFMTGZvU0GXCEzYmIJZYrbWwkbUUWZseGUbAje8surpHVZ9dyqd1+r1G/yOIpwAqdwDh5cQR3uoAFNYBDCM7zCmzN2Xpx352PRWnDymWP4A+fzB+PijOw=</latexit>

w1<latexit sha1_base64="x3+vf06QaiZbytuTgzZc+MW2BGw=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwAKKo2f</latexit><latexit sha1_base64="x3+vf06QaiZbytuTgzZc+MW2BGw=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwAKKo2f</latexit><latexit sha1_base64="x3+vf06QaiZbytuTgzZc+MW2BGw=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwAKKo2f</latexit><latexit sha1_base64="x3+vf06QaiZbytuTgzZc+MW2BGw=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9T3+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8MrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbQuqp5b9e4uK/XrPI4inMApnIMHNajDLTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwAKKo2f</latexit>

w2<latexit sha1_base64="EsfmBFLUBuuqmk6k+rvdBrdsh0Q=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9Sv9csVt+rOQVaJl5MK5Gj0y1+9QczSCKVhgmrd9dzE+BlVhjOB01Iv1ZhQNqZD7FoqaYTaz+anTsmZVQYkjJUtachc/T2R0UjrSRTYzoiakV72ZuJ/Xjc14ZWfcZmkBiVbLApTQUxMZn+TAVfIjJhYQpni9lbCRlRRZmw6JRuCt/zyKmnVqp5b9e4uKvXrPI4inMApnIMHl1CHW2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gALro2g</latexit><latexit sha1_base64="EsfmBFLUBuuqmk6k+rvdBrdsh0Q=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9Sv9csVt+rOQVaJl5MK5Gj0y1+9QczSCKVhgmrd9dzE+BlVhjOB01Iv1ZhQNqZD7FoqaYTaz+anTsmZVQYkjJUtachc/T2R0UjrSRTYzoiakV72ZuJ/Xjc14ZWfcZmkBiVbLApTQUxMZn+TAVfIjJhYQpni9lbCRlRRZmw6JRuCt/zyKmnVqp5b9e4uKvXrPI4inMApnIMHl1CHW2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gALro2g</latexit><latexit sha1_base64="EsfmBFLUBuuqmk6k+rvdBrdsh0Q=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9Sv9csVt+rOQVaJl5MK5Gj0y1+9QczSCKVhgmrd9dzE+BlVhjOB01Iv1ZhQNqZD7FoqaYTaz+anTsmZVQYkjJUtachc/T2R0UjrSRTYzoiakV72ZuJ/Xjc14ZWfcZmkBiVbLApTQUxMZn+TAVfIjJhYQpni9lbCRlRRZmw6JRuCt/zyKmnVqp5b9e4uKvXrPI4inMApnIMHl1CHW2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gALro2g</latexit><latexit sha1_base64="EsfmBFLUBuuqmk6k+rvdBrdsh0Q=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeiF48V7Qe0oWy2k3bpZhN2N0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4Zua3H1FpHssHM0nQj+hQ8pAzaqx0/9Sv9csVt+rOQVaJl5MK5Gj0y1+9QczSCKVhgmrd9dzE+BlVhjOB01Iv1ZhQNqZD7FoqaYTaz+anTsmZVQYkjJUtachc/T2R0UjrSRTYzoiakV72ZuJ/Xjc14ZWfcZmkBiVbLApTQUxMZn+TAVfIjJhYQpni9lbCRlRRZmw6JRuCt/zyKmnVqp5b9e4uKvXrPI4inMApnIMHl1CHW2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gALro2g</latexit>

wn<latexit sha1_base64="X03Pa3S0dA/LBqsepHC/FWWwqRY=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2m3bpZhN2J0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GNzO//ci1EbF6wEnC/YgOlQgFo2il+6e+6pcrbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/dUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4ZWfCZWkyBVbLApTSTAms7/JQGjOUE4soUwLeythI6opQ5tOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzifP2aejdw=</latexit><latexit sha1_base64="X03Pa3S0dA/LBqsepHC/FWWwqRY=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2m3bpZhN2J0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GNzO//ci1EbF6wEnC/YgOlQgFo2il+6e+6pcrbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/dUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4ZWfCZWkyBVbLApTSTAms7/JQGjOUE4soUwLeythI6opQ5tOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzifP2aejdw=</latexit><latexit sha1_base64="X03Pa3S0dA/LBqsepHC/FWWwqRY=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2m3bpZhN2J0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GNzO//ci1EbF6wEnC/YgOlQgFo2il+6e+6pcrbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/dUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4ZWfCZWkyBVbLApTSTAms7/JQGjOUE4soUwLeythI6opQ5tOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzifP2aejdw=</latexit><latexit sha1_base64="X03Pa3S0dA/LBqsepHC/FWWwqRY=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2m3bpZhN2J0oJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GNzO//ci1EbF6wEnC/YgOlQgFo2il+6e+6pcrbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/dUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4ZWfCZWkyBVbLApTSTAms7/JQGjOUE4soUwLeythI6opQ5tOyYbgLb+8SloXVc+teneXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzifP2aejdw=</latexit>

Figure 4: Nilpotent perceptron model

receives both of the inputs i1 and i2. The line has to behorizontal or vertical if the node only receives one of theinputs. The deeper hidden levels are responsible for thelogical operations.

From several perspectives, as mentioned in the Intro-duction, a continuous logical framework can provide a moresophisticated and effective approach to this problem thana boolean can.

Among continuous logical systems, the nilpotent logi-cal framework described above is well-suited for the neuralconcept architecture, when it comes to implementing logi-cal rules. For the sake of simplicity, henceforth we assumethat for the generator function f(x) = x holds and we de-sign the neural network architecture in the following way.In the first layer, the perceptrons model membership func-tions as truth values of inequalities, such as

n∑

i=1

wixi − b > 0; (20)

representing a half space bounded by a hyperplane in thedecision space (see Figure 4). Here, the weights wi and thebias b are to be learned. The truth value of this inequality

6

Page 7: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

z

x

y

z

x

y

z

x

y

z

x

y

z

x

y

z

x

y

Figure 5: Basic types of neural networks with two input values usinglogical operators in the hidden layer used to find different regions ofthe plane

can be modeled by

[n∑

i=1

wixi − b]

; (21)

or to avoid the vanishing gradient problem, the cuttingfunction can be approximated by the so-called squashingfunction, by the differentiable approximation of the cuttingfunction

S

(n∑

i=1

wixi − b). (22)

The parameters of the squashing function in (22) now havea context-dependent semantic meaning.

Since the nilpotent logical operators also represent in-equalities and therefore have the same structure (comparewith Equation (14), and see also Table 1 and Figure 4),in the hidden layers, we can apply them on the clusterscreated in the first layer (see Figure 3). Here, the weightsand biases characterize the type of the logical operator. Asan illustration, the perceptron models of the conjunctionand of the disjunction can be seen in Figure 6. This meansthat for a given logical operator, the weights and the biascan be frozen. The squashing function plays the role ofthe activation function in all of the layers. The backprop-agation algorithm needs to be adjusted: the error functionis calculated based on all of the weights and biases (frozenand learnable), but the backpropagation leaves the frozenlayers out. Moreover, in this nilpotent model, the conjunc-tion, the disjunction and the aggregation differ only in atranslation parameter; i.e. the weights are equal for all ofthem and only the biases are different. This fact makes itpossible for the network to learn the type of logical oper-ators just by learning the bias.

To illustrate the model, two basic examples are given.

Example 1. As an example, let us assume that a networkneeds to find positive examples which lie inside a triangularregion. This means that we should design the network toconjunct three half planes, and to find the parameters of theboundary lines. The output values for a triangular domain

�x

�y�P

�d(x, y)

�x

�y�c(x, y)

�1

�1

P

�1

�1

�1

��1

-3 -2 -1 1 2 3

0.2

0.4

0.6

0.8

1.0

-3 -2 -1 1 2 3

0.2

0.4

0.6

0.8

1.0

Figure 6: Perceptron model of the conjunction and the disjunction

�x

�y

P

�1

�r

-3 -2 -1 1 2 3

0.2

0.4

0.6

0.8

1.0

x2

y2

�1

�1

z

r2r2

Figure 7: Perceptron model classifying a circle with radius r

using nilpotent logic and its continuous approximation areillustrated in Figure 8.

Example 2. The flow chart and the model for the logicalexpression ”((x > 0) AND (y > 0)) OR ((x < 0) AND (y <0))” can be seen in Figure 10. Here, x < 0 is modeled byNOT (x > 0). Note that [[x+ y − 1] + [(1− x) + (1− y)− 1]] =

= [[x+ y − 1] + [−x− y + 1]] , therefore the bias for thenegated inputs is +1. The weights and biases for the logi-cal operators are listed in Table 3.

Additionally, taking into account the fact that the areainside or outside a circle is described by an inequality con-taining the squares of the input values, it is also possibleto construct a novel type of unit by adding the squareof each input into the input layer (see Figure 7). Thisway, the polygon approximation of the circle can be elim-inated. For an illustration, see Figure 9. Note that bymodifying the weights, an arbitrary conic section can alsobe described.

Choosing the right activation function for each layeris crucial and may have a significant impact on metricscores and the training speed of the neural model. In themodel introduced in this Section, the smooth approxima-tion of the cutting function is a natural choice for the ac-tivation function in the first layer as well as in the hid-den layers, where the logical operators work. Althoughthere are a vast number of activation functions (e.g. lin-

7

Page 8: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

Figure 8: Output values for a triangular domain using nilpotent logic and its continuous approximation for different parameter values

(a) (b)

Figure 9: Output values for a circular region using nilpotent logic(a), and its differentiable approximation (b)

ear, sigmoid, tanh, or the recently introduced RectifiedLinear Unit (ReLU) [21], exponential linear unit (ELU)[2], sigmoid-weighted linear unit (SiLU) [11]) considered inthe literature, most of them are introduced based on somedesired properties, without any theoretical background.The parameters are usually fitted only on the basis of ex-perimental results. The squashing function stands out ofthe other candidates by having a theoretical backgroundthanks to the nilpotent logic which lies behind the scenes.

To sum up, on the one hand, this structure leads toa drastic reduction in the number of parameters to belearned, and on the other hand, it supports the interpre-tation, making the debugging process manageable. Giventhe logical structure, the parameters to be learned are lo-cated in the first layer. The choice of the activation func-tions in the first layer as well as in the hidden, logical lay-ers, have a sound theoretical background. Designing thenetwork architecture appropriately, arbitrary regions canbe described as intersections and unions of polyhedra inthe decision space. Moreover, multicriteria decision toolscan also be integrated with given weights and thresholds.Note that the weights and biases in the hidden layers de-fine the type of operator to be used. These parameterscan also be learned in a more sophisticated model to beexamined in future work.

5. Playground Examples

To illustrate our model with some simple examples, weextended the Tensorflow Playground with the squashing

x

<latexit sha1_base64="BybdMFSKJzwceMSDl8LWUQj0uwY=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeiF48t2FZoQ9lsJ+3azSbsbsQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgpr6xubW8Xt0s7u3v5B+fCoreNUMWyxWMTqPqAaBZfYMtwIvE8U0igQ2AnGNzO/84hK81jemUmCfkSHkoecUWOl5lO/XHGr7hxklXg5qUCORr/81RvELI1QGiao1l3PTYyfUWU4Ezgt9VKNCWVjOsSupZJGqP1sfuiUnFllQMJY2ZKGzNXfExmNtJ5Ege2MqBnpZW8m/ud1UxNe+RmXSWpQssWiMBXExGT2NRlwhcyIiSWUKW5vJWxEFWXGZlOyIXjLL6+S9kXVq1W9Zq1Sv87jKMIJnMI5eHAJdbiFBrSAAcIzvMKb8+C8OO/Ox6K14OQzx/AHzucP54mNAA==</latexit>

x

<latexit sha1_base64="BybdMFSKJzwceMSDl8LWUQj0uwY=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeiF48t2FZoQ9lsJ+3azSbsbsQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgpr6xubW8Xt0s7u3v5B+fCoreNUMWyxWMTqPqAaBZfYMtwIvE8U0igQ2AnGNzO/84hK81jemUmCfkSHkoecUWOl5lO/XHGr7hxklXg5qUCORr/81RvELI1QGiao1l3PTYyfUWU4Ezgt9VKNCWVjOsSupZJGqP1sfuiUnFllQMJY2ZKGzNXfExmNtJ5Ege2MqBnpZW8m/ud1UxNe+RmXSWpQssWiMBXExGT2NRlwhcyIiSWUKW5vJWxEFWXGZlOyIXjLL6+S9kXVq1W9Zq1Sv87jKMIJnMI5eHAJdbiFBrSAAcIzvMKb8+C8OO/Ox6K14OQzx/AHzucP54mNAA==</latexit>

y

<latexit sha1_base64="Fj7lJbloA1Rr+gkr7h9S1urDXmc=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeiF48t2A9oQ9lsJ+3azSbsboRQ+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPJkvQj+hI8pAzaqzUzAblilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvwpVYYzgbNSP9WYUDahI+xZKmmE2p8uDp2RC6sMSRgrW9KQhfp7YkojrbMosJ0RNWO96s3F/7xeasIbf8plkhqUbLkoTAUxMZl/TYZcITMis4Qyxe2thI2poszYbEo2BG/15XXSvqp6tarXrFXqt3kcRTiDc7gED66hDvfQgBYwQHiGV3hzHp0X5935WLYWnHzmFP7A+fwB6Q2NAQ==</latexit>

y

<latexit sha1_base64="Fj7lJbloA1Rr+gkr7h9S1urDXmc=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeiF48t2A9oQ9lsJ+3azSbsboRQ+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPJkvQj+hI8pAzaqzUzAblilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvwpVYYzgbNSP9WYUDahI+xZKmmE2p8uDp2RC6sMSRgrW9KQhfp7YkojrbMosJ0RNWO96s3F/7xeasIbf8plkhqUbLkoTAUxMZl/TYZcITMis4Qyxe2thI2poszYbEo2BG/15XXSvqp6tarXrFXqt3kcRTiDc7gED66hDvfQgBYwQHiGV3hzHp0X5935WLYWnHzmFP7A+fwB6Q2NAQ==</latexit>

w1 = 1

<latexit sha1_base64="kHUo/PmqO7w9DpESxJzIfIrxNK8=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9lIQS9C0YvHCvYD2qVk02wbms2GJKuUpT/CiwdFvPp7vPlvTNs9aOuDgcd7M8zMC5Xgxvr+t1dYW9/Y3Cpul3Z29/YPyodHLZOkmrImTUSiOyExTHDJmpZbwTpKMxKHgrXD8e3Mbz8ybXgiH+xEsSAmQ8kjTol1Uvupn+HpNe6XK37VnwOtEpyTCuRo9MtfvUFC05hJSwUxpot9ZYOMaMupYNNSLzVMETomQ9Z1VJKYmSCbnztFZ04ZoCjRrqRFc/X3REZiYyZx6DpjYkdm2ZuJ/3nd1EZXQcalSi2TdLEoSgWyCZr9jgZcM2rFxBFCNXe3IjoimlDrEiq5EPDyy6ukdVHFtSq+r1XqN3kcRTiBUzgHDJdQhztoQBMojOEZXuHNU96L9+59LFoLXj5zDH/gff4AxOqPMQ==</latexit>

w2 = 1

<latexit sha1_base64="vm7Q5IJ29d4HNiBgOK7QdW/d368=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0lKQS9C0YvHCrYV2lA22027dLMJuxOlhP4ILx4U8erv8ea/cdvmoK0PBh7vzTAzL0ikMOi6305hbX1jc6u4XdrZ3ds/KB8etU2casZbLJaxfgio4VIo3kKBkj8kmtMokLwTjG9mfueRayNidY+ThPsRHSoRCkbRSp2nflabXnn9csWtunOQVeLlpAI5mv3yV28QszTiCpmkxnQ9N0E/oxoFk3xa6qWGJ5SN6ZB3LVU04sbP5udOyZlVBiSMtS2FZK7+nshoZMwkCmxnRHFklr2Z+J/XTTG89DOhkhS5YotFYSoJxmT2OxkIzRnKiSWUaWFvJWxENWVoEyrZELzll1dJu1b16lXvrl5pXOdxFOEETuEcPLiABtxCE1rAYAzP8ApvTuK8OO/Ox6K14OQzx/AHzucPxnGPMg==</latexit>

w4 = �1

<latexit sha1_base64="JfjRG+tV4csVZ8gk6byLfgjXhwA=">AAAAAHicbVDLSgNBEOyNrxhfUY9eBoPgxbArAb0IQS8eI5gHJEuYnUySIbOz60yvEpb8hBcPinj1d7z5N06SPWhiQUNR1U13VxBLYdB1v53cyura+kZ+s7C1vbO7V9w/aJgo0YzXWSQj3Qqo4VIoXkeBkrdizWkYSN4MRjdTv/nItRGRusdxzP2QDpToC0bRSq2nblqZXJ153WLJLbszkGXiZaQEGWrd4lenF7Ek5AqZpMa0PTdGP6UaBZN8UugkhseUjeiAty1VNOTGT2f3TsiJVXqkH2lbCslM/T2R0tCYcRjYzpDi0Cx6U/E/r51g/9JPhYoT5IrNF/UTSTAi0+dJT2jOUI4toUwLeythQ6opQxtRwYbgLb68TBrnZa9S9u4qpep1FkcejuAYTsGDC6jCLdSgDgwkPMMrvDkPzovz7nzMW3NONnMIf+B8/gA1Ho9r</latexit>

w3 = �1

<latexit sha1_base64="7GKOGMJPkfx0bWNgRIc4g1WIboY=">AAAAAHicbVDLSgNBEOz1GeMr6tHLYBC8GHY1oBch6MVjBPOAZAmzk0kyZHZ2nelVwpKf8OJBEa/+jjf/xkmyB00saCiquunuCmIpDLrut7O0vLK6tp7byG9ube/sFvb26yZKNOM1FslINwNquBSK11Cg5M1YcxoGkjeC4c3EbzxybUSk7nEUcz+kfSV6glG0UvOpk56Pr069TqHoltwpyCLxMlKEDNVO4avdjVgScoVMUmNanhujn1KNgkk+zrcTw2PKhrTPW5YqGnLjp9N7x+TYKl3Si7QthWSq/p5IaWjMKAxsZ0hxYOa9ifif10qwd+mnQsUJcsVmi3qJJBiRyfOkKzRnKEeWUKaFvZWwAdWUoY0ob0Pw5l9eJPWzklcueXflYuU6iyMHh3AEJ+DBBVTgFqpQAwYSnuEV3pwH58V5dz5mrUtONnMAf+B8/gAzlo9q</latexit>

wfix = 1

<latexit sha1_base64="AIv7nkdNyq8HCJ208V4S1hOhr9g=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9kVQS9C0YvHCvYD2rVk02wbmmSXJGsty/4PLx4U8ep/8ea/MW33oK0PBh7vzTAzL4g508Z1v53Cyura+kZxs7S1vbO7V94/aOooUYQ2SMQj1Q6wppxJ2jDMcNqOFcUi4LQVjG6mfuuRKs0ieW8mMfUFHkgWMoKNlR7GvbSrRBqypyy78nrlilt1Z0DLxMtJBXLUe+Wvbj8iiaDSEI617nhubPwUK8MIp1mpm2gaYzLCA9qxVGJBtZ/Ors7QiVX6KIyULWnQTP09kWKh9UQEtlNgM9SL3lT8z+skJrz0UybjxFBJ5ovChCMToWkEqM8UJYZPLMFEMXsrIkOsMDE2qJINwVt8eZk0z6reedW7O6/UrvM4inAEx3AKHlxADW6hDg0goOAZXuHNGTsvzrvzMW8tOPnMIfyB8/kD4W2SwA==</latexit>

wfix = 1

<latexit sha1_base64="AIv7nkdNyq8HCJ208V4S1hOhr9g=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9kVQS9C0YvHCvYD2rVk02wbmmSXJGsty/4PLx4U8ep/8ea/MW33oK0PBh7vzTAzL4g508Z1v53Cyura+kZxs7S1vbO7V94/aOooUYQ2SMQj1Q6wppxJ2jDMcNqOFcUi4LQVjG6mfuuRKs0ieW8mMfUFHkgWMoKNlR7GvbSrRBqypyy78nrlilt1Z0DLxMtJBXLUe+Wvbj8iiaDSEI617nhubPwUK8MIp1mpm2gaYzLCA9qxVGJBtZ/Ors7QiVX6KIyULWnQTP09kWKh9UQEtlNgM9SL3lT8z+skJrz0UybjxFBJ5ovChCMToWkEqM8UJYZPLMFEMXsrIkOsMDE2qJINwVt8eZk0z6reedW7O6/UrvM4inAEx3AKHlxADW6hDg0goOAZXuHNGTsvzrvzMW8tOPnMIfyB8/kD4W2SwA==</latexit>

wfix = 1

<latexit sha1_base64="AIv7nkdNyq8HCJ208V4S1hOhr9g=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9kVQS9C0YvHCvYD2rVk02wbmmSXJGsty/4PLx4U8ep/8ea/MW33oK0PBh7vzTAzL4g508Z1v53Cyura+kZxs7S1vbO7V94/aOooUYQ2SMQj1Q6wppxJ2jDMcNqOFcUi4LQVjG6mfuuRKs0ieW8mMfUFHkgWMoKNlR7GvbSrRBqypyy78nrlilt1Z0DLxMtJBXLUe+Wvbj8iiaDSEI617nhubPwUK8MIp1mpm2gaYzLCA9qxVGJBtZ/Ors7QiVX6KIyULWnQTP09kWKh9UQEtlNgM9SL3lT8z+skJrz0UybjxFBJ5ovChCMToWkEqM8UJYZPLMFEMXsrIkOsMDE2qJINwVt8eZk0z6reedW7O6/UrvM4inAEx3AKHlxADW6hDg0goOAZXuHNGTsvzrvzMW8tOPnMIfyB8/kD4W2SwA==</latexit>

wfix = 1

<latexit sha1_base64="AIv7nkdNyq8HCJ208V4S1hOhr9g=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9kVQS9C0YvHCvYD2rVk02wbmmSXJGsty/4PLx4U8ep/8ea/MW33oK0PBh7vzTAzL4g508Z1v53Cyura+kZxs7S1vbO7V94/aOooUYQ2SMQj1Q6wppxJ2jDMcNqOFcUi4LQVjG6mfuuRKs0ieW8mMfUFHkgWMoKNlR7GvbSrRBqypyy78nrlilt1Z0DLxMtJBXLUe+Wvbj8iiaDSEI617nhubPwUK8MIp1mpm2gaYzLCA9qxVGJBtZ/Ors7QiVX6KIyULWnQTP09kWKh9UQEtlNgM9SL3lT8z+skJrz0UybjxFBJ5ovChCMToWkEqM8UJYZPLMFEMXsrIkOsMDE2qJINwVt8eZk0z6reedW7O6/UrvM4inAEx3AKHlxADW6hDg0goOAZXuHNGTsvzrvzMW8tOPnMIfyB8/kD4W2SwA==</latexit>

wfix = 1

<latexit sha1_base64="AIv7nkdNyq8HCJ208V4S1hOhr9g=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9kVQS9C0YvHCvYD2rVk02wbmmSXJGsty/4PLx4U8ep/8ea/MW33oK0PBh7vzTAzL4g508Z1v53Cyura+kZxs7S1vbO7V94/aOooUYQ2SMQj1Q6wppxJ2jDMcNqOFcUi4LQVjG6mfuuRKs0ieW8mMfUFHkgWMoKNlR7GvbSrRBqypyy78nrlilt1Z0DLxMtJBXLUe+Wvbj8iiaDSEI617nhubPwUK8MIp1mpm2gaYzLCA9qxVGJBtZ/Ors7QiVX6KIyULWnQTP09kWKh9UQEtlNgM9SL3lT8z+skJrz0UybjxFBJ5ovChCMToWkEqM8UJYZPLMFEMXsrIkOsMDE2qJINwVt8eZk0z6reedW7O6/UrvM4inAEx3AKHlxADW6hDg0goOAZXuHNGTsvzrvzMW8tOPnMIfyB8/kD4W2SwA==</latexit>

wfix = 1

<latexit sha1_base64="AIv7nkdNyq8HCJ208V4S1hOhr9g=">AAAAAHicbVBNSwMxEJ2tX7V+VT16CRbBU9kVQS9C0YvHCvYD2rVk02wbmmSXJGsty/4PLx4U8ep/8ea/MW33oK0PBh7vzTAzL4g508Z1v53Cyura+kZxs7S1vbO7V94/aOooUYQ2SMQj1Q6wppxJ2jDMcNqOFcUi4LQVjG6mfuuRKs0ieW8mMfUFHkgWMoKNlR7GvbSrRBqypyy78nrlilt1Z0DLxMtJBXLUe+Wvbj8iiaDSEI617nhubPwUK8MIp1mpm2gaYzLCA9qxVGJBtZ/Ors7QiVX6KIyULWnQTP09kWKh9UQEtlNgM9SL3lT8z+skJrz0UybjxFBJ5ovChCMToWkEqM8UJYZPLMFEMXsrIkOsMDE2qJINwVt8eZk0z6reedW7O6/UrvM4inAEx3AKHlxADW6hDg0goOAZXuHNGTsvzrvzMW8tOPnMIfyB8/kD4W2SwA==</latexit>

output

<latexit sha1_base64="NlHkbqXBk4LlGOgLjPCWQdqIpGc=">AAAAAHicbVDLSgMxFM3UV62vqks3wSK4KjNS0GXRjcsK9gHToWTSTBuax5DcEcrQz3DjQhG3fo07/8a0nYW2HggczrmX3HPiVHALvv/tlTY2t7Z3yruVvf2Dw6Pq8UnH6sxQ1qZaaNOLiWWCK9YGDoL1UsOIjAXrxpO7ud99YsZyrR5hmrJIkpHiCacEnBT2jcx1BmkGs0G15tf9BfA6CQpSQwVag+pXf6hpJpkCKoi1YeCnEOXEAKeCzSr9zLKU0AkZsdBRRSSzUb44eYYvnDLEiTbuKcAL9fdGTqS1Uxm7SUlgbFe9ufifF2aQ3EQ5Vy4SU3T5UZIJDBrP8+MhN4yCmDpCqOHuVkzHxBAKrqWKKyFYjbxOOlf1oFEPHhq15m1RRxmdoXN0iQJ0jZroHrVQG1Gk0TN6RW8eeC/eu/exHC15xc4p+gPv8wckwpHQ</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

Z

<latexit sha1_base64="KHPZnWxvapM+F0b0AqWcNPAxI9E=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69BIvgqSRS0GPRi8cK9gPaUDbbTbt0dxN2J0IJ/QtePCji1T/kzX/jps1BWx8MPN6bYWZemAhu0PO+ndLG5tb2Tnm3srd/cHhUPT7pmDjVlLVpLGLdC4lhgivWRo6C9RLNiAwF64bTu9zvPjFteKwecZawQJKx4hGnBHNpwBUOqzWv7i3grhO/IDUo0BpWvwajmKaSKaSCGNP3vQSDjGjkVLB5ZZAalhA6JWPWt1QRyUyQLW6duxdWGblRrG0pdBfq74mMSGNmMrSdkuDErHq5+J/XTzG6CTKukhSZostFUSpcjN38cXfENaMoZpYQqrm91aUToglFG0/FhuCvvrxOOld1v1H3Hxq15m0RRxnO4BwuwYdraMI9tKANFCbwDK/w5kjnxXl3PpatJaeYOYU/cD5/ACP4jk0=</latexit>

b = 0

<latexit sha1_base64="DmmJgeZ2G1lW5ypx8ACjsua3BxQ=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoBeh6MVjRfsBbSib7aRdutmE3Y1QQn+CFw+KePUXefPfuG1z0NYHA4/3ZpiZFySCa+O6305hbX1jc6u4XdrZ3ds/KB8etXScKoZNFotYdQKqUXCJTcONwE6ikEaBwHYwvp357SdUmsfy0UwS9CM6lDzkjBorPQTXbr9ccavuHGSVeDmpQI5Gv/zVG8QsjVAaJqjWXc9NjJ9RZTgTOC31Uo0JZWM6xK6lkkao/Wx+6pScWWVAwljZkobM1d8TGY20nkSB7YyoGellbyb+53VTE175GZdJalCyxaIwFcTEZPY3GXCFzIiJJZQpbm8lbEQVZcamU7IheMsvr5LWRdWrVb37WqV+k8dRhBM4hXPw4BLqcAcNaAKDITzDK7w5wnlx3p2PRWvByWeO4Q+czx+2P41r</latexit>

b = 0

<latexit sha1_base64="DmmJgeZ2G1lW5ypx8ACjsua3BxQ=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoBeh6MVjRfsBbSib7aRdutmE3Y1QQn+CFw+KePUXefPfuG1z0NYHA4/3ZpiZFySCa+O6305hbX1jc6u4XdrZ3ds/KB8etXScKoZNFotYdQKqUXCJTcONwE6ikEaBwHYwvp357SdUmsfy0UwS9CM6lDzkjBorPQTXbr9ccavuHGSVeDmpQI5Gv/zVG8QsjVAaJqjWXc9NjJ9RZTgTOC31Uo0JZWM6xK6lkkao/Wx+6pScWWVAwljZkobM1d8TGY20nkSB7YyoGellbyb+53VTE175GZdJalCyxaIwFcTEZPY3GXCFzIiJJZQpbm8lbEQVZcamU7IheMsvr5LWRdWrVb37WqV+k8dRhBM4hXPw4BLqcAcNaAKDITzDK7w5wnlx3p2PRWvByWeO4Q+czx+2P41r</latexit>

b = 1

<latexit sha1_base64="FAR+HdrPqof8WeTi4vkrTet5vTY=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoBeh6MVjRfsBbSib7aZdutmE3YlQQn+CFw+KePUXefPfuG1z0NYHA4/3ZpiZFyRSGHTdb6ewtr6xuVXcLu3s7u0flA+PWiZONeNNFstYdwJquBSKN1Gg5J1EcxoFkreD8e3Mbz9xbUSsHnGScD+iQyVCwSha6SG49vrlilt15yCrxMtJBXI0+uWv3iBmacQVMkmN6Xpugn5GNQom+bTUSw1PKBvTIe9aqmjEjZ/NT52SM6sMSBhrWwrJXP09kdHImEkU2M6I4sgsezPxP6+bYnjlZ0IlKXLFFovCVBKMyexvMhCaM5QTSyjTwt5K2IhqytCmU7IheMsvr5LWRdWrVb37WqV+k8dRhBM4hXPw4BLqcAcNaAKDITzDK7w50nlx3p2PRWvByWeO4Q+czx+3w41s</latexit>

b = 1

<latexit sha1_base64="FAR+HdrPqof8WeTi4vkrTet5vTY=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoBeh6MVjRfsBbSib7aZdutmE3YlQQn+CFw+KePUXefPfuG1z0NYHA4/3ZpiZFyRSGHTdb6ewtr6xuVXcLu3s7u0flA+PWiZONeNNFstYdwJquBSKN1Gg5J1EcxoFkreD8e3Mbz9xbUSsHnGScD+iQyVCwSha6SG49vrlilt15yCrxMtJBXI0+uWv3iBmacQVMkmN6Xpugn5GNQom+bTUSw1PKBvTIe9aqmjEjZ/NT52SM6sMSBhrWwrJXP09kdHImEkU2M6I4sgsezPxP6+bYnjlZ0IlKXLFFovCVBKMyexvMhCaM5QTSyjTwt5K2IhqytCmU7IheMsvr5LWRdWrVb37WqV+k8dRhBM4hXPw4BLqcAcNaAKDITzDK7w50nlx3p2PRWvByWeO4Q+czx+3w41s</latexit>

b = 1

<latexit sha1_base64="FAR+HdrPqof8WeTi4vkrTet5vTY=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoBeh6MVjRfsBbSib7aZdutmE3YlQQn+CFw+KePUXefPfuG1z0NYHA4/3ZpiZFyRSGHTdb6ewtr6xuVXcLu3s7u0flA+PWiZONeNNFstYdwJquBSKN1Gg5J1EcxoFkreD8e3Mbz9xbUSsHnGScD+iQyVCwSha6SG49vrlilt15yCrxMtJBXI0+uWv3iBmacQVMkmN6Xpugn5GNQom+bTUSw1PKBvTIe9aqmjEjZ/NT52SM6sMSBhrWwrJXP09kdHImEkU2M6I4sgsezPxP6+bYnjlZ0IlKXLFFovCVBKMyexvMhCaM5QTSyjTwt5K2IhqytCmU7IheMsvr5LWRdWrVb37WqV+k8dRhBM4hXPw4BLqcAcNaAKDITzDK7w50nlx3p2PRWvByWeO4Q+czx+3w41s</latexit>

b = �1

<latexit sha1_base64="PcPzapyK1isfImK2WkEpTyoOaC8=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBiyWRgl6EohePFUxbaEPZbDft0t1N2N0IJfQvePGgiFf/kDf/jZs2B219MPB4b4aZeWHCmTau++2U1tY3NrfK25Wd3b39g+rhUVvHqSLUJzGPVTfEmnImqW+Y4bSbKIpFyGknnNzlfueJKs1i+WimCQ0EHkkWMYJNLoU3F96gWnPr7hxolXgFqUGB1qD61R/GJBVUGsKx1j3PTUyQYWUY4XRW6aeaJphM8Ij2LJVYUB1k81tn6MwqQxTFypY0aK7+nsiw0HoqQtspsBnrZS8X//N6qYmug4zJJDVUksWiKOXIxCh/HA2ZosTwqSWYKGZvRWSMFSbGxlOxIXjLL6+S9mXda9S9h0ateVvEUYYTOIVz8OAKmnAPLfCBwBie4RXeHOG8OO/Ox6K15BQzx/AHzucPIbqNow==</latexit>

AND

<latexit sha1_base64="pga+w7dOh3Y2jg9rfmfwFgePrUs=">AAAAAHicbVDLSsNAFJ34rPVVdSVugkVwVRIp6LI+Fq6kgn1AU8pketsOnUzCzI1YQnDjr7hxoYhbv8Kdf+O0zUJbD1w4nHPv3LnHjwTX6Djf1sLi0vLKam4tv76xubVd2Nmt6zBWDGosFKFq+lSD4BJqyFFAM1JAA19Awx9ejv3GPSjNQ3mHowjaAe1L3uOMopE6hX1v8kbSVwAyTTyEB0zOb67StFMoOiVnAnueuBkpkgzVTuHL64YsDkAiE1TrlutE2E6oQs4EpHkv1hBRNqR9aBkqaQC6nUzWp/aRUbp2L1SmJNoT9fdEQgOtR4FvOgOKAz3rjcX/vFaMvbN2wmUUI0g2XdSLhY2hPc7D7nIFDMXIEMoUN3+12YAqytCkljchuLMnz5P6Scktl9zbcrFykcWRIwfkkBwTl5ySCrkmVVIjjDySZ/JK3qwn68V6tz6mrQtWNrNH/sD6/AFm7JgR</latexit>

AND

<latexit sha1_base64="pga+w7dOh3Y2jg9rfmfwFgePrUs=">AAAAAHicbVDLSsNAFJ34rPVVdSVugkVwVRIp6LI+Fq6kgn1AU8pketsOnUzCzI1YQnDjr7hxoYhbv8Kdf+O0zUJbD1w4nHPv3LnHjwTX6Djf1sLi0vLKam4tv76xubVd2Nmt6zBWDGosFKFq+lSD4BJqyFFAM1JAA19Awx9ejv3GPSjNQ3mHowjaAe1L3uOMopE6hX1v8kbSVwAyTTyEB0zOb67StFMoOiVnAnueuBkpkgzVTuHL64YsDkAiE1TrlutE2E6oQs4EpHkv1hBRNqR9aBkqaQC6nUzWp/aRUbp2L1SmJNoT9fdEQgOtR4FvOgOKAz3rjcX/vFaMvbN2wmUUI0g2XdSLhY2hPc7D7nIFDMXIEMoUN3+12YAqytCkljchuLMnz5P6Scktl9zbcrFykcWRIwfkkBwTl5ySCrkmVVIjjDySZ/JK3qwn68V6tz6mrQtWNrNH/sD6/AFm7JgR</latexit>

OR

<latexit sha1_base64="Qen2Gbu2Ld1G3XeZXq9p6dkXh8A=">AAAAAHicbVDLSsNAFJ34rPUVdeHCTbAIrkoiBV0W3bizin1AE8pketsOnTyYuRFLyMZfceNCEbd+hjv/xmmahbYeuHA45965c48fC67Qtr+NpeWV1bX10kZ5c2t7Z9fc22+pKJEMmiwSkez4VIHgITSRo4BOLIEGvoC2P76a+u0HkIpH4T1OYvACOgz5gDOKWuqZh27+RuqLBLLURXjE9OYuy3pmxa7aOaxF4hSkQgo0euaX249YEkCITFCluo4do5dSiZwJyMpuoiCmbEyH0NU0pAEoL82XZ9aJVvrWIJK6QrRy9fdESgOlJoGvOwOKIzXvTcX/vG6Cgwsv5WGcIIRstmiQCAsja5qG1ecSGIqJJpRJrv9qsRGVlKHOrKxDcOZPXiSts6pTqzq3tUr9soijRI7IMTklDjkndXJNGqRJGMnIM3klb8aT8WK8Gx+z1iWjmDkgf2B8/gAegZdi</latexit>

(¬(x > 0)) ^ (¬(y > 0)) : [[1 � x]+[1 � y]�1]= [1 � x � y]

<latexit sha1_base64="VrZkVtehDVte2dTRCdfbMZuzwOw=">AAAAAHicbVHbattAEF0pbeO6Nzd5CfRF1BQcSoxUDGkKDSGhtI8u1E5AEma1GslLViuxO0ojhD6pP9O3/k1WjmhtpwMLh3POXHYmKgTX6Lp/LHvn0eMnu72n/WfPX7x8NXi9N9d5qRjMWC5ydRVRDYJLmCFHAVeFAppFAi6j64tWv7wBpXkuf2BVQJjRVPKEM4qGWgx+BasatYK4qQMBCY4CCeno9tQ9DBRPl3gY/IQ4hTWt+qc1n7r8VAHIpvab9Xq+d3QbNpuO99uOattx5P1lIkHZdVN/bgu1xsVg6I7dVTgPgdeBIeliuhj8DuKclRlIZIJq7XtugWFNFXImoOkHpYbC9KAp+AZKmoEO61XzxnlnmNhJcmWeRGfFrmfUNNO6yiLjzCgu9bbWkv/T/BKTj2HNZVEiSHbfKCmFg7nT3siJuQKGojKAMsXNrA5bUkUZmkv2zRK87S8/BPMPY28y9r5Phmfn3Tp65A15S0bEI8fkjHwjUzIjzNq3Tqxz68I+sE/tL/bXe6ttdTn7ZCPs6R2E/c34</latexit>

inputs

<latexit sha1_base64="XWadSEmQcwJMVHV+v2Vm5BbqItU=">AAAAAHicbVBNS8NAEN3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbt0s4m7k2IJ/R1ePCji1R/jzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmjjVHBo8lrFuB8yAFAoaKFBCO9HAokBCKxjdzvzWGLQRsXrASQJ+xAZKhIIztJLfRXjCTKgkRTPtlStu1Z2DrhIvJxWSo94rf3X7MU8jUMglM6bjuQn6GdMouIRpqZsaSBgfsQF0LFUsAuNn86On9MwqfRrG2pZCOld/T2QsMmYSBbYzYjg0y95M/M/rpBhe+4ufQPHFojCVFGM6S4D2hQaOcmIJ41rYWykfMs042pxKNgRv+eVV0ryoepdV7/6yUrvJ4yiSE3JKzolHrkiN3JE6aRBOHskzeSVvzth5cd6dj0VrwclnjskfOJ8/v+ySvA==</latexit>

layer1

<latexit sha1_base64="DZdlbJPuWoxQfQt888VVSiReuHE=">AAAAAHicbVBNS8NAEN3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbabt0s4m7k2II/R1ePCji1R/jzX/jts1BWx8MPN6bYWZeEEth0HW/ncLa+sbmVnG7tLO7t39QPjxqmijRHBo8kpFuB8yAFAoaKFBCO9bAwkBCKxjfzvzWBLQRkXrANAY/ZEMlBoIztJLfRXjCTLIUtDftlStu1Z2DrhIvJxWSo94rf3X7EU9CUMglM6bjuTH6GdMouIRpqZsYiBkfsyF0LFUsBONn86On9MwqfTqItC2FdK7+nshYaEwaBrYzZDgyy95M/M/rJDi49jOh4gRB8cWiQSIpRnSWAO0LDRxlagnjWthbKR8xzTjanEo2BG/55VXSvKh6l1Xv/rJSu8njKJITckrOiUeuSI3ckTppEE4eyTN5JW/OxHlx3p2PRWvByWeOyR84nz8+l5Jn</latexit>

layer2

<latexit sha1_base64="SDsESt3SVfbEfz7NW6eXzsOUg0k=">AAAAAHicbVBNS8NAEN3Ur1q/qh69BIvgqSSloMeiF48V7Ae0oWy2k3bpZhN3J8UQ+ju8eFDEqz/Gm//GbZuDtj4YeLw3w8w8PxZco+N8W4WNza3tneJuaW//4PCofHzS1lGiGLRYJCLV9akGwSW0kKOAbqyAhr6Ajj+5nfudKSjNI/mAaQxeSEeSB5xRNJLXR3jCTNAUVG02KFecqrOAvU7cnFRIjuag/NUfRiwJQSITVOue68ToZVQhZwJmpX6iIaZsQkfQM1TSELSXLY6e2RdGGdpBpExJtBfq74mMhlqnoW86Q4pjverNxf+8XoLBtZdxGScIki0XBYmwMbLnCdhDroChSA2hTHFzq83GVFGGJqeSCcFdfXmdtGtVt1517+uVxk0eR5GckXNySVxyRRrkjjRJizDySJ7JK3mzptaL9W59LFsLVj5zSv7A+vwBQBySaA==</latexit>

layer3

<latexit sha1_base64="yTp7/WHBAfdvD6KGakuxnm4DxTI=">AAAAAHicbVDLSgNBEJyNrxhfUY9eBoPgKexqQI9BLx4jmAckS5id9CZDZh/O9AaXJd/hxYMiXv0Yb/6Nk2QPmljQUFR1093lxVJotO1vq7C2vrG5Vdwu7ezu7R+UD49aOkoUhyaPZKQ6HtMgRQhNFCihEytggSeh7Y1vZ357AkqLKHzANAY3YMNQ+IIzNJLbQ3jCTLIU1OW0X67YVXsOukqcnFRIjka//NUbRDwJIEQumdZdx47RzZhCwSVMS71EQ8z4mA2ha2jIAtBuNj96Ss+MMqB+pEyFSOfq74mMBVqngWc6A4YjvezNxP+8boL+tZuJME4QQr5Y5CeSYkRnCdCBUMBRpoYwroS5lfIRU4yjyalkQnCWX14lrYuqU6s697VK/SaPo0hOyCk5Jw65InVyRxqkSTh5JM/klbxZE+vFerc+Fq0FK585Jn9gff4AQaGSaQ==</latexit>

NOT : [1 � x]

<latexit sha1_base64="cyG07V0t125aPvu13uIRLMVtdu0=">AAAAAHicbVDJSgNBEO2JW4zbqEcRBoPgxTAjAcVT0IsnjZANkiH0dCpJk56F7hpJGObkxV/x4kERr36DN//GznLQxAcFj/eququeFwmu0La/jczS8srqWnY9t7G5tb1j7u7VVBhLBlUWilA2PKpA8ACqyFFAI5JAfU9A3Rtcj/36A0jFw6CCowhcn/YC3uWMopba5mFr8kYioZMmLYQhJrd3lTS9bDqnQ7dt5u2CPYG1SJwZyZMZym3zq9UJWexDgExQpZqOHaGbUImcCUhzrVhBRNmA9qCpaUB9UG4yWSG1jrXSsbqh1BWgNVF/TyTUV2rke7rTp9hX895Y/M9rxti9cBMeRDFCwKYfdWNhYWiNM7E6XAJDMdKEMsn1rhbrU0kZ6uRyOgRn/uRFUjsrOMWCc1/Ml65mcWTJATkiJ8Qh56REbkiZVAkjj+SZvJI348l4Md6Nj2lrxpjN7JM/MD5/AMLqmUk=</latexit>

AND : [x + y � 1]

<latexit sha1_base64="AVq1GKDbii/ySVjfBFic6ULf/qc=">AAAAAHicbVBNS8NAEN34WetX1KOXaBEEsSQiKJ7qx8GTKNgqJKFsttO6dLMJuxOxhJy9+Fe8eFDEq7/Am//Gbe1Bqw8GHu/N7M68KBVco+t+WmPjE5NT06WZ8uzc/MKivbTc0EmmGNRZIhJ1HVENgkuoI0cB16kCGkcCrqLucd+/ugWleSIvsZdCGNOO5G3OKBqpaa8FgzfyjgKQRR4g3GF+eHZSFAf+3VZv2wubdsWtugM4f4k3JBUyxHnT/ghaCctikMgE1dr33BTDnCrkTEBRDjINKWVd2gHfUElj0GE+WKNwNozSctqJMiXRGag/J3Iaa92LI9MZU7zRo15f/M/zM2zvhzmXaYYg2fdH7Uw4mDj9XJwWV8BQ9AyhTHGzq8NuqKIMTXplE4I3evJf0tipertV72K3UjsaxlEiq2SdbBKP7JEaOSXnpE4YuSeP5Jm8WA/Wk/VqvX23jlnDmRXyC9b7F5ipms0=</latexit>

OR : [x + y]

<latexit sha1_base64="QjRqsbtzEXRBoe7GSmh6rKFCCNI=">AAAAAHicbVDJSgNBEO1xjXGLehRhMAiCEGYkoHgKevFmFLPAzBB6OpWkSc9Cd40kDHPy4q948aCIV7/Bm39jZzlo4oOCx3tV3VXPjwVXaFnfxsLi0vLKam4tv76xubVd2NmtqyiRDGosEpFs+lSB4CHUkKOAZiyBBr6Aht+/GvmNB5CKR+E9DmPwAtoNeYczilpqFQ7c8RupLxLIUhdhgOnNXZZdOIOTodcqFK2SNYY5T+wpKZIpqq3Cl9uOWBJAiExQpRzbitFLqUTOBGR5N1EQU9anXXA0DWkAykvHK2TmkVbaZieSukI0x+rviZQGSg0DX3cGFHtq1huJ/3lOgp1zL+VhnCCEbPJRJxEmRuYoE7PNJTAUQ00ok1zvarIelZShTi6vQ7BnT54n9dOSXS7Zt+Vi5XIaR47sk0NyTGxyRirkmlRJjTDySJ7JK3kznowX4934mLQuGNOZPfIHxucPWsCZrA==</latexit>

AND

<latexit sha1_base64="pga+w7dOh3Y2jg9rfmfwFgePrUs=">AAAAAHicbVDLSsNAFJ34rPVVdSVugkVwVRIp6LI+Fq6kgn1AU8pketsOnUzCzI1YQnDjr7hxoYhbv8Kdf+O0zUJbD1w4nHPv3LnHjwTX6Djf1sLi0vLKam4tv76xubVd2Nmt6zBWDGosFKFq+lSD4BJqyFFAM1JAA19Awx9ejv3GPSjNQ3mHowjaAe1L3uOMopE6hX1v8kbSVwAyTTyEB0zOb67StFMoOiVnAnueuBkpkgzVTuHL64YsDkAiE1TrlutE2E6oQs4EpHkv1hBRNqR9aBkqaQC6nUzWp/aRUbp2L1SmJNoT9fdEQgOtR4FvOgOKAz3rjcX/vFaMvbN2wmUUI0g2XdSLhY2hPc7D7nIFDMXIEMoUN3+12YAqytCkljchuLMnz5P6Scktl9zbcrFykcWRIwfkkBwTl5ySCrkmVVIjjDySZ/JK3qwn68V6tz6mrQtWNrNH/sD6/AFm7JgR</latexit>

x > 0

<latexit sha1_base64="7dmjQ7IfS6kVmUO530rwxCI2s5k=">AAAAAHicdVDLSsNAFL2pr1pfVZduBovgKiRtoLqRohuXFe0D2lAm00k7dDIJMxOxhH6CGxeKuPWL3Pk3Th+Kih64cDjnXu69J0g4U9px3q3c0vLK6lp+vbCxubW9U9zda6o4lYQ2SMxj2Q6wopwJ2tBMc9pOJMVRwGkrGF1M/dYtlYrF4kaPE+pHeCBYyAjWRrq+O3N6xZJjV8petVxBc+J+kVPk2s4MJVig3iu+dfsxSSMqNOFYqY7rJNrPsNSMcDopdFNFE0xGeEA7hgocUeVns1Mn6MgofRTG0pTQaKZ+n8hwpNQ4CkxnhPVQ/fam4l9eJ9XhiZ8xkaSaCjJfFKYc6RhN/0Z9JinRfGwIJpKZWxEZYomJNukUTAifn6L/SbNsu57tXnml2vkijjwcwCEcgwtVqMEl1KEBBAZwD4/wZHHrwXq2XuatOWsxsw8/YL1+ADoRjcU=</latexit>

y > 0

<latexit sha1_base64="8jjThkPRskX8RBq0HvGta4YXnxY=">AAAAAHicdVDLSsNAFL2pr1ofrbp0M1gEVyFpC9WNFN24rGgf0IYymU7aoZNJmJkIIfQT3LhQxK1f5M6/cfpQVPTAhcM593LvPX7MmdKO827lVlbX1jfym4Wt7Z3dYmlvv62iRBLaIhGPZNfHinImaEszzWk3lhSHPqcdf3I58zt3VCoWiVudxtQL8UiwgBGsjXSTnjuDUtmxq5VavVJFC+J+kTPk2s4cZViiOSi99YcRSUIqNOFYqZ7rxNrLsNSMcDot9BNFY0wmeER7hgocUuVl81On6NgoQxRE0pTQaK5+n8hwqFQa+qYzxHqsfnsz8S+vl+jg1MuYiBNNBVksChKOdIRmf6Mhk5RonhqCiWTmVkTGWGKiTToFE8Lnp+h/0q7Ybs12r2vlxsUyjjwcwhGcgAt1aMAVNKEFBEZwD4/wZHHrwXq2XhatOWs5cwA/YL1+ADuXjcY=</latexit>

b = 0

<latexit sha1_base64="DmmJgeZ2G1lW5ypx8ACjsua3BxQ=">AAAAAHicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoBeh6MVjRfsBbSib7aRdutmE3Y1QQn+CFw+KePUXefPfuG1z0NYHA4/3ZpiZFySCa+O6305hbX1jc6u4XdrZ3ds/KB8etXScKoZNFotYdQKqUXCJTcONwE6ikEaBwHYwvp357SdUmsfy0UwS9CM6lDzkjBorPQTXbr9ccavuHGSVeDmpQI5Gv/zVG8QsjVAaJqjWXc9NjJ9RZTgTOC31Uo0JZWM6xK6lkkao/Wx+6pScWWVAwljZkobM1d8TGY20nkSB7YyoGellbyb+53VTE175GZdJalCyxaIwFcTEZPY3GXCFzIiJJZQpbm8lbEQVZcamU7IheMsvr5LWRdWrVb37WqV+k8dRhBM4hXPw4BLqcAcNaAKDITzDK7w5wnlx3p2PRWvByWeO4Q+czx+2P41r</latexit>

Figure 10: Nilpotent neural structure representing the expression(x > 0) AND (y > 0)) OR ((x < 0) AND (y < 0)

function (β = 50, λ = 1, a = 0.5) as activation functionand modified the backpropagation algorithm according tothe frozen weights in the hidden layers.

5.1. XOR

Let us first consider an example on a particular dataset based on Example 2. An image of a generated set ofdata is shown in Fig. 11. Orange data points have a valueof −1 and blue points have a value of +1. Here, the targetvariable is positive when x and y are both positive or bothnegative. In a logical network:

• If (x1 > 0) AND (x2 > 0) THEN predict +1

• If (x1 < 0) AND (x2 < 0) THEN predict +1

• Else predict −1

An efficient neural network can be built to make pre-dictions for this logical expression even without using thecross feature x ∗ y. For the structure and for the frozenweights and biases, see Table 3.

8

Page 9: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

Table 3: Weights and biases for modeling the XOR logical gate

x > 0 w1 = 1 w2 = 0 b = 0 [x]y > 0 w1 = 0 w2 = 1 b = 0 [y]

x AND y w1 = 1 w2 = 1 b = −1 [x+ y − 1]x OR y w1 = 1 w2 = 1 b = 0 [x+ y]NOT (x) w1 = −1 w2 = 0 b = 1 [1− x]NOT (y) w1 = 0 w2 = −1 b = 1 [1− y]

( NOT (x)) AND w1 = −1 w2 = −1 b = 1 [−x− y + 1]( NOT (y))

According to our model, the smooth approximation ofthe cutting function called the squashing function is a nat-ural choice for the activation function in the first layer aswell as in the hidden layers, where the logical operators areused. If we design this logical structure before training, aninterpretation of the network naturally emerges.

Notice how the neurons in the hidden layer reveal thelogical structure of the network (Figure 11), assisting theinterpretability of the neural model.

5.2. Preference

Another image of a generated set of data is shown inFigure 12. Orange data points have a value of −1 and bluepoints have a value of +1. Here, the network has to learnthe parameters of the straight lines separating the differentregions. The target variable is positive when both x > yand y > −x hold or where both x < y and y < −x hold.In a logical network:

• If (x > y) AND (y > −x) THEN predict +1

• If (x < y) AND (y < −x) THEN predict +1

• Else predict −1

The network structure is illustrated in Figure 12. Here,

Table 4: Weights and biases for modeling the preference operator

x > y w1 = 0.5 w2 = −0.5 b = 0.5[x−y+1

2

]

y > −x w1 = 0.5 w2 = 0.5 b = 0.5[x+y+1

2

]

x < y w1 = −0.5 w2 = 0.5 b = 0.5[−x+y+1

2

]

y < −x w1 = −0.5 w2 = −0.5 b = 0.5[−x−y+1

2

]

the expression x > y is modeled by the preference operatorp(x, y) (see Table 1 and 4). Notice how the neurons in thehidden layer reveal the logical structure of the network(see Figure 12), assisting the interpretability of the neuralmodel.

Remark 6. Networks can also be readily designed for find-ing concave regions. For example, see Figure 13.

Figure 11: Nilpotent neural network block designed for modelingXOR

Figure 12: Nilpotent neural network block designed for modelingpreference

Remark 7. Note that the frequently used min and maxoperators can also be modeled by a similar network, basedon Equations (4) and (5).

6. Conclusions

In this study, we suggested interpreting neural net-works by using continuous nilpotent logic and multicriteriadecision tools to reduce the black box nature of the neu-ral models, aiming at the interpretability and improvedsafety of machine learning. We introduced the main con-cept and the basic building blocks of the model to lay thefoundations for the future steps of the application. In our

9

Page 10: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

Figure 13: Nilpotent neural network block designed for finding aconcave region

model, membership functions (representing truth values ofinequalities), and also nilpotent operators are modeled byperceptrons. The network architecture is designed prior totraining. In the first layer, the parameters of the member-ship functions are needed to be learnt, while in the hid-den layers, the nilpotent logical operators work with givenweights and biases. Based on previous results, a rich assetof logical operators with rigorously examined properties isavailable. A novel type of neural unit was also introducedby adding the square of each input to the input layer (seeFigure 7) to describe the inside or the outside of a circlewithout polygon approximation.

The theoretical basis offers a straightforward choice ofactivation functions: the cutting function or its differen-tiable approximation, the squashing function. Both func-tions represent truth values of soft inequalities, and theparameters have a semantic meaning. Our model alsoseems to provide an explanation to the great success ofthe rectified linear unit (ReLU).

The concept was illustrated with some toy examplestaken from an extended version of the tensorflow play-ground. The implementation of this hybrid model in deepernetworks (by combining the building blocks introducedhere) and its application e.g. in multicriteria decision mak-ing or image classification is left for future work.

Acknowledgments

This study was partially supported by grant TUDFO-/47138-1/2019-ITM of the Ministry for Innovation andTechnology, Hungary.

References

References

[1] B. Biggio, I. Corona, D. Maiorca, B. Nelson, N. Srndic, P.Laskov, G. Giacinto and F. Roli, Evasion Attacks Against Ma-chine Learning at Test Time, Lecture Notes in Computer Sci-ence, 387402, 2013.

[2] D. Clevert, T. Unterthiner and S. Hochreiter, Fast and AccurateDeep Network Learning by Exponential Linear Units (ELUs),arXiv:1511.07289, 2015.

[3] O. Csiszar, J. Dombi, Generator-based Modifiers and Mem-bership Functions in Nilpotent Operator Systems, IEEE In-ternational Work Conference on Bioinspired Intelligence (iwobi2019), July 3-5, 2019, Budapest, Hungary, 2019.

[4] J. Dombi, Membership function as an evaluation, Fuzzy SetsSyst., 35, 1-21, 1990.

[5] J. Dombi, O. Csiszar, The general nilpotent operator system,Fuzzy Sets Syst., 261, 1-19, 2015.

[6] J. Dombi, O. Csiszar, Implications in bounded systems, Inform.Sciences, 283, 229-240, 2014.

[7] J. Dombi, O. Csiszar, Equivalence operators in nilpotent sys-tems, Fuzzy Sets Syst., doi:10.1016/j.fss.2015.08.012, availableonline, 2015.

[8] J. Dombi, O. Csiszar, Self-dual operators and a general frame-work for weighted nilpotent operators, Int J Approx Reason,81, 115-127, 2017.

[9] J. Dombi, O. Csiszar, Operator-dependent Modifiers in Nilpo-tent Logical Systems, Operator-dependent Modifiers in Nilpo-tent Logical Systems, In Proceedings of the 10th InternationalJoint Conference on Computational Intelligence (IJCCI 2018),126-134, 2018

[10] D. Dubois, H. Prade, Fuzzy sets in approximate reasoning. Part1: Inference with possibility distributions, Fuzzy Sets and Syst.,40, 143-202, 1991.

[11] S. Elfwing, E. Uchibe and K. Doya, Sigmoid-weighted linearunits for neural network function approximation in reinforce-ment learning, Neural Networks 107, 3-11, 2018.

[12] M. Fisher, M. Balunovic, D. Drachsler-Cohen, T. Gehr, C.Zhang and M. Vechev, DL2: Training and Querying Neural Net-works with Logic, Proceedings of the 36 th International Con-ference on Machine Learning, Long Beach, California, PMLR97, 2019.

[13] M. V., Franca, G. Zaverucha and A. S. d. Garcez, Fast relationallearning using bottom clause propositionalization with artificialneural networks, Machine learning, 94(1):81-104, 2014.

[14] A. S. d. Garcez, K. Broda and D. M. Gabbay, Neural-symboliclearning systems: foundations and applications, Springer Sci-ence & Business Media, 2012.

[15] J. Dombi, Zs. Gera, Fuzzy rule based classifier construction us-ing squashing functions. J. Intell. Fuzzy Syst. 19, 3-8, 2008.

[16] J. Dombi, Zs. Gera, The approximation of piecewise linear mem-bership functions and Lukasiewicz operators, Fuzzy Sets Syst.,154, 275- 286, 2005.

[17] I. J. Goodfellow, J. Shlens and C. Szegedy, Explaining and Har-nessing Adversarial Examples, arXiv:1412.6572, 2014.

[18] Z. Hu, X. Ma, Z. Liu, E. Hovy and E. P. Xing, Harnessing DeepNeural Networks with Logic Rules, ArXiv:1603.06318v5

[19] T. D. Kulkarni, W. F. Whitney, P. Kohli, and J. Tenenbaum,Deep convolutional inverse graphics network, In Proc. of NIPS,2530-2538, 2015.

[20] C. T. Lin, C.S.G. Lee, Neural Fuzzy Systems: A Neuro-FuzzySynergism to Intelligent Systems, Upper Saddle River, NJ:Prentice Hall, 1996.

[21] A. L. Maas, A. Y. Hannun and A. Y. Ng, Rectifier NonlinearitiesImprove Neural Network Acoustic Models, 2014.

[22] C. Szegedy, Z. Wojciech, I. Sutskever J. Bruna, D. Erhan, I.Goodfellow and R. Fergus, Intriguing properties of neural net-works, arXiv:1312.6199, 2013.

[23] S. Thys, W. Van Ranst and T. Goedem, Fooling automatedsurveillance cameras: adversarial patches to attack person de-tection, arXiv:1904.08653, 2019.

[24] G. G. Towell, J. W. Shavlik and M. O. Noordewier, Refine-ment of approximate domain theories by knowledge-based neu-ral networks, in Proceedings of the eighth National Conferenceon Artificial Intelligence, Boston, MA, 861-866, 1990.

[25] E. Trillas, L. Valverde, On some functionally expressable im-plications for fuzzy set theory, Proc. of the 3rd InternationalSeminar on Fuzzy Set Theory, Linz, Austria, 173-190, 1981.

[26] J. Xu, Z. Zhang, T. Friedman, Y. Liang and G. V. den Broeck,A semantic loss function for deep learning with symbolic knowl-edge, Proceedings of the 35th International Conference on Ma-

10

Page 11: Semantic Interpretation of Deep Neural Networks …Semantic Interpretation of Deep Neural Networks Based on Continuous Logic J ozsef Dombi,1, Orsolya Csisz ar,2,3, yand G abor Csisz

chine Learning, ICML 2018, Stockholm, Volume 80, 54985507,2018.

11