УДК 519.248
Properties of the Entropy of Multiplicative-Truncated Approximations of Eventological Distributions
Oleg Yu. Vorobyev*
Institute of Mathematics, Siberian Federal University, Svobodny, 79, Krasnoyarsk, 660041
Russia
Nataly A. Lukyanova^
Siberian Federal University, Kirensky, 26, Krasnoyarsk, 660074,
Russia
Received 10.09.2010, received in revised form 10.10.2010, accepted 20.11.2010 The theorems about entropy of eventological distributions concerning its multiplicative-truncated approximation are formulated and proved. That expands a mathematical tooling of eventology. Entropy of even-tological approximations of various powers and also relative entropy of full eventological distributions of set of events in relation to the multiplicative-truncated approximations are considered on a simple example for any triplet of events.
Keywords: event, set of events, probability, eventological distribution, multiplicative-truncated projection, multicovariance, multiplicative-truncated approximation, relative entropy.
Introduction
For research of properties of entropy of eventological distribution (E-distribution) use eventological theory of multicovariances [1] and the theory of wide dependence theory [2]. For characterization of the E-distributions of set of events X use multicovariance of events as a measure of a multiplicative deviation of set of events from an independent situation. The appearance of unit in the range of possible values of multicovariances means a reduction of the set of parameters sufficient for characterization the E-distribution of the set of events X. The unit values of all multicovariances t (X) of subsets of events X C X, in which the power is greater than some fixed value: |X| > m, m = 0,1,..., |X| indicates on the type of E-distribution having the maximum entropy among all E-distributions, at which values of characteristic parameters from the given reduced set are fixed. Role of E-distributions, maximizing entropy at known restrictions, play by multiplicative-truncated approximations of corresponding power [2].
Entropy of E-distribution {p(X),X C X} of set of events X is defined under the classical formula just as entropy of probabilistic distribution or entropy of a random variable with final number of values is defined in probability theory and can be interpreted as a measure of uncertainty of E-distribution of set of events [2, 3, 4], and other type of entropy — relative entropy — is defined as a measure of a deviation of one E-distribution from another.
In the paper the simple example for any triplet of events is illustrated. An entropy of even-tological approximations of various powers, and also a relative entropy of E-distribution p (X)
* [email protected] t [email protected] © Siberian Federal University. All rights reserved
of set of events X to its multiplicative-truncated approximation p [No] (X) are considered. Theorems about entropy of distribution concerning its multiplicative-truncated approximation are formulated and proved, using the circumstance proved in work [3] that at fixation of probabilities of //-nd sort, the maximum of entropy of distribution of /-st sort is reached on distributions of /-st sort.
1. Entropy of the Multiplicative-Truncated Approximation
The multiplicative-truncated approximation of an order of N0 the E-distributions {p (X), X C X} is defined as E-distribution {piNo] (X), X C X}, in which
, No
P1No] (X) = — n p[m] (X) , (1)
Zno m=o
where
No
ZNo = E n p[m] (X) (2)
XCX m=0
— the factor providing global normalizing relation of multiplicative-truncated approximation p^No], and [N0] any order N0 =0,1, • • • , |X|,
[m](X)= J] T (ym), X C X (3)
P
YmCX
— multiplicative-truncated projection of the order m of E-distribution p (X); {t (Y), Y C X} — multicovariances of E-distribution p (X) [1, 4].
Let's notice that each m-multiplicative projection p[m] (X) any full* E-distribution p (X) is defined on 2X only by the values p[m] (Ym) = t (Ym) on m-subsets Ym C X, which quantity is equal C^. Other values of set-function p[m] (X) are equal or 1 at |X| < m or are calculated under the formula (3) at |X | > m.
For multiplicative-truncated approximation p [No], as E-distribution, entropy which is calculated under the formula from [2] is defined:
Hp [No, = - £ p [No] (X) lnpi[No] (X). (4)
XCX
2. Example for a Triplet of Events
Let's consider the eventology space (Q, F, P) and any full triplet of the events X = {x, y, z} C F, chosen from algebra of events F of this space.
Any full triplet of the events X = {x, y, z} has full E-distribution of /-st sort of a kind
{p (0), p (x), p (y), p (z), p (x, y), p (x, z), p (y, z), p (x, y, z)}
in which by definition: p (X) > 0, X C X h p (X) = 1. E-distribution of //-nd sort for this
X CX
triplet {p0, px, py, pz, pxy, pxz, pyz, pxyz}, consisting of probabilities of //-nd sort, is connected with E-distribution of /-st sort under formulas of the Mobius inversion formulas
pX = £ p(Y), p(X) = £ (-1)|y|-|x|py.
XCY XCY
^The set of events X X is called as full if in its E-distribution of /-st sort all 2|X| probabilities do not address by zero: p (X) > 0, X C X. E-distributions of full sets of events are called as full E-distributions.
Let's write formulas for the given triplet:
p0 = p (0) + p (x) + p (y) + p (z) + p (x, y) + p (x, z) + p (y, z) + p (x,y, z)
XCX
p (X) = 1;
Px = p (x) + p (x, y) + p (x, z) + p (x, y, z) = P (x); Py = p (y) + p (x, y) + p (y, z) + p (x, y, z) = P (y); pz = p (z) + p (x, z) + p (y, z) + p (x, y, z) = P (y) ; pxy = p (x, y) + p (x, y, z) = P (x n y). pxz = p (x, z) + p (x, y, z) = P (x n z) . pyz = p (y, z) + p (x, y, z) = P (y n z) . pxyz = p (x, y, z) = P (x n y n z) . From [1, 4] it is known that any full E-distribution {p (X), X C X} of set of events X = {x,y, z} C F can be connected [1] with corresponding multicovariances {t (X), X C X} of E-distribution p (X) under formulas of the Möbius inversion formulas:
p (X)= n t(Y), (5)
Y CX
t (X ) = H p (Y)
(_1)|X|-|Y |
(6)
YCX
Formulas (5) take the following form for X C {x, y, z}:
p (0) = t (
p (x) = T (0) t (x)
p (y) = T (0) T (y)
p (z) = T (0) T (z)
p (xy) = T (0) T (x) T (y) T(xy);
p (xz) = T (0) T (x) T (z) t (xz) ;
p (yz) = T (0) T (y) T (z) T (yz);
(7)
p (xyz) = T (0) T (x) T (y) T (z) T (xy) T (xz) T (yz) T (xyz) . The formula (5) can be written down in an equivalent kind, from [2]:
|X|
p (X) = J] p[m] (X), x c X
(8)
where p[m] (X) is defined under the formula (3) as multiplicative-truncated projection of m-power of the E-distribution p (X), and Ym — m-subsets of events from X, i.e. |Ym | = m, m = 0,..., |X|. In particular p[0] (X) = t (0) = p (0) ,p[1] (X) = f] t (x).
x£X
From [2, table 4] E-distribution of any triplet of events is defined by 8 dependent parameters. These parameters are multicovariances of m-th power, m = 0,1,2,3. Multicovariances the first and more powers can accept any positive values, and multicovariance t (0) is defined by the formula providing probabilistic rationing:
-1
T (0)= I E It 11 T (Ym)
yXCX m>0 YmCX
for a triplet of events, taking into account (2) and (7), we have:
(t (0))-1 = 1 + T (x) + T (y) + T (z) + T (x) T (y) T (xy) + +T (x) T (z) T (xz) + T (y) T (z) T (yz) + T (x) T (y) T (z) T (xy) T (xz) T (yz) T (xyz)
(9)
0
2.1. Entropy of E-approximation of the 0-power
From (1) obtain , that the E-approximation of the 0-power for E-distribution {p (X), X C X} is equal
p [°] (X) p (0) 1
P [°] (X ) =
Ep [°] (X) p (0) • 2|X| 2|X| xcx
i.e. coincides with equiprobable E-distribution on 2|X|. 0-approximation is equiprobable Edistribution /-st sort for triplet of the events X = {x , y, z}:
{p [°] (0), pi [°] (x), p [°] (y), p> [°] (z), pi [°] (x ,y), p [°] (x ,z), p [°] (y ,z), p [°] (x ,y,z)} =
11111111
v8' 8' 8' 8' 8' 8' 8' 8
and //-nd sort:
{P [°] P [°] P [°] P [°] P [°] P [°] P [°] P [°] } = i\ 1111111
lr0 7 /'x ? Fy : Fz 7 Fxy ' i xz ' i yz ' i xyzj | 2' 2' 2' 4' 4' 4' 8
From (4) entropy of equiprobable E-distribution of triplet of the events X = {x , y , z} is equal
Hp [0] = P [°] (X)lnP [°] (X) = 1/8 • 1/8 = 8 • (1/8 • ln 8) = ln8.
XCX XCX
2.2. Entropy of E-approximation of the 1-power
From (1) obtain , that the E-approximation of the 1-power for E-distribution {p (X), X C X} is equal
p [1](X )= p [°](X) • p ^ ) = p [1](^) = xPxT (x) yp [i]
p ( ) E p[°](X)• p[1](X) E p[1](X) E n^(x) ' xCxp ( ) • xcx x cx xcx xex XCX
Normalizing factor of E-approximation of the 1-power for triplet of the events X:
Zi = £ p[°] (X) p[1] (X) = xcx
= p[°] (X) (p[1] (0)+ p[1] (x)+ p[1] (y)+ p[1] (z)+ p[1] (x,y)+ p[1] (x,z)+ p[1] (y,z) + +p[1] (x, y, z)) =
= T (0) (1 + T (x) + T (y) + T (z) + T (x) T (y) + T (x) T (z) + T (y) T (z) + T (x) T (y) T (z)). Then
{p[1] (0), p [1] (x), p [1] (y), p [1] (z), p [1] (x, y), p [1] (x, z), p [1] (y, z) , p [1] (x, y, z)} =
= T (0) .f _L T (x) T (y) T (z) T (x) T (y) T (x) T (z) T (y) T (z) T (x) T (y) T (z)
Zi Zi Zi Zi Zi Zi Zi Zi
From (4) entropy of E-approximation of the 1-power of triplet of the events X = {x, y, z} is equal
Hp [1 = -p [1 (0)ln T-M -p[1 (x)ln -P W (y)ln -P W (z)ln -
Zi Zi Zi Zi
^m , t (0) t (x) t (y) t (0) t (x) t (z) n, t (0) t (y) t (z)
-p [i] (x, y) ln v ' ^ ' Ky) - P[i] (x, z) ln v ^ 7 w - p[i] (y, z) ln 1,
Zi Zi Zi
-p [1] (x,y,z)ln T (0) T (x,)T (y) T (z) =ln Zi £ p I1] (X) - ln T (0) £ p [1] (X) -
XCX
XCX
p [1] (x) + p [1] (x, y) + p [1] (x, z) + p [1] (x, y, z) ln t (x) —
rW
rW
[1]
p [1] (y) + p [1] (x, y) + p [1] (y, z) + p [1] (x, y, z) ln T (y) -
[1]
[1]
[1]
p [1] (z) + p [1] (x, z) + p [1] (y, z) + p [1] (x, y, z) ln T (z) =
r [1]
rw
[1]
ln Z1 - pj1 ln T (0) - px1] ln T (x) - py[1] ln T (y) - pz[1] ln T (z)
[1]
[1]
[1]
ln Z1 - E E p^ ln T (Ym) .
m=0 YmCX
2.3. Entropy of E-approximation of the 2-power
From (1) obtain , that the E-approximation of the 2-power for E-distribution {p (X), X C X} is equal
p [2] (X ) =
n T (x) n T (xy)
xeX {x,y}CX
p[0] (X) p [1] (X) p[2] (X) _
Ep [0] (X) p [1] (X) p [2] (X )~£ n T (x) n T (xy)
XCX XCX xEX {x,y}CX
, 5> [2] (X ) = 1.
XCX
Normalizing factor of E-approximation of the 2-power for triplet of the events X:
Z = E p [0] (X)p [1] (X)p [2] (X) = t (0) (1 + t (x) + t (y) + t (z) + t (x) t (y) t (xy) + XCX
+T (x) T (z) T (xz) + T (y) T (z) T (yz) + T (x) T (y) T (z) T (xy) T (xz) T (yz)).
Then
{p [2] (0), p [2] (x), p [2] (y), p [2] (z) , p [2] (x, y), p [2] (x, z), p [2] (y, z), p [2] (x, y, z)} = T (0) A T (x) T (y) T (z) T (x) T (y) T (xy) T (x) T (z) T (xz)
v Z2' Z2 ' Z2 ' Z2 Z2
T (y) T (z) T (yz) T (x) T (y) T (z) T (xy) T (xz) T (yz) |
Z2 Z2 J
From (4) entropy of E-approximation of the 2-power of triplet of the events X = {x, y, z} is equal
[2]
- Y, P [2] (X) lnp [2] (X) =ln Z2 Y, P [2] (X) - lnT (0) Y, p [2] (X) -
XCX
XCX
XCX
p [2] (x) + p [2] (x, y) + p [2] (x, z) + p [2] (x, y, z) ln T (x)- p [2] (x, y) + p [2] (x, y, z) ln T (xy) -p [2] (y) + p [2] (x, y) + p [2] (y, z) + p [2] (x, y, z) ln T (y)- p [2] (x, z) + p [2] (x, y, z) ln T (xz) -p [2] (z) + p [2] (x, z) + p [2] (y, z) + p [2] (x, y, z) lnT (z)- p [2] (y, z) + p [2] (x, y, z) lnT (yz) =
ln Z2 -p42] lnT (0) -px2] lnT (x) -p,j2] lnT (y) -p?z[2] lnT (z) -px2] lnT (xy) -px2] lnT (xz) -
[2]
[2]
[2]
;[2] xy
;[2 xz
-i?y[2] ln T (yz) = ln Z2 ^ £ pYm • ln T (Ym) .
m=0 YmCX
2.4. Relative Entropy
Entropy of E-distribution {p (X) , X C X} concerning its multiplicative-truncated approximation ptNo] is defined by the formula
Hp/pN°] = £ p (X)ln ' (10)
xcx
which after transformation comes to a kind
Hp/pNo] = -Hp - £ p (X) lnpiNo] (X), (11)
xcx
where Hp = - E p (X) lnp (X) - entropy of E-distribution {p (X), X C X}.
x cx
Using the same technics, as in the previous subsections, we will write entropy of E-distribution of a triplet of events X = {x, y, z}
Hp = - ^^ p (X) lnp (X) = - ln t (0) ^^ p (X) - p (x) ln t (x) - p (y) ln t (y) - p (z) ln t (z) -xcx xcx
-p (xy) ln (t (x) t (y) t (xy)) - p (xz) ln (t (x) t (z) t (xz)) - p (yz) ln (t (y) t (z) t (yz)) -
-p (xyz) ln (t (x) T (y) T (z) T (xy) T (xz) T (yz) T (xyz)) = - ln T (0) p (X) -
x cx
- [p (x) + p (xy) + p (xz) + p (xyz)] ln T (x) - [p (y) + p (xy) + p (yz) + p (xyz)] ln T (y) -- [p (z) + p (xz) + p (yz) + p (xyz)] ln T (z) - [p (xy) + p (xyz)] ln T (xy) -- [p (xz) + p (xyz)] ln T (xz) - [p (yz) + p (xyz)] ln T (yz) - p (xyz) ln T (xyz) = = -p0 ln T (0) - px ln T (x) - py ln T (y) - pz ln T (z) - pxy ln T(xy) - pxz ln T (xz) -
3
-pyz ln T (yz) - pxyz ln T(xyz) = -£ £ pYm • ln T (Ym) •
m=° Ymcx
Let's consider relative entropy of E-distribution of a full triplet of events X = {x, y, z} and its approximation of zero power. On the one hand:
H
p/p [°]
£ p (X) ln pp|^ = £ p (X) lnp (X) - £ p (X) lnp[°] (X) = xcx p ( ) xcx xcx
= EE pYm • ln T (Ym) - ln ^ E p (X) =
m=° Ymcx Z° xcx
= EE pYm • ln T (Ym) - p0 ln T (0)+ln Z° =ln Z° + EE pYm • ln T (Ym) •
m=° YmcX m=i YmcX
On the other hand:
Hp/p[0] = -Hp - £ p (X)ln 8 = -Hp + ln8 £ p (X) = -Hp + ln8 = -Hp + Hp[°]; xcx 8 xcx
Relative entropy of E-distribution of a full triplet of events X = {x, y, z} and its approximation of zero power will be transformed to a kind:
Hp/p[i] = -Hp - £ p (X) lnpw (X),
xcx
where is used the same technics, as in the previous subsections for the second summand:
- £ p (X)lnpW (X) = -p (0)ln - p (x)ln - p (y)ln -
XCX 1 1 1
( )l T (0) t (z) ( )l t (0) t (x) T (y) ( )l t (0) t (x) T (z)
-p(z) ln —~--p(x y) ln-~--p(x z) ln-~--
Zi Zi Zi
, S1 T (0) T (y) T (z) . T (0) T (x) T (y) T (z) , ^ v^ /T-.
-p (y, z) ln - p (x, y, z) ln = ln Z1 £ p (X) -
1 1 X CX
- ln t (0) p (X) - [p (x) + p (x, y) + p (x, z) + p (x, y, z)] ln t (x) -XCX
- [p (y) + p (x, y) + p (y, z) + p (x, y, z)] ln T (y) - [p (z) + p (x, z) + p (y, z) + p (x, y, z)] ln T (z) =
1
= ln Z1 - p0 ln T (0) - px ln T (x) - py ln T (y) - pz ln T (z) = ln Z1 - £ £ pym • ln T (Ym) .
m=0 YmCX
Hence
3 1
Hp/p[i] = E E pYm • ln T (Ym)+ln Z1 - E E pYm • ln T (Ym) = m=0 YmCX m=0 YmCX
= ln Z1 + EE pYm • ln T (Ym ) .
m=2 YmCX
We conduct similar conclusions for the second summand from (11) for approximation of the second power.
- £ p (X)lnp (X) = ln Z2 £ p (X) - ln t (0) £ p (X) -
XCX XCX XCX
- [p (x) + p (xy) + p (xz) + p (xyz)] ln T (x) - [p (xy) + p (xyz)] ln T (xy) -
- [p (y) + p (xy) + p (yz) + p (xyz)] ln T (y) - [p (xz) + p (xyz)] ln T (xz) -
- [p (z) + p (xz) + p (yz) + p (xyz)] ln T (z) - [p (yz) + p (xyz)] ln T (yz) =
= ln Z2 - P0 ln T (0) - px ln T (x) - py ln T (y) - pz ln T (z) - pxy ln T (xy) - pxz ln T (xz) -
2
-pyz ln T (yz) = ln Z2 - £ £ pYm • ln T (Ym) .
m=0 YmCX
Then relative entropy of E-distribution {p (X), X Ç X} of a full triplet of events X = {x, y, z} to its approximation of second power {p12 (X ), X Ç X}
Hp/p[2] = £ p (X) ln PplXr = £ p (X) lnp (X) - £ p (X) lnpP] (X) =
xcx p ( ) xcx xcx
3 2
= £ £ pYm • ln T (Ym) + ln Z2 - £ £ pYm ln T (Ym) = ln Z2 + pxyz • ln T (xyz) .
m=° YmcX m=° YmcX
Entropy of E-distribution of a full triplet of events X coincides with entropy of E-approxi-mation of third power:
Hp[3] = Hp = - £ p (X)ln p (X),
xcx
as
n p[m](X) p(X)
p^3](x)^^-= lp(XX) = p (X) , Ep (X ) = 1.
E n p[m] (X) XCXp ( ) xcx
XCX m=0
Then
HP/p[3] = E p (X )ln E p (X )ln1 = 0.
XCX p ( ) XCX
3. Properties of Relative Entropy of E-distribution to its Multiplicative-Truncated Approximation
Theorem 1. Relative entropy of any full E-distribution p to its multiplicative-truncated approximation piNo] of any order N0 = 0,1, • • • , |X| is equal:
| X|
Hp/p [No] = ln ZNo +
E E pYm • ln T (Ym) , (12)
m=No + 1 YmCX
where
No
ZNo = E n n T (Ym) , X C X
XCX m=0 YmCX
— the factor providing global normalizing relation of multiplicative-truncated approximation piNo].
Proof. Entropy of E-distribution {p (X), X C X} relation to its multiplicative-truncated approximation p[No] is defined by formula (10) with its transformation in (11)
Hp/p [No]= E p (X)ln „ piXX) = E p (X)lnp (X) - E p (X)lnP [No](X). (13)
XCX p ( ) XCX XCX
The first summand is entropy of E-distribution {p (X) , X C X}, for which we will enter following designation
Hp = - E p(X)lnp(X).
XCX
(|X| \ / |X|
n p [m](X)l = -£ |p (X) •£ ln p [m](X)
m=0 XCX m=0
= - E (p (X) • E ln ( n T (Ym))) = - E (p (X) • E E ln T (Ym)| .
XCX \ m=0 \YmCX J J XCX \ m=0 YmCX
We change a summation order:
| X|
Hp = -E E ln T (Ym) • £ p (X) ,
m=0 YmCX YmCX
H = - £ p (X)ln
XCX
Ep (X) ln Zn0 - E p (X) ■ £ ln pm (X)
XCX XCX V m=0 y
where E p (X) = pYm is probability //-nd sort of E-distribution p. Hence we obtain expres-
YmCX
sion:
|X|
Hp = -£ E pYm ■ ln T (Ym) , (14)
m=0 YmCX
Let's enter a designation for the second summand
h = - E p (X)lnp N0] (x).
xcx
' n p[m] (X) \ / No
- = Y^ p (X)ln ZN - Y^ L (X). Y^ ln p[m]
ZNo
V N / N
= lnZNo - E fp(X) ■ § lnp[m] (X))=lnZNo - E (p(X) ■ § E lnT(Ym)J .
XCX V m=0 / XCX y m=0 YmCX )
We change a summation order:
No No
H = ln ZNo - £ ]T ln T (Ym) ■ £ p (X) = ln ZNo - £ ]T pYm ■ ln T (Ym) , (15)
m=0 YmCX YmCX m=0 YmCX
which connects H with probabilities of //-nd sort on subsets of events Y C X up to power N0 inclusive. Further, inserting into (13) results from (14) and (15), we obtain
|X| No
Hp/p [No] = -Hp + H = E E pYm ■ ln T (Ym)+ln ZNo - E E pYm ■ ln T (Ym) =
m=0 YmCX m=0 YmCX
|X|
= ln ZNo + E E pYm ■ ln T (Ym) .
m=No + 1 YmCX
□
Theorem 2. Relative entropy of any full E-distribution p, in which probabilities of //-nd sort on subsets of events up to power N0 inclusive are fixed, to its multiplicative-truncated approximation p [No] of any order N0 = 0,1, ■ ■ ■ , |X| is equal to zero:
Hp/p [No] = -Hp + Hp [No] = 0. (16)
Proof. We will write relative entropy, using designations which have been entered at the proof of Theorem 1
Hp/p [No] = -Hp + H,
where from (11) and (15) the second summand
No
H = ln ZNo - E E pYm ■ ln T (Ym) (17)
m=0 YmCX
connects H with probabilities of //-nd sort on subsets of events Y C X up to power N0 inclusive.
Let's consider entropy of multiplicative-truncated approximation p [No]. Let's substitute in the formula (4) a formula (1), and then (3). We obtain :
H
[Nq]
No
-E ? [No] (X)ln
/ N0 \
' n p[m] (X)x
a=0
XÇX
Z
No
/
No
ln Znq - £ p [No] (X ) • £ ln P[m] (X ) = ln Znq - £ I ?[No] (X ) • £ £ ln t (Ym)
XÇX \ m=0
We change a summation order:
X ÇX
m=0 YmÇX
No
Hp [Nq]
As J2 P [No] (X ) = pYNo] is probability of //-nd sort of multiplicative-truncated approximation
YmÇx m
p?[No], we obtain expression:
ln Znq -£ E ln T (Ym) ■ £ P [No](X) .
m=0 YmÇX
YmÇX
No
E
m=0 YmÇX
Hp [No] = ln Znq ^ ]T pT ■ ln T (Ym) ,
(18)
which connects entropy of multiplicative-truncated approximation with probabilities of //-nd sort on subsets of events Y C X up to power inclusive.
Using the theorem of A.Vorobyev [3] and its consequence in the paper of O.Vorobyev [2] that entropy of E-distributions, having the fixed probabilities of //-nd sort on subsets of events up to power No inclusive, reaches a maximum on its multiplicative-truncated approximation of order No, we obtain that expressions (17) and (18) coincide, as multiplicative-truncated approximation, getting to a class of E-distributions with the fixed probabilities of //-nd sort, have the same probabilities of //-nd sort, as distribution:
PY = PY, |Y| < N0.
Thus, it is shown that
H
p/p [N0]
-HP + H5?
for any full E-distribution p, having the fixed probabilities of //-nd sort on subsets of events up to power N0 inclusive, when entropy of E-distribution reaches a maximum on its multiplicative-truncated approximation of order N0 [5].
In the paper by A.Vorobyev [3] and in papers by O.Vorobyev [1, 4] the theorem has been proved, that entropy of E-distributions of /-st sort {p (X), X Ç X} reaches a maximum on Edistributions, at which probabilities of //-nd sort of crossings of low-power m-subsets of events (m < N0 < |X|) are fixed and whose multicovariances of /-st sort of corresponding high-powers {t (X), |X| > N0} are equal to unit. Hence the second summand becomes by zero in formula (12)
|X|
H
p/p [No]
ln Znq +
E PYm ■ ln T (Ym) .
YmÇX
Considering that t (Ym) = 1, |Ym| > N0, we will write normalising factor
E
m=No + 1 YmÇX
No
Znq = E n n T (Ym) = E n n T (Ym) = E P(X) = 1.
XÇX m=0 YmÇX
XÇX m=0 YmÇX
X ÇX
X
Hence the first summand too addresses in zero. Thus, it is proved that
Hp/p [No] = -Hp + Hp [No] = 0.
for any full E-distribution p having the fixed probabilities of //-nd sort on subsets of events up to power N0 inclusive. □
Conclusion
Eventological theory of multicovariances and the theory of wide dependence are actively used for research an entropy properties in eventology. Based on these theories in the article the theorems about relative entropy of E-distribution to its multiplicative-truncated approximation were formulated and proved, it is a consequence of the theorems proved in the works [2, 3]. Relative entropies of E-distribution p (X) of the set of events X to its multiplicative-truncated approximations p [No] (X) are illustrated on an example of any triplet of events, illustrations inductively precede into the general theorems proving. This work is an extension of mathematical tools eventology and further it is planned to continue researches the properties of entropy and their use in appendices eventology to modeling the humanitarian and socio-economic systems.
References
[1] O.Yu.Vorobyev, A multcovariance of events, Proc. of the VII All-Russian FAM Conf. on Financial and Actuarial Mathametics and Related Fields, 1(2008), 67-81 (in Russian).
[2] O.Yu.Vorobyev, A wide dependence of events and an approximation of eventological distributions by widemultiplicative set-functions, (2009), 101-122 (in Russian).
[3] A.O.Vorobyev, Multicovariances and multipoint dependent distributions of random sets, Proc. of the I All-Russian FAM Conf. on Financial and Actuarial Mathametics and Related Fields, 1(2002), 21-24 (in Russian).
[4] O.Yu.Vorobyev, Eventology, Krasnoyarsk, SFU, 2007 (in Russian).
[5] N.A.Lukyanova, Entropy of the multiplicative-truncated approximation of eventological distributions, Proc. of the IX Int.FAM Conf. on Financial and Actuarial Mathametics and Eventoconverging Technologies, (2010), 198-201 (in Russian).
Энтропийные свойства мультипликативно-усеченных аппроксимаций эвентологических распределений
Олег Ю. Воробьев Наталья А. Лукьянова
В 'работе сформулированы и доказаны теоремы об энтропии эвентологического распределения относительно его мультипликативно-усеченной аппроксимации, расширяющие инструментарий математической эвентологии. На простом примере для произвольного триплета событий рассмотрены энтропии эвентологических аппроксимаций различных мощностей, а также относительные энтропии полных эвентологических распределений множества событий по отношению к своим мультипликативно-усеченным аппроксимациям.
Ключевые слова: событие, множество событий, вероятность, эвентологическое распределение, мультипликативно-усеченная проекция, мультиковариация, мультипликативно-усеченная аппроксимация, энтропия, относительная энтропия.