РАЗДЕЛ 2 МОДЕЛИ, СИСТЕМЫ, СЕТИ В ТЕХНИКЕ
УДК 004.032.26
finding the solution of partial differential equationswith radial basis functions networks
Alqezweeni Mohie Mortadha
нахождение решения дифференциального уравнения в частных производных при помощи сетей радиальных базисных функций
Алкезуини Мухи Муртада
Abstract. Background. For most of boundary value problems described by differential equations in partial derivatives, there are no analytical methods of solution. Limitations of analytical methods to solve practical problems led to the development of numerical methods. A famous numerical method for solving differential equations, such as the Finite Difference Method (FDM), Finite Volume Method (FVM) and Finite Element Method (FEM), requires the construction of a computational grid. Mesh for practical problems is a difficult task, which is often more difficult to solve the resulting system of linear algebraic equations. Therefore, currently gaining popularity meshless (meshfree) methods using radial basis functions and a network of radial basis functions. Materials and methods. Use the theory of radial basis functions networks. Results. The network structure of radial basis functions is described. The analysis of graded-index algorithms in the studying of radial basis functions networks in the case of solving straight lines and reverse boundary value problems is carried out. A need of accidental generation of trial points for studying the activity of the network is shown. Conclusions. The analysis shows that achieved great results in the solution of partial differential equations on networks of radial basis functions. Directions for further research - improvement of network learning algorithms for solving various tasks, solving problems with fuzzy input data.
Key words: partial differential equations, radial basis function network neural, network learning.
Аннотация. Актуальность и цели. Для большинства краевых задач, описываемых дифференциальными уравнениями в частных производных, отсутствуют аналитические методы решения. Ограничения аналитических методов для решения практических задач привели к развитию численных методов. Существуют различные численные методы для различных типов сложных задач, которые не имеют аналитического решения. Материалы и методы. Использована теория сетей радиальных базисных функций. Результаты. Описана структура сети радиальных базисных функций. Проведен анализ градиентных алгоритмов обучения сетей радиальных базисных функций при решении прямых и обратных краевых задач. Показана необходимость
случайной генерации пробных точек в процессе обучения сети. Выводы. Анализ показал, что достигнуты большие результаты в решении уравнений в частных производных на сетях радиальных базисных функций. Направления дальнейших исследований - совершенствование алгоритмов обучения сетей при решении различных задач, решение задач при неточных (нечетких) исходных данных.
Ключевые слова: дифференциальные уравнения в частных производных, сети радиальных базисных функций, обучение нейронных сетей.
Introduction
For most of boundary value problems described by differential equations in partial derivatives, there are no analytical methods of solution. Limitations of analytical methods to solve practical problems led to the development of numerical methods. Various numerical methods for different types of challenges that have no analytical solutions.
Famous numerical methods for solving differential equations, such as the Finite Difference Method (FDM), Finite Volume Method (FVM) and Finite Element Method (FEM), requires the construction of a computational grid. Mesh for practical problems is a difficult task, which is often more difficult to solve the resulting system of linear algebraic equations. Therefore, currently gaining popularity mesh-less (meshfree) methods [1] using radial basis functions [2] and a network of radial basis functions [3].
1. Radial Basis Function Collocation Methods
The process of solving boundary value problems using radial basis functions, consider the example of a boundary value problem given in the operator form
Lu(x) = f(x), xей, (1)
Bu( x) = p (x), x еЭй, (2)
where u - the desired solution; L - differential operator; operator B sets the boundary conditions; Q - area solutions; Эй - the border area; f and p - known functions.
Consider a method first proposed by Kansa [4]. From the field of solutions we choose a set of nodes collocation @ = {xi |г-=1n_mей}u{x;. |г-=n_m+1NеЭй},
where N - the total number of nodes, M - number of boundary nodes of Эй. The solution will be found in the following form
n _
Urbf(x) = XwjФ j(x), xей = йиэй, (3)
j=1
where ф j - Radial Basis Function (RBF); Wj, wN+1 - unknown coefficients
n
(wj - weight function фj), and X Wj = 0.
j=1
Radial basis function f (x), a function that depends only on the distance between the x, FP and the fixed point in space c, often called the center of the [2, 3].
There are various radial basis functions. The most popular are the radial basis functions (in the formulas c - position of the center of RB function, a - its width):
Gaussian function: 9(|| x - c ||, a) = exp
( ||x-c ||2> 2a 2
Multiquadric: 9(|| x - c ||, a) = , 11 +
x-c| 2a 2
Reverse multiquadric: 9(|| x - c ||, a) =
1 +
| x - c
„2 V12
2a2
Unknown coefficients in (3) are as the decision of system of the linear algebraic equations which turns out after substitution (3) in (1)-(2)
Aw = b,
where A - square matrix the sizes N x N.
Methods on the basis of radial basis functions don't require creation of a grid. Use of radial basis functions allowed to eliminate the principal defect of the collocation method: need of use of basis functions of an order of smoothness is at least, for differential equations in private derivatives of an order [5]. The principal shortcomings of a method are an informal choice of parameters, quantities and layouts of basis functions.
2
2. Solving Partial Differential Equations by Using Radial Basis Functions Networks
From these shortcomings the method using networks of radial basis function (Radial Basis Function Neural Network) [3] is free. The network of radial basis functions [6] contains one hidden layer which each node realizes radial basis function (fig. 1).
Input Layer Hidden Layer with m.
Radial basis function
Fig. 1. The network of radial basis function
The output of a network of radial basis functions of a neural network is described expression of a look (3) therefore such networks can be used for the solution of partial equations. When using networks of radial basis functions setup not only weights, but also parameters of radial basis functions is possible, i.e. the method can be considered as a projection method with configurable basis. It allows to solve a problem of a choice of parameters of basis functions.
Solution of the problem occurs during the learning network radial basis functions, while adjusting the weights and parameters of neurons so that a functional error is a sum of squared residuals per sampling points inside and on the boundary of the area takes a minimum value:
m
I (w, c, a) = Y,1Lurbf(x; w, c, a) - f (x,)] + i=1
m + k
+ X X [Burbf (x,; w, c, a) - p(x, )]2 — min, (4)
i=n+1
where w = (Wi, W2,..., Wm ), c = (q, C2,..., Cm ), a = (ai, a2,..., aM); M - number of control points of Q, K - from 9Q; X - penalty factor.
The methods of studying radial basis functions networks for solving partial differential equations can be divided into two groups: methods, which educate only the weight of the radial basis functions [7], and the parameters of the basis functions are defined, and the methods in which all parameters of the network are experimented. In the first case, if the operators L (1) and B (2) - linear symmetric positive definite operator training is reduced to solving a system of linear algebraic equations with a rectangular matrix A( M + K >x N prepared in the same manner as when using the collocation method. The solution of this system is using the singular value decomposition [8].
In the second approach to solve the non-linear least squares problem, because the parameters of the basis functions are included in the nonlinear functional errors. To solve it, as a rule, use two-stage iterative methods [9] using the gradient descent algorithm: alternately configured weights and centers and width of the basis functions. Each iteration of the learning algorithm consists of two steps:
Step 1. Fixing centers and width, we find weight minimizes the functional errors (4)
dI (c(n-1) a(n-1) w(n-1)
(n) (n-1) (n-1)Ú1\ck , ak , wk
awi-)
where n - number of training cycles, v^" x) - factor of training speed.
Step 2. Fixing w[n), is the center and width of minimizing the functional
errors
dI (c(n-1) a(n-1) w(n))
c(n)= c(n-1)-p(n-1)l(ck , ak , wk )
ac(n-1)
akn )= aj"-1)-a("-1)
9/(cj"), aj"-1), wj"
The directions of change of parameters of a network are selected opposite to gradient of the functional /. The components of the gradient of the functional parameters on the network is easy to calculate.
It is very important to observe when training a ratio between optimum amount of neurons and N, quantity of trial points [6] m ^ + K, where ^ - a proportionality sign. In [10-11] accidental generation of trial points on each iteration of training of a network that allows to use a small number of trial points is offered. In the same operations different algorithms of training of networks of radial basis functions are considered.
In [12] proposed a modification of the conjugate gradient algorithm tuning network weights in the solution of partial differential equations. In [13], a one-step algorithm for studying radial basis functions networks for solving partial differential equations was continued. The algorithm uses a modified method of the authors of trust in their field and it is the fastest among the known methods of studying this type of networks.
In the study [14] the solution of coefficient reverse tasks on the basis of parameter optimization of unknown coefficient is proposed. The essence of the offered method is considered on the example of the coefficient reverse task, the given in the operator form:
L(k(x, u))u(x) = f (x), xeQ, Bu(x) = p(x), xe 9Q,
where k - unknown coefficient, perhaps, depending on u, and additional conditions Du(z) ), ze Z; D - operator, setting additional conditions; Z cQu9Q ;
9 - known function, set with a margin error 8.
According to the method of parametric optimization unknown coefficient k is represented in parametric form. It is necessary to define the parameters of the representation. As such representation in [14] proposed to use the network of radial
mj
basis functions kRBF (x, u) = ^wm9m (x, u; cm, a). Solution of the direct problem is
m=1
also approximated by a network of radial basis functions. In the process of experimenting the networks parameters of both networks are configured.
Conclusion
The analysis shows that achieved great results in the solution of partial differential equations on networks of radial basis functions. Directions for further research - improvement of network learning algorithms for solving various tasks, solving problems with imprecise (fuzzy) input data.
The study was funded by RFBR according to the research project 14-0100660 «Methods of constructing neural network and hybrid mathematical models of processes and phenomena in complex technical systems» and 14-01-00733 «Information model based on hierarchical heterogeneous neural networks in the study of the effect of transport infrastructure on environment».
List of references links
1. Liu, G. R. Mesh free methods: moving beyond the finite element method / G. R. Liu. -Florida : CRC Press, 2003. - 712 p.
2. Chen, W. Recent Advances in Radial Basis Function Collocation Methods / W. Chen, Z.-J. Fu, C. S. Chen. - Springer, 2014. - 90 p.
3. Yadav, N. An Introduction to Neural Network Methods for Differential Equations / N. Yadav, A. Yadav, M. Kumar. - Springer, 2015. - 114 p.
4. Kansa, E. J. Multiquadrics - a scattered data approximation scheme with applications to computational fluid dynamics-II. Solutions to hyperbolic, parabolic, and elliptic partial differential equations / E. J. Kansa // Computers and Mathematics with Applications. - 1990. - Vol. 9, № 8, 9. - P. 147-161.
5. Mitchell, A. R. The Finite Element Method in Partial Differential Equations / A. R. Mitchell, R. Wait. - London : John Wiley & Sons, Ltd, 1977. - 208 p.
6. Haykin, S. O. Neural Networks and Learning Machines / S. O. Haykin. - Prentice Hall, 2008. - 936 p.
7. Mai-Duy, N. Numerical solution of differential equations using multiquadric radial basis function networks / N. Mai-Duy, T. Tran-Cong // Neural Networks. - 2001. -№ 14. - P. 185-199.
8. Watkins, D. Fundamentals of Matrix Computations / D. Watkins. - Wiley, John & Sons, Incorporated, 2010. - 664 p.
9. Jianyu, L. Numerical solution of elliptic partial differential equation using radial basis function neural networks / L. Jianyu, L. Siwei, Q. Yingjian, H. Yaping // Neural Networks. - 2003. - № 16 (5/6). - P. 729-734.
10. Васильев, А. Н. Нейросетевое моделирование. Принципы, алгоритмы, приложения / А. Н. Васильев, Д. А. Тархов. - СПб. : Изд-во Политехи. ун-та, 2009. - 528 c.
11. Тархов, Д. А. Нейросетевые модели и алгоритмы : справочник / Д. А. Тархов. -М. : Радиотехника, 2014. - 352 с.
12. Артюхина, Е. В. Бессеточные методы и их реализация на радиально-базисных нейронных сетях / Е. В. Артюхина, В. И. Горбаченко // Нейрокомпьютеры: разработка, применение. - 2010. - № 11. - С. 4-10.
13. Горбаченко, В. И. Подходы и методы обучения сетей радиальных базисных функций для решения задач математической физики / В. И. Горбаченко, М. В. Жуков // Нейрокомпьютеры: разработка, применение. - 2013. - № 9. - С. 12-18.
14. Жуков, М. В. Решение коэффициентных обратных задач математической физики с помощью сетей радиальных базисных функций / М. В. Жуков // Нейрокомпьютеры: разработка и применение. - 2014. - № 2. - С. 32-39.
Алкезуини Мухи Муртада Мухи аспирант,
кафедра компьютерных технологий, Пензенский государственный университет E-mail: [email protected]
Alqezweeny Mohie Mortadha Mohie postgraduate student, sub-department of computer technologies, Penza State University
УДК 004.032.26 Alqezweeni, M. M.
Finding the solution of partial differential equationswith radial basis functions networks / M. M. Alqezweeni // Модели, системы, сети в экономике, технике, природе и обществе. - 2015. - № 3 (15). - C. 91-96.