2017 IEEE International Parallel and Distributed Processing...

Info icon This preview shows pages 1–2. Sign up to view the full content.

Training Many Neural Networks in Parallel via Back-Propagation Javier A. Cruz-L´opez, Vincent Boyer Graduate Program in Systems Engineering Universidad Aut´onoma de Nuevo Le´on 66451, Monterrey, Mexico [email protected], [email protected], Didier El-Baz LAAS-CNRS, Universit´e de Toulouse, CNRS Toulouse, France [email protected] Abstract —This paper presents two parallel implementations of the Back-propagation algorithm, a widely used approach for Artificial Neural Networks (ANNs) training. These implemen- tations permit one to increase the number of ANNs trained simultaneously taking advantage of the thread-level massive parallelism of GPUs and multi-core architecture of modern CPUs, respectively. Computational experiments are carried out with time series taken from the product demand of a Mexican brewery company; the goal is to optimize delivery of products. We consider also time series of the M3-competition benchmark. The results obtained show the benefits of training several ANNs in parallel compared to other forecasting methods used in the competition. Indeed, training several ANNs in parallel yields to a better fitting of the weights of the network and allows to train in a short time many ANNs for different time series. Keywords -Product Demand Forecasting, Neural Networks, Back-Propagation, GPU, Multiprocessing. I. I NTRODUCTION In recent years, artificial neural networks (ANN) have proven to be a powerful tool for classification and pattern recognition. One of the main reasons is its ability to learn from experience and from general information. The ANNs have been used in a wide variety of fields such as science, business and industry. Back-propagation is an algorithm that has been widely used in training neural networks for its simplicity of implemen- tation and its efficiency. This paper deals with the study of the implementation of the Back-propagation algorithm on a Graphics Processing Unit (GPU) and on a multi-core CPU such that several ANNs can be trained simultaneously. The objective is to increase the diversity of the search to obtain the best configuration for the problem under study. In particular, the expected benefits of training several ANNs in parallel compared to other forecasting methods are a better fitting of the weights of the network and quick training of many ANNs for different time series. GPU are powerful graphics engines but also highly parallel computing accelerators, this characteristic has spawned the research community to map different computationally complex and demanding problems to the GPU. A parallel implemen- tation via CUDA of the dynamic programming method for solving the knapsack problem on NVIDIA GPU is presented in [3] showing a speedup for large size instances compared with the sequential implementation. In [2] a survey with recent ad- vances on GPU computing in Operation Research is presented, which shows that significant works have been proposed to parallelize meta-heuristics on such an architecture. This effort
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern