A cooperative neural network approach for enhancing data traffic prediction

A cooperative neural network approach for enhancing data traffic prediction

This paper addresses the problem of learning a regression model for the prediction of data traffic in a cellular network. We proposed a cooperative learning strategy that involves two Jordan recurrent neural networks (JNNs) trained using the re y algorithm (FFA) and resilient backpropagation algorithm (Rprop), respectively. While the cooperative capability of the learning process ensures the effectiveness of the regression model, the recurrent nature of the neural networks allows the model to handle temporally evolving data. Experiments were carried out to evaluate the proposed approach using high-speed downlink packet access data demand and throughput measurements collected from different cell sites of a universal mobile telecommunications system-based cellular operator. The proposed model produced signi cantly superior results compared to the results obtained on the same problems from the traditional method of separately training a JNN with FFA and Rprop.

___

  • [1] International Telecommunication Union. ICT facts and gures - The world in 2015.
  • [2] Dong X, Fan W, Gu J. Predicting LTE throughput using traffic time series. ZTE Communications 2015; 4: 014.
  • [3] Liu Y, Lee JY. An empirical study of throughput prediction in mobile data networks. In: IEEE 2015 Global Communications Conference; 6 Dec 2015; IEEE. pp. 1-6.
  • [4] Mirza M, Sommers J, Barford P, Zhu X. A machine learning approach to TCP throughput prediction. IEEE ACM T Network 2010; 18: 1026-1039.
  • [5] Rattaro C, Belzarena P. Throughput prediction in wireless networks using statistical learning. In: Latin-American Workshop on Dynamic Networks; 4 Nov 2010; 4p.
  • [6] He Q, Dovrolis C, Ammar M. On the predictability of large transfer TCP throughput. Comput Netw 2007; 51: 3959-3977.
  • [7] Svoboda P, Buerger M, Rupp M. Forecasting of traffic load in a live 3G packet switched core network. In: IEEE 2008 International Symposium on Communication Systems, Networks and Digital Signal Processing; 25 Jul 2008; IEEE. pp. 433-437.
  • [8] Yu Y, Song M, Fu Y, Song J. Traffic prediction in 3G mobile networks based on multifractal exploration. Tsinghua Sci Technol 2013; 18: 398-405.
  • [9] Yoo W, Sim A. Time series forecast modeling on high-bandwidth network measurements. J Grid Comput 2016; 14: 463-476.
  • [10] Benet CH, Kassler A, Zola E. Predicting expected TCP throughput using genetic algorithm. Comput Netw 2016; 108: 307-322.
  • [11] Demuth HB, Beale MH, De Jess O, Hagan MT. Neural Network Design. 2nd ed. USA: Martin Hagan, 2014.
  • [12] Bao Y, Xiong T, Hu Z. Multi-step-ahead time series prediction using multiple-output support vector regression. Neurocomputing 2014; 129: 482-493.
  • [13] Lawal IA, Abdulkarim SA, Hassan MK, Sadiq JM. Improving HSDPA traffic forecasting using ensemble of neural networks. In: IEEE 15th IEEE International Conference on Machine Learning and Applications; 18 Dec 2016; IEEE. pp. 308-313.
  • [14] Blaszczyszyn B, Jovanovicy M, Karray MK. How user throughput depends on the traffic demand in large cellular networks. In: IEEE 12th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks; 12 May 2014; IEEE. pp. 611-619.
  • [15] Park YK, Lee C. Applications of neural networks in high-speed communication networks. IEEE Commun Mag 1995; 33: 68-74.
  • [16] Chabaa S, Zeroual A, Antari J. Identi cation and prediction of internet traffic using arti cial neural networks. J Intell Learn Syst Applic 2010; 2: 147.
  • [17] Abdulkarim SA. Time series prediction with simple recurrent neural networks. Bayero J Pure Appl Sci 2016; 9: 19-24.
  • [18] Abdulkarim SA, Garko AB. Effectiveness of re y algorithm based neural network in time series forecasting. Bayero J Pure Appl Sci 2016; 9: 6-10.
  • [19] Dorffner G. Neural networks for time series processing. Neural Netw World 1996; 6: 447-468.
  • [20] Riedmiller M, Braun H. A direct adaptive method for faster backpropagation learning: the RPROP algorithm. In: IEEE International Conference on Neural Networks, 1993; IEEE. pp. 586-591.
  • [21] Riedmiller M. Rprop-Description and implementation details: technical report. Inst. f. Logik, Komplexitat u. Deduktionssysteme; 1994.
  • [22] Yang XS. Fire y algorithm, stochastic test functions and design optimisation. Int J Bio-Inspir Com 2010; 2: 78-84.
  • [23] Yang XS. Fire y algorithms for multimodal optimization. In: International Symposium on Stochastic Algorithms; 26 Oct 2009; Springer. pp. 169-178.
  • [24] LeCun Y, Bottou L, Orr GB, Muller KR. Efficient backprop. In: Orr GB, Muller KR, editors. Neural Networks: Tricks of the Trade. Berlin, Germany: Springer, 2012, pp. 9-48.
  • [25] Hamzacebi, C. Improving arti cial neural networks' performance in seasonal time series forecasting. Inform Sciences 2008; 178: 4550-4559.
  • [26] Sheela KG, Deepa SN. Review on methods to x number of hidden neurons in neural networks. Math Probl Eng 2013; 2013, Article ID 425740, 11 pages. doi:10.1155/2013/425740.
  • [27] Pankratz A. Forecasting with Univariate Box-Jenkins Models: Concepts and Cases. New York, NY, USA: Wiley, 2009.
  • [28] Wessels LF, Barnard E. Avoiding false local minima by proper initialization of connections. IEEE T Neural Networ 1992; 3: 899-905.
  • [29] Robel A. The Dynamic pattern selection algorithm: Effective training and controlled generalization of Backpropa- gation neural networks: tech rep. Technische Universitat Berlin, 1994.
  • [30] Mann HB, Whitney DR. On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 1947, 18: 50-60.