R. Muhamedyev.  Machine learning methods: An overview // Computer Modeling and New Technologies. Scientific and research journal. – Riga, 2015. — 19(6). — P. 14-29.
Abstract
This review covers the vast field of machine learning (ML), and relates to weak artificial intelligence. It includes the setup diagram of machine learning methods, the formal statement of ML and some frequently used algorithms (regressive, artificial neural networks, k-NN, SVN, LDAC, DLDA). It describes classification accuracy indicators, the use of “learning curves” for assessment of ML methods and data pre-processing methods, including bad value elimination methods, normalization. It addresses issues of ML systems application in big data management and the approaches of their solution by methods of computation paralleling and roughening of gradient descent.
Key words: machine learning,  Big Data, regression, map reduce, artificial neural networks, k-NN, SVN, LDAC, DLDA, accuracy, precision, recall, T1 Score.
[http://geoml.info/?p=336]
References
Systems of Neuromorphic Adaptive Plastic Scalable Electronics http://www.darpa.mil/Our_Work/DSO/Programs/Systems_of_Neuromorphic_Adaptive_Plastic_Scalable_Electronics_(SYNAPSE).aspx 10 Aug 2014
Сандра Блейксли, Джефф Хокинс. «Об интеллекте». — Москва-Санкт-Петербург-Киев. — 2007. – 128 c.
Weiß G. Multiagent Systems: A Modern Approach to Distributed Artificial Intelligence. – Cambridge: MIT Press, 1999. – 648 p.
Stuart Russell and Peter Norvig. Artificial Intelligence: A modern approach. – New Jersey: Upper Saddle River, 2010. –  1078 p.
Городецкий В.И. Самоорганизация и многоагентные системы // Известия РАН. Теория и системы управления. – 2012. – № 2. – C. 92–120.
Городецкий В.И. Самоорганизация и многоагентные системы. Приложения и технологии разработки // Известия РАН. Теория и системы управления. – 2012. – № 3. – С. 55–75.
Tim Jones. Artificial Intelligence: A Systems Approach. – Hingham, Massachusetts, New Delhi: INFINITY SCIENCE PRESS LLC, 2008. – 500 p.
Guoqiang Peter Zhang. Neural Networks for Classification: A Survey // IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART C: APPLICATIONS AND REVIEWS, 2000. – V. 30. – №. 4.
David Kriesel. A Brief Introduction to Neural Networks http://www.dkriesel.com/en/science/neural_networks 2 Jun 2015)
Van der Baan, M. and Jutten, C. Neural networks in geophysical Geophysics. – V. 65(4). — P. 1032-1047.
Baldwin, J. L, R. M. Bateman and C. L. Wheatley. Application of a neural network to the problem of mineral identification from well logs // The Log Analyst. — 1990. — P. 279-293.
Benaouda B., Wadge G., Whitmark R.B., Rothwell R. G., MacLeod C. Inferring the lithology of borehole rocks by applying neural network classifiers to downhole logs — an example from the Ocean Drilling Program // Geophysical Journal International. — 1999. — 136. — P. 477- 491.
Saggaf M. M., Nebrija Ed. L. Estimation of missing logs by regularized neural networks. //AAPG Bulletin. — 2003. — 87, № 8. — P. 1377-1389.
В.А. Тененёв, Б.А. Якимович, М.А. Сенилов, Н.Б. Паклин. Интеллектуальные системы интерпретации геофизических исследований скважин // Штучний інтелект. – 2002. – № – С.338.
Алёшин С.П., А.Л. Ляхов. Нейросетевая оценка минерально-сырьевой базы региона по данным геофизического мониторинг // Нові технології. — № 1 (31). – 2011. – C. 39-43.
Карпенко А.Н., Булмасов О.В. Применение нейроносетевых технологий при интерпретации данных геофизических исследований скважин. http://oil-gas.platinov-s.com/index.php?name=articles&op=view&id=11&pag=3&num=1 5 Okt 2015
Rogers S.J., Chen H.C., Kopaska-Merkel D.C.t Fang J.H. Predicting permeability from porosity using artificial neural networks //AAPG Bulletin. – P. 786-1797.
Костиков Д.В. Инструментальные средства интерпретации геофизических исследований скважин на основе преобразованных каротажных диаграмм с помощью многослойной нейронной сети. Диссертация кандидата технических наук. — М.: РГБ, 2007- 189 p.
Muhamediyev, E. Amirgaliev, S. Iskakov, Y. Kuchin, E. Muhamedyeva. Integration of Results of Recognition Algorithms at the Uranium Deposits // Journal of ACIII. – 2014. – Vol. 18, № 3. – P. 347-352.
Амиргалиев Е.Н., Искаков С.Х., Кучин Я.В., Мухамедиев Р.И. Интеграция алгоритмов распознавания литологических типов // Проблемы информатики. Сибирское отделение РАН. – 2013. – № 4 (21). – С. 11-20.
Амиргалиев Е.Н., Искаков С.Х., Кучин Я.В., Мухамедиев Р.И. Методы машинного обучения в задачах распознавания пород на урановых месторождениях //Известия НАН РК. – 2013. –  № 3.  – С.82-88.
Joseph A. Cruz and David S. Wishart Applications of Machine Learning in Cancer Prediction and Prognosis // Cancer Informatics, — 2006. — V. 2. — P. 59–77.
Shoeb, Ali H., and John V. Guttag. Application of machine learning to epileptic seizure detection // Proceedings of the 27th International Conference on Machine Learning. — P. 975-982.
Mannini, Andrea, and Angelo Maria Sabatini. Machine learning methods for classifying human physical activity from on-body accelerometers //  2010 — V. 2. — P. 1154-1175.
Ballester, Pedro J., and John BO Mitchell. A machine learning approach to predicting protein–ligand binding affinity with applications to molecular docking // 2010. — V. 9. P. 1169-1175.
Farrar, Charles R., and Keith Worden.Structural health monitoring: a machine learning perspective. John Wiley & Sons, 2012. 66 рр.
Frederich Recknagel. Application Of macine Learning To Ecological Modelling // Ecological Modelling. – P. 303-310.
Clancy, Charles, Joe Hecker, Erich Stuntebeck, and Tim O. Shea. Applications of machine learning to cognitive radio networks // Wireless Communications, IEEE. – 2007. — V. 4. P. 47-52.
Ball, Nicholas M., and Robert J. Brunner. «Data mining and machine learning in astronomy // International Journal of Modern Physics. 2010. — V. 07. – P. 1049-1106.
Csaba Szepesv´ari. Algorithms for Reinforcement Learning. Synthesis Lectures on Artificial Intelligence and Machine Learning series by Morgan & Claypool Publishers, 2009, 98 p. http://www.ualberta.ca/~szepesva/papers/RLAlgsInMDPs.pdf
Xiaojin Zhu. Semi-Supervised Learning Literature Survey. Computer Sciences TR 1530 University of Wisconsin – Madison. Last modified on July 19, 2008. http://pages.cs.wisc.edu/~jerryzhu/pub/ssl_survey.pdf Semi-Supervised Learning Literature Survey.pdf/ 6 Nov 2014
Kohonen, Teuvo (1982). «Self-Organized Formation of Topologically Correct Feature Maps».Biological Cybernetics 43 (1): 59–69. doi:1007/bf00337288
K. Jain, M.N. Murty, P.J. Flynn. Data Clustering: A Review ACM Computing Surveys, Vol. 31, No. 3, September 1999
Wesam Ashour BarbakhYing Wu, Colin Fyfe. Review of Clustering Algorithms. Non-Standard Parameter Adaptation for Exploratory Data Analysis Studies in Computational Intelligence Volume 249, 2009, pp 7-28. DOI: 10.1007/978-3-642-04005-4_2    Review of Clustering Algorithms.pdf/ 16 Jan 2014.
Taiwo Oladipupo Ayodele (2010). Types of Machine Learning Algorithms, New Advances in Machine Learning, Yagang Zhang (Ed.), ISBN: 978-953-307-034-6 pp.19-48.
Hamza Awad Hamza Ibrahim et al. (2012) Taxonomy of Machine Learning Algorithms to classify realtime Interactive applications // International Journal of Computer Networks and Wireless Communications (IJCNWC), ISSN: 2250-3501,Vol. 2, No. 1, 2012, pp. 69-73.
Дьяконов А. Г. Анализ данных, обучение по прецедентам, логические игры, системы WEKA, RapidMiner и MatLab (Практикум на ЭВМ кафедры математических методов прогнозирования): Учебное пособие. – М.: Издательский отдел факультета ВМК МГУ имени М.В. Ломоносова, 2010. — 277 c.
Martin Fodslette Møller. A scaled conjugate gradient algorithm for fast supervised learning. Neural Networks, Volume 6, Issue 4, Pages 525-533.
Dong C. Liu, Jorge Nocedal On the limited memory BFGS method for large scale optimization // Mathematical Programming. — August 1989,  45, Issue 1-3, pp 503-528.
Warren S. McCulloch, Walter Pitts. A logical calculus of the ideas immanent in nervous activity // The bulletin of mathematical biophysics, December 1943, Volume 5, Issue 4, pp 115-133.
Rosenblatt, F. The perceptron: A probabilistic model for information storage and organization in the brain // Psychological Review, Vol 65(6), Nov 1958, 386-408. http://dx.doi.org/10.1037/h0042519/ 2 May 2015.
Marvin Minsky, Seymour Papert. Perceptrons, expanded edition. The MIT Press, 1987. – 308 p.
Минский М., Пейперт С. Персептроны.-М: Мир, 1971, 263 с.
Marvin Minsky, Seymour Papert. Perceptrons, expanded edition. The MIT Press, 1987, 308 pp
Werbos  Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences,  1974
Werbos, P.J. Backpropagation: past and future // IEEE International Conference  on Neural Networks, 24 Jul 1988-27 Jul 1988,343 — 353 vol.1, San Diego, CA, USAб 1109/ICNN.1988.23866
David Saad. Introduction. On-Line Learning in Neural Networks. Edited by David Saad. Cambridge University Press1998, digital version on 2009. pp 3-8. ISBN 978 — 0- 521 -65263 — 6
G Cybenco Approximation by superpositions of a sigmoidal function // Mathematics of Control Signals and Systems, 1989, 2 (4), 304-314.
Hornik K. et al. Multilayer feedforward networks are universal approximators // Neural Networks, 1989, 2, 359-366.
Галушкин А.И. Решение задач в нейросетевом логическом базисе. Нейрокомпьютеры: разработка, применение:Издательство «Радиотехника» (Москва), 2006, N2, c. 49-71 ISSN: 1999-8554
Галушкин А.И. Нейронные сети: основы теории. Горячая линия – Телеком, 2010,
Ясницкий Л.Н. Введение в искуственный интеллект: Уч.пос.для вузов. — М: Академия, 2008. -176с.
Нейрокомпьютеры: Учеб. Пособие для вузов.- М.: Изд-во МГТУ им. Н.Э. Баумана, 2004. – 320 с.
Dudani, Sahibsingh A. The distance-weighted k-nearest-neighbor rule // Systems, Man and Cybernetics, IEEE Transactions on4 (1976): 325-327.
K-nearest neighbor algorithm Support vector machine. http://en.wikipedia.org/wiki/Support_vector_machine02.2012http://en.wikipedia.org/wiki/K-nearest_neighbor_algorithm/ 5 Jun 2012.
Support vector machine. http://en.wikipedia.org/wiki/Support_vector_machine 22.02.2012 17 Mar 2013
Linear discriminant analysis. http://en.wikipedia.org/wiki/Linear_discriminant_analysis/ Nov2012
Sandrine Dudoit, Jane Fridlyand,Terence P. Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data // Journal of the American Statistical Association. – 97Issue 457, 2002, pages 77-87.
Нейрокомпьютер. Проект стандарта/ Е.М. Миркес — Новосибирск: Наука, Сибирская издательская фирма РАН, 1998.  http://pca.narod.ru/MirkesNeurocomputer.htm 19 Jan
Léon Bottou. Large-Scale Machine Learning with Stochastic Gradient Descent // Proceedings of COMPSTAT. — 2010. — pp. 177-186
Leon Bottou. Online learning and stochastic approximation. On-Line Learning in Neural Networks // Cambridge University Press1998, digital version on 2009. pp 9-43. ISBN 978 — 0- 521 -65263 – 6
Сheng-Tao Chu etc. Map-Reduce for Machine learning on multicore . Advances in Neural Information Processing Systems // Proceedings of the 2006 Conference.John PlattThomas Hofmann. MIT Press, 2007 — P. 281-310.
Jure Leskovec, Anand Rajaraman and Jeffrey David Ullman. Mining of Massive Datasets. Cambridge University Press,  2014 — 476 , ISBN: 9781107077232
Ghoting, Amol, Rajasekar Krishnamurthy, Edwin Pednault, Berthold Reinwald, Vikas Sindhwani, Shirish Tatikonda, Yuanyuan Tian, and Shivakumar Vaithyanathan. «SystemML: Declarative machine learning on MapReduce.» In // Data Engineering (ICDE). — 2011 IEEE 27th International Conference on, pp. 231-242. IEEE, 2011.
Kraska, Tim, Ameet Talwalkar, John C. Duchi, Rean Griffith, Michael J. Franklin, and Michael I. Jordan. MLbase: A Distributed Machine-learning System // Conference on Innovative Dta Systems Research (CIDR). – 2013. – P. 7-9.