{"id":5061,"date":"2018-09-11T22:51:31","date_gmt":"2018-09-11T19:51:31","guid":{"rendered":"https:\/\/hpc-education.unn.ru\/?page_id=5061"},"modified":"2018-10-18T13:10:37","modified_gmt":"2018-10-18T10:10:37","slug":"introduction-to-deep-learning-using-the-intel-neon-framework","status":"publish","type":"page","link":"https:\/\/hpc-education.unn.ru\/en\/trainings\/collection-of-courses\/introduction-to-deep-learning-using-the-intel-neon-framework","title":{"rendered":"Introduction to deep learning using the Intel\u00ae neon\u2122 Framework"},"content":{"rendered":"<h2>Description<\/h2>\n<p>The course examines the construction and the performance analysis of deep neural networks using the Intel\u00ae neon\u2122 Framework.<\/p>\n<p>The following topics are covered:<\/p>\n<ol>\n<li>Introduction to deep learning.<\/li>\n<li>Multilayered fully-connected neural networks.<\/li>\n<li>Introduction to the Intel\u00ae neon\u2122 Framework.<\/li>\n<li>Convolutional neural networks. Deep residual networks.<\/li>\n<li>Transfer learning of deep neural networks.<\/li>\n<li>Unsupervised learning: autoencoders, deconvolutional networks.<\/li>\n<li>Recurrent neural networks.<\/li>\n<li>Introduction to the Intel\u00ae nGraph\u2122.<\/li>\n<\/ol>\n<p>The course is practice oriented. There are 8 lectures (1.5 hours each) and 5 individual consultations in groups of 2-3 people (for each group). Lectures are held in plain lecture or master class (tutorial) form. The presentation of the theoretical material in most lectures\/master classes is supported by examples of developing a deep neural network architecture using the Intel\u00ae neon\u2122 Framework. The problem for which deep models are constructed is comprehensive and covers the entire lecture part, with the exception of an introductory lecture of a survey nature. The practice of the course is structured as follows: students are divided into groups of 2-3 people. Each group chooses a separate problem and tries to achieve the maximum quality by constructing different types of deep architectures and modifying their internal structure. Students follow the provided tutorials that represent step-by-step deep model development using the Intel\u00ae neon\u2122 Framework. The mode of collective development is simulated. The final control of knowledge assumes presentation of the developed project with demonstration of quality\/performance measurements of proposed deep neural networks.<\/p>\n<p>The course is aimed at engineers, teachers and researchers, as well as postgraduate students and students of higher educational institutions.<\/p>\n<h2>Preliminary requirements<\/h2>\n<p>The course is aimed at students who have basic programming skills in the scripting programming language Python. Along with this, the course requires theoretical knowledge in the field of optimization methods, probability theory, image processing and computer vision.<\/p>\n<h2>Links<\/h2>\n<p>Syllabus is available <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/!Syllabus_Eng.pdf\">here<\/a>.<\/p>\n<p>Sources for all practical classes to solve the task considered at the lectures are available <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Src.zip\">here<\/a>. Experimental results are available <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/README.md\">here<\/a>.<\/p>\n<p>The list of practical tasks for solving by the group of students is available <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/!Tasks_Eng.pdf\">here<\/a>.<\/p>\n<h2>Licence<\/h2>\n<p>The licence is available <a href=\"http:\/\/www.apache.org\/licenses\/LICENSE-2.0.txt\">here<\/a>.<\/p>\n<h2>Authors<\/h2>\n<p><strong>Kustikova Valentina Dmitrievna<\/strong>, Phd, Prof. Assistant, department of Computer software and supercomputer technologies, Institute of Information Technologies, Mathematics and Mechanics, Nizhny Novgorod State University. Lead and developer.<\/p>\n<p><strong>Zolotykh Nikolai Yurievich<\/strong>, Dr., Prof., department of Algebra, geometry and discrete mathematics, Institute of Information Technologies, Mathematics and Mechanics, Nizhny Novgorod State University. Scientific adviser.<\/p>\n<p><strong>Zhiltsov Maxim Sergeevich<\/strong>, master of the 1st year training, Institute of Information Technology, Mathematics and Mechanics, Nizhny Novgorod State University. Developer.<\/p>\n<p><em><strong>The course development is supported by Intel Corporation.<\/strong><\/em><\/p>\n<h2>The course structure<\/h2>\n<p><strong>LECTURE 1. Introduction to deep learning<\/strong><\/p>\n<p>The notion of deep learning. Biological fundamentals of deep learning. Examples of practical problems. Classification of deep models.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/1_Deep learning intro.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/1_Deep learning intro.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 2. Multilayered fully-connected neural networks<\/strong><\/p>\n<p>The structure of fully-connected neural networks (FCNN), types of activation functions. Training problem of FCNN, loss function. Backpropagation method.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/2_FCNN.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/2_FCNN.pdf\">docx<\/a>)<\/p>\n<p><strong>PRACTICE 0.&nbsp;Preprocessing and converting data to HDF5 format for the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Preliminary practice to prepare dataset for the subsequent practical classes.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Practice\/Tutorials\/Practice0_intro.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 3. Introduction to the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Introduction to the Intel\u00ae neon\u2122 Framework. Installation. The structure of application for training\/testing of the single-layer fully-connected neural network using the Intel\u00ae neon\u2122 Framework.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/3_Intro to Intel Neon.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/3_Intro to Intel Neon.pdf\">docx<\/a>)<\/p>\n<p><strong>PRACTICE 1. The development of fully-connected neural networks using the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Problem statement for the laboratory works. Development of architectures of fully-connected networks with a different number of hidden layers and the number of hidden elements on each layer. Developing scripts for training\/testing the proposed architectures. Carrying out experiments, collecting performance results.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Practice\/Tutorials\/Practice1_mlp.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 4. Convolutional neural networks. Deep residual networks<\/strong><\/p>\n<p>The structure of convolutional layer and network. Example of training\/testing a single-layer convolutional network using the Intel\u00ae neon\u2122 Framework. Deep residual networks, a typical structural block, an example of a residual network.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/4_CNN.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/4_CNN.pdf\">docx<\/a>)<\/p>\n<p><strong>PRACTICE 2. The development of convolutional neural networks using the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Development of convolutional network architectures with different number of hidden layers and filter parameters on each layer. Developing scripts for training\/testing the proposed architectures. Carrying out experiments, collecting performance results.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Practice\/Tutorials\/Practice2_cnn.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 5. Transfer learning of deep neural networks<\/strong><\/p>\n<p>Description of the general approach underlying the transfer learning in deep neural networks. An example of transfer learning application using the Intel\u00ae neon\u2122 Framework.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/5_TransferLearning.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/5_TransferLearning.pdf\">docx<\/a>)<\/p>\n<p><strong>PRACTICE 3. Application of transfer learning to solve a given problem using the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Selection of the original problem (connected with a given problem) and a corresponded trained model. Modification of the network architecture for a given problem. Complete learning of the parameters of all network layers with arbitrary initialization. Training of all layers of parameters of all layers of the network with initialization, obtained as a result of learning the model to solve the original problem. Learning only the last layers (modified) of the network with initial initialization, obtained as a result of learning the model to solve the original problem.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Practice\/Tutorials\/Practice3_transfer.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 6. Unsupervised learning: autoencoders, deconvolutional networks<\/strong><\/p>\n<p>Unsupervised learning methods. The concept of an autoencoder, a stack of autoencoders, deconvolutional networks.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/6_UnsupervisedLearning.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/6_UnsupervisedLearning.pdf\">docx<\/a>)<\/p>\n<p><strong>PRACTICE 4. Initial pre-training the weights of the most perspective architectures of fully-connected networks for the subsequent solution of a given problem in supervised style using the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Selection of several fully-connected neural networks. Developing a stack of autoencoders. Training of the developed architectures. Application of the obtained initial weights for training the network in supervised style to solve a given problem.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Practice\/Tutorials\/Practice4_ae.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 7. Recurrent neural networks<\/strong><\/p>\n<p>The general structure of the model. Deploying a recurrent network in time. Recurrent networks training. Long short-term memory network. An example of training\/testing a simple recurrent network using the Intel\u00ae neon\u2122 Framework.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/7_RNN.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/7_RNN.pdf\">docx<\/a>)<\/p>\n<p><strong>PRACTICE 5. The development of recurrent neural networks using the Intel\u00ae neon\u2122 Framework<\/strong><\/p>\n<p>Development of architectures of recurrent neural networks with a different number of hidden layers and the number of hidden elements on each layer. Developing scripts for training\/testing the proposed architectures. Carrying out experiments, collecting performance results.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Practice\/Tutorials\/Practice5_rnn.pdf\">docx<\/a>)<\/p>\n<p><strong>LECTURE 8. Efficient execution of neural networks. The Intel\u00ae nGraph\u2122 overview<\/strong><\/p>\n<p>Introduction to the Intel\u00ae nGraph\u2122. The neon\u2122 frontend to Intel\u00ae nGraph\u2122.<\/p>\n<p>(<a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Presentations\/8_nGraph library.pdf\">pptx<\/a>, <a href=\"https:\/\/hpc-education.unn.ru\/files\/courses\/intel-neon-course\/Eng\/Lectures\/Annotations\/8_nGraph library.pdf\">docx<\/a>)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Description The course examines the construction and the performance analysis of deep neural networks using the Intel\u00ae neon\u2122 Framework. The following topics are covered: Introduction to deep learning. Multilayered fully-connected neural networks. Introduction to the Intel\u00ae neon\u2122 Framework. Convolutional neural &hellip; <a href=\"https:\/\/hpc-education.unn.ru\/en\/trainings\/collection-of-courses\/introduction-to-deep-learning-using-the-intel-neon-framework\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":40,"featured_media":0,"parent":2102,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"ngg_post_thumbnail":0},"_links":{"self":[{"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/pages\/5061"}],"collection":[{"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/users\/40"}],"replies":[{"embeddable":true,"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/comments?post=5061"}],"version-history":[{"count":3,"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/pages\/5061\/revisions"}],"predecessor-version":[{"id":5451,"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/pages\/5061\/revisions\/5451"}],"up":[{"embeddable":true,"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/pages\/2102"}],"wp:attachment":[{"href":"https:\/\/hpc-education.unn.ru\/en\/wp-json\/wp\/v2\/media?parent=5061"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}