• DocumentCode
    3734282
  • Title

    On the weight sparsity of multilayer perceptrons

  • Author

    Georgios Drakopoulos;Vasileios Megalooikonomou

  • Author_Institution
    Multidimensional Data Analysis and Knowledge Management Lab, Computer Engineering and Informatics Department, University of Patras, Patras 26500, Hellas
  • fYear
    2015
  • fDate
    7/1/2015 12:00:00 AM
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    Approximating and representing a process, a function, or a system with an adaptive parametric model constitutes a major part of current machine learning research. An important characteristic of these models is parameter sparsity, an indicator of how succintly a model can codify fundamental properties of the approximated function. This paper investigates the sparsity patterns of a multilayer perceptron netwrok trained to mount a man-on-the-middle attack on the DES symmetric cryptosystem. The notions of absolute and effective synaptic weight sparsity are introduced and their importance to network learning procedure is explained. Finally, the results from the training of the actual multilayer perceptron are outlined and discussed. In order to promote reproducible research, the MATLAB network implementation has been posted in GitHub.
  • Keywords
    "Neurons","Biological neural networks","Multilayer perceptrons","Encryption","Training","Standards"
  • Publisher
    ieee
  • Conference_Titel
    Information, Intelligence, Systems and Applications (IISA), 2015 6th International Conference on
  • Type

    conf

  • DOI
    10.1109/IISA.2015.7388096
  • Filename
    7388096