• DocumentCode
    258104
  • Title

    Communication requirement for distributed statistical machine learning with application in waveform cognition

  • Author

    Husheng Li ; Zhu Han

  • Author_Institution
    Dept. of Electr. Eng. & Comput. Sci., Univ. of Tennessee, Knoxville, TN, USA
  • fYear
    2014
  • fDate
    3-5 Dec. 2014
  • Firstpage
    1311
  • Lastpage
    1314
  • Abstract
    Distributed learning is an effective approach to mitigate the data communications in machine learning when the data is stored in a distributed manner, particularly in the era of big data. In the distributed learning procedure, learners can send intermediate computation results instead of raw data, thus reducing the communication cost. In this paper, the communication requirement for distributed learning is studied in the scenario of multiple data storage nodes having the capability of learning and a fusion center. Lower bounds for communications are derived based on VC-entropy of modeling in the machine learning. Numerical results are provided to show the communication requirement for typical learning problems.
  • Keywords
    Big Data; cognition; data mining; entropy; learning (artificial intelligence); VC-entropy; big data; communication requirement; data communications; distributed learning; distributed statistical machine learning; multiple data storage nodes; waveform cognition; Big data; Cognitive radio; Data mining; Distributed databases; Entropy; Signal processing; Statistical learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Signal and Information Processing (GlobalSIP), 2014 IEEE Global Conference on
  • Conference_Location
    Atlanta, GA
  • Type

    conf

  • DOI
    10.1109/GlobalSIP.2014.7032335
  • Filename
    7032335