Assembly output codes for learning neural networks

Philippe Tigréat 1, 2 Carlos Rosar Kos Lassance 1, 2 Xiaoran Jiang 3 Vincent Gripon 1, 2 Claude Berrou 1, 2
1 Lab-STICC_TB_CACS_IAS
Lab-STICC - Laboratoire des sciences et techniques de l'information, de la communication et de la connaissance
3 Sirocco - Analysis representation, compression and communication of visual data
Inria Rennes – Bretagne Atlantique , IRISA-D5 - SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE
Abstract : Neural network-based classifiers usually encode the class labels of input data via a completely disjoint code, i.e. a binary vector with only one bit associated with each category. We use coding theory to propose assembly codes where each element is associated with several classes, making for better target vectors. These codes emulate the combination of several classifiers, which is a well-known method to improve decision accuracy. Our experiments on data-sets such as MNIST with a multi-layer neural network show that assembly output codes, which are characterized by a higher minimum Hamming distance, result in better classification performance. These codes are also well suited to the use of clustered clique-based networks in category representation.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01502488
Contributor : Philippe Tigreat <>
Submitted on : Wednesday, April 5, 2017 - 3:51:42 PM
Last modification on : Thursday, December 19, 2019 - 1:23:14 AM
Long-term archiving on: Thursday, July 6, 2017 - 1:34:03 PM

File

PapierISTC.pdf
Files produced by the author(s)

Identifiers

Citation

Philippe Tigréat, Carlos Rosar Kos Lassance, Xiaoran Jiang, Vincent Gripon, Claude Berrou. Assembly output codes for learning neural networks. ISTC 2016 - 9th International Symposium on Turbo Codes & Iterative Information Processing, Sep 2016, Brest, France. pp.285-289, ⟨10.1109/ISTC.2016.7593122⟩. ⟨hal-01502488⟩

Share

Metrics

Record views

646

Files downloads

525