Skip to main content

Semantic Channel and Shannon’s Channel Mutually Match for Multi-label Classification

  • Conference paper
  • First Online:

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 539))

Abstract

A semantic channel consists of a set of membership functions or truth functions which indicate the denotations of a set of labels. In the multi-label learning, we obtain a semantic channel from a sampling distribution or Shannon’s channel. If samples are huge, we can directly convert a Shannon’s channel into a semantic channel by the third kind of Bayes’ theorem; otherwise, we can optimize the membership functions by a generalized Kullback–Leibler formula. In the multi-label classification, we partition an instance space with the maximum semantic information criterion, which is a special Regularized Least Squares (RLS) criterion and is equivalent to the maximum likelihood criterion. To simplify the learning, we may only obtain the truth functions of some atomic labels to construct the truth functions of compound labels. In a label’s learning, instances are divided into three kinds (positive, negative, and unclear) instead of two kinds as in the One-vs-Rest or Binary Relevance (BR) method. Every label’s learning is independent as in the BR method. However, it is allowed to train a label without negative examples and a number of binary classifications are not used. In the label selection, for an instance, the classifier selects a compound label with the most semantic information. This classifier has taken into the consideration the correlation between labels already. As a predictive model, the semantic channel does not change with the prior probability distribution (source) of instances. It still works when the source is changed. The classifier will vary with the source and hence can overcome the class-imbalance problem. It is shown that the old population’s increase will change the classifier for label “Old person” and has been impelling the evolution of the semantic meaning of “Old”. The CM iteration algorithm for unseen instance classification is introduced.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Zhang, M.L., Zhou, Z.H.: A review on multi-label learning algorithm. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)

    Article  Google Scholar 

  2. Zhang, M.L., Li, Y.K., Liu, X.Y., et al.: Binary relevance for multi-label learning: an overview. Front. Comput. Sci. 12(2), 191–202 (2018)

    Article  Google Scholar 

  3. Gold, K., Petrosino, A.: Using information gain to build meaningful decision forests for multilabel classification. In: Proceedings of the 9th IEEE International Conference on Development and Learning, pp. 58–63. Ann Arbor, MI (2010)

    Google Scholar 

  4. Doquire, G., Verleysen, M.: Feature selection for multi-label classification problems. In: Cabestany, J., Rojas, I., Joya, G. (eds.) IWANN 2011. LNCS, vol. 6691, pp. 9–16. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-21501-8_2

    Chapter  Google Scholar 

  5. Reyes, O., Morell, C., Ventura, S.: Effective active learning strategy for multi-label learning. Neurocomputing 273(17), 494–508 (2018)

    Article  Google Scholar 

  6. Lu, C.: B-fuzzy quasi-Boolean algebra and a generalize mutual entropy formula. Fuzzy Syst. Math. (in Chinese) 5(1), 76–80 (1991)

    Google Scholar 

  7. Lu, C.: A Generalized Information Theory (in Chinese). China Science and Technology University Press, Hefei (1993)

    Google Scholar 

  8. Lu, C.: A generalization of Shannon’s information theory. Int. J. Gener. Syst. 28(6), 453–490 (1999)

    Article  MathSciNet  Google Scholar 

  9. Lu, C.: Semantic channel and Shannon channel mutually match and iterate for tests and estimations with maximum mutual information and maximum likelihood. In: 2018 IEEE International Conference on Big Data and Smart Computing, pp. 227–234. IEEE Conference Publishing Services, Piscataway (2018)

    Google Scholar 

  10. Lu, C.: Channels’ matching algorithm for mixture models. In: Shi, Z., Goertzel, B., Feng, J. (eds.) ICIS 2017. IAICT, vol. 510, pp. 321–332. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68121-4_35

    Chapter  Google Scholar 

  11. Anon, Cross entropy, Wikipedia: the Free Encyclopedia. https://en.wikipedia.org/wiki/Cross_entropy. Edited on 13 Jan 2018

  12. Tarski, A.: The semantic conception of truth: and the foundations of semantics. Philos. Phenomenol. Res. 4(3), 341–376 (1944)

    Article  MathSciNet  Google Scholar 

  13. Davidson, D.: Truth and meaning. Synthese 17(1), 304–323 (1967)

    Article  Google Scholar 

  14. Zadeh, L.A.: Fuzzy sets. Inf. Control 8(3), 338–353 (1965)

    Article  Google Scholar 

  15. Bayes, T., Price, R.: An essay towards solving a problem in the doctrine of chance. Philos. Trans. R. Soc. Lond. 53, 370–418 (1763)

    Article  Google Scholar 

  16. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–429 and 623–656 (1948)

    Article  MathSciNet  Google Scholar 

  17. Popper, K.: Conjectures and refutations. Repr. Routledge, London and New York (1963/2005)

    Google Scholar 

  18. Goodfellow, I., et al.: Generative Adversarial Networks (2014). arXiv:1406.2661[cs.LG]

  19. Bar-Hillel, Y., Carnap, R.: An outline of a theory of semantic information. Technical report No. 247, Research Lab. of Electronics, MIT (1952)

    Google Scholar 

  20. Wang, P.Z.: Fuzzy Sets and Random Sets Shadow. Beijing Normal University Press, Beijing (1985). (in Chinese)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chenguang Lu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lu, C. (2018). Semantic Channel and Shannon’s Channel Mutually Match for Multi-label Classification. In: Shi, Z., Pennartz, C., Huang, T. (eds) Intelligence Science II. ICIS 2018. IFIP Advances in Information and Communication Technology, vol 539. Springer, Cham. https://doi.org/10.1007/978-3-030-01313-4_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01313-4_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01312-7

  • Online ISBN: 978-3-030-01313-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics