Skip to main content

Classification of Slow and Fast Learners Using Deep Learning Model

  • Conference paper
  • First Online:
Computational Intelligence

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 968))

  • 494 Accesses

Abstract

Cognitive learning strategies are focused on the improvement of the learner’s ability to analyze information more deeply, efficiently handle new situations by transferring and applying the knowledge. These techniques result in enhanced and better-retained learning. To cater to the needs of different students having different levels of cognitive learning, it is very important to assess their learning ability. In this paper, a method based on deep learning is presented to classify the earners based on their past performance. This technique is taking the student's past semester marks, their total failures in subjects/passing heads, and their current semester attendance. The proposed method classifies the learners into three categories, namely slow, fast, and average learners. A deep learning classifier with multilayer perceptron-based nodes is built for the classification. The proposed method is fully automatic and robust. A final accuracy of 90% is achieved in the classification of the learners in their cognitive learning level.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 299.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 379.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 379.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Winn AS et al (2019) Applying cognitive learning strategies to enhance learning and retention in clinical teaching settings. MedEdPORTAL J Teach Learn Resour 15:10850. Web

    Google Scholar 

  2. Visser L, Korthagen FAJ, Schoonenboom J (2018) Differences in learning characteristics between students with high, average, and low levels of academic procrastination: students’ views on factors influencing their learning. Front Psychol 9. Web

    Google Scholar 

  3. Rosenblatt F, The perceptron: a probabilistic model for information storage and organization in the brain

    Google Scholar 

  4. Mcculloch WS, Pitts W (1990) A logical calculus of the ideas immanent in nervous activity

    Google Scholar 

  5. Pham T, Tran T, Phung D, Venkatesh S (2017) Predicting healthcare trajectories from medical records: a deep learning approach. J Biomed Inform 69:218–229. https://doi.org/10.1016/j.jbi.2017.04.001

    Article  Google Scholar 

  6. Sehar A (2013) Influence of the constructivist learning approach on students’ levels of learning trigonometry and on their attitudes towards mathematics. Hacettepe Üniversitesi Eğitim Fakültesi Dergisi 28(3):219–234

    Google Scholar 

  7. Anees S (2017) Analysis of assessment levels of students’ learning according to cognitive domain of bloom’s taxonomy. Online Submis 1–14. Print

    Google Scholar 

  8. Koparan T, Güven B (2015) The effect of project-based learning on students’ statistical literacy levels for data representation. Int J Math Educ Sci Technol 46(5):658–686. Web

    Google Scholar 

  9. Bouhamed H, Ruichek Y (2018) Deep feedforward neural network learning using local binary patterns histograms for outdoor object categization. Adv Model Anal B 61(3):158–162. https://doi.org/10.18280/ama_b.610309

    Article  Google Scholar 

  10. Hinton G et al (2012) Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process Mag 29(6):82–97. https://doi.org/10.1109/MSP.2012.2205597

    Article  Google Scholar 

  11. Mohamed AR, Dahl GE, Hinton G (2012) Acoustic modeling using deep belief networks. IEEE Trans Audio Speech Lang Process 20(1):14–22. https://doi.org/10.1109/TASL.2011.2109382

  12. Cireşan DC, Meier U, Gambardella LM, Schmidhuber J (2010) Deep, big, simple neural nets for handwritten digit recognition. Neural Comput 22(12):3207–3220. https://doi.org/10.1162/NECO_a_00052

  13. Yu D, Deng L (2011) Deep learning and its applications to signal and information processing [exploratory DSP]. IEEE Signal Process Mag 28(1):145–154. https://doi.org/10.1109/MSP.2010.939038

    Article  Google Scholar 

  14. Bengio Y (2009) Learning deep architectures for AI. Found Trends Mach Learn 2(1):1–27. https://doi.org/10.1561/2200000006

    Article  MathSciNet  MATH  Google Scholar 

  15. Bharadi VA, Mestry HA, Watve A (2019)Biometric authentication as a service (BaaS): a NOSQL database and CUDA based implementation. In: 2019 5th international conference on computing, communication, control and automation (ICCUBEA), Pune, India, pp 1–5. https://doi.org/10.1109/ICCUBEA47591.2019.9129570

  16. Bharadi VA, Tolye S (2020) Distributed decomposed data analytics of IoT, SAR and social network data. In: 2020 3rd international conference on communication system, computing and IT applications (CSCITA), Mumbai, India, pp 180–185. https://doi.org/10.1109/CSCITA47329.2020.9137785

  17. Bharadi VA, Meena M (2015) Novel architecture for CBIR SAAS on Azure cloud. In: 2015 international conference on information processing (ICIP), Pune, pp 366–371. https://doi.org/10.1109/INFOP.2015.7489409

  18. D'silva GM, Bharadi VA (2015) Modified online signature recognition using software as a service (SaaS) model on public cloud. In: 2015 international conference on information processing (ICIP), Pune, pp 360–365. https://doi.org/10.1109/INFOP.2015.7489408

  19. Zhang Z, Shan S, Fang Y, Shao L (2019) Deep learning for pattern recognition. Pattern Recogn Lett. https://doi.org/10.1016/j.patrec.2018.10.028

  20. Bouhamed H (2020) COVID-19 deaths previsions with deep learning sequence prediction. Int J Big Data Anal Healthc 5(2):65–77. https://doi.org/10.4018/ijbdah.20200701.oa1

    Article  Google Scholar 

  21. Karlik B, Olgac A (2010) Performance analysis of various activation functions in generalized MLP architectures of neural networks. Int J Artif Intell Exp Syst (IJAE) 1(4):111–22. http://www.cscjournals.org/csc/manuscript/Journals/IJAE/volume1/Issue4/IJAE-26.pdf

  22. Ramchoun H, Amine M, Idrissi J, Ghanou Y, Ettaouil M (2016) Multilayer perceptron: architecture optimization and training. Int J Interact Multimed Artif Intel 4(1):26. https://doi.org/10.9781/ijimai.2016.415

  23. Chollet F (2015) Keras: the python deep learning library. Keras.Io

    Google Scholar 

  24. Tao Z, Muzhou H, Chunhui L (2018) Forecasting stock index with multi-objective optimization model based on optimized neural network architecture avoiding overfitting. Comput Sci Inform Syst 15(1):211–36. https://doi.org/10.2298/CSIS170125042T

  25. Vasicek D (2019) Artificial intelligence and machine learning: practical aspects of overfitting and regularization. Inf Serv Use 39(4). https://doi.org/10.3233/isu-190059

  26. Wong TT, Yeh PY (2020) Reliable accuracy estimates from K-fold cross validation. IEEE Trans Knowl Data Eng 32(8):1586–94. https://doi.org/10.1109/TKDE.2019.2912815

  27. Burges CJC (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2(2):121–67. https://doi.org/10.1023/A:1009715923555

  28. Sperandei S (2014) Understanding logistic regression analysis. Biochemia Medica 24(1):12–18. https://doi.org/10.11613/BM.2014.003

    Article  Google Scholar 

  29. Chen S, Webb GI, Liu L, Ma X (2020) A novel selective Naïve Bayes algorithm. Knowl Based Syst 192. https://doi.org/10.1016/j.knosys.2019.105361

  30. Maillo J, Ramírez S, Triguero I, Herrera F (2017) KNN-IS: an iterative spark-based design of the k-nearest neighbors classifier for big data. Knowl-Based Syst 117:3–15. https://doi.org/10.1016/j.knosys.2016.06.012

  31. Song YY, Lu Y (2015) Decision tree methods: applications for classification and prediction. Shanghai Arch Psychiatr 27(2):130–35. https://doi.org/10.11919/j.issn.1002-0829.215044

  32. Carneiro T, Medeiros Da NóBrega RV, Nepomuceno T, Bian G, De Albuquerque VHC, Filho PPR (2018) Performance analysis of google colaboratory as a tool for accelerating deep learning applications. IEEE Access 6:61677–61685. https://doi.org/10.1109/ACCESS.2018.2874767

    Article  Google Scholar 

  33. Bharadi VA, Prasad KK, Mulye YG (2020) Using deep learning techniques for the classification of slow and fast learners (version 1.0). Zenodo. https://doi.org/10.5281/zenodo.4153494

Download references

Acknowledgements

This work is sponsored by University of Mumbai Minor research Grant Project ID: 1001, Sanctioned in Dec 2019. The hardware for this project work is given by NVIDIA; they have given two Jetson Nano boards (2GB) for the research work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. A. Bharadi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bharadi, V.A., Prasad, K.K., Mulye, Y.G. (2023). Classification of Slow and Fast Learners Using Deep Learning Model. In: Shukla, A., Murthy, B.K., Hasteer, N., Van Belle, JP. (eds) Computational Intelligence. Lecture Notes in Electrical Engineering, vol 968. Springer, Singapore. https://doi.org/10.1007/978-981-19-7346-8_39

Download citation

Publish with us

Policies and ethics