Using deep learning to understand the level of attention and engagement of students. Ensuring privacy, having real-time feedback on the delivery of coursework will help lecturers/presenters, make improvements vs. waiting once or twice a year for this information.
This deep dive is all about neural networks - training them using best practices, debugging them and maximizing their performance using cutting edge research.
Students in the future will be able to personalise their learning while teachers can monitor their engagement and behaviour, according to ed-tech experts. Opening the EdTechX conference in London today, Benjamin Vedrenne-Cloquet said the future of education lies with artificial intelligence and deep learning, citing the movement towards data and "deep tech" in new ed-tech companies, away from the "lighter tech" of digitisation of content seen at the beginning of the decade.
note footnote at the bottom: "http://www.sciencemag.org/content/313/5786/504.abstract, http://www.cs.toronto.edu/~amnih/cifar/talks/salakhut_talk.pdf. In a strict sense, this work was obsoleted by a slew of papers from 2011 which showed that you can achieve similar results to this 2006 result with “simple” algorithms, but it’s still true that current deep learning methods are better than the best “simple” feature learning schemes, and this paper was the first example that came to mind. [return]"
Z. Long, Y. Lu, and B. Dong. (2018)cite arxiv:1812.04426Comment: 16 pages, 15 figures. arXiv admin note: substantial text overlap with arXiv:1710.09668.
A. Dulny, A. Hotho, and A. Krause. KI 2022: Advances in Artificial Intelligence - 45th German Conference on AI, Trier, Germany, September 19-23, 2022, Proceedings, volume 13404 of Lecture Notes in Computer Science, page 75--89. Springer, (2022)
J. Lin, R. Nogueira, and A. Yates. (2020)cite arxiv:2010.06467Comment: Final preproduction version of volume in Synthesis Lectures on Human Language Technologies by Morgan & Claypool.
R. Wang, E. Durmus, N. Goodman, and T. Hashimoto. (2022)cite arxiv:2203.11370Comment: ICLR Oral 2022. Code: https://github.com/rosewang2008/language_modeling_via_stochastic_processes.
Y. Xie, F. Shu, J. Rambach, A. Pagani, and D. Stricker. (2021)cite arxiv:2110.11219Comment: accepted to BMVC 2021, code opensource: https://github.com/EryiXie/PlaneRecNet.
W. Merrill, V. Ramanujan, Y. Goldberg, R. Schwartz, and N. Smith. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, page 1766--1781. Online and Punta Cana, Dominican Republic, Association for Computational Linguistics, (November 2021)
X. Du, Z. Yu, B. Zhu, X. Chen, and Z. Ma. Proceedings of the International Conference on Acoustics, Speech and Signal Processing, page 551--555. IEEE, (June 2021)
J. Berner, P. Grohs, G. Kutyniok, and P. Petersen. (2021)cite arxiv:2105.04026Comment: This review paper will appear as a book chapter in the book "Theory of Deep Learning" by Cambridge University Press.
Z. Long, Y. Lu, and B. Dong. (2018)cite arxiv:1812.04426Comment: 16 pages, 15 figures. arXiv admin note: substantial text overlap with arXiv:1710.09668.
J. Sun, Y. Xie, L. Chen, X. Zhou, and H. Bao. (2021)cite arxiv:2104.00681Comment: Accepted to CVPR 2021 as Oral Presentation. Project page: https://zju3dv.github.io/neuralrecon/.
S. Wang, L. Hu, Y. Wang, X. He, Q. Sheng, M. Orgun, L. Cao, F. Ricci, and P. Yu. (2021)cite arxiv:2105.06339Comment: Accepted by IJCAI 2021 Survey Track, copyright is owned to IJCAI. The first systematic survey on graph learning based recommender systems. arXiv admin note: text overlap with arXiv:2004.11718.
M. Paris, and R. Jäschke. Proceedings of the 14th International Conference on Knowledge Science, Engineering and Management, volume 12816 of Lecture Notes in Artificial Intelligence, page 1--14. Springer, (2021)
S. Löwe, P. O'Connor, and B. Veeling. (2019)cite arxiv:1905.11786Comment: Honorable Mention for Outstanding New Directions Paper Award at NeurIPS 2019.