Author of the publication

SP-EyeGAN: Generating Synthetic Eye Movement Data with Generative Adversarial Networks.

, , , , , and . ETRA, page 18:1-18:9. ACM, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Reconstruction-guided attention improves the robustness and shape processing of neural networks., , and . CoRR, (2022)Characterizing Target-absent Human Attention., , , , , , , and . CVPR Workshops, page 5027-5036. IEEE, (2022)SP-EyeGAN: Generating Synthetic Eye Movement Data with Generative Adversarial Networks., , , , , and . ETRA, page 18:1-18:9. ACM, (2023)Towards Predicting Reading Comprehension From Gaze Behavior., , , and . ETRA Short Papers, page 32:1-32:5. ACM, (2020)Gazeformer: Scalable, Effective and Fast Prediction of Goal-Directed Human Attention., , , , , and . CVPR, page 1441-1450. IEEE, (2023)Benchmarking Gaze Prediction for Categorical Visual Search., , , , , , , , and . CVPR Workshops, page 828-836. Computer Vision Foundation / IEEE, (2019)Reading detection in real-time., , , , , , and . ETRA, page 43:1-43:5. ACM, (2019)Predicting Goal-Directed Human Attention Using Inverse Reinforcement Learning., , , , , , , and . CVPR, page 190-199. Computer Vision Foundation / IEEE, (2020)Target-Absent Human Attention., , , , , and . ECCV (4), volume 13664 of Lecture Notes in Computer Science, page 52-68. Springer, (2022)