Ikko Yamane, Assistant Professor

Paris-Dauphine University/RIKEN AIP

Publications

Conferences

  • Takashi Ishida, Ikko Yamane, Nontawat Charoenphakdee, Gang Niu, and Masashi Sugiyama.
    Is the performance of my deep network too good to be true? A direct approach to estimating the Bayes error in binary classification.
    In Proceedings of 11th International Conference on Learning Representations (ICLR 2023), 2023.
    [OpenReview]

  • Ikko Yamane, Yann Chevaleyre, Takashi Ishida, and Florian Yger.
    Mediated Uncoupled Learning and Validation with Bregman Divergences: Loss Family with Maximal Generality.
    In Proceedings of the 26th International Conference on Artificial Intelligence and Statistics (AISTATS 2023), Proceedings of Machine Learning Research, vol. 206, pp. 4768-4801, 2023.
    [paper]

  • Futoshi Futami, Tomoharu Iwata, Naonori Ueda, and Ikko Yamane.
    Skew-symmetrically perturbed gradient flow for convex optimization.
    In Proceedings of the 13th Asian Conference on Machine Learning (ACML 2021), Proceedings of Machine Learning Research, vol. 157, pp. 721-736, 2021.
    [paper]

  • Ikko Yamane, Junya Honda, Florian Yger, and Masashi Sugiyama.
    Mediated Uncoupled Learning: Learning Functions Without Direct Input-output Correspondences.
    In Proceedings of the 38th International Conference on Machine Learning (ICML 2021), Proceedings of Machine Learning Research, vol. 139, pp. 11637-11647, 2021.
    [arXiv version (latest)], [code on GitHub]

  • Tianyi Zhang, Ikko Yamane, Nan Lu, and Masashi Sugiyama.
    A One-step Approach to Covariate Shift Adaptation.
    In Proceedings of the 12th Asian Conference on Machine Learning (ACML 2020), Proceedings of Machine Learning Research, vol. 129, pp. 65-80, 2020.
    [ACML paper, video] (Best Paper Award!)

  • Takashi Ishida, Ikko Yamane, Tomoya Sakai, Gang Niu, and Masashi Sugiyama.
    Do We Need Zero Training Loss After Achieving Zero Training Error?
    In Proceedings of the 37th International Conference on Machine Learning (ICML 2020) Proceedings of Machine Learning Research, vol. 119, pp. 4604-4614, 2020.
    [ICML paper, arXiv version, code on GitHub]

  • Ikko Yamane, Florian Yger, Jamal Atif, and Masashi Sugiyama.
    Uplift Modeling from Separate Labels.
    In Advances in Neural Information Processing Systems 31 (NeurIPS 2018), pp. 9949-9959, 2018.
    [NeurIPS paper, arXiv version, code on GitHub]

  • Ikko Yamane, Florian Yger, Maxime Berar, and Masashi Sugiyama.
    Multitask Principal Component Analysis.
    In Proceedings of the 8th Asian Conference on Machine Learning (ACML 2016), Proceedings of Machine Learning Research, vol. 63, pp. 302-317, 2016.
    [ACML paper, code on GitLab]

Journal Articles

  • Tianyi Zhang, Ikko Yamane, Nan Lu, and Masashi Sugiyama.
    A One-Step Approach to Covariate Shift Adaptation.
    SN Computer Science. vol. 2, no. 319, 12 pages, 2021.
    [paper]

  • Ikko Yamane, Hiroaki Sasaki, and Masashi Sugiyama.
    Regularized Multi-Task Learning for Multi-Dimensional Log-Density Gradient Estimation.
    Neural Computation, vol. 28, no. 6, pp. 1388-1410, 2016.
    [paper]

  • Akinori Kawachi and Ikko Yamane.
    A Fourier-Analytic Approach to List-Decoding for Sparse Random Linear Codes.
    IEICE Transactions on Information and Systems, vol. E98-D, no. 3, pp. 532-540, 2015.
    [paper]