Publications

Showing entries 41 - 60 out of 108
Berner J, Dablander M, Grohs P. Numerically Solving Parametric Families of High-Dimensional Kolmogorov Partial Differential Equations via Deep Learning. In Larochelle H, Ranzato M, Hadsell R, Balcan MF, Lin H, editors, Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, December 6-12, 2020, virtual. Cambridge, Mass.: MIT Press. 2020. (Advances in neural information processing systems : ... proceedings of the ... conference, Vol. 33). doi: https://proceedings.neurips.cc/paper/2020/file/c1714160652ca6408774473810765950-Paper.pdf

Grohs P, Koppensteiner S, Rathmair M. Phase Retrieval: Uniqueness and Stability. SIAM Review. 2020;62(2):301-350. doi: https://doi.org/10.1137/19M1256865

Bauer L, Grohs P, Wohlschläger A, Plant C. Planting Synchronisation Trees for Discovering Interaction Patterns among Brain Regions. In Papapetrou P, Cheng X, He Q, editors, Proceedings - 19th IEEE International Conference on Data Mining Workshops, ICDMW 2019: 8–11 November 2019 Beijing, China. Piscataway, NJ: IEEE. 2019. p. 1035-1036. 8955527. (International Conference on Data Mining workshops). doi: 10.1109/ICDMW.2019.00149

Grohs P, Elbrächter D, Berner J. How degenerate is the parametrization of neural networks with the ReLU activation function? In 32nd Conference on Neural Information Processing Systems (NeurIPS 2019): Vancouver, Canada, 8-14 December 2019. Red Hook, NY: Curran Associates. 2019. p. 7790-7801. (Advances in neural information processing systems : ... proceedings of the ... conference, Vol. 32).

Grohs P, Sander O, Sprecher M, Hardering H. Projection-Based Finite Elements for Nonlinear Function Spaces. SIAM Journal on Numerical Analysis. 2019;57(1):404-428. doi: 10.1137/18M1176798

Alaifari R, Daubechies I, Grohs P, Yin R. Stable Phase Retrieval in Infinite Dimensions. Foundations of Computational Mathematics. 2019;19(4):869–900. doi: 10.1007/s10208-018-9399-7

Berner J, Elbrächter D, Grohs P, Jentzen A. Towards a regularity theory for ReLU networks – chain rule and global error estimates. In 2019 13th International Conference on Sampling Theory and Applications (SampTA). Piscataway, NJ: IEEE. 2019. p. 1-5 doi: 10.1109/SampTA45681.2019.9031005

Wiatowski T, Grohs P, Bölcskei H. Energy Propagation in Deep Convolutional Neural Networks. IEEE TRANSACTIONS ON INFORMATION THEORY. 2018 Jul;64(7):4819-4842. doi: 10.1109/TIT.2017.2756880

Grohs P, Elbrächter D, Perekrestenko D, Bölcskei H. The Universal Approximation Power of Finite-Width Deep ReLU Networks. arXiv.org. 2018 Jun 5. doi: https://doi.org/10.48550/arXiv.1806.01528

Grohs P, Sander O, Starck J-L, Wallner J. Nonlinear Data: Theory and Algorithms. Oberwolfach Reports. 2018;15(2):1161–1234. doi: 10.4171/OWR/2018/20

Grohs P, Sprecher M, Yu T. Scattered Manifold-Valued Data Approximation. Numerische Mathematik. 2017 Apr;135(4):987-1010. doi: 10.1007/s00211-016-0823-0

Grohs P, Wiatowski T, Bölcskei H. Energy decay and conservation in deep convolutional neural networks. In Kramer G, editor, 2017 IEEE International Symposium on Information Theory, ISIT 2017: Aachen, 25-30 June 2017. Piscataway, NJ: IEEE. 2017. p. 1356-1360. 8006750. (IEEE International Symposium on Information Theory , Vol. 2017). doi: 10.1109/ISIT.2017.8006750

Bölcskei H, Grohs P, Kutyniok G, Petersen PC. Memory Optimal Neural Network Approximation. In Lu YM, VanDeVille D, Papadakis M, editors, Wavelets and Sparsity XVII: 6-9 August 2017, San Diego, California, United States. Bellingham, Washington: SPIE. 2017. 103940Q. (Proceedings of SPIE, Vol. 10394). doi: 10.1117/12.2272490

Wiatowski T, Grohs P, Bölcskei H. Topology Reduction in Deep Convolutional Feature Extraction Networks. In Lu YM, Van De Ville D, Van De Ville D, Papadakis M, editors, Wavelets and Sparsity XVII: 6-9 August 2017, San Diego, California, United States. Bellingham, Washington: SPIE. 2017. 1039418. (Proceedings of SPIE, Vol. 10394). doi: 10.1117/12.2271761

Showing entries 41 - 60 out of 108