About me
I am a PhD Student at Jagiellonian Univeristy in Kraków, Poland. I am working in the Group for Machine Learning Research (GMUM) led by prof. Jacek Tabor. My main research interests is the analysis and development of sparse neural networks. In particular, I focus on advancing tools that allow the network to change and adapt already during the training.
News
- (May 2024) Our work, Sparser, Better, Deeper, Stronger: Improving Sparse Training with Exact Orthogonal Initialization has been accepted to ICML 2024! See in Vienna and many thanks to co-authors!
- (December 2023) At NeuRIPS 2023 presenting two of my papers, Trust Your ∇ : Gradient-based Intervention Targeting for Causal Discovery and Fantastic Weights and How to Find Them: Where to Prune in Dynamic Sparse Training! Be sure to drop by at the poster session!
- (October 2023) Super excited to start my internship as Student Researcher in Google DeepMind in Montreal, where I will be workig on efficeint adaptation in transfer learning!
- (May 2023) See you at the ICLR23 SNN Workshop in Kigali, Rwanda! Many thanks to all other organizers!
- (January 2023) I am joining IDEAS NCBR as Student PhD Researcher working on efficient deep neural networks and sparse architectures.
- (September 2022) Starting a research internship at Univeristy of Twente, Netherlands in the VScAIL group led by dr. Decebal Mocanu.
- (July 2022) Presenting our work, Connectivity Properties of Neural Networks Under Performance-Resources Trade-off at the Dynamic Neural Networks, ICML 2022 Workshop.
- (July 2022) The MLSS^N school if finally here! Thanks to all other organizers and volunteers for the enormous work and passion put into making the event possible. The lectures are available here.
- (June 2022) Two my papers, Discovering wiring patterns influencing neural networks performance and On the relationship between disentanglement and multi-task learning have been accepted to ECML PKDD 2022.
- (April 2022) I presented our work on analyzing neural network architectures based on random graphs at the meeting From Neuroscience to Artificially Intelligent Systems (NAISys) in Cold Spring Harbor Laboratory (CSHL), USA.
- (November 2021) - Co-organized the ML in PL Conference 2021.
- (November 2021) - I’ve participated in the Google Women in Tech Mentoring Program (online), together with 31 selected students across Poland.
- (September 2021) Our work Non-Gaussian Gaussian Processes for Few Shot Regression has been accepted to NeurIPS 2021.
- (July 2020) Co-organizing the Eastern European Machine Learning Summer School 2020 (ECML 2020).
- (December 2019) Presenting our work on Non-linear ICA based on Cramer-Wold metric on ICONIP in Sydney, Australia.
- (November 2019) Co-conducting workshops in reinforcement learning at the MLinPL conference.
- (October 2019) I’ve started my PhD studies at Jagiellonian Univeristy in Kraków, Poland.
Selected Publications

Sparser, Better, Deeper, Stronger: Improving Sparse Training with Exact Orthogonal Initialization
Aleksandra Nowak, Łukasz Gniecki, Filip Szatkowski, Jacek Tabor
ICML 2024
[Paper]
Aleksandra Nowak, Łukasz Gniecki, Filip Szatkowski, Jacek Tabor
ICML 2024
[Paper]

Fantastic Weights and How to Find Them: Where to Prune in Dynamic Sparse Training
Aleksandra Nowak, Bram Grooten, Decebal C. Mocanu, Jacek Tabor
NeurIPS 2023
[Paper]
Aleksandra Nowak, Bram Grooten, Decebal C. Mocanu, Jacek Tabor
NeurIPS 2023
[Paper]

Trust Your ∇ : Gradient-based Intervention Targeting for Causal Discovery
Mateusz Olko*, Michał Zając*, Aleksandra Nowak*, Nino Scherrer, Yashas Annadani, Stefan Bauer, Łukasz Kuciński, Piotr Miłoś
NeurIPS 2023
[Paper]
Mateusz Olko*, Michał Zając*, Aleksandra Nowak*, Nino Scherrer, Yashas Annadani, Stefan Bauer, Łukasz Kuciński, Piotr Miłoś
NeurIPS 2023
[Paper]

Discovering wiring patterns influencing neural networks performance
Aleksandra Nowak, Romuald Janik
ECML PKDD 2022
[Paper]
Aleksandra Nowak, Romuald Janik
ECML PKDD 2022
[Paper]

Neural Networks Adapting to Datasets: Learning Network Size and Topology
Romuald Janik, Aleksandra Nowak
Dynamic Neural Networks, ICML 2022 Workshop
[Paper]
Romuald Janik, Aleksandra Nowak
Dynamic Neural Networks, ICML 2022 Workshop
[Paper]

Non-Gaussian Gaussian Processes for Few Shot Regression
Marcin Sendera, Jacek Tabor, Aleksandra Nowak, Andrzej Bedychaj, Massimiliano Patacchiola, Tomasz Trzcinski, Przemysław Spurek, Maciej Zieba
NeurIPS 2021
[Paper]
Marcin Sendera, Jacek Tabor, Aleksandra Nowak, Andrzej Bedychaj, Massimiliano Patacchiola, Tomasz Trzcinski, Przemysław Spurek, Maciej Zieba
NeurIPS 2021
[Paper]