Bartosz Malman

Hi! My name is Bartosz, I live in Stockholm, Sweden, where I am currently a lecturer in mathematics at KTH Royal Institute of Technology.

I teach and I do research. My mathematical research is focused on the subjects of operator theory and complex analysis. In particular, I like to work with spaces of analytic functions and study linear operations defined on them.

Previously, I was stationed in Lund at Matematikcentrum where I completed my doctoral work in May 2020, with Professor Alexandru Aleman as my advisor. My PhD thesis can be found here. I have also obtained a Master's degree in computer science and engineering at Lunds Tekniska Högskola.


Address: Småbrukets Backe 14,
141 58 Huddinge,


Address: Hälsovägen 11,
141 52 Huddinge,
A pdf version of my CV is here.

Publications and preprints

  1. Construction of some smooth Cauchy transforms, Bartosz Malman, preprint (2021) arXiv:2111.14112
  2. Inner functions, invariant subspaces and cyclicity in Pt(mu)-spaces, Adem Limani and Bartosz Malman, preprint (2021) arXiv:2108.08625
  3. On the problem of smooth approximations in de Branges-Rovnyak spaces and connections to subnormal operators, Adem Limani and Bartosz Malman, preprint (2021). arXiv:2108.08629
  4. An abstract approach to approximations in spaces of pseudocontinuable functions, Adem Limani and Bartosz Malman, accepted for publication in Proceedings of the American Mathematical Society (2021). arXiv:2106.09828
  5. On model spaces and density of functions smooth on the boundary, Adem Limani and Bartosz Malman, preprint (2020). arXiv:2101.01746
  6. Generalized Cesàro operators: geometry of spectra and quasi-nilpotency, Adem Limani and Bartosz Malman, accepted for publication in International Mathematics Research Notices (2020). doi
  7. Hilbert spaces of analytic functions with a contractive backward shift, Alexandru Aleman and Bartosz Malman, Journal of Functional Analysis 277.1 (2019): 157-199. doi
  8. Nearly invariant subspaces of de Branges spaces , Bartosz Malman, preprint (2019). arXiv:1902.10450
  9. Spectra of Generalized Cesàro Operators Acting on Growth Spaces, Bartosz Malman, Integral Equations and Operator Theory 90.3 (2018): 26. doi
  10. Density of disk algebra functions in de Branges–Rovnyak spaces, Alexandru Aleman and Bartosz Malman, Comptes Rendus Mathematique 355.8 (2017): 871-875. doi

Doctoral thesis

Master thesis

I have coded a small drawn digit recognition tool in Python, Javascript and PHP. The idea was to learn a bit about this new fashionable concept of artificial neural networks, and not use any machine learning libraries but instead get a feeling for the subject by deriving the equations and building every piece from scratch. I have used the excellent book by Michael Nielsen as my knowledge source. The mathematical theory requires not more than a university course in multivariable calculus, and perhaps a bit of linear algebra, to understand.

I have used the simplest fully-connected neural networks, with the cross-entropy cost function and L2 weight regularization techniques, as well as some hard-to-justify improvisations that I thought might help. The code for learning was written in Python and can be found here. It is not optimized in any particular way, but runs fast enough to train the networks in reasonable time on my cheap personal laptop. A learned network was then ported to PHP, and the app below does some necessary pre-processing of the input image in Javascript.

I have trained several networks. The first two, which I call B1 and B2, will be used if you press the blue buttons below. They are both trained using the classical MNIST database of 60000 scanned images of handwritten digits. The first network one consists of two hidden layers, 100 neurons in the first and 10 in the second. I've achieved a 98.01% accuracy on the MNIST test data set. The network B2 consists of 150, 200 and 100 hidden neurons in the three hidden layers. It incidentally achieved the same 98.01% accuracy as the simpler B1. The two networks B3 and B4 will be used if you press the green buttons below, and these networks are trained using the EMNIST database . This data set is much larger than MNIST as it consists of 240000 images of handwritten digits. The corresponding article describes the structure of the data in EMNIST in detail. B3 consists of one hidden layer of 250 neurons, while B4 is a bit deeper. It has three hidden layers of 300, 250 and 200 neurons in the layers. I've achieved 98.95% and 99.19% accuracy on the EMNIST test data set using the networks B3 and B4 respectively.

I have also trained a convolutional neural network K1 using the Python library Keras, the architecture can be inferred from my script here. The network was trained in Python and then ported to Javascript using TensorFlow.js. Unsurprisingly it is a bit better than my own fully-connected networks, and reaches as much as 99.5% accuracy on EMNIST test set.

In the drawn digit recognition below, the estimates of the networks do sometimes differ. When one fails, sometimes some other succeeds, suggesting that an ensamble of networks could probably improve the results. An average of the networks estimates will be displayed after pressing the vote button below. The pie chart appearing on the right side represents some sort of confidence in the decision that the network made, in the sense that it displays the "probabilities" that the input image is a certain digit. Check yourself how it recognizes digits drawn by you. When I draw the digit, first two networks seem to have some trouble with detecting my "8s", and they often think it is a "3". My "4" sometimes kind of looks like a "9", then this is evidenced in the probabilities in the pie chart. I find that if I put in minimal effort into drawing the digit, the recognition works great except for 8s. Of course, the data used for learning was actual scanned human-handwritten digits, while here we input something drawn on a computer screen, so certain difficulties are understandable.

Draw in the left box.

Here is a talk of mine which I contributed to the Fields Institute Focus Program on Analytic Function Spaces and their Applications, in October 2021. It presents some of my research on approximation theory in analytic function spaces, and this particular work has been done jointly with Adem Limani from Lund University.