Thursday, January 16Manfred Denker, Penn State
2:30pmNeural networks and labeled trees
ABSTRACTIt has been observed in experiments that the number of active neurons in a neural network obeys a power law: The chances of having L of them is proportional to L^{-3/2}. I will discuss the question how this can be derived in a mathematically rigorous way from simple assumptions. It turns out that this question is related to an old theorem of Caylay (in 1889) about the number of labeled trees with a fixed number of vertices.

The mathematical tools used in the talk are taken from combinatorics (placing balls in boxes), binomial coefficients, and analysis (differentiation).
Thursday, January 30Mihran Paikian, Penn State
2:30pmCoverings by systems of linear equations
ABSTRACTLet V be a finite dimensional vector space over a finite field.
Let S be a subset of V. What is the minimal number of systems of
linear equations such that the union of their solutions exactly coincides with S?
The motivation for this question comes from logic and computer science.
I will explain all basic concepts required for understanding this problem, and
then discuss some algebraic and geometric approaches to its solution in special cases.
Thursday, February 13Anatole Katok, Penn State
2:30pmBilliard table as mathematician's playground
ABSTRACTMotion of an ideal particle ("a billiard ball") inside a bounded plane domain with the usual elastic reflection law `"the angle of incidence is equal to the angle of reflection'' is one of the simplest mechanical problems and attempts to describe this motion lead both to remarkable open problems and to connections with a variety of deep mathematical theories. For example, it is still not known whether inside any triangle there is a periodic billiard motion. It is also not known whether there is twice differentiable convex curve such that some billiard motion inside it is dense. In this talk I will give a very brief and elementary introduction to the beauty and complexity of this problem that looks so deceptively simple.
Thursday, February 27Victoria Sadovskaya, Penn State
2:30pmProbability puzzles
ABSTRACTWe will discuss several problems in elementary probability
where intuition may give a wrong answer while the correct one seems,
at first glance, unlikely or even impossible.
Thursday, March 6Vadim Kaloshin, University of Maryland, College Park
2:30pmCan you hear the shape of the drum?
ABSTRACT
Thursday, April 3Joe Roberts, Penn State
2:30pmGrouping and rearranging terms in infinite series
ABSTRACTWhen studying infinite series of real numbers, it is tempting to expect the familiar properties of addition to hold, and in some cases they do. Series that converge absolutely are as well behaved as finite series --- they are associative and commutative. However, series that are not absolutely convergent have very different properties. The Riemann Rearrangement Theorem states that any conditionally convergent series can be reordered to sum to any real number or to diverge (illustrating a failure of commutativity), and there are divergent series for which one can choose a subsequence of partial sums that converge to any arbitrary real number (a failure of associativity). I will give examples of these kinds of unexpected outcomes, eventually building to a construction due to Sierpinski of a single power series whose terms can be grouped to converge uniformly to any continuous function on [0,1] vanishing at 0 --- a "universal Taylor series".
Thursday, April 17Jason Rute, Penn State
2:30pmWhat is a random number?
ABSTRACTSuppose there was a new lottery, Power Coin. Once a week, the lotto agency flips a fare coin 50 times,
converts the heads to ones and the tails to zeros, and publishes that as the winning lottery ticket. Consider these
three lottery ticket numbers:

01010101010101010101010101010101010101010101010101,
11001001000011111101101010100010001000010110100011,
011000110 01001110100110110001011100110110000000111.

Do any of these seem more random than the others? (What if I told you the second was the first 50 binary digits of pi?)
Does it even make sense to say that a finite sequence of bits is more random than another? What about an infinite
sequence of bits? In this talk, I am going to explore different definitions of random number. I will talk about the view
that random numbers are a myth. I will talk about normal numbers. I will talk about Kolmogorov complexity, a way to
measure randomness via computable compression. Last, I will introduce Martin-Löf randomness.