(λblog. blog blog) (λblog. blog blog)

diverging since 2008

Entries tagged "math"

2008-07-26 12:00

(Written 2008-07-25 10:30)

I'm writing this somewhere above Warsaw, halfway through our flight to Budapest. From Budapest we take a connecting flight to Sofia, after which a bus will take us 100km to the town of Blagoevgrad, where the American University of Bulgaria at Blagoevgrad is hosting the International Mathematics Contest for University Students 2008 for the nth consecutive year.

IMC is an indivitual competition but contestants are divided into teams originating from universities. We have a three-man team representing the University of Helsinki, whose participation has been intermittent at best. I wouldn't even know of the event if I hadn't overheard an acquaintance of mine talking about forming a team.

A word about my history with competitive math. I did pretty well in the national math competitions during school, often placing in the top 10 or 20. During high school I attended special training that aimed to produce a finnish team for the prestigious IMO (International Mathematics Olympiade, a high school level contest). In the process I attended multiple nordic and baltic competitions, but never did make it to IMO, partly because I attended the coaching only for two years while most do it for three or four, and partly because of seriously underperforming in the final selection tests.

Well anyway, I'm kind of looking forward to the competition. The problems are very different compared to IMO: gone are geometry and number theory, instead the problems focus on (basic univariate) calculus and linear algebra with some algebra hiding in the nooks. The competition requires only the very basics of undergraduate math but the problems offer opportunities for applying advanced techniques. Here's an example problem:

Let A and B be real nx_n_ matrices such that AA+BB=AB. Prove that if BA-AB is invertible, then n is divisible by 3.

On the other hand I fear my analysis and linear algebra have fallen into a state of disuse as I've mostly been meddling around with logics and algebra for the past year.

I'm finishing this at the airport in Budapest waiting for the connecting flight. Keep tuned for follow-ups.

... finally got an opportunity to post this. It seems our accomodation's all "IT services" are down. A kindly local internet cafe let me plug my laptop in. Rates are pretty low, 1 local currency unit (about 0.5€) per hour.

Tags: math, travel.
2008-07-28 14:30

The last of two competition days is over. Can't say I'm satisfied with my performance. Day 1: spent too much time banging my head on problem 2 (concerning polynomials), I had a slightly wrong approach that lead to some tricky special cases in an otherwise simple inductive proof. Starting from scratch probably would've gotten me on the right track, as the solution really was pretty simple. After all this hassle all the points I got were 10 (out of 20) from a solution to problem 4 that I cobbled up in the last 15 or so minutes.

On day 2 I was close to the solutions of three problems but just couldn't make it, mostly because of spending too much time on problem 3. I somehow managed not to spot that the combinatorial sum in prob 3 was just the binomial expansion of the analytic solution to the fibonnacci reccurrene, a fact that would've instantly provided the solution.

Also, I didn't look too closely at the geometry problem as geometry tends not to be my forte. I however did manage to spot that the points in question must lie on a hyperbola. This fact combined with the obvious symmetries of the situation would've given that the points actually lie on a line, yielding the solution.

Guess it's just been too long since I've solved any real math problems.

Tomorrow: excursion to a monastery. The day after that: prize ceremony. We're arriving in Finland on Thursday night.

I'll link to the problems when they hit the interwebs.

Tags: math, travel.
2009-07-13 12:20

Parallel or

Every (imperative) programmer knows the usefulness of a shortcircuiting or operation, that is, a binary operation that doesn't evaluate it's second argument if the first one is True. To a functional programmer this is only natural: the function or is lazy in it's second argument.

> or True  _ = True
> or False x = x

Wouldn't it be nice to have a parallel or that evaluates to True if either of it's arguments is True. That is, a function por satisfying

> por :: Bool -> Bool -> Bool
> por True  _    === True
> por _     True === True

even if _ were undefined or a non-halting computation. As it turns out, a function like this cannot be implemented in lambda calculus. Let's prove this.

We will restrict our selves to simply-typed lambda calculus with three constants, True, False :: Bool and if :: Bool -> a -> a obeying the following reduction rules:

> if True  M N ==> M
> if False M N ==> N

Recall that the basic reduction rule of lambda calculus was beta-reduction, that is, function application.

We will additionally allow for fixpoints, as otherwise we wil not be able to define most functions at all. (Strictly speaking, we will need one fixpoint operator per type, but type annotations will be ommitted in the future for brevity.) The definition of fix (if you don't understand it, just take for granted that fix let's us do recursion or see these posts):

> fix_a :: (a -> a) -> a
> fix_a ==> \f -> f (fix f)

Our evaluation order will be normal order, that is: we apply beta-reduction (function application) and our custom reduction rules at the lefternmost possible part of the expression. (This evaluation scheme is confluent, i.e. it always produces a unique normal form if such exists)

Now, because of it's type, por must have form

> por = \x -> \y -> E

for some E. Let us assume that E is in normal form (if it is not, it can be reduced without changing the semantics of por). We will also assume that both x and y appear in E as this is only natural.

Now then, buried in E there must be some reduction opportunities that can only be taken when something is substituted for x and y. Because x and y are variables, they must be distinct parts of E. That is, when we substitute something for x and y, these somethings will not overlap. By symmetry we can assume that x is left of y in E. Let E[A,B] denote E with A replaced for y and B for x. Now then if A reduces to A' in one step, E[A,B] reduces to E[A',B] in one step because A was our lefternmost reduction opportunity.

Now we note that fix (\x -> x) reduces to itself in two steps:

> fix (\x -> x) ==> (\f -> f (fix f)) (\x -> x) ==> (\x -> x) (fix (\x -> x))
>   ==> fix (\x -> x)

And thus

> por (fix (\x -> x)) B === E[fix (\x -> x),B]

can have no normal form regardless of B. Especially por (fix (\x -> x)) True =/= False.

A taste of domain theory

Scott models, also known as domains, are a way of obtaining denotational semantics for lambda calculi. Denotational semantics mean a mapping of the structures of the language to mathematical objects (functions, sets) of some sort.

Long story short, we consider a more-defined-than ordering of values. For simple types such as Bool and Nat (the natural numbers), there are just two levels to this ordering: the completely undefined value _|_ (pronounced "bottom"), and the defined values (1,2,True,...). The intuistic reason for using these orders is that a (computable) function must map a more-defined argument to a more-defined result. That is, functions must be monotonic wrt. the orderings of it's argument and result types.

An example is in place. Consider a function f :: Nat -> Bool with

> f undefined === True

now clearly f 4 must be True as well because f couldn't have inspected it's argument. Similarily, if

> f 4 === undefined

f undefined must be undefined also.

To be exact, for simple types (here Bool and Nat) the ordering is defined by

  1. _|_ <= x for all x
  2. x and y are not comparable if neither is _|_

The monotonicity requirement can now be formulated as: if a <= x then f a <= f x.

The cliff hanger

If we expand the definition of monotonicity to two arguments in the most natural way, that is

> a <= x && b <= y ==> f a b <= f x y

we find out that partial or is monotonic: this can be easily ascertained by either checking all cases or reasoning that making either argument to por less defined will never make the result more defined. So, por cannot be told apart from the defineable functions in our semantic model.

In the next post we will investigate what happens when we introduce por as a constant into our lambda calculus. Be warned, it might take a while as there's a lot to handle and I'm leaving for the alps after the weekend.

Tags: haskell, math.

RSS Feed