Sunday, January 30, 2005

Where did it go?

I have not left behind anything of importance so far on this trip, but it's not for want of trying. I walked out the door at home three times: I came back once because I'd forgotten my poster, once because I'd forgotten to pay the rent check, and once because I thought I'd forgotten my keys. In the airport I had to backtrack when I left my book (with the boarding pass as a bookmark) on the counter. And on opening my bags this evening, I discovered that I'd left behind the little satchel where I usually keep my travel toiletries.

Oops.

On the other hand, I made substantial revisions to two papers: for one, I did something about the technical meat; and for the other, I beat LaTeX into bloody submission using a sequence of tricks. I posted my poster, and I socialized with a variety of people at the reception. So while I may give every evidence that I need my own personal sheepdog to keep herding me away from hazards, I do all right on my own.

Could you teach a sheepdog about vector calculus?

Arrivals, Books, and Complaints

I've already written about the fifth floor printer in Soda Hall, which went out of service some months ago. The problem was fixed by putting a sign on the printer saying that it would be out of service until further notice. As a result, the fifth floor print-and-copy room no longer has a copier or a working printer; the microwave and the sink both still works, but I wouldn't lay odds on their maintenance should they eventually fail. The locks on the third floor main entrance to Soda Hall have been on the fritz for the past month, and at the start of last week a sign appeared on the door directing traffic to the fourth floor entrance at night and on weekends. The copiers seem to have disappeared altogether from the math library, though, and my friends in civil engineering -- now returned to Davis Hall after a semester of earthquake-safety retrofitting -- are still without desks and chairs; relatively speaking, I have little cause for complaint. The urge to complain, though, is nearly as fundamental as the urges toward sex or money; to be ruled by any of the three is unseemly, but to deny any one completely bespeaks either serious illness or a sort of saintly state that -- at best -- I think most of us might admire from a distance. Some might claim that it's all one and the same, and that such a saintly state is a sort of mental illness; but I think that claim can only be given credence to the extent that all psychology is pathological, and that everyone's crazy but me and thee -- and I'm not so sure about thee.

Despite the lurking fear that the remaining printer might also fail, or that the third floor entrance might freeze up at an inopportune time, I was able to visit the office yesterday to print the information I needed for my flight. So here I am in the hotel lobby in Miami; my flight was uneventful, my poster is ready, and my wireless network connection here seems more reliable than the connection at my desk in Berkeley. Later, perhaps, I'll use the network to get some work done -- there are some simulations I'd like to re-run for a nearly-finished paper, and my laptop has too little memory to handle them gracefully -- but right now my brain is telling me that it's 6:30 in the morning, and that I've been up since we landed at 3:20 am PST (6:20 EST). Mathematical work will wait until after I've had coffee.

Web surfing requires much less mental effort than mathematics -- at least for me -- and so I've been catching up with the news. It seems that voter turnout in Iraq is high, thanks in no small part to Al-Sistani's injunction that Iraqi Shiites should vote as a sort of religious obligation. We'll see what happens from here. Contemplating Iraq makes me less inclined to complain about things in my own life, if not so much that I'll cease to do so entirely.

The Sunday Book Review was interesting, too. I have a surfeit of interesting books to read right now. I'm nearly finished with The Vintage Mencken (ed. Alistair Cooke), but I also brought Euler: The Master of Us All (William Dunham), The Way I Remember It (Walter Rudin), and A Stroll with William James (Jacques Barzun); and I have other books to read once I return home. Nevertheless, I would be tempted to add Jared Diamond's Collapse to my list, based on the merits of the Times review; whatever faults he may find with the conclusions, Easterbrook does make it sound like an interesting book. There is also an essay by Steve Johnson on how new software for searching personal documents may actually change the way we think. I don't agree with the conclusions; indeed, I think that unless one defines the verb to think so broadly that the word loses the usual meaning, the idea of a personal search engine changing how we think is preposterous. On the other hand, search tools do influence the chains of things about which we choose to think, and the essay is entertaining.

I think I'll find that cup of coffee now.

Saturday, January 29, 2005

Onward and forward

I will be traveling Sunday through Thursday, and immediately thereafter I'll be dealing with one of those little chores I probably should have dealt with some time ago. Posting for the next two weeks may be sporadic.

Wednesday, January 26, 2005

Amusing Abstracts

The following abstract went out to the CS graduate student mailing list today. I found it highly entertaining. Be sure to read the last line.

EE 298-5
Berkeley Optoelectronics Seminar Series (BOSS)
http://microlab.berkeley.edu/text/298-12.seminar

Friday 28 January 2005
11-12 in the Hogan Room
(521 Cory)

Growth of the Glorious Internet: Technology, Economics & Society

Dr. Ivan Kaminow
EECS/UCB

ABSTRACT

Important new technologies influence society and society, in turn, shapes the technologies. We are in the midst such an interplay as the Internet develops around us. I will review the basics of the technology, and the history of the Internet and efforts of Wall Street and Washington to control it. I end with a discussion of current research on optical packet switching.

  • Currently drinking: Russian Caravan tea

Bananas for Brains

You know how sometimes you start coding and lose track of time, and then suddenly discover it's the middle of the afternoon and you're a little dizzy from not eating? When one of my compatriots uttered this question this afternoon, I could only nod in sympathy. I've done the same thing all too often myself.

But I was good about eating today. I had a reasonable amount of food for breakfast, lunch, and dinner. So why is it -- despite eating plenty today, and despite the fact that I took little exercises beyond the usual couple miles of walking -- I'm still feeling hungry-dizzy again now? Drat. I'm going to have to eat something before I'll be able to sleep. Maybe I still have some apples or bananas at home.

Tuesday, January 25, 2005

Jasmine Pearls

Jasmine pearls are jasmine-scented whole green tea leaves, dried and curled up into balls about the size of large pearls. When you pour hot water over them, they unfurl; they start slowly, maybe a little shyly, but soon turn into perfect little whole leaves. It's a virtual Sargasso tea in there, with the leaves waving gently back and forth whenever I bump the mug. I usually prepare this type of tea in a glass mug, just so I can admire the leaves and the color of the brew. In the first steeping, the tea is intensely green; in later steepings, the color and the taste both become more subtle.

When I get grouchy about life -- which often happens when I'm sick, when I'm editing PowerPoint, or when gadgets cease to work (all of which have happened recently) -- a cup of tea and a little music often serves to restore my cheer. I'm warm; I have interesting work; and I have tea. I even have good books: I started last night on a collection of H.L. Mencken's short pieces, and they're great treats. This is my reward for myself tonight, too, once I finish the edits to this dratted poster.

Best get back to it.

  • Currently drinking: Jasmine pearl tea

Monday, January 24, 2005

The Best-Laid Plans

I planned to make a poster last night. Instead, I made vindaloo.
I planned to write an abstract last night. Instead, I wrote a letter.
I planned to edit two paragraphs. I wrote two new pages.
I planned to leave at 5:00 today. I only became aware of the time at 6:30.
I planned to ask a quick question. The answer took 90 minutes.
I planned to refill my pen. I ended up with ink on my fingers and nose.
I planned to eat lunch. I had a cup of coffee.

I think I should go home now. At least, that's the plan.

Sunday, January 23, 2005

Vince's visit

Vince visited last weekend, and he has now posted a few pictures from his trip.

  • Currently drinking: Coffee again

Credo

I missed quoting Einstein in my post-of-quotes. Fortunately, Mark Zimmermann beat me to it.

In Quotes

I've spent a lot of time recently immersed in books on spectral and pseudospectral methods and related topics (you can think of these methods as numerical tools based on generalizations of Fourier expansions -- and yes, that includes expansions in terms of Bessel functions!). There are technical reasons why I've been reading about -- and coding -- spectral methods recently, which I might write about at some other time. But right now, I want to mention a particular book, Chebyshev and Fourier Spectral Methods by J.P. Boyd.

Boyd's book is a gem. It's available as a PDF file from his web site; if you're interested in numerical ODE/PDE solvers or approximation theory, I recommend downloading it and skimming the table of contents. If you have only the vaguest interest in numerical mathematics, I still recommend that you download it and read the preface (to the first edition); I agree wholeheartedly with the sentiments expressed therein.

One of the things I like about Boyd's book is the quotes that appear at the beginning of each chapter. For whatever reason, such introductory quotes are common in books (and some papers!) on numerics. Several of the quotes looked familiar, and upon a little poking around, I discovered that many were also chapter quotes in Men of Mathematics -- one of my all-time favorite books, as I've mentioned a time or twelve.

So I've decided to collect a few favorite quotes of my own. I've recently spent some time re-reading Poincare's popular works, so the selection is biased; other quotes are taken from Chebyshev and Fourier Spectral Methods (Boyd), The Character of Physical Law (Feynman), Men of Mathematics (Bell), Galileo's Commandment (ed. Bolles), German Essays on Science in the 20th Century (ed. Schirmacher), and The Force of Symmetry (Icke). I'd thought about organizing these quotes under different headings, but that would destroy half the fun.

And here let me insert a parenthesis to insist on the importance of written exercises. Compositions in writing are perhaps not given sufficient prominence in certain examinations. In the Ecole Polytechnique, for instance, I am told that the insistence on such compositions would close the door to very good pupils who know their subject and understand it very well, and yet are incapable of applying it in the smallest degree. I said just above that the word understand has several meanings. Such pupils only understand in the first sense of the word, and we have just seen that this is not sufficient to make either an engineer or a geometrician. Well, since we have to make a choice, I prefer to choose those who understand thoroughly.

-- H. Poincare

A mathematician who is not also something of a poet will never be a complete mathematician.

-- K. Weierstrass

In my opinion a mathematician, in so far as he is a mathematician, need not preoccupy himself with philosophy -- an opinion, moreover, which has been expressed by many philosophers.

-- H. Lebesgue

It is a safe rule to apply that, when a mathematical or philosophical author writes with a misty profundity, he is talking nonsense.

-- A. N. Whitehead

Six months in the lab can save you a day in the library.

-- A. Migliori

It is the increasingly pronounced tendency of modern analysis to substitute ideas for calculation; nevertheless there are certain branches of mathematics where calculation conserves its rights.

-- P.G.L. Dirichlet

Talk with M. Hermite: he never evokes a concrete image; yet you soon perceive that the most abstract entities are for him like living creatures.

-- H. Poincare

A scientist worthy of the name, above all a mathematician, experiences in his work the same impression as an artist; his pleasure is as great and of the same nature.

-- H. Poincare

History shows that those heads of empires who have encouraged the cultivation of mathematics, the common source of all the exact sciences, are also those whose reigns have been the most brilliant and whose glory is the most durable.

-- M. Chasles

Nothing requires a rarer intellectual heroism than willingness to see one's equation written out.

-- Santayana

He studied and nearly mastered the six books of Euclid since he was a member of Congress.
   He began a course of rigid mental discipline with the intent to improve his faculties, especially his powers of logic and language. Hence his fondness for Euclid, which he carried with him on the circuit till he could demonstrate with ease all the propositions in the six books; often studying far into the night, with a candle near his pollow, while his fellow-lawyers, half a dozen in a room, filled the air with interminable snoring.

-- A. Lincoln (Short Autobiography)

In the terminology which you graciously ascribe to me, we might say that the atmosphere is a musical instrument on which one can play many tunes. High notes are sound waves, low notes are long inertial [Rossby] waves, and nature is a musician more of the Beethoven than of Chopin type. He much prefers the low notes and only occasionally plays arpeggios in the treble and then only with a light hand. The oceans and the continents are the elephants in Saint-Saens' animal suite, marching in a slow cumbrous rhythm, one step every day or so. Of course there are overtones: sound waves, billow clouds (gravity waves), inertial oscillations, etc., but these are unimportant and are heard only at NYU and MIT.

-- J. Charney

Physics is beautiful. It makes me sad beyond words to know that so many people think of the physical sciences as barren, boring, bone-dry. Not so: when you lie outside in the grass on a clear dark night and look up at the stars, what you see is splendid. It is also physics. Understanding can lift you off the Earth, safer and faster and further than any rocket. The mind can travel among the stars, even enter them to see what causes those fires inside. To the beauty of seeing, we can add the beauty of understanding. And there is another level of beauty beyond that: the beauty of discovery, of creation, of doing physics. This beauty I love the most.

-- V. Icke

To summarize, I would use the words of Jeans, who said that the Great Architect seems to be a mathematician. To those who do not know mathematics, it is difficult to get across a real feeling as to the beauty, the deepest beauty, of nature. C.P. Snow talked about two cultures. I really think that those two cultures separate people who have and peoplle who have not had this experience of understanding mathematics well enough to appreciate nature once.
   It is too bad that it has to be mathematics, and that mathematics is hard for some people. It is reputed -- I do not know if it is true -- that when one of the kings was trying to learn geometry from Euclid, he complained that it was difficult. And Euclid said, There is no royal road to geometry. And there is no royal road. Physicists cannot make a conversion to any other language. If you want to learn about nature, to appreciate nature, it is necessary to understand the language she speaks in. She offers her information only in the one form; we are not so unhumble as to demand that she change before we pay any attention.
   All the intellectual arguments that you can make will not communicate to deaf ears what the experience of music really is. In the same way all the intellectual arguments in the world will not convey an understanding of nature to those of the other culture. Philosophers may try to teach you by telling you qualitatively about nature. I am trying to describe her. But it is not getting across because it is impossible. Perhaps it is because their horizons are limited in this way that some people are able to imagine that the center of the universe is man.

-- R. Feynman

Play, art and science are the spheres of human activity where action and aim are not as a rule determined by the aims imposed by the necessities of life; and even in the exceptional instances where this is the case, the creative artist or the investigating scientist soon forgets this fact -- as indeed they must forget it if their work is to prosper.

-- E. Schrodinger

You have doubtless often been asked of what good are mathematics and whether these delicate constructions entirely mind-made are not artificial and born of our caprice.
   Among those who put this question I should make a distinction; practical people ask of us only the means of money-making. These merit no reply; rather would it be proper to ask of them what is the good of accumulating so much wealth and whether to get time to acquire it, we are to neglect art and science, which alone give us souls capable of enjoying it, and for life's sake to sacrifice all reasons for living.
   Besides, a science made solely in view of applications is impossible; truths are fecund only if bound together. If we devote ourselves solely to those truths whence we expect an immediate result, the intermediary links are wanting and there will no longer be a chain.

-- H. Poincare

Science knows only one commandment: contribute to science.

-- B. Brecht (The Life of Galileo)

It is only through science and art that civilization is of value. Some have wondered at the formula: science for its own sake, and yet it is as good as life for its own sake, if life is only misery; and even as happiness for its own sake, if we do not believe that all pleasures are of the same quality, if we do not wish to admit that the goal of civilization is to furnish alcohol to people who love to drink.

-- H. Poincare

Others will always ask themselves what use it is. They will not have understood, unless they find around them, in practice or in nature, the object of such and such a mathematical notion. Under each word they wish to put a sensible image; the definition must call up this image, and at each stage of the demonstration they must see it being transformed and evolved. On this condition only will they understand and retain what they have understood. These often deceive themselves: they do not listen to the reasoning, they look at the figures; they imagine that they have understood when they have only seen.

-- H. Poincare

This vain presumption of understanding everything can have no other basis than never understanding anything. For anyone who had experienced just once the perfect understanding of one single thing, and had truly tasted how knowledge is accomplished, would recognize that infinity of other truths of which he understands nothing.

-- Galileo (The Two Chief World Systems)

Would a naturalist imagine that he had an adequate knowledge of the elephant if he had never studied the animal except through a microscope?
   It is the same in mathematics. When a logician has resolved each demonstration into a host of elementary operations, all of them correct, he will not yet be in possession of the whole reality; that indefinable something that constitutes the unity of the demonstration will still excape him completely.
   What good is it to admire the mason's work in the edifices erected by great architects, if we cannot understand the general plan of the master? Now pure logic cannot give us this view of the whole; it is to intuition we must look for it.

-- H. Poincare

What is it about nature that lets this happen, that it is possible to guess from one part what the rest is going to do? That is an unscientific question; I do not know how to answer it, and therefore I am going to give an unscientific answer. I think it is because nature has a simplicity and therefore a great beauty.

-- R. Feynman

Must we therefore say that science should be abandoned, and morality alone be studied? Does anyone suppose that moralists themselves are entirely above reproach when they have come down from the pulpit?

-- H. Poincare

I have heard myself accused of being an opponent, an enemy of mathematics, which no one can value more highly than I, for it accomplishes the very thing whose achievement has been denied me.

-- Goethe

I do not know.

-- J.L. Lagrange

Thursday, January 20, 2005

Editors

I don't like using my web browser as an editor. That's not what it was designed to do. For most of my work, I switch back and forth between vi and Emacs for most of my editing tasks, though in the past I used other editors (e.g. Brief) more regularly. And, of course, I'm often happy to revert to pen and pad. So it isn't that I'm biased against using a different technology for writing -- it's just that my web browser isn't an editor.

I usually compose the pieces I post here on my laptop, using Emacs. To post, I copy and paste into the web form. I've tried using the e-mail interface, but it's spotty, and sometimes either drops posts completely or delays them by as much as a few days. There's also an Emacs mode to post to Blogger using the Blogger 1.0 XML-RPC interface (RPC is short for Remote Procedure Call; it's a way of requesting that a remote server do something on your behalf, such as creating a new post). Leaving aside my opinion of XML-RPC in general, the Blogger interface is going nowhere: Blogger is going forward with a different API called Atom which supports many more features (including things like titles), while much of the rest of the world is going with RSS. Either way, I have better things to do with my time right now than to hack together an Emacs LISP module so that I can have a newer-and-spiffier way to post directly from my editor.

I could, of course, use the web browser built into Emacs. Yes, I use an editor with a web browser built into it; I enjoy using the Emacs web browser almost as much as I enjoy using my web browser as an editor.

Wednesday, January 19, 2005

Teaching

I planned to be a TA this semester. I requested one of the theory courses -- algorithms, perhaps -- but those openings were filled by others. I was told that I'd either be assigned to the compilers course or to one of the courses in the introductory sequence (programming, data structures, and basic architecture). But that was not to be either. My advisor nixed the idea, and convinced me that he was right to do so. I'm trying to juggle too many things at once as it is. I will, however, be a TA once more (in the fall): Jim had me petition not too, but the petition was denied. So it goes.

I don't remember ever having a TA for a math course while I was at Maryland. For most of the courses I took, there wasn't even a grader.

I remember five graduate student instructors for undergraduate computer science courses. The TA for my programming language survey course was competent and pleasant, and had a deep understanding of the material. I believe he won an award for his teaching; he certainly deserved it. The graduate student who taught my first algorithms class -- it was a summer course -- was a Chinese gentleman with a tremendously thick accent. I could not tell the difference between his l and his r, nor between his m and his n. His handwriting was not much clearer than his speech, and I would have been totally lost if I wasn't already familiar with most of the material he covered. He also wrote false statements semi-regularly, and after the first few corrections he took to calling me Mr. Counterexample. My robotics class was effectively taught by a graduate student; the TA for my operating systems course was an Russian fellow, a very sharp programmer who was almost comically abrupt with his fellow humans; and the TA for my network class left no lasting impression.

I've taught and tutored in the past in a variety of circumstances: martial arts, mathematics, and Matlab. But one of the things I thought was useful about being a TA for parallel computing a couple years ago was the type of feedback I received. Self-deception is easy, but the responses and performance of students provide objective data -- one just has to pay attention. I get easily enthusiastic about my subjects, and know something about a broad range of topics in mathematics and computer science. I've had good luck conveying information to people who are interested and who have adequate background to understand what I'm saying. But I've also bored those who really weren't interested, and snowed those who didn't really understand the required background as well as I thought they did. I've observed both the positive and the negative reactions when I've spoken with people one-on-one, but in the context of a class, the basic patterns were more obvious.

Okay, sometimes the patterns are pretty obvious even in less formal settings. I work with smart people, and sometimes those people astound me by saying something deep about a mutual interest -- and then claiming that they learned about it from me. Eh? On the other side, I've tried to explain things to people who never really got it because they didn't have the background I assumed. And I've certainly dealt with people who appeared to lack any sense of curiosity about the world at large, let alone the particular corner of the world that happened to fascinate me at the time. I know some people just aren't curious; but knowing and understanding are different things, and I find incuriosity terribly frustrating.

I like to explain things, to myself and to others. I'm glad to be in an environment where I can do just that.

Monday, January 17, 2005

Optical delusion

Ever get this feeling?

Markov chains

I said a while back that I would write something about Markov chains. I just finished a tricky debugging session, and I don't think I'll be good for much else for the next half hour; so here goes.

Remember Frogger? Suppose our hero, the intrepid frog, is exploring a major intersection. From any of the corners, median strips, or other safe havens, he can venture forth to explore one of the other havens. The frog rolls a die (badly weighted, since frogs are not good dice-makers) to tell which corner to hop to at each step. Supposing the frog does not get squashed, we might ask: what fraction of the time the frog is like to spend at each corner?

More generally, we might have some graph that we're moving around. Each vertex in the graph (each state) is connected to a few other vertices, and at every time step we move to one of the adjacent vertices or stay still based on a roll of the dice. The important thing is that whenever we're at node X, the probability that we'll move to another node Y must be fixed -- it can't change over time, or depending on past history (the latter condition is more important than the former condition). Typical questions that we might then ask are: if we let the chain run for an arbitrarily long time, will the probability associated with landing on a particular node settle down? If so, how quickly?

Notice that I've changed something fundamental as soon as I start talking about probabilities. If I watch a foolhardy frog hopping around a major intersection, I'll only see the frog at one corner at a time -- barring gruesome accidents, of course. But if I look at the frog's behavior over long time spans, or if I look at the aggregate behavior of a bushel-basket full of frogs that I picked up at Ramses Fish Market and House of Plaques, then I can talk about a distribution over all the corners. That is, I've gone from a discrete value to something continuous.

Notice also that I've made a critical assumption, the one that makes a Markov chain a Markov chain: the transition probabilities do not depend on prior history. One of the profound things about physical systems is that, even for complicated systems, this is often a good assumption! It's almost a scientific article of faith that, underneath it all, the universe plays by a fixed set of rules. When a system does appear to have memory, it's often because we weren't looking closely enough to see some hidden variable. For instance, if I hit a ball with a bat, there's some probability (very small, in my case) that the bat will break. That probability changes over time, but that's not because of the passage of time per se; it's because of the accumulated damage to the bat. The probability that a bat with a particular amount of damage will break does not depend on time. The trick here, which applies in general, is that the memory of the system is encoded in additional variables. So I now have a bigger graph -- not just bat broken and bat not broken, but also bat slightly damaged (or moderately damaged, or heavily damaged) -- but on this larger graph, I can call the process memoryless.

The problem with adding variables to represent memory is that they make the number of possible states in the system much larger. At the logical extreme, you could call the entire history part of the state -- but then you'd have an infinite state space, which is inconvenient. The remarkable thing about the natural world is that the history of very complex systems can often be summarized with such a small number of variables. Human memories are not so easily summarized, which is part of why statistical mechanics produces very useful descriptions of nature, but the analogous statistical mechanics of human behavior (psychohistory in Isaac Asimov's Foundation books) is never likely to be seen outside science fiction.

Back to the matter at hand: Markov chains. If I have a finite state space, I can represent a Markov chain by a matrix, which I will call A. The Aij entry (row i, column j) represents the probability of moving state i to state j. If pk is a vector representing the probability I'm in each state at the kth step, then I can write the probability at step k+1 as pk+1 = A pk. The solution to this recurrence, starting at some initial probability distribution p0, is pk = Ak p0.

What began as a question about randomly hopping frogs has turned into a question of linear algebra -- as many interesting statistical problems ultimately do. The trick to analyzing linear systems is to choose your variables right. If we use the parameterization of the probability distribution, the entry pki represents the probability of ending up in state i at step k. If we ask instead how some combination of the entries of pk behaves, then we can solve an easier problem. This is the second time we've relaxed the problem this way: the first time, we went from a discrete set (the frog is at the northeast corner of Fourth and Broadway) to a continuous set of probability distributions. Now we're taking things a step further and looking at coordinates that don't even represent being in a particular state! But by choosing those vectors right, we can (usually) turn a messy matrix equation into a bunch of one-dimensional equations -- which are easy to analyze.

Huzzah for the Jordan canonical form! No, that's not the way Michael Jordan throws a basketball.

At any rate, this sort of analysis tells you a number of things. In particular:

  • There is at least one stationary distribution: that is, a ps such that A ps = ps. If the graph of transitions has only one connected component, then there is only one such distribution.
  • If there is one stationary distribution, then in the long run (as k goes to infinity), there are only two things pk can do. The first thing is that the chain can slosh back and forth between distributions. For instance, suppose I put a stone on checkerboard and, at each step, allow myself to move up, down, left, or right by one square. If I start at a white square, then I'll always be on a white square every second move, and on black squares during the moves between. In the long run, I'll have to distributions to how often I visit each square on the board: one for how often I visit the white squares, and one for how often I visit the black squares. The second thing that pk can do is to converge to ps -- in my checkerboard example, this is what happens if at each step I allow some nonzero probability that the stone will stay where it is. A Markov chain which always converges to a single stationary distribution is called ergodic -- a simple enough idea with a fancy name.
  • If the chain is ergodic, we can ask how long does it take to get to the stationary distribution? The answer is that you can characterize the speed of convergence in terms of differences between eigenvalues of A (usually between the eigenvalue 1 and the eigenvalue which is second-largest in magnitude). Yet another reason for people to care about eigenvalue calculations!
  • The finiteness of the chain is not only quantitatively important: it's also qualitatively important. If I allow an infinite number of states, then in general there does not need to be any stationary distribution! In a finite, ergodic Markov chain, the probability of landing in a particular state tends to diffuse from the initial distribution (this is not a false analogy: many diffusion equations can be derived from considering Markovian behavior of particles at the microscale). If the state space is infinite, then the probability can keep diffusing away forever, so that the chain visits ever more states with ever-decreasing probability. Much of what I've said generalizes to some continuous state spaces, but these Markov chains are characterized by a different sort of finiteness condition.
  • Currently drinking: Gen mai cha

Mathematical styles

Jim asked me this weekend How many types of mathematics are there, anyhow? It's a good question, and it took me a few seconds to say anything at all. I rambled on for a bit about the AMS classification of mathematical topics, and that provided us both with a few minutes of amusement, but I kept thinking about the topic afterward. I'd thought a little about the question earlier in the week, when I took a little time to put my office bookshelves in order. The divisions between branches of mathematics are rather artificial, of course, and I suspect that most of the mathematicians I admire would feel pretty strongly that at least a few divisions are spurious, historical accidents at best and divisive political nonsense at worst. Nevertheless, there are definitely different traditions and different modes of thought within mathematics, and I have some thoughts about what defines them.

I will organize my thoughts according to how I organized my bookshelf:

  1. Computer science

    I already have written that I don't think computer science is properly a scientific discipline -- at least, it isn't mostly. It also isn't entirely a mathematical discipline. Nevertheless, computer science as a discipline includes a strong mathematical component that isn't emphasized so strongly elswewhere. In particular, computer scientists pay a great deal of attention to graph theory, certain areas of combinatorics, automata theory, and both the very pragmatic and the very abstruse characteristics of algorithm performance.

    As mathematicians, computer scientists tend to be strongly combinatorial and algebraic in their thinking. Many of us have some grounding in probability and statistics as well, but it's often restricted to a discrete setting. There are computer scientists who spend time thinking about continuous problems, too: scientific computing folks (like me), graphics and vision folks, and some others. But it's not the most common strength.

    My computer science shelves are organized into books on languages and tools; on core CS topics; and on software engineering. The algorithms and data structures books, the cryptography book, and parts of the compiler book could be considered mostly mathematics.

  2. Mathematical physics

    My physics shelf includes books on general physics, classical mechanics, solid and fluid mechanics, electrodynamics, and a little quantum and statistical mechanics. It also includes a couple books that I nearly classified in geometry -- a closely related discipline.

    Mathematical physics is a frustrating misnomer. Elementary books on mathematical physics rarely include much of real physical interest; rather, they include yet another rehash of the standard linear differential equations, some special functions, Fourier analysis, and perhaps elements of the calculus of variations. They could be re-titled as texts in advanced engineering mathematics, and nobody would be the wiser -- except that most engineers write dx at the end of an integral, and many physicists like write it at the beginning. There are more advanced books that essentially follow the same line of development, but with more sophistication -- and usually without the name mathematical physics in the title. Suffice it to say that some of the mechanics books on my shelf are full of very physical insights, backed up by occasional calculations; and some of them are mathematics books that happen to have a lot of examples drawn from physics.

  3. Linear algebra

    You knew this was coming, right? Linear algebra is a curious area, partly because it's fundamental to so many disciplines: analysis, statistics, optimization, geometry, algebra, ... The influence goes both ways, too. Despite the name algebra, researchers in linear algebra often spend as much time on analytic work as they do on algebraic work.

    Numerical linear algebra is one sub-discipline of linear algebra. Matrix theory is another (as I've indicated elsewhere, matrices have structure beyond what they inherit as representatives of operators on linear spaces). To some extent, functional analysis is a branch of linear algebra, too, but I've classified it elsewhere for the purposes of shelving.

  4. Integral/differential equations and analysis

    If half the proofs start with the words for all epsilon, there exits..., I'll probably call it analysis. Analysts deal primarily in estimates, inequalities, and notions of convergence. Real and complex analysis are fields that deal primarily with the rigorous development of the real and complex fields, and with different types of real and complex functions and their basic properties. Functional analysis deals with spaces of functions, and is as much a branch of linear algebra as of analysis. The theory of ordinary and partial differential equations is hard analysis, tied closely to the field of functional analysis. I also classify my books on the numerical solution of ODEs and PDEs on this shelf, partly because even the most hard-headedly practical books on numerical solution of ODEs and PDEs will include some discussion of elementary functional analysis -- at least, they will if they're any good.

  5. Geometry and nonlinear systems

    When I speak of geometry, I'm not talking about Euclid. The geometric objects that most interest me tend to come from physics, either classical or modern. They represent the surfaces of objects, or sets of solutions to some interesting equations (e.g. sets of configurations with a given energy), or whatnot. Nonlinear systems can almost never be solved, but they can be analyzed; and investigating the geometry of their solutions, together with certain analytic properties (particularly in the neighborhood of singularities) is the clearest route to a understanding what most nonlinear systems will do.

    Poincare wrote about two basic types of mathematical minds: the geometer and the analyst, he called them, referring to those who proceed by intuition and those who proceed more methodically. At the end of the chapter where he set up this dichotomy, he admitted that it wasn't perfect, but stated that he still thought it was a real division -- and I agree. If I see a proof that involves global qualitative properties of a solution set, I'll probably call it geometry. If I see a proof that involves detailed estimation based on local quantities, I'll probably call it analysis. That's not a very good description of the difference between the fields; maybe it would be more accurate to follow Poincare's description, and say there are those who primarily launch out intuitively, and there are those who insist on the more solid ground of carefully-reasoned proofs. In Poincare's classification, I think I'm more of an analyst than a geometer.

  6. Applied mathematics and statistics

    This is a catch-all category. Some of my statistics books are really about applications; some of them are really about hard analysis. The books about applied mathematics are... eclectic. This is fine by me, of course, but it does make them hard to classify. I include in this category my books on perturbation methods.

    Most of the books in this shelf involve the analysis of problems from some discipline outside of mathematics, be it operations research, meteorology, physics, engineering, or something else entirely. When done well, a paper or book on a topic in applied mathematics should be interesting both because the problem is interesting (to someone other than a mathematician) and because the problem generates interesting mathematics.

    My books on special functions go roughly under this heading, too, though space constraints forced me to put them next to my library books instead.

  7. Nonlinear equation solving, optimization, and numerics

    This is another catch-all classification, basically containing all the numerical mathematics books I have that deal with finite-dimensional phenomena and are not devoted exclusively to numerical linear algebra. Let me mention the book Numerical Methods That Work by Forman Acton: the front cover has the word Usually embossed below the title, and the book is full of cautions which boil down to the command to think before you compute. I agree wholeheartedly.

    Numerical analysis is concerned with the design of algorithms to compute approximations (usually) which are both accurate and inexpensive. It is an art that combines the estimates of mathematical analysis, the algorithmic thinking of computer science, the trickery of applied mathematics, and a certain amount of healthy skepticism.

  8. Algebra

    Algebra involves the study of a wide variety of types of formal structures and the functions that preserve those structures (homomorphisms). Unlike analysis, most of the proofs of algebra do not involve estimation. Algebra is not my strongest subject, though I have the standard background that any math grad student might have (a graduate algebra course, a little number theory, and a fervent appreciation of whatever group-theoretic or ring-theoretic properties might help me better understand algorithmic or physical properties of systems I work with more regularly).

Sunday, January 16, 2005

Bumper sticker

I saw a bumper sticker yesterday that tickled me. In bright red letters, it said:

If these words look blue, you're driving too fast.

Friday, January 14, 2005

Wednesday, January 12, 2005

Peppers and Olives

Stuffed peppers are easy to make, they keep well, and they taste great. What's not to like? One can of refried beans, one can of tomatoes, and a bit of cheese fills six bell pepper halves. Bake at 300 or 350 for 20-30 minutes; I start a pot of rice when I put the peppers in, and take the peppers out when the rice is done. Leftovers for lunch today!

Elena got a jar of olives stuffed with spicy pimientos for the holidays, and when I got back she said please help eat them! I'm trying to keep myself to one at a time, but it's a challenge. Yum!

Quibbling with JoS

I was going to follow up yesterday's comments on matrix representations of graphs and linear operators, with a discussion of Markov chains. I still might. But I think this evening I'll write about something different.

I've already linked to Joel on Software. I think I wrote that I liked the book with the same title (the subtitle is And on Diverse and Occasionally Related Matters that Will Prove of Interest to Software Developers, Designers, and Managers, and to Those Who, Whether By Good Fortune or Ill Luck, Work with Them in Some Capacity). One of the things I like about it, of course, is that the author had the good sense to agree with me on so many issues. I also like his writing style. A few days ago I was happy to see a new article on the JoS site. As usual, I was entertained, and found myself agreeing -- mostly. You might guess where our opinions differ from this snippet:

The trouble is, we don't really have professional schools in software development, so if you want to be a programmer, you probably majored in computer science. Which is a fine subject to major in, but it's a different subject than software development.

If you're lucky, though, you can find lots of programming-intensive courses in the CS department, just like you can find lots of courses in the History department where you'll write enough to learn how to write. And those are the best classes to take. If you love programming, don't feel bad if you don't understand the point of those courses in lambda calculus or linear algebra where you never touch a computer. Look for the 400-level courses with Practicum in the name. This is an attempt to hide a useful (shudder) course from the Liberal Artsy Fartsy Administration by dolling it up with a Latin name.

He's right that software development isn't the same as computer science (and neither software development nor computer science is the same as helping an acquaintance troubleshoot Windows problems, regardless of what the world might think). Software development isn't linear algebra, either, except in a few cases. Software development also isn't reading, writing, arithmetic, or the scientific method. But if you're a software developer and you never learned to read efficiently, to write clearly, to estimate intelligently, or to construct an experiment, then you didn't get as much out of your education as you should have. Of course, you're unlikely to get very far in the most programming-intensive courses I know about -- courses in compilers, operating systems, and computer networks -- without drawing on some fairly sophisticated mathematics. Actually, a compiler course is a really good example of what I mean. Writing a compiler involves a lot of programming, but programming competence is only the start. To write a compiler, you need to know all sorts of things about regular languages and finite state machines, various classes of more complicated grammars and corresponding push-down automata, recursion, graph theory, complexity theory, and some basic logic. For that matter, it's awfully useful to know a little about -- you guessed it -- lambda calculus. None of these topics is any less abstract than, say, a similar list of topics drawn from a first course in linear algebra.

Of course, most of the programmers in the world don't write compilers, any more than most of the programmers in the world write numerical linear algebra software. And, much as I might grouse about it, I do understand that lots of people will program for a living without ever understanding linear algebra. Hey, you can write some pretty interesting programs without really understanding recursion (though a programmer who has never heard of recursion is sort of like a mathematician who has never heard of Gauss -- you have to wonder about his education, and you might think twice before hiring him). The people who really bother me are not those who don't understand the point: it's those who don't understand that there is a point or, worse yet, those who scoff at the notion that something they don't know about might be of practical import. This attitude reminds me of the old quote that if the King's English was good enough for Jesus, it's good enough for me.

Gauging from his writing, though, I guess Joel would probably agree with me (maybe not about linear algebra). If you're one of the handful of people reading this, I suspect you agree, too.

Tuesday, January 11, 2005

Freud meets Turing

What happens when a documentary film-maker takes the Parry program too seriously? The last paragraph is particularly choice, since the author basically describes running a system to solve problems which are not solvable by Turing machines -- on something which is Turing-equivalent.

Sadly, this is not a joke.

Graphs and matrices

A graph is a set of vertices connected to each other by edges. The vertices and edges can represent all sorts of things: cities and roads, transistors and wires, computers and network cables, web pages and links, neighborhood gossips and conversations, people and friendships, tasks and scheduling dependencies, blocks and lines in a flow chart, or any of a number of other things. If the edges go one way (one web page links to another, but not vice-versa), the graph is called directed; if the edges have numbers associated with them (distances between cities, say), the graph is called weighted.

Graph theory is one of the main mathematical tools in computer science. Graphs have lots of applications; they're relatively easy to explain; and they can be represented in an utterly intuitive way by a sort of connect-the-dots scribbling that most of us learn to do as youngsters. Is it any wonder computer scientists love them?

There are other ways to represent graphs which -- though they lack the charm of connect-the-dots -- are easier for a computer to manage. They are also, perhaps, more dignified. One of these representations is called an adjacency matrix. For a graph with a finite number of vertices (the usual case), we can number each vertex in turn; then if vertex i is connected to vertex j, we put a nonzero in the ij entry (the entry at row i, column j) of the adjacency matrix.

So: a square matrix can represent a graph. It can also represent a linear operator acting on a finite-dimensional vector space. You might ask if weighted graphs and linear operators aren't really the same thing in some sense; the answer is that they really are different, though there is a close connection. Why are they different? If I label the major cities by numbers, it doesn't make much difference if I call San Francisco 1 and New York 2 or vice-versa (except, perhaps, to some residents and sports fans in said cities). But I can't book a flight between the sum of San Francisco and New York and the difference of the two cities, with a stop-over at pi times Chicago. Vertices aren't vectors, even though we can represent them as such; and it takes a certain amount of imagination to go back and forth between the two.

The difference between an adjacency matrix representation of a graph and a matrix representation of a linear operator is really in the legitimate isomorphisms that can be applied to each. I can represent a linear operator by many different matrices, depending on a choice of basis (do I walk a mile north and a mile east, or sqrt(2) miles to the northeast?): the different matrices that can represent the same operator are called similar. I can also represent a graph by many different adjacency matrices, because I can permute the labels of the vertices. But there are far fewer ways for me to rearrange labels than there are for me to change bases in general. That is, any two adjacency matrices representing the same graph are similar -- but just because two matrices are similar doesn't mean they represent the same graph.

This sort of non-obvious connection is one of the main reasons why mathematics is so useful. Seemingly disparate areas of inquiry connect together, and suddenly you can turn information about the connections between web pages into page rankings -- by turning a problem in graph theory into a problem in linear algebra (an eigenvalue problem, to be specific).

Monday, January 10, 2005

Equivalence and weirdness

As far as mathematical notions go, the idea of a relation is a basic. Formally, a relation on a set S is a subset of the Cartesian product S × S. If the pair (s,t) is in this subset, we say that s is related to t, and write s ~ t.

An equivalence relation has the following properties:

  • Reflexivity: for all s, s ~ s.
  • Symmetry: if s ~ t, then t ~ s.
  • Transitivity: if r ~ s and s ~ t, then r ~ t

An equivalence relation cuts up the set S into disjoint subsets, or equivalence classes. Two elements are in the same equivalence class if and only if they are related under an equivalence relation. This works both ways: for any partitioning of S into disjoint subsets, there is a corresponding equivalence relation. Mathematicians usually use the word modulo (or just mod) to describe this partitioning into equivalence classes: for example, if ~ is an equivalence relation and s ~ t, I might say that s and t are equivalent modulo the relation.

Now, I think I know most of the people who read what I write here, and most of you know more than the average bear about mathematics: basic calculus, probability, linear algebra, discrete mathematics, and some programming for most of you (a couple of you might not have formal courses in all those topics, but you've probably had some exposure). So why do I drag out a definition which is probably familiar to everyone who has actually read this far? Because the equivalence relations really are ubiquitous -- and in some cases they correspond to really strange equivalence classes.

Let me give an example: the example, in fact, that led me to thinking about this topic right now. It involves the rational numbers (which are defined in terms of equivalence classes among pairs of integers) and real numbers (which may be defined in terms of equivalence classes among sequences of rational numbers). Let's say x ~ y if x-y is a rational number. We would usually write the equivalence classes under this relation as R / Q (the reals mod the rationals). Now, what if I make a set X by taking one member in the interval [0,1] from each equivalence class? The answer is that I get something weird. In fact, I've pulled a fast one in even describing the set. Taking one member from each equivalence class sounds like a natural enough operation -- probably more natural than dividing up the real numbers this way to start with -- but there are an uncountably infinite number of equivalence classes (R is a lot bigger than Q in a well-defined sense, but I don't want to get distracted by a discussion of power sets and cardinality, so I'll leave it at that). Anyhow, the ability to make this choice in each equivalence class is not granted to me as a consequence of some theorem: it is an axiom of its own, called, appropriately enough, the Axiom of Choice. Some logicians find the Axiom of Choice distasteful, but most of us think it's convenient to assume it anyhow, as I will do here.

Now, R/Q has some structure -- it has a well-defined addition operation, though the fact that 0 ~ 1 messes up the multiplicative structure. But X is a truly awful, wonderful, bizarre set: to paraphrase a line from a book I read recently, Most sets have some redeeming virtue; this one is something special, as it has none. Suppose I define some probability distribution on the real line, and ask for the probability I'll draw an element of X. In general, I can't get a consistent answer to the question! If you don't find this disturbing, think about it some more: it's as though I'd told you that you could throw darts at the interval [0,1], and I would arrange the target in a way that a probability that you'd hit the target couldn't be defined. In the formal development of measure theory -- which is the real underpinning of probability theory -- the first thing we usually do is deliberately discard pathological sets like X, which are called unmeasurable. If I recall correctly, the existence of (Lebesgue) unmeasurable sets is a consequence of the Axiom of Choice; that is, if I'd found some other way to build an unmeasurable set, I'd have to invoke the Axiom of Choice again. Unless you have special training, you probably won't be able to find an unmeasurable set; and if you can find it, your intuition will fail. The strangeness of X makes it a great test case for sanity-checking statements about sets of real numbers.

Oh, brave new world that has such constructions in it!

Expect further invocations of the definition of an equivalence relation in the coming week or two, as I seem to be firmly planted in random lecture mode.

Sunday, January 09, 2005

Food again

I've been trying to read about duality for the past forty minutes, but I keep getting distracted by thoughts of food. Winnie and I made dinner last night, which was very good and quite filling -- but probably not so filling that I should have skipped lunch. Oops.

Before I go find something to eat, a recap:

  • Black currant tea

    The black currant blend from Peet's is delicious, probably the best of that type that I've had.

  • Salsa-spinach scramble

    Salsa: Saute an onion. Add tomato, garlic, jalapeno, and green peppers to taste. Cook it all together until it smells good. Mix a couple eggs, some milk, salt, basil, and plenty of chopped spinach. Scramble it all together. Add the salsa and serve.

  • Sweet bread

    1-2 cups of warm liquid (I used a cup of water and one of milk)
    A dollop of sugar (1-2 tablespoons)
    A dollop of salt (1-2 teaspoons, maybe?)
    A dollop of oil or butter (1-2 tablespoons or teaspoons)
    A package of yeast
    Flour as needed
    Raisins, walnuts, and honey (or jam or other sweet spreads)

    Mix liquid, sugar, and yeast. Let the yeast get started while you hunt up everything else and dump it in a bowl (start with a couple cups of flour). Pour in the liquid and knead in flour until it quits sticking to things.

    Roll the dough into a (thick) sheet before you start letting it rise. Spread the raisins, honey, and spread onto the sheet. Before you bake, roll everything together. If you're very skilled, you'll manage to roll things up so that you simultaneously have a little bit of filling in every slice but the very heels and have no places where the honey can bubble forth, lava-like, to produce a caramelized mess on the baking sheet. I'm not so skilled, and usually end up with both a caramelized mess and a few slices of plain bread. Since both plain bread and caramelized messes are good to eat, this is not a problem.

    Bake about half an hour at 350.

Computer science?

Computer science is misnamed. It is not primarily a scientific discipline; though computer scientists do sometimes apply the scientific method, experimental training is not a fundamental component of the usual CS undergraduate curriculum. However, the scientific method does play a role in computer science, and I think it's enlightening to consider that role.

First, the scientific method applies to topics purely within computer science because the systems we build become too complicated to analyze directly. In principle, I could sit down with a program and work out precisely how long the program would take to solve a particular problem on a particular machine (assuming that I can determine in advance that the program will terminate). It might take me a long time, but all the rules are known, and they're all deterministic. But I usually would not do such a thing, because the low-level details are too complicated for convenient reasoning. I don't want to have to analyze all the possible implications of the organization of the memory, the pipeline, the bus, and so forth -- even though those organizational details can make an orders-of-magnitude difference in the performance of my program. So I use a crude mental model that captures the fundamental features of the machine, and then run computational experiments to convince myself that my mental model, though incomplete, included enough details to accurately predict how my program runs.

Perhaps the most common use the scientific method to understand the behavior of a computer program is in debugging. If a program has a bug, that's usually a sign that the programmer's mental model was flawed. To uncover the flaw, the programmer will run a sequence of experiments, guided by knowledge of the consequences of his mental model -- his hypotheses -- until he finds the point where the actual behavior deviates from the behavior he expects. At that point, he has the data he needs to fix his mental model, and from there fix the program. Savvy programmers work hard to make sure that these tests are easy to run, often even including explicit tests of their hypotheses -- called assertions -- as part of the program text.

Second, the scientific method applies to the intersection of computer science with other disciplines where a precisely detailed model is not only inconvenient, but is also unknown. For instance, computer-human interface design must be partly experimental, because we don't really understand how people work. In studying performance trade-offs in the design of a system, there must be some set of sample workloads which one thinks are representative -- usually based on a set of actual workloads -- but the system designer usually doesn't really know enough to definitively describe the typical workload characteristics. And, of course, a large part of computational science involves building a program to make a prediction, and then checking against reality in order

In the way in which the scientific method is employed, there are similarities between computer science and mathematics. Many theorems start life as conjectures, created out of an incomplete mental model in the mind of the mathematician. The conjecture is tested against various examples that provide insight -- and is modified or rejected if it predicts the wrong thing. With luck, the conjecture will be right, and the examples will provide the insight necessary to devise a proof. When a physical system is described in mathematical terms, the observed behavior of the system may be a source of conjectures which can be turned into theorems; or the mathematical description may be the source of theorems which contradict experimental observation, thus leading to a modification of the basic description.

So if computer science is not primarily a scientific discipline, what is it? It's partly engineering, partly mathematics, and partly art...

And it's a lot of fun.

Death at the Supermarket

He's dying, you know.

Yes, you do know. You could tell from the alarmed look on his face, the way his right hand flew to his chest as he started to lean forward. He'll topple any moment now, but the people around don't even seem to notice. He's having a heart attack, there in front of you, and all you can do is watch, appalled, fascinated. Look, even the clerk tallying the cost of his groceries hasn't noticed -- he's looking up the code for chard.

He walked to the store to buy chicken stock, chard, yeast, and apples. It's cold outside, and he felt like something warm and comforting for dinner: chard soup and bread with a baked apple for dessert. The apple is important. He eats two apples every day, one with his lunch and one with his dinner. He's maintained this habit for years, maybe decades now, without fail. But some days he's creative about how he eats his apples: he'll have a little apple sauce, or he'll bake an apple, or fry it with a little butter and bread it in a batter laced with sugar and cinnamon. This evening he decided to bake the apple. Now he'll never get the chance.

If you were behind him in line, you could save him. You're sure of it. But you're stuck. The man in front of you is slowly filling out a check for his groceries, and the woman behind you is trying to wrestle an improbably large bag of cat litter onto the conveyor belt. The bag has a larger-than-life photo of a cat on it, looking smug. You think the cat looks smug because it just put the litter to good use, but cat people know better. That cat only has four facial expressions: asleep, surprised, stalking, and smug. The woman bought that litter despite the advertising, not because of it; she knows the cat's owner uses a different brand of litter altogether, a more expensive brand with a special sort of deodorant. The bag of litter now stubbornly stuck in the woman's cart -- she's nearly lifting the cart now, the bag is so tightly jammed -- that bag of litter is a generic brand with no top-secret deodorants or scientifically determined grain sizes. It works, and she's partial to it despite the smug-looking cat picture.

You cannot go back; you cannot go forward. Could you get to the man, who is now bent nearly double, if you vaulted over the magazine rack? You look at the rack; staring back at you there is an old man with gardening tools; a woman in an impossibly tight dress, an actress whose name you've forgotten; and a cat who must be related to the one on the litter bag. You'll never make it that way, and you know it; if you try, the celebrities on the cover of Enquirer will look up and mock you. Now who's the fool, they'll say, me with my alien love child, or you with the leg you broke as you tried to save a dying man in the next lane?

You're stuck. You can't get to him. All you can do is stand, and watch, and maybe pray. You know that he went to a seminary, but left after he met the woman he was eventually to marry. He'd appreciate your prayers, if he were in any state to appreciate anything. So would his wife, for that matter. She's waiting for him to come home and make dinner now; she's hungry, and she's starting to get grouchy. In an hour, she'll be furious. He's a little absent-minded, and has gotten lost on the way home from the store before. Even after years in their home, just two blocks away from the store, he still sometimes gets confused. In two hours, she'll be worried, because he's never gotten that lost before.

So you watch, and you think that in those last moments you can see something of this man's history and character. His life flashes before your eyes. As a child, he wanted to be a physicist -- no, a dentist -- no, a fireman -- but then he got a bachelor's degree in history, and from there he went straight to the seminary. He could not tell you today why he thought he would be a good priest, but at one time he was convinced of it.

The man sighs a little and falls to his knees. Then he stands. In his hand he holds the keys to the car; he puts them into his pocket, thanks the clerk, and walks briskly from the store, carrying the chard and the yeast and the chicken stock. The clerk starts scanning the next customer's groceries, beginning with the bag of apples.

Friday, January 07, 2005

Bread

I baked bread last night. It's the first time I've made bread since I moved, and it was a nice change of pace. I used entirely whole wheat flour -- usually I'd mix in some white flour, but I couldn't find the cannister -- and a short rise time, and so the loaf is denser than what I usually bake. The density is somewhere between that of an ordinary loaf and a fruitcake.

Thursday, January 06, 2005

Browsing the electric shelves

This is even better than JSTOR. At least for me.

Of course, what will I do with the time that I save by having electronic access to old SIAM journals (and fighting with the math library photocopiers)? Naturally, I'll spend it finding useful things about the history of Parker pens [1, 2, 3]; then about the history of pens more generally [4, 5]; and then about the madcap adventures of a Hungarian who, among other things, spent time as a door-to-door fountain pen salesman [6].

I've been curious about the history of writing utensils at least since high school; I remember picking up this book from the Bel Air branch library many years back -- it was not only an interesting book, but also my first introduction to Petroski's writing. So I cannot claim that, were it not for the network, I wouldn't from time to time hare off after information about writing utensils, growing conditions for camelia sinesis, or the most cost-effective way to make table salt. But a computerized card catalogue -- and Google -- does make such quests a lot easier.

Wednesday, January 05, 2005

Poems and proofs

The December issue of SIAM News (which unfortunately won't be online for a few more weeks) includes the article Bursts of Poetic Creativity Enliven Two Mathematical Careers. One of the quotes (of S.J.P. Hillion) tickled me:

Every so often we'd come up with something that particularly pleased us, and then we'd dwell on that phrase for hours or days, sometimes almost meditating on it over dinner at Chez Panisse or La Val's, as if we'd just discovered a profound mathematical truth.

I'm not sure which I appreciated more: the sentiment expressed, or the juxtaposition of Chez Panisse (Alice Waters's expensive, famous restaurant in Berkeley's gourmet ghetto) and La Val's (a cheap, though good, local Mexican place -- I often go to the one just around the corner from Soda Hall).

While you're waiting for the December issue to come online, check out the November book review, Mumbo Math.

Wednesday, Wednesday

The first time I took a plane was late 1998 (I think it was November), and since then I've had consistently good luck. I've been delayed once or twice, and occasionally I spend a few hundred miles with a small child bawling and kicking me in the back through the seat. I've only once had luggage lost or delayed, though that's not such a good record given that I make an effort to travel light, and I've only checked luggage twice. Yesterday's trip was more eventful than most: the first leg was bumpy, and due to freezing rain at the airport, we didn't make it to the intended destination of Kansas City. The plane landed in Chicago, and after a couple hours in line I was given a ticket for an alternate flight. I was in a middle seat, but the adjacent seats were filled by reasonably polite adults, and I was able to take a nap. The plane got in on time, the bus ride to the BART station was unusually prompt and pleasant, and the BART ride home was like any other BART ride. I got in around 11:30.

On longer flights, Southwest gives out snack packs -- some crackers and cookies and chewy candies -- rather than just the roasted peanuts. I think these packages are delightful, not so much because of the food as because of the packaging. This time, both the crackers and the cookies were packaged in air-tight wrappers strong enough not to fail when the plane went up. The pressure differential made them look for all the world like air-filled packaging material, and I was very much tempted to see if I could pop them. As the airplane was crowded -- and as I didn't really want to shatter my snack -- I resisted temptation.

Long plane rides are a great time to ponder questions about the strength of the joins in a piece of food packaging, or try to mentally estimate the altitude at which a bag of Oreos would pop (supposing it was sealed at sea level) or what the terminal velocity of said bag of Oreos would be. It's also a good time to ponder the recent increases in laptop battery life, and the ettiquete of watching a movie over someone's shoulder. Well -- not exactly a movie. I suspect that I got much more entertainment from watching fragments of Sex in the City over someone's shoulder in an airplane than I would ever have achieved from watching it in other circumstances (e.g. with sound).

My first day back in the office has so far seen me moving books around, deleting an enormous amount of backlogged spam, cheering for Poland, and rolling my eyes over someone taking Philip Dick and William Gibson too seriously. I type faster than I can effectively think of things to type, and doubt that a more direct-to-the-brain interface would do anything except lower the chances of RSI. Maybe I'd feel differently if I had a mind-controlled rocket car, but I doubt it.

A mind-controlled rocket car would be cool, though. I'm sure it would have an intelligent computer to do most of the driving. I could call it Fess.

  • Currently drinking: Black coffee

Monday, January 03, 2005

2005

Tomorrow, I return to California. The holidays are a grand time to catch up with family, friends, and food -- the three Fs, I might call them, if I didn't already know a different trio of activities by the same label. Besides, while I can call cats felines, I can't recall a synonym for books that starts with F, so the idea of classifying my winter break activities by a letter of the alphabet is... fundamentally and fatally flawed, I fear.

I'm now the proud owner of a t-shirt emblazoned with the words Books. Cats. Life is sweet. The shirt is decorated with a picture of a pile of books and a happy, goofy cat sprawled on top. I think I read twelve books this break, most of them science fiction, and I certainly spent many hours observing the cats. The best combination, of course, involves a book, a cat curled up at my feet, and a mug of something hot. There are three cats around the family home, now. Misty is the oldest of the lot -- he's an old grey tom, big but peaceable, who has been part of the family since I was in high school. Pounce and Jasmine are the other two: both of them were born after I left for graduate school, but I know them from previous visits and from hearing about them and seeing the photos. It's no surprise that I get along best with Misty, who is a more familiar figure, and who is also less skittish than the other two.

Among the authors I've read or skimmed this break between bouts of cat-watching are Alistair Cooke, Garrison Keilor, and Andy Rooney. All three write much as they speak, and I can hear their voices in my head when I read particularly characteristic phrases. It's impressive. Few of us speak in prose, at least not with any regularity. We often speak in a garbled mish-mash of sentence fragments, half-thoughts, stutters, and verbal tics which we would find mortally embarrassing were it set to paper. Fortunately, the ear forgives more readily than the eye does, so we overlook most of our verbal miscues. The remarkable thing about these authors, then, is not so much that they write as they speak; it's that they speak in a prose style that appeals to the ear and the eye alike, without seeming stilted or awkward to either one.

Of course, if I admire an author's words, it doesn't always mean I agree with them. In the opening paragraph of the preface to Word for Word by Andy Rooney (a book I swiped briefly from my mother's shelf), Rooney writes

Writing is difficult. That'w why there's so little of it that's any good. Writing isn't like mathematics where what you've put down is either right or wrong. No writer ever puts down anything on paper that he knows for certain is good or bad.

and later in the preface, he repeats the point:

The English language is more complex than calculus, because numbers don't have nuances.

Writing is hard. But numbers certainly have nuances, and mathematicians are as capable as anyone else of writing statements which are right, but are poorly-argued and not very useful; or superficially wrong but good, and fundamentally right underneath; or are otherwise gloriously mixed-up counterexamples to the premise that a mathematical result is right or wrong with no further subtleties. So when I read that, I hemmed and I hawed -- and realized I was getting worked up talking to myself about a nit. This is, I think, a good sign that it's time for me to return to my normal routine.

In particular, it's time for me to go back to my normal work routine. I haven't left aside all computing, of course: I've scribbled recurrences all over the card I'm using as a bookmark, I've read with gusto pieces of the optimization book I got for Christmas, and I've penned a long letter in what my friend who will receive it has dubbed the problem of the month series. It has all been unfocused play; and if I've done none of the work I'd hopefully scheduled for completion while at home, I'll probably finish more in the long run because I've had the time to play. Still: it's a new year. Time to be at it.