The plural of
housemay behouses,
But the plural ofmouseisn'tmouses.
Is some sense of what's nice,
Unlike rodents or lice,
Why we do not sayspice,but sayspouses?
Monday, May 30, 2005
Pondering Plurals
Sunday, May 29, 2005
The Things We Do
I'm back. I spent one week at the Householder meeting in Pennsylvania, where I spoke with interesting people and with them mourned the lack of network access and good coffee. My talk went fine, and I listened to various other talks, some good and some not-so-good, but (as usual) all the most useful work and conversation happened between sessions, or in the hall outside the sessions. I flew to Boston on Friday, spent the night, and came back to California on Saturday. It has been a good two weeks, but I'm glad to be home, with my own bed and my own cooking.
Based on experiences on this trip, I think I may make three investments this summer. First, I should probably get a new pair of glasses, so that if I'm faced with a slide full of equations in a somewhat-too-small font, I will still be able to read them. Toward the end of last week, I would try to read an equation only to find that the fuzzy-looking screen split into two screens, each of them even less distinct than the original. This is suboptimal. Second, I should probably get a new laptop. This one is five years old: it crashes sporadically, the screen fades out at times, the battery lasts about 20 minutes, the plug at the back has to be propped up in order to deliver power, and from time to time the usual high-pitch whine of normal operation turns into a sort of warble. When I worked on my slides in my room, I was constantly worried that the laptop would quit working and leave me in the lurch. This is also suboptimal. Third, I am thinking (again) of getting a cell phone, mostly for when travel mishaps occur.
Comments on any of these things are welcome. (Scott, you told me that not having comments was an offense -- now you can use them to advise me on phones, if any of the advice has changed since last I thought about getting one.)
Now, to the point which led to the title of this post. One of the things I discovered in the past two weeks is that people find it a bit confusing when they try to find the common themes in the things I've worked on. In part, this is because many of the things I've thought about have come directly from questions that I've been asked. Since I know lots of different people in lots of different areas, it makes sense that the collaborations that arise from good questions go in lots of different directions. I am not picky, and am usually happy to spend at least a few minutes to entertain any problem that sounds -- well -- entertaining. However, there are a few directions where I know enough that someone is likely to seek my advice, and there are some directions where I've thought deeply. So there is a pattern.
My undergraduate degrees were mathematics and computer science. My graduate department is computer science, but I could just as well be called an applied mathematician. Fortunately, Berkeley is a good place for interdisciplinary work, and so nobody objects to me being both an applied mathematician and a computer scientist. Through training since starting graduate school, I'm also reasonably well-versed in continuum mechanics and finite element simulation of continuum behavior, and in MEMS engineering. This puts me in a good position to do problem-driven research in the analysis of physical systems, since I can take descriptions of a variety of continuum-level models as posed (for example) by an colleague interested in engineering widgets; figure out what simplifications can be done to reduce the problem to something of reasonable size; build a discrete model and write or adapt a code to compute a solution to the model; and provide the results in a useful form. At each level, from model-building to discretization to computer analysis, I have some areas of particular expertise. Consequently, I often analyze the behavior of resonant MEMS; I look at locally linearized models of coupled mechanical and electrical equations for these devices in order to compute their behavior near different equilibria; I write finite element codes in order to approximate the partial differential equations in these models; and I write specialized eigenvalue solvers that take advantage of different types of structures that appear on the way. Consequently, my thesis title is Structured and Parameter-Dependent Eigenvalue Calculations for Resonant MEMS Design, and the body consists of sections describing mathematical structures that I use, simulation software that use those structures, and applications where it actually makes a difference.
To be more specific, the problems that I'm thinking about most right now have applications
in the design of high-frequency MEMS resonators. I've been thinking about ways of studying
damping in these devices, and how they might be tuned. The balance of different physical
mechanisms for these problems isn't always well understood -- for example, we think that
the MEMS literature largely misrepresents the relative magnitudes of fluid damping, damping
from radiation of elastic waves away from a support, thermoelastic damping, and damping due
to intrinsic material losses (e.g. due to defect dynamics). But we know how to write down
continuum models for these effects. This leads to eigenvalue problems involving partial
differential and integral equations; usually I know that the solutions to these problems
are fairly smooth and are not wildly nonlinear. To approximate these with a not-too-large
discrete system, I use scaling analyses or perturbation analyses and then integrate these
analytic results into finite-element types of models. I typically code these models either
using HiQLab (my own finite element code) or FEAPMEX (which provides MATLAB access to the
guts of Bob Taylor's simulator, FEAP). The discrete systems often mimic structures of
the continuous system, and I have worked on different types of numerical eigenvalue
solvers that can exploit these structures: continuous dependence on some tuning parameter,
locality of interactions in the discrete system (sparsity), complex symmetry of the
linearized system matrices, separations of time and space scales associated with different
physical effects, and special approximately low-rank
structures that come from
interactions taking place at domain boundaries. At the end of the day, I produce results
that show my engineering collaborators useful things like plots of the damped resonant modes
in their devices, graphs of how those resonances and damping factors change as the designs
change, and programs that allow them to do such analyses for themselves (or at least allow
me to do similar analyses in much less time in the future).
This seems to me a straightforward to approach research. But universities have departments, and many folks stay within the boundaries of one or two departments. Consequently, one of the things for which I seem most useful is in facilitating communication: pointing engineers and physicists at good codes and computing tools, and pointing numerical mathematicians at problems they might not have realized fell into the purview of their tools. And on the way I build theorems and models and codes, and a merry time is had by all.
- Currently drinking: Green tea
Saturday, May 21, 2005
Travels
I will probably be in Boston from May 27 through June 1. My original reason for going there has evaporated, but because of how it fits in with other travel, it makes little sense for me not to spend at least a couple days in Boston regardless. Any suggestions for what I should do while there?
The workshop this past week was enlightening. Most of the attendees
had backgrounds in dynamical systems; I know something about
dynamical systems, but it's by no means something I know well. But
I do have some common background with lots of other applied
mathematicians, namely the family of subjects that tend to fall
under the heading of analysis
: ODEs, PDEs, functional
analysis, complex analysis, geometry, some mathematical physics. So
I was able to generally follow most of the discussion.
A big theme of the week was computing or approximating or bounding the spectra of different types of linear operators. There is, I think, a very interesting tradeoff in this sort of analysis. It comes up in other sorts of analyses I've thought about, too. On the one hand, you can pose your problem as something that takes place on an unbounded domain, and get something that's linear (though linear eigenvalue problems actually involve nonlinear equations). The linearity of the problem is nice, but it's sometimes harder to deal with problems on infinite domains (certainly it's harder numerically). The trade is that you can take the same problem and reduce it to something that lives on a nice, bounded, compact set, which is good for computation... but then your problem becomes nonlinear.
Well, I like nonlinear eigenvalue problems, but I don't like them so much that it keeps me from doing tricks in order to turn them into linear problems as quickly as I can. Computers only handle finite things, so an unbounded mesh is sort of a problem, and so I can't deal with the real linear problems. But I can trade off how large a discrete problem I'm willing to look at with how nonlinear a problem I'm willing to tolerate. In fact, I can make that trade in a controlled way: starting from a small, fully nonlinear eigenvalue problem, I can introduce a sequence of approximations, each of which is a linear eigenvalue problem. My approximations get better and better, but at the expense of introducing more and more dummy variables. In the limiting case, at least for some problems, I can show that the approximations tend to the actual nonlinear eigenvalue problem... but they also get infinitely big.
This is pretty cool, but the really cool thing is that in similar contexts, I've spent a lot of time thinking about how to get the most out of my added variables. So not only can I make good approximations while remaining linear, I can make good approximations that aren't too big. At least, I can do that for some problems.
This coming week I will be in Pennsylvania for another meeting (this one on numerical linear algebra), and I'll return to California around the middle of the following week. I'll be glad to return to a more ordinary schedule. Traveling is okay, though I don't enjoy it nearly so passionately as some of my friends do; and academic meetings can be extremely interesting. But they're also very fast and intense, and I get worn out after a while. It's also an unfortunate fact that talking about ideas is not the same as developing those ideas, so I very rapidly develop a backlog of things I'd like to play with but for which I've not yet had time.
- Currently drinking: Coffee
Sunday, May 15, 2005
Mathematics and Science
From Saturday's reading:
In the main, it is the applied scientists who call for numbers, numbers, numbers. Although the layman conceives mathematics as being the science of numbers, few mathematicians agree, and most of modern mathematics gives to numbers a role at best ancillary or illustrative in the development of concepts. Mathematics is the science of precise relations. Mathematicians habitually deal with properties that no measurement could verify or controvert, properties that can only be imagined. I refer to functions defined on infinite sets, continuity, passage to limits, relations of inclusion among infinite sets and in regard to membership in such sets, etc, etc. If the hard-headed empiricist replies,
I don't care about any such things, I just get numbers,in most instances the mathematician can easily see that he is deluding himself, for nearly all experimental results in physical science are explained and correlated by use of the operations of differential and integral calculus, in which the infinite is always present, and no finite set of rational numbers suffices for taking even the beginner's first step in that theory. Mathematics might have greater potential use if those who tried to apply it to the biologies and to social phenomena would learn that measurement provides only one of many possible kinds of quantification. The worst advocate for mathematics is the enthusiast who thinks he understands it but does not. For an informed, sober, and concise survey of the pitfalls in mathematical modelling, especially for the biologies, I refer the reader to Maynard Thompson'sMathematization in the Sciences.
-- C. Truesdell
Saturday, May 14, 2005
Tensor
When I first heard the word tensor
toward the end of high
school, I liked the sound of it.
But it took me until a couple years into graduate school to get to
my current understanding.
It's not so uncommon for students to come into college with a bit of formal mathematical coursework beyond basic calculus. In my case, I had a second semester calculus course and some differential equations. The math department at UMCP, in a bid to catch early the interest of potential students, invites incoming freshmen with such background to take an introductory real analysis sequence. I took it. It was a shock.
I was good at math
in high school, which really meant that I
was good at setting up and executing various sorts of calculations.
I'd had some experience with proofs, particularly in my high school
geometry class, but they were mostly very simple thing. This was
the first time I really engaged in the sort of rigorous reading,
writing, and exploration that characterizes the subject. Usually, I
might say I learned about proofs,
and leave it at that; but
rigorous mathematical reasoning is a skill that extends beyond the
ability to turn coffee into proofs, and just as it bothers me when
people think mathematicians do nothing but write page after page of
formulae, so it bothers me when (more sophisticated) folks think
mathematical writing consists of nothing but theorems, lemmas, and
proofs connected haphazardly by a few conjunctions and a definition
or two.
At the end of the year, we covered some theorems about
multi-dimensional integration from a geometric point of view. In
particular, we discussed the generalized version of Stoke's theorem,
which meant that we had to learn about differential forms. It was
my first introduction to tensors, though we didn't actually say
tensor
in class, and I didn't really realize what we were
working with until rather later.
I was able to do the calculations, and to prove simple theorems, but I don't think any of us had a real feel for differential forms. The intuitive understanding came later.
In my senior year, I took a graduate Riemannian geometry course. It
was another shock. I wasn't ready for it, and at the time it seemed
like I was catching only a very little bit of the material. It was
by no means the fault of Schwartz, who was an inspired lecturer, but
that was the only math course I ever took in which I felt completely
lost by the semester's close. We learned about tensors and tensor
bundles in that class, but what little luck I had understanding the
idea really was a result of learning about tensor products in the
graduate algebra sequence (which I took at the same time as the
geometry class). I remember talking about it to one of the
algebraists in the department, who said Well, I know what the
algebraic definition of a tensor is, but I've always wished I
understood how the physicists think of them.
Indeed.
William James said we learn to swim in the winter and to skate in the summer I may not have understood what was going on in my Riemannian geometry class, but I did remember it, and over time I started to digest it a little better. I took a course in differentiable manifolds -- usually a prerequisite for a course on Riemannian geometry, but I've always taken prerequisites rather loosely -- and things made a lot of sense; combined with a course on nonlinear finite elements for continuum mechanical problems and a lot of outside reading, I got a context in which all the things I didn't understand when I took the Riemannian geometry course suddenly made sense.
In all, I have seen four distinct presentations of tensors: a purely algebraic presentation, as suitable for groups or rings as for linear spaces; a presentation based on Kronecker products, in which everything is reduced to matrices; the geometric view, which effectively takes the algebraic notion of a tensor product over vector spaces and combines it with the techniques of analysis; and the indicial presentation which seems common in engineering. In addition, I've seen tensor analysis presented in indicial form with only first and second-order tensors on three dimensional spaces; it always seemed to me that such a presentation was subsumed by a good elementary course on linear algebra or matrix algebra, but I kept my mouth shut. At this point, they all seem interchangeable to me, and even fairly intuitive -- but it took a while for me to feel that way.
I still like the sound of tensor,
even if it no longer seems
like such a mysterious thing. My last desktop computer -- now
running on nine years old, and mostly left to sit while I use my
much-faster five-year-old laptop -- was called Vector,
another mathematical word that I think sounds good. After a year or
two with Vector, I thought I would upgrade, and call the new box
Tensor.
At the time, I secretly hoped that I would actually
have a good grip on what tensors are before I named a computer after
them. When I got the laptop, it somehow seemed more fitting to call
it Mongoose; but now, using Mongoose to type a page full of tensor
calculus for part of my thesis while Vector sits quietly by my knee,
I wonder -- maybe it's time to get a new machine that I can name
Tensor?
- Currently drinking: Coffee
Wednesday, May 11, 2005
Idiot's Fugitive Essays
Clifford Truesdell was interested not only in mechanics, but also in
the history and philosophy of science. Knowing this, I
looked up the list of books he wrote. My intention was to find
a reference that might tell me a little about the history of
Riemannian ideas in continuum mechanics. I didn't exactly find what
I was looking for; what I found instead was An Idiot's Fugitive
Essays on Science: Methods, Criticism, Training, Circumstances.
I went to the math library Monday afternoon, picked it up from the
shelf, and started leafing through it, starting at the very end.
The last article in the book consisted of three paragraphs written
in Latin, a language in which I am decidedly deficient. The
second-to-last article was entitled The Computer: Ruin of Science
and Thread to Mankind.
Naturally, I checked out the book.
Few of the essays are so provocatively titled as that second-to-last, but Truesdell's pen was sharp and several of the essays are polemical. Even historical comments occasion outbursts, as when during a review of the works of Stokes, Truesdell writes (p.234)
The catastrophe that has befallen the language of science in the past hundred years is only the outer dress of the catastrophe to method and thought and taste in natural philosophy...
I wonder whether Mencken and Truesdell ever met? In spite of -- or perhaps because of -- the crotchety attitude of the author, the book is informative, entertaining, and well-written. If you happen to be interested in the history or philosophy of science (and particularly of mechanics), I recommend borrowing a copy from the library.
Monday, May 09, 2005
City CarShare
I used the City CarShare program for the first time today. As the
name suggests, this is a program for sharing cars, a sort of
non-profit micro-rental program. There are pods
of cars
scattered across the East Bay and the South Bay, usually close to
public transportation so it's possible to just take the car for a
final leg of a trip after taking public transit.
I don't like driving. I never have. Nonetheless, I do have enough innate appreciation for clever mechanical devices that I appreciate some car designs. The car I drove today was a hybrid Prius, and it was sort of interesting. When I rode to school every day, I remember that I would sometimes be spooked when a hybrid whizzed past me without my hearing it coming. Unsurprisingly, they're also quiet from the inside. The car controls were more computerized than anything else I've driven, which gave me mixed feelings: turning on the car with a power button rather than with a turn of a key was a little odd, but eventually intuitive enough, but the heating and cooling system was a complete mystery. It also took me some time to figure out how to turn on the windshield wipers, so it isn't just the button-controlled things that I found counter-intuitive. On the other hand, I'm not sure I've ever driven anything made after 1990, so perhaps it isn't too surprising that such a recent model is a little unfamiliar.
I'm thankful that I'm able to satisfy most of my transportation needs with a combination of BART, bike, and the Mark I foot. But it's good to know that this alternative is available if I need it.
Computational Fest
The Berkeley-Stanford Computational Mechanics Fest took place this
Saturday. Twelve professors, some from Stanford and some from
Berkeley, spoke on numerical models of various problems in
mechanics; the topics included things phenomenta such as
sedimentation, ductile fracture, dislocation dynamics, granular
flows, and anisotropic surface friction. I thought the talks were
generally quite good. In fact, I was surprised that out of twelve
talks there were none that I would consider snoozers.
Because
of overestimates of the attendance, there was a lot of
food, and so I was well-fed as well as well-entertained. It was a
good day.
I think my favorite talk was by Phil Marcus, who spoke about planet formation. Actually, he spoke about vortex formation in proto-planetary dust. The problem is interesting to astrophysicists because vortices provide one possible mechanism for the sort of clumping from which planets may be born. It's also interesting because a number of physical effects balance in ways we're not used to from our terrestrial experience. There were lots of pretty pictures of vortices going unstable and shedding gravity waves which launched other vortices; I've probably made a mistake in my summary, but at least the explanation was sufficiently within my background that I understood what I was seeing pictures of. Good stuff.
Friday, May 06, 2005
Quick Explanations
What makes a good quick explanation?
I can think of several possible criteria. The most obvious are that it should be quick and it should explain something. A lecture is, to most people, not quick; and a sound bite is, as often as not, not very explanatory. Political speeches, marketing pitches, and other such beasts are neither quick nor explanatory,
It's easy to measure the time someone talks, or the amount of paper
someone uses, to say something. It's much harder to measure the
explanatory quality of what is said. In one of Asimov's
Foundation books, an ambassador visits the Foundation
planet for a week or two, and everything he says is recorded.
Afterward an analysis is done, and it's revealed that once the
pleasantries, diversions, and self-contradictory things he said are
stripped away, the only information is We are going to attack
soon.
It made a good story point, but I have a hard time
imagining how such an analysis could be made rigorous. Still, some
sentences clearly convey more information than others. As one wit
advised technical writers the number of words you use divided by
the number of ideas should not tend toward infinity.
I think one hallmark of a good quick explanation is
concreteness. Of course, the extent to which an idea
invokes a concrete mental picture varies depending on the person.
Still, if I speak of waves in a pond,
most people will be
able to imagine what I mean, based on their own experiences. In
contrast, the only image (literally meant) that patriotic
duty
brings to my mind involves a tiny American flag.
Presumably if someone wants to explain why I should do something and
appeals to my patriotic duty,
they would rather not have me
think of a tiny American flag on a plastic stick with a little
Made in Taiwan
sticker on it. Patriotic duty is the stuff of
good propaganda, but not good explanation.
Thursday, May 05, 2005
Cinco de Mayo
Happy May 5! Today is not actually Mexican independence day, but it is the anniversary of a major battle between Mexico and France. Popular confusion over the reason for the celebration won't keep any number of people from celebrating, of course. The weather, on the other hand, might be an impediment. It's raining here. It's also sufficiently close to the end of the semester that a large fraction of those I see on my daily walks about campus look like tired, worried, or both.
I last wrote anything here on April 11. Since I'll be traveling for most of the last half of this month, I probably won't write anything then. For a moment, though, I need to write something that involves no formulas and no code. I love continuum mechanics; I think differential geometry is grand; and I think simulations of coupled-field interactions are wonderful. I also think I need a brief respine from trying to simultaneously reference too many papers and texts in which E stands for energy, Young's modulus, the electric field, the Lagrangian relative strain tensor, an orthonormal basis for a referential coordinate system, and a generic measurable set. Epsilons are similarly overused.
There's a classic reference by Truesdell and Noll, The Nonlinear
Field Theories of Mechanics which, near the beginning, has most
of a page devoted to which fonts and alphabets are to be used to
denote different types of mathematical objects; the following several
pages are a glossary of frequently-used symbols. Unfortunately, while
I know the Greek and Roman alphabets, I don't know the Hebrew alphabet
and I have trouble deciphering characters in ornate Fraktur fonts.
The first time I saw a formulae involving both a Fraktur F (I think it
was an F) and a resh (which looks sort of like a backward capital
gamma) subscripted by a capital gamma, I remembered the whole thing as
something like sum of meow-meow of strange-meow-sub-gamma.
I
forget what happened next, but I probably decided that it was time to
take a walk.
I know a few people who feel very strongly that books and papers
should be read in a strictly sequential manner, starting at the
beginning and proceeding line by line through the end. For my own
part, I have no qualms on a first reading with mentally replacing all
the formulas in a paper with meow-meow-meow
(or perhaps
something like
meow-meow-meow, which is some complicated function of position but
not time
). In areas of sufficient familiarity, I read formulae in
the same way I read words, by looking at the shape rather than at the
individual characters. But when an unfamiliar formula appears, the
words can -- and usually do -- give a sufficient sense of what's going
on that it's possible to understand the big picture even with an
occasional meow-meow
substitution.
Similarly, I don't mind overmuch if I tell someone about something I
find interesting and they hear something like eigen-blah-blah
from time to time. That's part of learning. On the other hand, I
find it intensely annoying to spend the effort to speak my thoughts,
only to realize that the listener turned off his brain immediately
after the first unfamiliar word. This is nearly as annoying as trying
to inform someone who knows
you're wrong, and therefore doesn't
bother to listen. Was it Will Rogers who said it's not what people
don't know that's the problem, it's what they know that just ain't
so?
It's a good thing I'm not a politician.
Baa-ck
... said the sheep to the masseuse.
I think this is a good time to end my blogging hiatus. Posting will be light at times, particularly during the weeks when I'm on travel or when I'm making good progress on writing my thesis. But there will be posts. I may even get around to adding a blogroll.