Yesterday was our faculty's postgrad symposium, where science students of all descriptions attempted (usually with some measure of success) to explain to science students of all other descriptions what their research is all about. This necessitates some creativity and has left me thinking (not for the first time) about whether or not QFT is indeed inexplicable. It seems worth writing up a few thoughts here.
QFT is hard.
There's a reason that nobody less than four years out of high school gets QFT (for an approximate value of nobody and allowing 'gets' to be ill-defined) and nobody with less than six years of tertiary education has actually used it in any practical way. (Again, generalisation, but I don't think it's far off the mark.) QFT is a very abstract, mathematical theory and divorcing it from the mathematics is like trying to explain swimming without water. If you want to understand QFT, come back when you have a degree's worth of mathematics. This is perhaps a little too depressing, so let's try another tack.
QFT is not hard
Sure, the scientists who do QFT research need lots of training and deal with scary equations all day, every day, but that's not the heart of what they're doing. They're actually doing calculations on "god particles" and quarks, which are basically very small marbles glued to elastic bands which are other particles called gluons and the gluon is massless which means it's like a fish that can swim through the god particle molasses slickly, unlike the other particles, which are like whales because they have masses.
Now, I will admit that our building houses not just the physics department, but also the oceanography department and the Marine Research Institute, but we are separate departments. A quark or a Z boson is nothing at all like a whale. In fact, the work I do on a day-to-day basis involves talking about mathematical abstractions that can be experimentally tested using the ideas of such particles, but doesn't really talk about particles at all. So while I do think that non-specialists should be able to get an idea of what's going on in QFT, I'm not convinced that these analogies do much more than make people think they know what's going on. Which is perhaps worth something, but seems suboptimal.
QFT is (not) hard
QFT is an abstract theory based on abstract mathematics. If you want a genuine feeling for how it behaves, you need a genuine feeling for how maths behaves. To reuse an analogy (since I've decided they're not all bad), you can't understand swimming if you don't know what water is. QFT is related to abstract maths just as intimately and shying away from it doesn't help. But what I think we miss is that all you need is a "genuine feeling for how maths behaves". I say 'all', but of course such a feeling can be hard to come by. However, it's an awful lot easier to come by than a degree in mathematics. It almost has to be.
I don't think you need to be able to calculate a commutator to get a feel for what's going on in QFT. I do think you need to know that sometimes in maths, as in life, order matters. 2 + 3 = 3 + 2. I don't care whether you put on your hat and then scarf or you scarf and then hat. But pq is not the same as qp and you should put on your socks before you put on your shoes, unless perhaps you're going to a fancy dress party.*
For some reason we don't seem to want to explain QFT this way. Perhaps it's because we've all learned in school that maths is terrifying. Perhaps it feels like it takes us too far off topic. (I feel this every time I try to give a talk about QFT, but I find there's nothing to say if I take out the maths.) Perhaps we just haven't thought about it enough. (We almost certainly don't think about science communication enough.) Perhaps it is happening, but I'm not aware of it (and neither are my scientist friends, which would mean it needs way more publicity).
I'm still working out where this leaves me. It means that when I talk about QFT for a general audience, I don't shy away from the maths. Maybe it means I need to write and/or talk about these ideas more often (she said, guiltily writing her first blog post in months). Maybe it's enough to be aware of it and to talk about. Maybe I need to pick fights about it. Maybe (definitely) it's not all a problem for me to solve, all on my own, today. But it's a topic that deserves some thought, in the midst of marking and debugging and trying to put in those six years you need to learn to use QFT. We'll see.
____
* I feel like a cheat putting in mathematics without explaining the physical significance. The p and q here are representing the tools we use to measure the position and momentum (which gives us the speed) of a particle respectively. The maths tells us that the order in which me measure them matters, because pq ≠ qp. This means that making one measurement must somehow corrupt the other (it doesn't tell us how that happens, sadly). This in turn means that if we measure the position correctly, we can't measure the momentum and vice versa. This is Heisenberg's Uncertainty Principle (which is one of the most obvious and well known consequences of order mattering, although not the only one). You can talk about thought experiments, like the Heisenberg microscope, to make it seem more intuitive, but it comes from the fact that if we choose maths that gives us the right answers, order matters when you write down p and q.
Friday, September 5, 2014
Monday, July 14, 2014
Definitions and Discoveries
At first glance
it seems that it would be rather silly to mix up the ideas of a
definition and a discovery, but at least in maths (and perhaps
especially theoretical physics) it's surprisingly easy to do. Both
processes produce rules that allow us to go on and derive new rules and
descriptions -- the big difference is where they come from.
Might require a change of maths. |
Suppose, for instance, that I want to construct a rule
using the word "good". Then I would work out what the word "good" means
for the rule by seeing how it works in the world -- I would discover it,
rather than defining it. (There might be situations where you would
choose to define it differently from how it works in the rest of the
world, but you would have to be clear that you were doing something
funny then.) This is rather like how we decide to write the description
of gravity as pulling things together. Mathematically you could write
about it as a force that pushes things apart, but that wouldn't describe
the gravity we experience. Likewise, if you want to describe arithmetic
as we know it, you can't decide whether 2+3 and 3+2 are the same thing.
You discover that you don't get the expected results unless they are.
(One might decide to define it differently anyway, but then one would no
longer be talking about ordinary arithmetic.)
Things get more complicated if I have a procedure that
involves putting "un" at the beginning of words. I can't go out and
discover the meaning of "ungood", because it's not a word you can find
in the world. Nothing about the way people talk changes if I interpret
"ungood" as meaning "butterfly" or "convenient" or "pirouetting",
because people don't say "ungood" in the first place. Situations like
this tend to crop up in the intermediate mathematical steps of a physics
problem. I've translated the situation into equations and now I want to
solve them. Solving equations doesn't mean anything physically,
although the solutions should be something we can translate back. That
means there aren't clear physical requirements on what is or isn't
allowed in trying to solve the equation.
If I want to be able to use the "un" procedure, I have to
come up with a meaning for "ungood". There's nothing stopping me from
picking any definition I like -- but some will be more useful than
others. I'd like to pick a definition that means the "un" procedure does
the same thing it does to other rules. "Decided" becomes "undecided",
"expected" becomes "unexpected" and "forgettable" becomes
"unforgettable", all of which do mean things already. It makes sense
that when "good" becomes "ungood", it means the opposite of "good". We
might define "ungood" to mean "bad".
We don't have to define it that way, though. We could go by
similar phonetics instead and define "ungood" to mean "unguent". The
"un" rule wouldn't work as expected any more, of course. But if I were
writing a speech-to-text program, say, that might be less important than
the phonetics. Besides, it's not like the "un" rule is infallible
anyway -- look at how "til" becomes "until". It's only if I want the
definition to work in a system where the "un" rule always produces
opposites that defining "ungood" to mean "unguent" would be
inconsistent.
The same thing happens mathematically. If something doesn't
exactly match up to some physical observable, we need to define what it
means. Some definitions make more sense than others when it comes to
translating back and those are the ones we choose. But unlike making a
discovery, it is a choice. We decide that we want certain rules and
procedures to do what they do elsewhere and choose to define things
accordingly. The definition isn't forced upon us by the way the world
works.
Why do I care about such a subtle difference? If you just
want to apply the rules, it doesn't really matter. But if you want to
understand where they come from and what makes physics tick, you need to
know what's required to describe the world and what's just a helpful
way of thinking about intermediate maths. And that's why I'm quite sure
I'm neither the first nor the last physics student to say "Oh is that
just a definition? It all makes sense now..."
Savo 'lass a lalaith.
Friday, June 20, 2014
Theme Thursday: Sport?
Boogie at the UCT Ballroom and Latin Dancing Society |
Cari at Clan Donaldson runs the Theme Thursday linkup, and since she says there are no rules, I will persist in joining in erratically, even when I'm not really in the target audience. Also I will post-process the heck out of mediocre pictures until I can convince myself that they're arty. And claim that the social dancing when Ballroom lessons are cancelled for the holiday is a sport. (Actually, looking at what other people have linked, this last might be on-trend.)
Straight off the |
If I go to black and white, I can convert the motion blur to graininess (yay, unsharp mask) and call it a feature, yes? It was fun to play with, at any rate; and a fun evening to remember. So perhaps I can file this under "further perks of [still] being a student". It fits somewhere below "Science is cooool" and above "No, actually, I don't get to take the university holiday off. I need to learn everything about everything before my funding runs out."
Savo 'lass a lalaith.
Tuesday, June 17, 2014
Of spin and other nonsense
From the Oxford English Dictionary:
spin:
noun 1 A rapid turning or whirling motion
...
1.4 Physics The intrinsic angular momentum of a subatomic particle.
...
noun 3 [in singular] The presentation of information in a particular way; a slant, especially a favourable one
I was going to write something about covering groups, which I've been reading about today, but one of the applications of covering groups in theoretical physics is in linking quantum spin to rotations. Spin is fascinating and weird. I got sidetracked.
I suspect that spin is not really all that weird, if one thinks about it properly. But in the process of discovering what exactly holds the world together, one doesn't always come across ideas in contexts that make it easy to think about them properly. Such is the case with (quantum) spin, which, despite the name, does not involve any rapid twirling or whirling motions. In fact, all the talk about twirling and whirling could probably be classified as spin in the third sense "presentation of information in a particular way; a slant, especially a favourable one" (but not necessarily an accurate one).
Spin is just a property of a particle (I'll disagree with the OED on its technically needing to be a subatomic particle, but admittedly that's overwhelmingly the context where we talk about spin.) It's easy enough to imagine a particle having a position. It can have a mass, which we tend to think about in terms of how much it weighs -- although the ideas aren't quite the same. We're happy to think about a particle having a speed. Other things are harder to imagine.
We know that there's a thing called electric charge. It's what makes lightbulbs shine and computers compute. It's the reason you can rub a plastic ruler on your head and use it to pick up scraps of paper; and the cause of thunder and lightning. It certainly seems to exist. But what exactly is it? Well, it's electric charge. If we describe it as being like something easier to imagine then we're describing it as being something different from itself.
There's another thing called spin. People sometimes try to describe it as whirling and twirling, because that's easy to imagine and the context in which it was first noticed. In fact, it can be linked very closely -- but not identically -- to the idea of rotations using the mathematics of covering groups. However, spin is not about twirling and whirling, so we end up describing it as something other than itself when we take that route. It tends not to end well.
There are two yellow lines in the sodium spectrum, not just one. |
Spin is a thing that means splitting the light from a sodium lamp with a prism produces two yellow lines, instead of just one. There's a yellow line for each kind of spin. Spin is the thing that means if you fire a stream of particles into a magnetic field, some will go up and some will go down. It means electron energies are arranged twice as efficiently as you might expect. Like electric charge, spin has noticeable effects. Even if we can't exactly imagine it, it makes sense to talk about it.
That's where the maths comes in handy, of course -- it gives us a way to talk about things like spin, even when we don't have a convenient way to imagine what they 'actually' are.
Savo 'lass a lalaith.
Wednesday, June 11, 2014
I should be debugging
, but the server's down and sometimes it's good to step off the hamster wheel. I think. Maybe. It's okay to step off the hamster wheel, right? Are we allowed to admit that there is a hamster wheel?
I'm not complaining, mind you. I love my work. Really, laugh-out-loud, love theoretical physics and seeing how ridiculously, beautifully abstract mathematics can describe the real world and how things actually happen. I love C++ debugging somewhat less, but I can accept that it's part of the package. Which isn't to say it's not a bit of a hamster wheel.
On Saturday at a workshop on science communication I told an auditoriumful of people that I was infatuated with Grassman algebras. That isn't a hamster wheel. It's something to remember and savour. It's a reason to get on the hamster wheel when the wheel needs to be turned, even if I don't seem to be going anywhere.
Grassman algebras are very neat. See, ordinary numbers commute. That means you get equations like
It turns out that in particle physics there's a family of particles -- called fermions -- that behave like this. If fermion the first and fermion the second are identical (for instance, they might both be electrons), it still matters which order I put them in. No, that's not intuitive, but it does seem to be the way nature works. If I switch fermion one and fermion two, so that instead I'm looking at fermion two and fermion one, the mathematics I'm using needs to have a minus sign attached. And that's where Grassman numbers (the things you use in Grassman algebras) come in. Grassman numbers don't commute, they anticommute:
which is to say
In fact, Grassman numbers behave just the way fermions seem to. That means that while I might describe the length and breadth of a room using ordinary ("real") numbers, it's more convenient to describe fermions using Grassman numbers. I have to modify the rules of maths slightly to make sure that they anticommute, but otherwise I can carry on just as before. The kind of number I'm using does most of the work and I don't have to keep accounting for the odd behaviour of fermions. The fact that they do strange things when you swap them around is built in.
I think that's pretty. Just about pretty enough that I might muster the willpower to go and ask C++ why it insists that the solution to my equation is infinity. (The solution is not infinity. Unless I've given it the wrong equation. Or the wrong method for solving the equation. Or I accidentally typed +∞ before printing the answer. Maybe I'll go check that last one.)
-----
Savo 'lass a lalaith.
I'm not complaining, mind you. I love my work. Really, laugh-out-loud, love theoretical physics and seeing how ridiculously, beautifully abstract mathematics can describe the real world and how things actually happen. I love C++ debugging somewhat less, but I can accept that it's part of the package. Which isn't to say it's not a bit of a hamster wheel.
On Saturday at a workshop on science communication I told an auditoriumful of people that I was infatuated with Grassman algebras. That isn't a hamster wheel. It's something to remember and savour. It's a reason to get on the hamster wheel when the wheel needs to be turned, even if I don't seem to be going anywhere.
Grassman algebras are very neat. See, ordinary numbers commute. That means you get equations like
ab - ba = 0which is to say
ab = ba.Five times three is the same as three times five, and for most of the things we want to use maths for, that's awfully convenient. If I switch the length and breadth of a room, I don't want the area to change! But sometimes switching things around does change things. Putting on my shoes and then my socks is not the same as putting on my socks and then my shoes.
It turns out that in particle physics there's a family of particles -- called fermions -- that behave like this. If fermion the first and fermion the second are identical (for instance, they might both be electrons), it still matters which order I put them in. No, that's not intuitive, but it does seem to be the way nature works. If I switch fermion one and fermion two, so that instead I'm looking at fermion two and fermion one, the mathematics I'm using needs to have a minus sign attached. And that's where Grassman numbers (the things you use in Grassman algebras) come in. Grassman numbers don't commute, they anticommute:
ab + ba = 0
which is to say
ab = -ba.
In fact, Grassman numbers behave just the way fermions seem to. That means that while I might describe the length and breadth of a room using ordinary ("real") numbers, it's more convenient to describe fermions using Grassman numbers. I have to modify the rules of maths slightly to make sure that they anticommute, but otherwise I can carry on just as before. The kind of number I'm using does most of the work and I don't have to keep accounting for the odd behaviour of fermions. The fact that they do strange things when you swap them around is built in.
I think that's pretty. Just about pretty enough that I might muster the willpower to go and ask C++ why it insists that the solution to my equation is infinity. (The solution is not infinity. Unless I've given it the wrong equation. Or the wrong method for solving the equation. Or I accidentally typed +∞ before printing the answer. Maybe I'll go check that last one.)
-----
Savo 'lass a lalaith.
Monday, March 3, 2014
Patterns
Some correlations without much thought on the mechanisms behind them. Things I've noticed.
Three participants at the Cape Town heats of the FameLab science communication competition prepared their talks for both regional rounds before the day. (The others prepared the second talk during lunch break.)
//
The same three participants were the three sent through to the national stages.
(Did they [we] do better because they [we] were more prepared, is there something that affects both or is it just chance?)
Researchers are evaluated not by what they understand (which is hardly measurable), but by "research output" or publications.
//
Students are told that it's more important to understand the topic than to worry about the grades.
(This is partly to do with what we can measure and partly to do with how we look at things and definitely a question that goes much deeper than what I've written here.)
If I get to bed early, I'm much more capable of being productive in my work the next day.
//
A lot of fun-sounding events are run in the late evening.
//
There's a stereotype about scientists not having social lives.
(Even the most obvious way of linking these doesn't involve social awkwardness. One might argue that it implies it, I suppose.)
Mathematica is probably* the most expensive and widely-used symbolic programming language out there.
//
It also has the best pattern matching capabilities.*
(My supervisor likes to point out that, even so, it's a terrible stand-in for your brain.)
*I've heard so and it sounds plausible, but I haven't looked it up for myself.
Three participants at the Cape Town heats of the FameLab science communication competition prepared their talks for both regional rounds before the day. (The others prepared the second talk during lunch break.)
//
The same three participants were the three sent through to the national stages.
(Did they [we] do better because they [we] were more prepared, is there something that affects both or is it just chance?)
Researchers are evaluated not by what they understand (which is hardly measurable), but by "research output" or publications.
//
Students are told that it's more important to understand the topic than to worry about the grades.
(This is partly to do with what we can measure and partly to do with how we look at things and definitely a question that goes much deeper than what I've written here.)
If I get to bed early, I'm much more capable of being productive in my work the next day.
//
A lot of fun-sounding events are run in the late evening.
//
There's a stereotype about scientists not having social lives.
(Even the most obvious way of linking these doesn't involve social awkwardness. One might argue that it implies it, I suppose.)
Mathematica is probably* the most expensive and widely-used symbolic programming language out there.
//
It also has the best pattern matching capabilities.*
(My supervisor likes to point out that, even so, it's a terrible stand-in for your brain.)
*I've heard so and it sounds plausible, but I haven't looked it up for myself.
Wednesday, February 12, 2014
What exactly counts as open?
I'm currently writing a paper for a journal that accepts submissions only in .doc format. That doesn't sound like a problem – everyone uses Microsoft office anyway, right? Well, no, I don't. Because MS Office doesn't run under Linux, which I'm using. And even if it did, there are places I'd prefer to throw my resources, given a choice. But aren't there open source alternatives that can produce those files? Well – kind of. LibreOffice will produce a .doc file alright. And if it contains straightforwardly formatted text with the occasional picture, MS Word would handle that file fine. However. I need equations. And while both LibreOffice and MS Word have equation editor functions, they're not entirely compatible. Nor are they particularly fun to use. Why can't I just use LaTeX?
Well, okay, I know why I can't. It's a general science journal, and unless you're doing fairly mathematical science, the LaTeX learning curve may not seem or be worth the (relative) ease of use. (And, I suppose, not allowing LaTeX forces me to consider if I really need to include that obscure equation in the paper.) LaTeX documents are not, for most people, as easy to read and edit as is a .doc file. It is free, though. Which MS Word is not.
It raises an interesting question about which method is really more "open". Requiring Microsoft formats means I need to buy – or, more realistically, borrow – an expensive piece of software to contribute to the journal. Requiring LaTeX formats forces contributors to pick up a non-trivial skill set before submitting. Neither of these is really entirely open. Perhaps simply offering a choice of format would be a reasonable workaround, although it does still feel like a workaround. Improving the compatibility of LibreOffice (or other open source office suites) with MS Office, while far beyond the reach of the journal, might be a slightly better fix. I think that will happen, but not until long after I've written this paper!
In the meantime, I'll be sulking in the corner because a side effect of using an office suite format is that I can't work in vi.
Well, okay, I know why I can't. It's a general science journal, and unless you're doing fairly mathematical science, the LaTeX learning curve may not seem or be worth the (relative) ease of use. (And, I suppose, not allowing LaTeX forces me to consider if I really need to include that obscure equation in the paper.) LaTeX documents are not, for most people, as easy to read and edit as is a .doc file. It is free, though. Which MS Word is not.
It raises an interesting question about which method is really more "open". Requiring Microsoft formats means I need to buy – or, more realistically, borrow – an expensive piece of software to contribute to the journal. Requiring LaTeX formats forces contributors to pick up a non-trivial skill set before submitting. Neither of these is really entirely open. Perhaps simply offering a choice of format would be a reasonable workaround, although it does still feel like a workaround. Improving the compatibility of LibreOffice (or other open source office suites) with MS Office, while far beyond the reach of the journal, might be a slightly better fix. I think that will happen, but not until long after I've written this paper!
In the meantime, I'll be sulking in the corner because a side effect of using an office suite format is that I can't work in vi.
Subscribe to:
Posts (Atom)