The University of Waikato - Te Whare Wānanga o Waikato
Faculty of Science and Engineering - Te Mātauranga Pūtaiao me te Pūkaha
Waikato Home Waikato Home >Science & Engineering & gt; Physics Stop
Staff + Student Login

August 2010 Archives

I'm back from a fairly brief trip to Sydney, where I spent almost equal amounts of time talking to a collaborator in the School of Physics (Peter Robinson and his colleagues) at the University of Sydney and sat in traffic jams on buses / taxis (and waiting for delayed trains).  Anyone who thinks Hamilton traffic is bad doesn't know what they're missing...

Anyway, we talked a bit about one of our projects. I shan't of course describe it to you in full detail, because most of you won't be interested (it concerns computer modelling of groups of neurons in the cerebral cortex), but I will describe a little bit of the flavour of this work (N.B. actually, what I describe is closer to the work that has been done in Sydney than to that done in Hamilton, but it's pretty similar)

A neuron (brain cell) will 'fire' signals to other neurons, at a rate that depends on the signals it receives.  If the neuron receives more signal, it fires more quickly. So, assuming a constant signal in, it will fire at a constant rate. That is one oscillation. (An oscillation being something that is repetitive.) But a group of neurons can also have other oscillation frequencies. This is the group behaviour - e.g. the group as a whole might fire a little quicker, then a little slower, then a little quicker, etc. That is another oscillation.

So we have individual neurons, that have their own firing frequency, that are driven by an oscillating input at another frequency.   What happens?

We do a little bit of this in our undergraduate teaching, but for simpler systems. For example, a car driving over bumps in the road. The suspension system of a car has its own resonance frequency (push down on the bonnet and time how long it takes for the bonnet to rise), and the bumps on the road occur at another frequency. In this case, if you drive over the series of bumps, you'll find that the response of the car 'slaves' to the frequency of the input - i.e. the bumps. The frequency at which you go up and down is the frequency at which the bumps occur, not the natural frequency of the suspension. It's a forced oscillator, which we can solve neatly mathematically.

So what about the neurons? These are more complicated.  The answer depends on the difference between the two oscillation frequencies. If the two are very different, each neuron will 'ignore' the underlying oscillation in the input, and just fire at its own natural frequency (unlike the car driving over the bumps). But if the two are similar, the neuron will abandon its own preferred natural frequency, and instead take up (nearly) the frequency of the input. In fact, we can get sudden jumps in behaviour...start with the two frequencies similar and slowly increase the input frequency. The neuron starts by following the input and slowly increases its frequency, but then suddenly jumps out of this pattern and then reverts to its natural frequency. So it fires at one frequency, or the other, depending on how similar they are.

This complicated behaviour (non-linear) makes it quite a tricky system to study, but a very interesting one. Worth a trip to Sydney for.

My travels continue next week.

| | Comments (0) | TrackBacks (0)

I'll be flitting off around the globe for much of the mid-semester break (well, away from the university, anyway), so blogging will be a bit light for the next two weeks.

| | Comments (2) | TrackBacks (0)

I've often talked about how great medical physics is. The MRI scanner, for example, contains some fantastic physics - interaction of atomic nuclei with magnetic fields  (which you need quantum mechanics to explain properly) - and is supported by clever mathematics too. And the PET scanner uses anti-matter (specifically anti-electrons from beta plus decay) to help map out your insides. But this technology, and its support staff, doesn't come cheap.

I was fascinated by Gareth Morgan's article in the NZ Herald on Tuesday, about the cost of healthcare. Why is it that healthcare costs seem to go up and up, even faster than my rates bills. Part of the answer I think is technology. There is a whole lot more that can be done for a sick person now, than twenty, fifty or two hundred years ago. And the view that people "deserve the best treatment possible" means that it is seen as reasonable to pay these costs. If you want to earn big money with a physics degree there are two choices: First is to go into banking and insurance and work the derivatives markets (this one has got a little risky in the last couple of years), second is to train as a medical physicist. It's hard to see that there will be an oversupply of them in the next few years.

But is it really reasonable to pay these costs and employ these physicists? (Am I really saying this?) Gareth points out that a huge amount of money is spent on trying to treat sick people, when we are particularly poor at making sure we (I mean the whole population) don't become sick in the first place. Junk food, alcohol, poor quality housing, lack of exercise - most of these things are cheap to fix, but require a major change in mindset to do them. I've often wondered if the health boards here would save money in the long run AND achieve better health outcomes if they spent part of their budgets on insulating people's homes. Of course they won't, because it's not what a health budget is seen as, but it might be a better use of their money.

Finally, a story that is terribly close to my heart. Last April, my father was shovelled through various high-technology tests, including CT and MRI, at I imagine a reasonable cost to his insurers and the UK National Health service, in an attempt to diagnose his pain. The technology, unfortunately, could do no more for him than to tell him he was dying of pancreatic cancer. Two weeks after his diagnosis, he was dead.  In contrast, his last two days in St Catherine's hospice 'achieved' far more - no scans, no operations, no tremendously high technology (though I did think the bed was pretty clever) just some very dedicated nursing care.

| | Comments (0) | TrackBacks (0)

I was at the NIWA science fair at the Hamilton Gardens yesterday morning, talking to some of the children who had put together displays on their science projects.  I can't say anything specific, not least because the prizes haven't been anounced yet, but I will say that, as ever, it is a real privelege to be able to see what these children have been up to and to encourage them further in science (in my case, physics).

Out of three hundred or so exhibits (not all physics, I should say) how does one go about picking the prize winners?  Although there are always difficulties, it is not so difficult as it might sound, because the good science tends to leap out at you.

What do I mean by good science?  A number of things contribute here.

First, is studying something that is of scientific interest.  That usually excludes comparing one brand of X against another brand of X for how Y it is.... you know the kind of thing...which washing powder washes the whitest...which energy-saving light-bulb is the brightest...etc.

Then there is thought behind the method. Is the experiment well controlled?  If you're looking at how the pressure in a football influences how far it will travel when kicked, how are you making sure that it is only the pressure that you are changing? You need to kick with the same force every time, at the same point on the ball. How are you achieving that? What about wind conditions - are your kicks into a headwind one day and tailwind the next? Temperature and relative humidity of the atmosphere?And so on.

Statistical variations are thought about, and enough trials are done to get decent means (averages). Just doing your experiment once isn't really enough. (I still have to tell my second year experimental physics class this on a regular basis).

Then results are presented and discussed well, and appropriate conclusions drawn. When talking with the children, it is usually evident whether they have understood what their experiment means. It's easy to over-estimate the importance of your work. It's unlikely that a football-pressure experiment will result in FIFA changing their regulations, for example, though it might emphasize to you that you should take the effort to pump your ball up to the right pressure before playing in the park with your mates.  Sometimes children say things like "If I were to do this again, I would change XYZ because that would account for ABC....". This shows real thought behind their work and is a sign of a budding scientist at work.

And finally there's the logbook. That's what the scientist uses to write down what he or she is doing as he or she goes along.  (Again, I have to keep telling that to my second years). A good logbook contains thoughts, reflections on work, and oodles of data and graphs, and probably diagrams too, and probably runs over several weeks. What it shows is a record of careful thought, planning, and taking and analysis of data.

So, overall then, a good science project usually speaks for itself, and judging 300 exhibits (not that I had that job this year) isn't so much of a daunting task as it first appears.

| | Comments (0) | TrackBacks (0)

For the last couple of days, I've been engaged with a student of mine on a computer-modelling problem. Specifically, it's an electromagnetic problem, working out how the electric field behaves between an array of electrodes. It's a useful thing to do, because the outputs of the model will help guide future experimental work, and help us to interpret the results.

Computer models are well used in science, particularly in physics. I've used lots in my time as a researcher, and they fall into many different varieties  (my classification based on experience).

First there are the computer models that use well-known physics that is described by known equations. My student's work is one of these. Electromagnetism is described by Maxwell's equations. There is no dispute about this (unless if you get into quantum effects, etc). If your computer programme is solving Maxwell's equations, it will work out for sure your electric fields (so long as you've specified your problem correctly). There's not a great deal of scope for things going wrong, though that shouldn't mean that you can just take your result for granted as correct.

Then there are the computer models that use well-known physics, but in problems that are really quite hard to specify. Fluid flow and movement of particles in fluids falls into this category. I've done a bit of this kind of modelling too - for example looking at the movement of airborne bacteria in a food-production building - with a view to identifying high-risk regions of the building where bacteria might accumulate. Here the equations are fairly well established (e.g. Navier Stokes equation for fluid flow) but some parts of them are uncertain. Exactly how does a bacterium respond to moving air? A tricky one - not least because they have a variety of sizes, shapes and textures, and this can influence how they move. There are various sub-models around of how to do this, but there is room for debate.

Moreover, in this kind of model you can have problems specifying the problem.  Do you have to model every tiny piece of machinery (geometry) in the room? Sometimes a small change in geometry can lead to a large change in the behaviour of fluid flow. And the reality is, on a production line, things change all the time, so the poor modeller never knows what his problem really is anyway.

But it can get worse for the modeller. There are those models (such as the models I work with for looking at the electrical currents in the brain) where the equations themselves aren't robustly established. Here the modeller is, in a sense, having to make up his own equations, drawing from what data is known about the brain (and there is a lot). This kind of modelling has a huge uncertainty associated with it, as it is loaded with assumptions. Get your underlying equations wrong, and you might end up with predictions that are just utterly disconnected from reality. A modeller can ask the question 'how close do my equations have to be to reality?' The answer to that one is often 'it depends on what you are going to use the model for'.  Sometimes we need really accurate physical models, that are based on pain-staking experiments, and sometimes we don't. That will control how much effort goes into developing them.

Overall, then, 'computer modelling' is a many-faceted beast which hides a multitude of skills. I would say it is an area of science in itself

| | Comments (0) | TrackBacks (0)

I don't usually rant too much on my blog - it's not my style, but the news yesterday about a legal challenge to NIWA's temperature data is just too provoking. Yes, I know it has been well-blogged about already by other Scibloggers, but for good reason. There deserves to be some sensible scientific comment made. So here's mine, as a scientist (a physicist), not as a climate specialist. How on earth is taking an organisation to court over its scientific data going to achieve anything positive (other than making sure lawyers don't get made redundant)?  Scientific work is normally scrutinised through peer review - where other scientists look at it and comment. Peer review has its problems, for sure, but it is the best system we have. How can a judge possibly be in a better position than the entire scientific community to say whether a collection of scientific data is valid or not? It is utterly crazy and any judge who knows anything about science (I mean science as a whole, not just climate change) should throw this case out as being beyond what the law is there to cover. There are better things for my taxes to be spent on; moreover, there are hard political decisions to make over climate change and the governments of every country should be given encouragement to make them appropriately in the face of science, not in the face of legal red tape.

| | Comments (0) | TrackBacks (0)

Yesterday I went into our tearoom and there was a post-it note on the high-tech tap saying it wasn't working. Why am I not surprised?  Maybe we'll have to replace it with an even smarter one...Have a good weekend, wherever you are...

| | Comments (0) | TrackBacks (0)

 No, not ice on car windows this time, but ice on aeroplane windows. John Fouhy has sent this question to me, and I don't know the answer. I have a couple of ideas, but what do you think? Below I reproduce John's question in full, along with the picture.  Airpoints Gold Card holders, help us out here...





 I recently flew into Auckland from Singapore. It was a night flight, so we only opened the windows as we prepared to descend. When we did, I looked outside and noticed ice on the outside window. Well, fair enough, it's cold outside. But if you look at the photo I attached, there is only a small patch of ice. It's in the shape of an annulus, centred on a small metal pin. The pin appears to be attached to neither the inner nor the outer surface of the window (I imagine aeroplane windows have many layers).

It didn't take long for the ice to melt in the morning sun. Drops of water remained on the window for a while after, in the same place. So it is possible (from the absence of water elsewhere) that ice did not form elsewhere.   [Marcus - I've seen this before too - is it something to do with the manufacture process?]

As far as I could see, by craning my neck, the next window up from me had the same feature.

So my questions, if you have time: Why would ice form only near this metal pin? Why would ice _not_ form even closer to the pin?

| | Comments (1) | TrackBacks (0)

Saturday was one of those days that the Waikato winter is famous for. Cold and damp - by damp I mean humid as well as raining - in fact the kind of weather that reminds you that you are living in an area that used to be one massive swamp. The sort of dampness that makes you feel it is raining inside the house as well as outside and makes you wonder whether you'd be better off weather-wise living in Wellington. Just yukky, soppy, dampness, everywhere.

At about 2pm I'd had enough of it and drove into town to a well-known homeware store (which, surprise, surprise, was having a sale) and bought a dehumidifier. It spent the rest of the day sucking vast quantities of water out of the internal atmosphere, while the heat pumps heated it.

Dehumidifiers are pretty simple in theory. The idea is that since cold air holds less moisture than warm air, all you have to do is to create a cold surface, pass the air over it, and the moisture will condense out. Just collect the moisture and, hey presto, you have dry air left.  There's not a great deal of difference between a dehumidifier and a fridge or freezer - in fact, you could use your freezer as a crude dehumidifier if you left the door open. (Try this on a hot summers day, like we did once accidently when we went away for a week, and see how much water has been deposited - in this case as ice). 

Dehumidifiers will also heat the air (not cool it)  This is because to create the cold surface, they need to shift heat elsewhere, and this process, by the second law of thermodynamics, is going to generate more heat. Also, when moisture condenses out of air, it releases heat. (Think of the reverse process, evaporation requires heat). Feel the air leaving the dehumidifier - it should feel somewhat warm.

However, dehumidifiers in some parts of NZ (of which Waikato is one) have a bit of a problem. They are mostly designed to work in places like Singapore, where it is hot and humid, not in badly insulated NZ homes in winter, where it is cold and humid. That means there's less scope for cooling down the already cold air, and so less moisture can be removed.  It helps to heat the air in the first place (i.e. dehumidifiers are likely to do better when your house is warm). Also, since dry air is easier to heat than moist air, it is easier to heat your house when the air is dry, so heating and dehumidifying in tandem both help each other.

Saturday afternoon certainly felt much more pleasant than Saturday morning.




| | Comments (2) | TrackBacks (0)

I gave a talk to the Junior Naturalists in Hamilton last Friday. It had some similarity to the talks I gave in June to the Osborne Days (year 12 and 13 school students), but I needed to change a few things because 1. The audience was younger, and 2. I wasn't prepared to cart voluminous apparatus across from the University to Hamilton Gardens and back on a dark night.

The mobile phone in foil experiment works pretty well, and is simpler to do than the mobile phone in water experiment, and this time I extended it to a radio in aluminium foil. (It shouldn't receive anything - the foil reflects all the waves and the radio goes quiet.)Now, I tried to test this a couple of evenings previously at home (to avoid embarrassing situations where your experiment doesn't work.) But we'd run out of foil at home, and I had to improvise by sticking my pocket radio in a saucepan and putting the lid on.  This is when I got a bit of a startling result - on FM, the radio went silent as I expected. But on AM, it still continued to pick up stations while completely surrounded by metal.

| | Comments (0) | TrackBacks (0)

I went to a very interesting seminar this morning. Phil Race, from the UK, was presenting about making assessments better in tertiary teaching. There was a lot in his talk (you can download it and other information from ) - I'll just summarise some of the points that are most interesting to me.

1. Assessment started going downhill when, in 1791, the University of Cambridge introduced the first written exam. (Before that, it was purely oral).  Not sure that this is ever likely to change - but I can certainly say that in my experience students seem to appreciate feedback a lot more when it is given in person.

2. Don't put a mark or grade on a student's assignment when you return it to them. The student will become focused on the grade, to the point of ignoring all your written feedback.

3. Instead, let them work out what their grade should be, based on the feedback you give and how their work compares to that of their peers. I tried this out very briefly this afternoon in a lab class. I normally mark student lab reports by spending a few minutes the following week with the student and going through their report together (see point 1). Today I asked my poor unsuspecting students what mark they reckoned they should get.   All but one was spot-on - their assessment was the same as mine. The other one was harsh on himself - I thought his work was of better quality than he did, and I was able to explain why.

4. Never ask a student 'Do you understand?' This is likely to trigger the following train of thought:

What is it he wants me to understand? What if I don't understand it? Will he think I'm stupid? Will my friends think I'm stupid? Will he ask me more awkward questions? How much do I have to understand? Is it a hint that this will be in the exam? etc. etc.

So the student answers .... Hmmm... I'm not sure...which gets no-one anywhere.

And 5. There is so much literature about what works and doesn't work with assessment that there shouldn't be any excuse for carrying on with the same methods that we know aren't much good. Just go and do what works.   As the Oracle of Delphi is supposed to have said "You know what the problem is... you know what the solution is.... now go and do it"

| | Comments (0) | TrackBacks (0)

I feel it is about time I commented on the high-technology tap that is in our Faculty tearoom. It was put in several months ago during refurbishment. It's certainly an impressive-looking tap. It has switches for hot and cold water, that you can flip up or down to turn the water on (the only difference I can find is that if you flip it down, it will spring back, whereas if you flip it up, it won't), and blue and red LEDs, which I think are there to tell you whether the cold water and hot water are at their appropriate temperatures.  But the most exciting thing about it is that it keeps you on your toes because occasionally (well, fairly frequently) it won't do quite what you'd expect. 

You put the coffee granules in your mug, hold it under the tap, flick the 'hot' switch, and, as if by magic, the tap opens up full bore on the nicely refrigerated cold water. Why? I have no idea. Monday morning it was refusing to give me cold water at all. It didn't matter whether the switch was flicked up or down, nothing would come out. Until I gave up and walked away, when it responded by switching on, despite the lever being in the 'off' position.  It really does have a mind of its own.

| | Comments (0) | TrackBacks (0)

Those who saw last night's report on TV One about the lithium reserves in Bolivia might be forgiven for thinking that this is a magic new energy source that the Bolivian president is sitting on. Describing it as 'the new oil' is somewhat misleading.

The application at hand is of course lithium ion batteries, which will be well suited to electric cars. (Though note that it is not the only technology that is possible here - don't discount super capacitors that are growing ever smaller.) But a lithium ion battery is not a source of energy as such - rather it is a store of energy.  You would have to plug in your electric car, which charges the batteries (in other words, stores the energy that you have taken from the national electrical grid) and then this energy is converted to the kinetic energy of your car as you drive. Now, overall this would be a reasonably efficient process, because you don't waste energy idling your engine in traffic as with a petrol engine, and your electric car doesn't pump out nasty gases into the atmosphere (not directly, anyway) thus keeping the city cleaner,  but, you still need to use energy. And where does this energy come from?

| | Comments (0) | TrackBacks (0)