The University of Waikato - Te Whare Wānanga o Waikato
Faculty of Science and Engineering - Te Mātauranga Pūtaiao me te Pūkaha
Waikato Home Waikato Home > Science & Engineering > BioBlog
Staff + Student Login

A while ago now (6 years ago, in fact! How time flies when you're having fun), I wrote a piece about some fairly wild claims made about Neandertals. Rather surprisingly this post (here, & over on sciblogs.co.nz where it's mirrored) continues to attract occasional comments from those who firmly believe in the idea that Neandertals were cannibalistic, brutish savages rather than our very close cousins, an idea promoted by the author Danny Vendramini. So I thought I'd re-publish it now, with some edits to update things.

After being asked about the 'killer Neandertal' claims after a Schol Bio workshop, I quickly found a website promoting a book by Danny Vendramini. Called Them and Us: how Neanderthal predation created modern humans, the book supposedly provides "new archaeological and genetic evidence to show [Neandertals] weren't docile omnivores, but savage, cannibalistic carnivores..." - the 'Neanderthal Predation theory'. (I noticed that the author uses the spelling 'Neanderthal' throughout - a bit surprising as the norm these days is to use 'Neandertal', after the correct German spelling for the river valley where the type specimen was found.) Given the lack of any real evidence, and of support for this from the wider scientific community, this position would be better described as an hypothesis...

The website goes on to claim that that

Eurasian Neanderthals hunted, killed and cannibalised early humans for 50,000 years in an area of the Middle East known as the Mediterranean Levant. Because the two species were sexually compatible, Eurasian Neanderthals also abducted and raped human females.... this prolonged period of cannibalistic and sexual predation began about 100,000 years ago and that by 50,000 years ago, the human population in the Levant was reduced to as few as 50 individuals.

The death toll from Neanderthal predation generated the selection pressure that transformed the tiny survivor population of early humans into modern humans. This Levantine group became the founding population of all humans living today.

These claims are accompanied by illustrations that make Neandertals appear more akin to gorillas than to modern humans, which is 'interesting' to say the least, given the information we now have on the genomes of sapiens & neandertalensis.  We're told that the Neandertal Predation 'theory' "argues that, like modern nocturnal predators, Neanderthals had slit-shaped pupils to protect them from snow blindness" (thus conflating two ideas - not all nocturnal predators live in snow-covered lands - on the basis of zero evidence, since eyeballs don't fossilise). And there's also the statement that Neandertals "had thick body fur and flat primate faces to protect them against the lethal cold."

Now, that last one is just ridiculous. As far as I know there have been no published findings of Neandertal fossils accompanied by evidence of thick body fur. On the other hand, there is tantalising evidence that they may have had the technology to make sewn garments, thus reducing any selection pressure favouring hirsuteness. In addition, Europe was definitely not in a state of constant glaciation during the few hundred thousand years that Neandertals lived there. During interglacial periods temperatures were fairly similar to what they are today - hardly conditions where a thick furry pelt would be selected for (let alone those slit-shaped pupils...).

As for the 'flat primate faces' - if you have a look at a gorilla skull you'll see that the nasal opening is flush with the surface of the facial bones: gorillas do indeed have flat faces & no protruding nose. But a Neandertal skull, like that of a modern human, does have projecting nasal bones & so, by extension, a nose that juts out from the face. In fact, the whole central region of a Neandertal face projects further forward than ours, so it's hard to see where Vendramini gets the idea of a 'flat' face from. He does provide an image of an Neandertal skull, superimposed onto a chimpanzee profile, & claims that the 'perfect' fit is evidence that neandertalensis "more closely resembled non-human primates than a modern humans". What's missing is any recognition that the skull is not in its 'life' position but presented at an angle that conveniently fits the point of view being espoused. If Neandertals really did hold their heads at this angle their posture would be distinctly odd, to say the least. (Similar techniques were used by some illustrators in the 1800s to support the idea that African negroes were closer to the apes than to Europeans.)

And the claims of rape and cannibalism are fairly extraordinary. As the late Carl Sagan said, extraordinary claims require extraordinary proof. So let's go back to some of those statements. How about the supposedly much-diminished group of Levantine humans becoming "the founding population of all humans living today"? How, exactly, does this fit with the fact that the sapiens populations of Africa were not exposed to supposed Neandertal predation? Or with the colonisation of Australia by Homo sapiens around 60-70,000 years ago?

Or the idea of frequent interspecies rape, of sapiens by neandertalensis? By the way, if all this - the brutish images & tales of rape - isn't intended to demonise Neandertals, then I'm not sure what would. Frankly it smacks of the way this species was portrayed in the years immediately following its discovery, before palaeoanthropologists began to expose the details of its life - for example, a reconstruction by Frantisek Kupka, based on work by Marcellin Boule. Something of a dehumanising stereotype, in other words.

By the way, there's an interesting paper by Julia Drell (2000: Neandertals: a history of interpretation) that looks at how portrayals of Neandertal have changed over time, as more evidence has become available - and also as societal attitudes have changed. (NB this may well not be open-access.) Drell also notes that suggestions of cannibalism by Neandertals aren't new, first appearing in the 1860s. She cites an earlier author as saying that "there is no more universally common way of distancing oneself from other people than to call them cannibals."

In fact, there's not a lot of evidence of cannibalism in Neandertals, and what we have - published recently in the journal Science -  is evidence of Neandertals eating other Neandertals, not Homo sapiens, for the simple reason that our own species wasn't in that part of the world at the time the eating was done.  And that over the total span of their existence. (I do wonder why they'd turn to cannibalism anyway, given that they were extremely successful hunters of large game going by the butchered remains associated with neandertalensis living sites.) There is no published evidence that supports the contention that Neandertals ever ate non-Neandertal hominins, let alone on the scale that Vendramini suggests. On the other hand, there is evidence of Neolithic sapiens eating each other.

Nor is there evidence of frequent interspecies rape in the gene pool of modern humans. Back in 2010, Green et al announced the sequencing of the Neandertal genome, and the results of a comparison of this and the sapiens genome. Since then the amount of data available on the Neandertal genome has increased enormously. Differences in mitochondrial DNA (mtDNA) suggest the two species diverged 550,000-690,000 years ago. However, comparisons of nuclear DNA do show some introgression, so that there clearly was a certain amount of interbreeding going on. (The data did not support the idea that all modern humans are descended from a remnant human population in the Levant, as Them and Us would have it; Neandertal genes are notably absent from African populations. Nor does it support the idea of Neandertal predation, despite claims to the contrary on the book's website.)

The Them and Us website also provides a link to a paper, Neanderthal predation and the bottleneck speciation of modern humans, for the 'academically minded'. Strangely for an academic paper, the pdf contains no publication details (journal name, volume, & so on) & a Google Scholar search doesn't throw up any published papers with that name. So it's a fair bet that this has not been subject to the normal pre-publication process of peer review - something I would expect for an hypothesis that's supposed to turn our understanding of human evolution on its head...

J.R.R.Drell (2000) Neanderthals: a histroy of interpretation. Oxford Journal of Archaeology 19(1): 1-24
 

| | Comments (2)

I've just come across a most excellent article by the Genetic Literacy Project. In it, Nicholas Staropoli notes that a proportion of the human genome actually has viral origins.

This might sound a bit strange - after all, we tend to think of viruses as our enemies (smallpox, measles, and the human papilloma virus come to mind). But, as Staropoli notes, there are a lot of what are called 'endogenous retroviruses' (ERVs) - or their remains - tucked away in our genome. (An ERV has the ability to write its own genes into the host's DNA.) And he links to a study that draws this conclusion: 

We conservatively estimate that viruses have driven close to 30% of all adaptive amino acid changes in the part of the human proteome conserved within mammals. Our results suggest that viruses are one of the most dominant drivers of evolutionary change across mammalian and human proteomes.

Carl Zimmer writes about one such example in his blog The Loom: it seems that a gene that's crucial in the development of the placenta (that intimate connection between a foetus and its mother) is viral in origin. In fact, one gene encoding the protein syncytin is found in primates - but carnivores have a quite different form of the gene, while rabbits have a different form again, and mice yet another!

This is a very complex evolutionary story indeed. And so you could do much worse than read the two articles, by Staropolis and Zimmer, in their entirety. 

 

| | Comments (0)

I've written before about the so-called 'miracle mineral solution', aka MMS (here, for example), but I see that it's hit the news again recently.

MMS is essentially bleach1, but one Jim Humble has made quite a little empire (and a 'church') out of selling the stuff, and has previously claimed that it's a preventative & cure-all for just about anything that might ail you - including malaria, cancer, and HIV. (It isn't, and it won't: the proposed mode of action is preposterous.) While Humble has recently distanced himself from those claims, it appears that one of his church's archbishops continues to promote them. The associated fevers and gastro-intestinal upsets associated with ingesting a bleach solution? Simply a sign that the 'treatment' is doing its job. /<snark>

Now, if adults choose to 'treat' themselves thusly, then that's their decision. However, MMS has also been promoted in some quarters as a 'cure' for autism, with parents advised to administer MMS to their autistic children multiple times over a period of 72 hours - alongside thrice-weekly enemas2 of the stuff. The proponents of this activity claim that autism is due to parasite infection, and that the evidence lies in the dead parasites that can be seen in the poor kids' bowel movements. I say 'poor kids', because what those ropey strings of membranous stuff actually are, is the mucosal lining of the intestines themselves.

I can think of two words that apply here; 'treatment' & 'cure' are not those words.

Over at Respectful Insolence, Orac has once more picked up on this story following a news segment about Humble & his so-called church, and handled it in his inimitable way.

1 MMS is a solution of 28% sodium chlorite in distilled water. When this is is added to water containing citric acid (or some other acid), it generates chlorine dioxide ie a bleach. 

2 500mL water + 10-15 drops of MMS, administered and left in the colon for 20-30 minutes

| | Comments (1)

A couple of days ago I did a spot of live radio with the good folks at 95bFM. It was great fun. One of the topics was dog evolution, which I've already written about here; another was the recent publications on human dispersal, covered nicely over on sciblogs.co.nz

The third was a brief discussion of claims made in an article on stuff, in relation to organic farming & its use of pesticides & insecticides. More specifically, the writer (Dr Libby Weaver) said this (my emphasis):

Organic produce is labelled "certified organic" when it has been grown, raised, harvested and packaged without the use of pesticides, insecticides, growth hormones and antibiotics.

Now, that phrase I've emphasised is simply incorrect, and extremely easy to check (as was pointed out fairly emphatically by several commenters on the original article). It would have been correct had the statement included something like 'synthetic' pesticides & insecticides, because organic farming certainly uses chemicals to control pests. Copper sulphate, for example, is widely used as a fungicide, while rotenone & pyrethrum are common insecticdes. 

There's an interesting post on organic production here. It comments, rightly, that many of the chemicals used in organic production in the US are quite toxic - and then goes on to point out that this need not be a problem if they are used correctly because it's the dose that makes the poison - something that is true for both organic and conventional farming.

But I snuck 'biodynamic' into the title of this post, & here's why. In that same stuff article we find this statement: 

it is so important to support organic, biodynamic and sustainable agricutlure.

I doubt anyone will quibble over the need for farming practices - whether organic or conventional - to also be sustainable.

But 'biodynamic'? Here's an NZ website about biodynamics; it did make me wonder how familiar the OP writer was with its contents. For instance, biodynamic practice appears to include the belief that the stars & planets have an influence on crop production - but with the disclaimer that this involves astrology. It would be very interesting to see the scientific data that demonstrates an actual positive impact from the stars on plant and animal health & production. (Note: the actual stars - not regular seasonal changes.) There's some interesting commentary on biodynamics here. And then, of course, there's the implausibility of possum peppering...

 

Incidentally, I was interested to discover that the Bt toxin, produced by a common soil bacterium (Bacillus thuringiensishas been available as a spray-on insecticide for organic farming, in some jurisdictions, for at least 50 years, and is used in New Zealand. Arguments in favour of this, and against the use of GM crops that express the same toxin, include the suggestion that the latter could lead to widespread resistance to Bt toxin. However, the use of targeted sprays is also an agent of natural selection, & could eventually have the same result.

| | Comments (0)

So, I own a pocket wolf.

...

...

Oh, OK, I own a little black mini-poodle. But, like all dogs, he has the same number of chromosomes as a wolf!

There've been several articles posted recently about the evolution of domestic dogs. While we've tended to think that domestication didn't begin until humans began to settle down & develop agriculture, DNA analysis suggests that wolves and humans may have begun a relationship up to 100,000 years ago. And a paper published in Science back in June presents evidence that there were two domestication events, one in Asia and once in either the Near East or Europe. There's a nice visualisation and explanation of the doggy family tree here.

A few weeks ago, I was discussing domestication with RadioLive's Graeme Hill, and one of the questions he asked was, why do we have so many different body forms in domestic dogs, and so little variation in form in cats? Was it because the dog genome is more 'plastic' & susceptible to change? My answer was that I suspected it had more to do with the length of the species' association with humans. Cats are sometimes described as 'semi-domesticated', and our shared history may go back just 10,000 years (with the possibility, again, of at least two separate domestication events). Whereas 100,000 years gives a lot of time for selective breeding by humans to produce all those different dog breeds.

Which takes us to bulldogs - the subject of a Scholarship Biology question in 2014. Bulldogs were originally bred to drive cattle, and had the strength and the ferocity (and presumably also a high pain threshold!) to subdue an animal by grabbing its muzzle and hanging on. They were subsequently used in bull-baiting, a cruel 'sport' that the UK banned in 1835.

Instead, the dogs were then bred for show. While their physical characteristics remained pretty much the same (short & squat, very muscular, the familiar very short face/muzzle with deeply folded skin) - by 1860 they had already begun to develop, as a result of selective breeding, their now-familiar gentle, non-aggressive temperament.

Unfortunately, selection for that very distinctive body form has brought with it a whole host of inherited disorders. You've probably met a bulldog with that snorfling, stertorous breathing - this respiratory problem is called brachycephalic syndrome, due to the very short face & nasal passage. Other heritable disorders include:

  • hip dysplasia (seen too in other breeds, such as labradors), where the hip joint can partially dislocate - this one is due to polygenic inheritance ie there are a number of different genes involved. 
  • a hole between the two ventricles of the heart. This is called ventricular septal defect (VSD), and an animal must inherit 2 copies of a recessive allele to express this disorder. It's autosomal, which means that male and female bulldogs are equaliy likely to express it.
  • cryptorchidism - one or both testes remains up in the body cavity instead of descending to the scrotum. This one is an autosomal dominant trait - only one copy of the allele is required. 
  • and - with all those wrinkles - dermatitis, also considered to be an autosomal dominant trait. The dermatitis often leads to bacterial infections. 

Poor bulldogs :( 

The actual exam question gave this background information & asked students to discuss two things: how humans may have manipulated the evolution of bulldogs from wolves; and how further selective breeding could be used to try to eliminate EACH of the named disorders AND evaluate how effective this might be. For students intending to sit these exams: remember that when you're answering these questions you need to provide both 'evidential' statements and justifications for them. And to do that, you'll need to integrate the resource materials provided in the exam paper with your own biological knowledge. 

The first part's pretty straightforward: you might consider, for example, why humans would want to select for non-aggressive wolves (eg for help in protecting a human or group of humans from other predators). Or what about the founder effect, which would come into play because of that small population of proto-dogs? The small sample of the wolf gene pool found in those first 'dogs' could mean that particular alleles were simply lost in the dog population, while others could become much more common.

You'd then want to address the sort of things a breeder would focus on to get from generalised doggy form to the highly specialised bulldog: insensitivity to pain; the short, powerful, upwards-facing jaw; the squat, heavily-muscled body. Don't forget the explanation: that the physical features would allow the dog to drive or subdue much larger animals, and that selection for those traits, if they had a genetic component, would see those particular alleles to increase in frequency in the bulldog gene pool.

And of course, once bull-baiting was banned, selection for gentleness/docility came into play. Because that called for further inbreeding in the existing bulldog population, it would likely result in a higher frequency of any existing harmful alleles as well.

The second part of the question tests understanding of students' knowledge of concepts around inheritance. In some cases it's probably possible to remove the deleterious alleles from the bulldog gene pool - but as a result the dogs would diverge from what's currently viewed as the breed standard. For example, selective breeding for a longer, less-wrinkled face could reduce the frequency of brachycephalic syndrome and dermatitis - but the resulting animals would be much less bulldog-like!

The same approach is less likely to be effective for hip dysplasia, however, due to the polygenic nature of this disorder, because it involves multiple genes rather than a single gene locus.

But selective breeding could help with VSD & cryptorchidism. For example, a dog that doesn't express VSD is either homozgyous dominant (with 2 copies of the normal allele) or heterozygous. So at the population level, breeding heterozygous individuals will on average produce 25% of pups with VSD, with the rest not expressing the disorder. Using a test-cross would allow you to breed only from parents homozygous for the normal allele, but it would take more time.  

And with cryptorchidism, you'd avoid breeding from males who had the disorder (which would have very poor fertility anyway, I'm guessing), and from females whose sons had undescended testes (because these females would be 'carriers' for the allele - while cryptorchidism is a dominant trait, you're only going to see it in males). 

Of course, you could also try out-breeding with other dog breeds - but then, the resultant pups may well not conform to the bulldog 'standard'. 

I must say, it does bother me that adhering to a breed standard - a human construct - can perpetuate known health problems in a breed such as this.

| | Comments (0)

The National government is proposing a number of amendments to the NZ Education Act. One, which has already received quite a lot of press, sounds rather like a return to bulk funding under another name. But the latest one to hit the news is more like an untried social experiment with the potential for a lot of brown stuff to hit the fan.

And what is this proposal? They've certainly come up with a catchy title: COOLs - Communities of On-line Learning. The NZ Herald covered yesterday's announcement by the Minister of Education, Hekia Parata, with its reporter stating that

any registered school, tertiary provider such as a polytechnic or an approved body corporate be able to apply to be a "community of online learning" (COOL).

Any student of compulsory schooling age will be able to enrol in a COOL - and that provider will determine whether students will need to physically attend for all or some of the school day.

Sounds cool? Not really. I try hard to be a glass-half-full sort of person, but I can see too many fishhooks in this proposal to be in any way confident that it should be rolled out in this fashion. 

Yes, I understand that there are some children for whom regular schooling really, really doesn't work. But we already have a range of alternatives in place for this cohort. Where is the evidence that going on-line is a better option? We also have the Correspondence School, Te Kura - surely we should be looking at how it operates in the digital space and enhance that if needed, before going full open slather?

The Minister is reported as saying 

This innovative way of delivering education offers a digital option to engage students, grow their digital fluency, and connect them even more to 21st century opportunities.

Yet digital options already exist in mainstream schooling & have been used very successfully to engage students, with notable successes - including for students at low-decile schools. So we should be encouraging & supporting teachers in all schools to investigate ways of doing these things, rather than setting up yet another layer of schooling - presumably also funded by the public purse - to 'fix' a perceived problem in an untried way. After all, a range of resources already exist - see here, here, & here, for example. 

There are other reasons for caution. COOLs sound a lot like MOOCs (Massive Open On-line Communities), which offer many good things to their potential users but which also have an impressively high drop-out rate - on average 80-90% of those beginning a course, fail to finish it. And that figure includes data from very high-quality options, such as those available through Coursera. Student motivation probably plays a large role in this - it can be quite hard to maintain motivation when contact with tutors and classmates is solely digital. Before the Minister's proposal is implemented, we need to be very sure indeed that any providers are able to maintain student engagement & motivation to succeed. 

There's certainly mixed evidence that digital learning, alone, can contribute to learner success. For instance, this study found that on-line learners - especially those where there was also an element of face-to-face contact - did tend to do better, but pointed out that 

conditions often included additional learning time and instructional elements not received by students in control conditions. This finding suggests that the positive effects associated with blended learning should not be attributed to the media per se.

Plus there's evidence that on-line learning suits abstract thinkers more than those who need and use concrete examples in their learning: 

Successful telecourse students also preferred to look for abstract concepts to help explain the concrete experiences associated with their learning. That is, they wanted to know 'why' certain things happened in conceptual or theoretical terms. This more abstract approach clearly favoured success ... [while] those who needed concrete experience and were not able to think abstractly were more high-risk in a telecourse.

There is also a significant social element to successful learning for most students. In fact, learning is about far more than acquiring factual information; there are a wide range of social attributes and what are commonly called 'soft skills' that students also need to gain. Indeed, in the tertiary sector the emphasis is more and more on institutions being able to demonstrate that they are producing work-ready graduates with a range of competencies and capabilities, including communication and teamwork skills, and other social skills that are difficult to come by in a digital context. 

And finally, it's difficult these days for many families to cope financially unless both parents are employed. Which leads me to ask: if students are able to spend part or all of their day learning on line and at a distance from their education provider, just who is going to be supervising them?

I'm sorry, Minister, but we need - and our children and students deserve - to see the actual evidence that this proposal works before it's put into action.

| | Comments (8)

There's a lovely, life-size bronze sculpture of a Powelliphanta land snail sitting on my china cabinet. I love it because a friend made it for us - and because snails in this genus are rather special, for they are all carnivorous.

Now, I 'knew' this fact, but I'd never actually seen one feeding. Snails being normally rather slow, sedate creatures, it was hard to imagine how they'd ever catch anything other than even slower prey. That was until I saw this video

Every earthworm's nightmare!

| | Comments (2)

The semester's begun, teaching has started, admin isn't letting up any time soon, & there are days when I feel like a zombie by home-time. So it seems entirely appropriate to revivify a post I wrote 3 years ago, on that very subject.

Honestly, sometimes I think the zombie apocalypse is already here. Certainly zombies seem to be flavour of the month (& whatever friends say, I still can't bring myself to watch Walking Dead). And I've written about them myself: well, the insect variety, anyway.

But our developing understanding of how parasites 'zombify' their hosts has been developing since well before the latest iteration of human zombies grabbed the popular imagination. I was reminded of this when I saw the video below (in all its over-the-top hyperbolic glory), for I was first introduced to the concept of zombie snails years & years ago by one of David Attenborough's TV programs**. (According to my aging memory, it would have been an episode of Life on Earth.) 

| | Comments (2)

Tunicates are more commonly known as 'sea squirts' - little blobby marine creatures that squirt water when you touch them (hence the name). We don't hear about them often, except perhaps when they make the news for all the wrong reasons. But from an evolutionary perspective they are fascinating little creatures - and it's largely due to their larvae.

As an aside: why do we call them tunicates? Because the body of the adult organism is enclosed in an outer sheath, aka a tunic. The majority of tunicate species belong to a group known as ascidians, which as adults live in shallow waters, attached to rocks or maritime structures (including boats). The remainder are planktonic & found out in the open ocean.

The larvae of many ascidians are free-swimming and, because of their body form, are often described as 'tadpole larvae'. (Some of the non-ascidian tunicates have adults with the same morphology.) These little animals have a number of features (shared with creatures such as the cephalochordate formerly known as Amphioxus) that link them with the chordates: a hollow dorsal nerve cord, a post-anal tail, a pharynx with slits in it (which feeds into the gut), and a living cartilaginous rod known as the notochord, against which the animal's muscles work. (The larvae, and the adults of some non-ascidian tunicates, are basically little swimming filtration units.)

In fact, because of their rather simple structure, tunicates have long been viewed as representing the likely common ancestor of both chordates (a group that includes us) and the slightly-more-complex cephalochordates like Amphioxus. However, a newly-published & fascinating article by Linda Holland (2016) looks at 

the highly derived body plans and life styles of the tunicate classes, their importance in the marine food web and their genomics [with an] emphasis ... on the impact of their especially rapid evolutionary rates on understanding how vertebrates evolved from their invertebrate ancestors.

It turns out that a genomic comparison, using nuclear genes from chordates, cephalochordates and tunicates, indicates that it's actually Amphioxus that sits at the base of this particular group. This in turn means that tunicates

have lost a lot of what the long extinct ancestral tunicate once possessed. 

This genomic work is fascinating on a number of levels. For example, the 'textbook wisdom' is only bacteria (ie Prokaryotes) have their genome organised into operons, where a single mRNA transcript contains several genes. But it turns out that tunicates, which have a rather small genome.

[have] a high percentage of genes in operons

something that Holland states they share with roundworms (nematodes) and some flatworms, which apparently also have "reduced genomes". In tunicates, it seems that among the genes that have been lost are some of the 'Hox' genes - genes that control the development and patterning of body form. 

I learned heaps of new things from this paper: tunicates are able to regenerate most of their bodies, for example (makes sense, I guess, as the sessile adult sea squirt can't exactly avoid being snacked on by predators). Apparently this is achieved by pluripotent stem cells in the animals' blood, though how it's done is still something of a mystery. And I had no idea at all that the animal's 'tunic'

contains cellulose, synthesized by a cellulose synthase that was evidently acquired in an ancestral tunicate by horizontal gene transfer from a bacterium. 

An animal that produces cellulose! Nature never ceases to surprise :)

L.Z.Holland (2016) Tunicates. Current Biology 26: 4 pR146–R152 DOI: http://dx.doi.org/10.1016/j.cub.2015.12.024

| | Comments (0)

This is an amended re-post of something I first wrote back in 2012.

We're in the lead-up to the start of the A semester & lately I've spent a lot of time lately advising students on their programs of study. (Consequently I'm a bit short of the time needed to give attention to serious posts on Serious Subjects.) One of the things we often talk about is which major(s) a student should study, where a 'major' is the subject that they will devote most time to over the second & third years of their degree.

This is an important decision for first-year students as it pretty much determines how they're going to spend much of their study time in the ensuing years, and so we take quite a bit of time to talk about the various options, and I often find myself asking 'where do you see yourself in in 5 years' time? It's serious stuff as you don't want to get it wrong, and sometimes I encounter someone who is just a bit confused by the various majors on offer & how they're structured - but happily I have yet to meet anyone with the views parodied by the good folks at xkcd :-) (Thanks to my friends at Number8Network for passing this on, and yes - someone has already had a go at singing it!)

| | Comments (0)

November 2016

Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30      

Recent Comments

  • Alison Campbell: That would be a resounding 'nope'. And I would urge read more
  • herr doktor bimler: No love for Vendramini's genetic-engram neo-Lamarckian racial memory theory? http://eusa-riddled.blogspot.co.nz/2015/08/astigmatism-and-face-palm-eldritch.html read more
  • Alison Campbell: The defining characteristic of obligate carnivores is empathy. This one read more
  • Arthur Robey: The defining characteristic of obligate carnivores is empathy. A carnivorous read more
  • herr doktor bimler: The "church" part of the scam seems to be modelled read more
  • Alison Campbell: Oh yes, & hantavirus isn't an STD. Unless you're indulging read more
  • Alison Campbell: I've approved this comment (which is obviously spam) for one read more
  • Alison Campbell: Thanks, Paul, I appreciate your feedback and comments, & I'll read more
  • Lauren Rielly: I am very much delighted to every viewers that is read more
  • Paul Sullivan: Alison, I have been reading your commentary regarding Communities of read more