31 December 2012

2012: waiting and DIY

The first word that came to my mind when thinking of my publishing this year was "waiting."

I didn't publish as much science this year as last. But that's as expected. 2011 may have been my annus mirabilis, with six papers. This year, just two: one journal article and one book chapter. I haven't even seen the book chapter myself in the flesh yet - but it's been shipped, so I'm willing to believe it exists. I'll have a post about the long genesis of the book chapter once I have the printed copy in my hands.

Almost every attempt to publish this year was an exercise in patience. The typesetting of the published manuscript happened at a crawl. Another manuscript that I mentioned in the same post is still on an editor's desk, where it seems to have been for almost a year and a half since I first submitted it. One person asked my why I haven't contacted the editor about it, and my rationale is that time they spend reading an email from me is time that they're not spending getting papers published. Another manuscript went through a much longer review process than I expected. I am hoping that those papers will see the light of day sometime this year.

And it wasn't just the manuscripts, but new projects. I had a couple of very promising little projects that just need a few last pieces of information before I can write them up and send them out the door. But these are also taking longer to get completed than I was hoping.

But as I thought about it a bit more, I remembered that this was also a year where I dabbled with a couple of self-published experiments. I published a paper here on my blog – not a first, but still unusual enough that the story behind it was my most read post of the year. I also self-published my Presentation Tips ebook in a Kindle version, which has earned me a cool $13 in profit.

Getting a proper research article published through the usual routes feels like a greater accomplishment, just because you have had to go through more barriers. The ease of self-publishing stands in stark contrast, and makes that route look very appealing. It's only been a few months since those self-published projects came out, though. This year may help determine whether I actually made a ripple with those experiments.

Related posts

Good thing I'm not in a hurry
Big in Japan

28 December 2012


There are certain periods that, in retrospect, seem almost impossible in their unbridled creativity. I want to make the case that Stan Lee’s co-creation of the Marvel universe as one of the greatest outpourings of artistic creativity ever. Name me another writer in any medium in the last 50 years who has given us as many characters that are now etched into the public consciousness as Stan Lee.

You have Spider-Man, the Fantastic Four, the X-Men, and almost all of the characters making up the Avengers – Iron Man, the Hulk, Thor.

In comics, one of things that defines a great comic is not just the hero, but the rogue’s gallery. The villains that the hero faces. Stan Lee not only helped create some of the best rogue’s galleries in all of comics, but he did it in an amazingly short period of time.

Look at the first year of The Amazing Spider-Man. In twelve issues, we get Doctor Octopus, the Lizard, the Vulture, Sandman, and Electro. Go just three more issues, and we get Mysterio, the Green Goblin, and Kraven the Hunter. Other heroes had to wait years, if not decades, to get a line-up of villains that good. Same famous superheroes still don’t have a rogue’s gallery that good. (Quick! Name all the Wonder Woman villains you can.)

In Fantastic Four, Stan created possibly the best villain in all of comics: Doctor Doom.

But I’m not singling out Stan Lee as the greatest writer in comics just because of the characters he created. At their best, Stan’s stories had an amazing economy. It’s no accident that retellings of classic stories Marvel stories often run many more issues than the original story did. Those Marvel issues routinely packed more plot and character in his single issues than most comics writers today manage in three or four.

One of the best examples is Fantastic Four #25, the first major battle between the Thing and the Hulk.

This is an issue that is near a high water mark for superpowered battle. In twenty-odd pages, this fight that takes so many twists and turns. The Thing gains advantage, then loses it, then gains it back. Ben Grimm knows he can’t beat the Hulk in a straight-up contest of strength, but keeps himself in though ingenuity and quick thinking.

It’s just amazing. Fights between superheroes and supervillains are so often used, so cliché, that it’s easy to forget how exciting a good one can be.

Part of the success was no doubt that Stan had great artistic partners. They didn’t call Jack Kirby (penciller of Fantastic Four, above) “The King” for nothing. Much has been written about just who contributed what (not to mention a bunch of court cases), and I don’t want to undervalue the contributions of anyone in creating those books. But those books wouldn’t be as good without Stan’s dialogue between characters, and, more importantly, the inner monologues.

Thought balloons are something of a lost part of comics. The few times I’ve talked to comic writers about them, they seem to think that they’re an unsophisticated storytelling device. I’ve always been intrigued by them, because comics is almost alone in the visual media for having a way to show the difference between what a character says and what a character thinks.

Again, nobody did that better than Stan. He gave his characters inner lives that were typically a stark contrast to their current situation. In fights, characters would get surprised, figure out plans, while retaining a cool exterior – the heroes snapping out the quips that Stan was famous for.

That’s my case. Lots of people have written comics. Many of them have written great comics. But in my mind, Stan Lee, working with the artists he did, is without peer.

Written on the occasion of Stan Lee’s 90th birthday.

Related posts

An appreciation

External links

Stan Lee turns 90
Stan Lee, Jack Kirby et al...The Birth Of The Marvel Universe

25 December 2012

Tuesday Crustie: Playtime

Enjoy your toys this Christmas!

Photo by puuikibeach on Flickr; used under a Creative Commons license.

24 December 2012

To Christmas, and beyond!

Even NASA decorates. Behind the Christmas tree is one of two training mock-ups of the next generation Orion capsules, intended to take up to four passengers at a time to space.

21 December 2012

Level 9

I spent yesterday at the Johnston Space Center in Houston, taking a behind the scenes tour they have tagged "Level 9."

So good.

Got to see the neutral buoyancy lab, where they train the astronauts for the International Space Station in water to simulate weightlessness.

Got to go on the floor of the Mission Control center for most of the Apollo missions, including Apollo 11.

Got to see the currently active Mission Control, and talk to one of the guys whose job is to drive the International Space Station.

And a big building where they have full size mock ups of the ISS and most of the other equipment in use, or is planned for use in the near future.

There was too much good stuff to try to type in with a tablet. Just great to see close up.

20 December 2012

Problem solving

For every complex problem there is an answer that is clear, simple, and wrong.

- H.L. Mencken

I've always liked this quote, but didn't realize the original source. I usually use a paraphrase that I think I read in Foucoult's Pendulum:

As the man said, for every complex problem there’s a simple solution, and it’s wrong.

― Umberto Eco

It has seemed particularly apt the last few days. There seem to an usually large number of people suggesting simple solutions to complex problems.

19 December 2012

That looks like it smells: a tale of sensory mash-up

We normally think that each of our senses is more or less distinct. Sure, there’s that condition called synesthesia, where people experience numbers with colours and that sort of thing, but that’s pretty rare, right?

Maybe not. A new paper suggests our different senses may be influencing each other more often than we think. The team looked at how smell, something we normally think of as one of our weaker, less important senses, hold sway over our vision, the sense that most people normally think of as our strongest, most important, senses.

Zhou and colleagues used a phenomenon called binocular rivalry to test this. Binocular rivalry is not when two binocular stores are competing on price.

Normally, the two halves of our brain get complementary information coming from each eye, which the brain stitches together into one almost seamless visual experience. Using a little bit of visual trickery, it’s possible to get all of the left brain being fed one image, and all of the right brain being fed a completely different, incompatible image. Face with two competing sets of information, people see only one image image at a time, alternating with the other, alternating with the other every few seconds, in an unpredictable way

You can get a sense of it from this picture. If you let your eyes cross so that the images are superimposed (a little like a 3-D stereogram).

In the overlapping image in the center, you will tend to see either green circles or red bars, not the half and half images. The two images will alternate back and forth in an unpredictable way.

In their main experiment, Zhou and colleagues showed people rival pictures of a rose and a banana at the same time. While doing this, they gave their volunteers the smell of a rose, and people became more likely to see the image of the rose.

When they gave them the smell of the banana, they were more likely to see the banana.

They also got this effect with a mix of images and words. In a second experiment (which must have been less fun for the volunteers), the rival images were a male torso and a set or words. When presented with the smell of, um, body odor. When presented with good ol’ B.O. (eeewww), the subjects were more likely to see the person instead of the words... but only if the smell of sweat was given in the right nostril.

Why does the nostril matter? Like the rest of our body, each nostril is wired to one half of the brain, so the input from each nostril has a different effect on one side of the brain than the other.

The “nostril” effect can be broken fairly easily, though. If you show a picture of a banana, with a rival image being the word “rose”, the scent of the rose still makes you more likely to see the word “rose,” but it no longer matters which nostril through which you smell the rose-like scent.

The one thing I can’t quite understand is why this paper is in The Journal of Neuroscience. There is no neuroscience in this paper. No brightly lit brain blobs, no EEGs, no neurons, nothing. This is a straight sensory perception paper.


Zhou W, Zhang X, Chen J, Wang L, Chen D. 2012. Nostril-specific olfactory modulation of visual perception in binocular rivalry. Journal of Neuroscience 32(48): 17225-17229. DOI:

Binocular rivalry image from here. Nose by Caro's Lines on Flickr; rose and banana by cproppe on Flickr; both used under a Creative Commons license.

18 December 2012

Tuesday Crustie: Ornamental

A little green and a little red... it looks like something that could be on any Christmas tree!

According to the tag, this is Neocaridina heteropoda, red cherry shrimp.

Picture by PKMousie on Flickr; used under a Creative Commons license.

17 December 2012

Comments for first half of December, 2012

Prof-like substance has good advice for people seeking NSF funding. Even good advice bugs me sometimes, though. Comments there escalated to another post.

I make a cameo in Biochem Belle’s Ever On and On blog because of my dislike of LinkedIn.

Doc Free-Ride considers the role of interviews in hiring academics. Are they just adding noise?

Identifying a crayfish on the Life Traces of the Georgia Coast blog.

OdysseyBlog examines sending reviews of scientific articles directly to the author before the editor does.

United Academics fumbles basic neuroscience.

12 December 2012

When British and Canadians are angrier than Americans

This was the scene in England in 2010:

This was at the Science is Vital rally, organized to protest planned budget cuts in the U.K.

This was the scene in Ottawa, Canada in July of this year:

This was at the Death of Evidence rally, organized to protest several changes, including budget cuts, in Canada.

Today, faced with the looming threat of sequestration (the so called “fiscal cliff”), which could cut basic research budgets something like 8%, this is the scene in Washington:

I am surprised that American scientists do not seem to want to demonstrate publicly that they are worried about how much funding cuts could hurt them. Instead, I just see the same low level drone of emails in my inbox from scientific societies asking scientists to contact congress and support research funding. But I’ve seen those emails for years, and there doesn’t seem to be any greater urgency this time. Maybe the leadership of those societies is convinced that sequestration won’t happen, just like the capping of the debt ceiling didn’t happen a while back.

What would it take to get scientists waving placards in the Mall in Washington, DC? I’ve speculated this might just be due to geography (the UK is smaller), but there is a high enough concentration of science on the eastern seaboard to make a decent showing.

And I know it’s not because Americans are more restrained than the Canadians or the British.

Science is Vital picture from here, Death of Evidence picture from here.

11 December 2012

Tuesday Crustie: Peduncular

Nice image showing why goose barnacles are also sometimes called stalked barnacles.

Photo by flythebirdpath~}~}~} on Flickr; used under a Creative Commons license.

10 December 2012

Transitions to a new South Texas university

They just put up these new signs around my campus.

And they’re going to have to change them in a year and a half.

As I expected, the big announcement last week was for a medical school. What I didn’t completely expect was that my institution, The University of Texas-Pan American, is sort of going away.

There is going to be a new University of Texas institution in South Texas in August, 2014. It still has to be approved by the Texas legislature, but I can’t imagine it will not pass.

 I had suspicions that something was up when I saw news stories like this one:

Higher education institutions in the Rio Grande Valley could be reorganized as part of a proposal laying out plans for a comprehensive medical school in the region, state Sen. Juan “Chuy” Hinojosa said Tuesday.

When I saw that, I thought, “We’re getting Brownsville.” The University of Texas at Brownsville (UTB) was always a small campus, with only a couple of thousand students. It had, for a long time, been associated with a community college. They split in 2011, and from what I had heard, UTB had been suffering since then. They had no physical space of their own; that was all given to the college. And, as I noted, they were small. Bringing UTB into the fold made sense to me.

I was aware that the Regional Academic Health Centers (RAHC) had been struggling, too. Despite one branch being located on the UTPA campus (for example), it was controlled by the University of Texas Health Sciences Center at San Antonio. That distance caused problems. The RAHC faculty had expectations to teach, but there were effectively no teaching spaces and no students there.

That aspect of the proposal makes a lot of sense to me.

A lot of time on Thursday and Friday went into meetings surrounding this announcement. I went to a town hall meeting on Friday, and it was filled to capacity, and people were excited, though it was hard to tell about what, exactly.

The thing that surprised me was that I expected the first thing that would be talked about would be the medical school, given how long it had been talked about by so many people. But it wasn’t. The first thing that came out was PUF.

It is strange to hear all these administrators talking about this, because they don’t say, “pee you eff,” they say “puff.” One of them even joked, “I thought ‘PUF’ was a magic dragon.”

I don’t think I had ever heard about “puff” in over a decade at this university, but it’s an acronym for “Permanent University Fund.” This is a found set up by the state that is worth around $11 billion. For various reasons, the universities in the region were never eligible to tap into this fund, but a new university would be.

This change to create a new university, with access to PUF, requires two-thirds approval from the Texas legislature. But – and this is a critical “but” – it doesn’t require the legislature spend a dime of money. Besides, announcements of this scale usually don’t go forward unless those involved are extremely confident of getting the votes.

And it was clear that back room deals are the secret to things like this. At the town hall meeting on Friday, it was very clear that this motion to make a new university with a medical school happened because a lot of the key players, like Chancellor Ciguerroa, were from from the Rio Grande Valley. Old boy’s network in action.

Fusing these institutions is going to be a pain. But my institution, UTPA, is probably going to suffer the least upheaval.

I was interviewed by university affairs about this, and they seemed to want to talk about the medical school. I understand why: there is a long standing need in an underserved community.

But the Rio Grande Valley is not about to become one big hospital.

The commitment to creating an emerging research university is, over the long haul, going to have a bigger impact on the area than just the medical school.

External links

Chronicle of Higher Education

Puff from here.

05 December 2012

Bold transformational synergies

Earlier today, my colleagues and I started getting carpet bombed about an announcement tomorrow.

A “bold new plan,” the little graphic says.

The emails we go included admin speak like this:

(W)e are finally at a transformative place...

Everyone, please align your chakras.

(T)he constellations are perfectly—and finally—aligned to bring significant resources to the Valley and create new and powerful synergies...

If this were NASA teasing a forthcoming press release, this is the point where we’d all be going, “OMG, aliens!

...a plan that we believe will forever transform not only the Valley, but also the State.

Wait, are we going to secede from the union?

This is not mere hyperbole...

Whew. That’s good to know. I agree, it’s not mere hyperbole, it’s exceptional hyperbole.

I’m predicting that the translation of all that will be:

Medical school.

This has been batted about for a good long while now. Some news reports (which I didn’t look at until I read the above) bear this theory out.

Fun with program assessment

The state of Texas has asked for a new program assessments from all universities starting this year. In my university, the biology master’s program gets to be among the first.

My favourite parts? The many sections that ask us to provide information on things we were never told before that we should have.

Some bits are useful to compile, but it’s a huge amount of work. And I wager like most of these assessments, it will vanish into the void and we will never get any feedback about it.


04 December 2012

Tuesday Crustie: White with black stripes or black with white stripes?

These zebra crabs (Zebrida adamsii) like to live on urchins of various sorts. There are a mere three species in the genus.

Photo by PacificKlaus on Flickr; used under a Creative Commons license.

03 December 2012

Nominees for the Newton of neuroscience

Gary Marcus wrote in The New Yorker:

 Neuroscience has yet find its Newton, let alone its Einstein.

Really? Here’s a small selection of individuals who could be candidates for the gig of “the Newton of neuroscience.”

Luigi Galvani, who discovered that there is electricity in living organisms. To me, if you are to draw a single line between how the ancients understood brains and behaviour and how we understand how brains and behaviour today, I would draw it at Galvani and bioelectricity. That changed everything.

Santiago Ramón y Cajal, who established that the nervous system is composed of individual cells .

Otto Loewi, who proved that neurons release chemical neurotransmitters after having the idea for the critical experiment in a dream – twice.

Alan Hodgkin and Andrew Huxley, who established how neurons send electrical signals along their length. (Surprisingly, this is the only picture I can find with the pair in the same frame, despite their historic association. Hodgkin is on the left.)

All of these are seminal, fundamental advances in our understanding of how nervous systems work. These are findings that apply almost universally across the animal kingdom, and even in cases where they are not true – nonspiking neurons, electrical synapses – we probably couldn’t understand those without the knowledge gained by these people. Like Newton’s findings, these are unifying principles of the field. And also like Newton’s work, they are not the final word.

I think one reason none of these individuals is recognized as a “Newton of neuroscience” is because their discoveries do not address the primary preoccupation of modern neuroscience.

The goal of neuroscience today is not to understand the nervous system; it is to understand the human mind.

You can see this in Marcus’s article. Marcus equates “neuroscience” to human brain imaging of cognitive function.  The first example is speech perception. He describes how bright blobs of brain became “became a fixture in media accounts of the human mind”. Every example is about human thinking.

When someone says, “Neuroscience has not found its Newton,” it may be because they are looking for someone who will start to explain human consciousness. And by that standard, no, we’re nowhere close to understanding that. But that’s like declaring, “Physics has not found its Pythagoras, let alone its Euclid,” because Newton didn’t develop a grand unified theory or a theory of everything.

Responding to Marcus, Scicurious thinks neuroscience will not have a Newton because brains are complicated. Or, as she put it on Twitter this:

Sci writes:

Physics, math, these are simple, elegant fields.

These may look simple and elegant to someone outside the field, but research is fractal: every level is messy and disorganized when viewed from inside the thick of it. Chemistry looks simple to people in biology. Biology looks simple to people in psychology. Individual psychology looks simple to people in social psychology. And so on.

Of the sort of scientists I nominate here, Sci writes:

But even these are not really neuroscience Newtons. They are not because as we gain more and more knowledge of the brain, we are able to see: there is no unifying theory of the brain.

Newton proposed no unifying theory of motion, either. He gave us three independent, non-interacting laws and one for gravity. And needless to say, physicists are still have their hands full of new phenomenon (dark matter, dark energy) that were not even predicted by their theories. And gravity still stubbornly refuses to be integrated with the other fundamental forces in the universe in the standard model.

Physics might be more like neuroscience than we’d like to admit.

Related posts

When is neuroscience not neuroscience? When it’s neurobiology

External links

Does neuroscience need a Newton? by Scicurious. It’s already clear she’s got a winner in that post.

Prime times for survival

How long can an insect live? Cicadas might be up near the top. Some cicadas are famous for remaining in the larval stage for thirteen and seventeen years/ That makes them a pretty long lived insect, even if they spend most of that time as larvae underground, out of sight.

A lot of cicadas are synced up in these thirteen and seventeen year cycles, so that in peak years, huge numbers of these insects emerge. Then they are everywhere, singing to attract mates so they can get the next brood of baby cicadas on their long road to maturity.

Now, these two times – thirteen and seventeen years – are notable because they are both prime numbers. As I understood it, the leading explanation is that lots of things in nature tend to cycle. But most of those cycles are fairly short. One possible advantage of something that cycles with a prime number is that it’s unlikely that any other short cyclic events will consistently coincide with the emergence of the new adult cicadas.

Imagine cicadas emerged on a twelve year cycle. Any predator that was on a roughly two, three, four, or six year cycle could sync up with the food feat of cicada emergence – provided there was a little give in their cycles so they could line up in the first place. But that sort of synchronization between predators and prey is much harder to do with a prime number. Thus, cicadas never face large numbers of predators just waiting for them to come out from their long larval stage.

A new paper suggests that the cicadas might even reap a bigger advantage than that.

Koenig and Liebhold do a new analysis estimating how many birds are during each year when cicadas emerge in large numbers, and how many birds when the cicadas don’t. They have population estimates for fifteen predatory bird species over 45 years. Their data set is as old as I am.

Surprisingly, there are routinely fewer birds on the years when cicadas emerge. The authors propose that this indicates that the long cycle has somehow allowed the cicadas to emerge during years that are safer than usual.

The authors do briefly mention alternative hypotheses. Cicadas are famously loud insects. Maybe the cicadas are so abundant and noisy that they actually drive birds away from their normal habitats. They authors say this is unlikely, because the bird counts go down even in places where the cicadas are not calling.

Koenig and Liebhold suggest that it's more or less coincidence that the cicada broods last for a prime number of years. They suggest that the emergence of these huge numbers of insects has some sort of knock-on effects, such that when they occur, the bird populations are effects, and go through booms and busts of their own - and the birds' low point comes around again in about thirteen or seventeen years.

The details of how this might happen aren't clear.

I suppose that the good news about being a cicada researcher is you have time to plan new studies. The bad new is that it probably doesn't take thirteen years to plan those projects... or seventeen years


Koenig WD, Liebhold AM. 2012. Avian predation pressure as a potential driver of periodical cicada cycle length. The American Naturalist: in press. DOI:

Photo by fmerenda on Flickr; used a Creative Commons license.

01 December 2012

Comments for second half of November 2012

Ed Yong has a story of a species that waited over twenty years on museum shelves before being described.

Eva Amsen writes about crowdfunding science. I deal with now familiar stock criticisms in the comments.

At Thoughomics, Lucas Brouwers examines the evolution of vision.

Genegeek ponders professional identity of a scientist.

I make a cameo at The OpenHelix Blog.

SV-POW! and Dynamic Ecology discuss the randomness of reviewer recommendations. Yesterday’s news, my friends.

The Cellular Scale discovers graphical abstracts.

30 November 2012

And now, unicorns

Apparently this is cryptozoology week. We started the week with the news about the claim of Sasquatch DNA. We end the week with the announcement of the discovery of a unicorn lair.

In North Korea.

How did the researchers find this? There was a sign:

Archaeologists of the History Institute of the DPRK Academy of Social Sciences have recently reconfirmed a lair of the unicorn rode by King Tongmyong, founder of the Koguryo Kingdom (B.C. 277-A.D. 668).

The lair is located 200 meters from the Yongmyong Temple in Moran Hill in Pyongyang City. A rectangular rock carved with words "Unicorn Lair" stands in front of the lair. The carved words are believed to date back to the period of Koryo Kingdom (918-1392).

That the sign is about a millenium younger than purported event, the riding of the unicorn by King Tongmyong, is not discussed as problematic.

What is weirder to me is that they are putting out a press release for something that is being “reconfirmed.” I could totally go into that press release scheme. “Scientists reconfirmed today that water is still wet.”

And you know, since this means there were unicorns less than 2,00 years ago, that is probably young enough that you could get ancient DNA to sequence, if the preservation conditions were right.

Update, 5 December 2012: Maybe things are quite as surprising it originally seemed. A grad student explains that the translation of the press release was bad. Incredibly, amazingly, bad. For one, the beast was better described as a kirin rather than a unicorn.

Hat tip to Bess Lovejoy.

Hat tip to College Guide blog.

You need an online presence, scientists

Two examples of people giving advice about why you need to get online.

Earlier this week, we hosted a presentation by Sheri Graner Ray, game designer. She is a South Texas native, who attended our university for a couple of years, and now lives in Austin.

Sheri’s advice on getting into the gaming industry? Get on Facebook. Clean it up to a point where you wouldn’t be embarrassed if a potential employer saw it. Then friend people in the business, and update your status once a day. Get on Twitter and tweet twice a day, and retweet and reply to people already in the business.

She asked students in the audience how many were of Facebook. A good chunk of the hands went up. She asked how many were on Twitter. Very few hands went up. Students, you are missing out.

Given her digital emphasis, I was a little surprised that she also stressed the importance of having a business card, calling it another “golden ticket” to networking and employment.

She said so far, four students took her advice. The number she was able to help land jobs in the gaming industry? All four.

Sheri has a series of “Networking 101” posts on her blog that, while geared towards the game industry, have a lot of good tips for students, too.

(The picture at right is my very primitive attempt to sketchnote her presentation. I was also playing with the newest update of Adobe Ideas on my iPad. The update included different pens, brushes, and a fill tool, all of which make it much more useful and fun than before.)

The Action Potential blog at Nature also talks about this. Much of the article is about how the editors are trying to be more inclusive, especially of women scientists. But there’s also this (my emphasis):

The greater challenges are deciding when to follow up on a suggestion to try out someone we don’t know, and identifying potential new reviewers on our own – particularly young scientists. The first stop in the process is usually a Google and/or Pubmed search to check infer expertise. Recommendations from trusted reviewers play a part. An informative lab or personal website helps. We also invest a fair amount of energy in “scouting” personally. We take note when we have had constructive and productive exchanges with authors. When we go to conferences, we make mental notes when presentations impress us, or we have interesting scientific conversations while in line for coffee. Our ears perk up when a veteran referee touts the critical faculties of a senior postdoc. The occasional find has even come through blogs – I have contacted people on the basis of the careful, rigorous thinking evident in their posts.

And here are the top two suggestions:

  1. Have some form of web presence. If someone can’t find you, they can’t follow up on you. This is less of an issue in the US, but I can’t tell you the number of times I’ve looked for someone in Asia and had trouble.
  2. Keep the information on your website current, or if your department is maintaining your web entry, make sure someone’s keeping on top of it. Ensure that your publications, key research interests, and technical expertise are easily accessible.

When called a jackass

When one person calls you a jackass, that person is obviously an ignorant fool.

When ten people call you a jackass, maybe you ought to think about getting fitted for a saddle.

—Source unknown

Photo by plone of Flickr; used under a Creative Commons license.

29 November 2012

A map of pain

Motor homonculusThere are maps of your body in your brain. Some maps represent the control over your muscles. Other maps show the input coming in from your senses. One of the best known sensory maps is the one for touch.

But we might think of everything we feel with our skin as one sense – touch – these are several separate sense. We feel pressure. We feel changes in temperature, and and different neurons handle warmth and chills.

And we feel pain.

While Wilder Penfield published the famous maps of the somatosensory cortex over 60 years ago, it hasn’t been clear if the neurons that we use to pick up pain from tissue damage, nociceptors, make maps in the brain the way other sense do. There are fewer nociceptors in the skin than other sensory neurons.

A new paper by Mancini and colleagues set out to test this. They gave their volunteers either innocuous little puffs of air on their hands, or...

They shot their volunteers with frikkin’ laser beams.

This hurt. Not much, but enough to set off the nociceptors in the volunteers’s fingers. The authors describe it as “pinprick.”

While they were doing this to the hands, Mancini and company were taking brain scans using functional magnetic resonance imaging (fMRI).

If you look at your hand, the middle finger is well, in the middle, flanked on either side by the ring and index fingers.

If there’s a map of nociceptors in the cortex, you should find that same order in the parts of the brain that respond to being shot with lasers. Using the colour scheme above, the blue should always be flanked by red on the one side and green on the other.

And that’s what you see. Check the area surrounded by the dotted white line in the picture:

The team also shows that the responses for the control puffs of air also map out in the same way.

Strictly speaking, the authors only show that there’s a map of the nociceptors of the fingers. Now, to assert that this means there is a full map of the sort that gets shown in textbooks is sort of like saying that because you have a decent map of the Mediterranean, you also have a decent map of Australia. That’s plausible, though strictly speaking, they haven’t mapped the entire nociceptive globe, so to speak.

It’s a nice demonstration that these neurons follow some of the same patterns of organization as other sensory systems. Which does lead to a bigger question: why does the nervous system tend to make these maps instead of some other form of organization?


Mancini F, Haggard P, Iannetti GD, Longo MR, Sereno MI. 2012. Fine-grained nociceptive maps in primary somatosensory cortex. The Journal of Neuroscience 32(48): 17155-17162. DOI:

Related posts

Classic graphics #3: The somatosensory cortex

28 November 2012

Fallout from “GM food causes cancers in rats” paper

Remember a few months back that there was a paper claiming genetically modified food caused cancers in rats?

A whole bunch of letters to the editors, and a response from the authors, are available as pre-prints in Food and Chemical Toxicology.

Criticisms (mostly)


(We feel) compelled to point out weaknesses in the paper by Séralini et al. (2012), the number and importance of which make the study reported very difficult to interpret scientifically.


(R)eporting and analysis of the study as presented in Food and Chemical Toxicology are inadequate and that this contribution is of insufficient scientific quality to be relevant in the safety assessment process

Wagner and colleagues:

(T)his study does not provide sound evidence to support its claims. Indeed, the flaws in the study are so obvious that the paper should never have passed review

de Souza and Oda:

This paper has some relevant flaws from the experimental design, through the statistical analysis and the way the data is presented. In addition, it lacks of some crucial information for the proper understanding and full assessment of the work.


I am writing to ask that the paper by Séralini et al. be retracted(.)


I was much involved with the problems that followed a similar failure of scrutiny of a paper about the MMR vaccine. This led to real injury to many and emphasized that careful consideration of the validity of results that appear to be outliers, is vital.

Le Tien & Le Huy

(P)resented here below are the three points that may constitute a sufficient ground to request an editorial retraction of the paper.


Analysis of the data suggests that no statistically significant findings of GMM toxicity were presented in the first place.

Grunewald and Bury:

Because of these fundamental flaws, the conclusions of Séralini et al. are not substantiated in any way.


I think that the readers of this important journal should have these answers from the authors to better understand and evaluate the results obtained in this work.

Hammond et al.:

(T)he study cannot be used to support any conclusions regarding the safety of NK603 glyphosate tolerant maize and Roundup® herbicide.


(O)ne can see that no value reaches the threshold of 7.815 needed for declaring the differences among treatments statistically significant at the 5% level.

Trewavas (With shout-out to science blogger Emily Willingham and her re-plot of the data):

(T)his paper and this journal have dealt the value of evidence-based knowledge a serious blow and it can only be rectified if the paper is withdrawn by the authors with an apology for misleading the public and the scientific community alike


The problems lie at several levels and bring into serious question the quality and standard of the editorial processes in your journal.

Heinemann, noteworthy for its positive tone:

(I)t is my view that the recent study is a valuable contribution to the scientific literature, debate and process of evaluating technologies


Since I last wrote to you, the scope and seriousness of the international scientific criticisms of the Séralini (2012) paper appearing in your journal has made me realise that my comments about the paper do not adequately describe the serious failures that have occurred in the peer review process at FCT.


The widely publicised Séralini paper does not survive such scrutiny.

Editor’s response

Wallace Hayes

Peer review does not end with publication. In the event that an accepted manuscript is questioned by the scientific community on the basis that the authors acted unethically, plagiarized, or where there are queries relating to the data or interpretation of the data, the editors will contact the authors to investigate unethical/fraudulent/plagiarized works or the journal editor will invite or accept letters to the editors.

Authors response

Séralini and colleagues

This may explain why 75% of our first criticisms within a week, among publishing authors, come from plant biologists, some developing patents on GMOs, and from Monsanto Company owning these products. ...

We encourage others to replicate such chronic experiment, with more statistical power. Now, the burden of proof has to be obtained experimentally by studies independent from industry. This was recommended by regulatory agencies that have assessed our work in France, even if it their objective is more to regulate products than to review research. GM NK603 and R cannot be regarded as safe as long as their safety is not proven by further investigations.

Related posts

What did you think those film crews were doing in the lab?
Why not retract the rat cancer / GM corn paper?

Building memory by not building molecules

Honeybees are clever wee beasties. If you give a honeybee a scent, then give her food, she can quickly learn to extend her mouthparts when she smells the scent alone. And they can remember this for at least a whole 24 hour day. This is a classic learning test made famous by Pavlov’s dogs. So honeybees are at least as smart as dogs, for this test anyway.

What’s going on in that tiny little head as they learn that some arbitrary smell means food? Usually, neurons need to make new “stuff” to form a memory. Making proteins, for instance, is usually needed for long term memory, but not short term memory.

Actin is a protein that is best known as half of the machinery that powers muscles (myosin in the other), but actin is also a more general component of a cell’s skeleton. In rats and mice and other furry mammals, you need to make actin to get long-term potentiation (LTP), which is a strengthening of the connections between two neurons.

Ganeshina and colleagues injected honeybees with chemicals that blocked the making of actin. You would expect that this would mess up the poor little honeybee’s memory.

But expectations were dashed. These actin-inhibiting drugs made the honeybees remember better, not worse.

The authors’ aren’t sure what’s going on here, but they have a guess.

The parts of the honeybee’s nervous system that learns smell are called the mushroom bodies. These mushroom bodies grow a little as the honeybee gets older, adding in new connections between neurons all the time, regardless of whether the honeybee learns anything or not. These new connections, because they aren’t related to anything the bee learns, would mostly add noise to the neural pathway. And that could drown out some of connections between neurons that are formed or strengthened as the honeybee learns.

The authors seem to think that knocking out the actin production prevents “random” new connections that would form just during normal aging. As a result, the honeybee gets more memory signal and less noise.

This is a story of diversity. This paper reminds us that even when animals can learn the same kinds of tasks, they may not be learning them in the same ways.


Ganeshina O, Erdmann J, Tiberi S, Vorobyev M, Menzel R. 2012. Depolymerization of actin facilitates memory formation in an insect. Biology Letters 8(6): 1023-1027. DOI:

Photo by BugMan50 on Flickr; used under a Creative Commons license.

26 November 2012

Sasquatch DNA?

On Sunday night, I spotted an article in Neil Gaiman’s Twitter feed: A lab claiming that it has sequenced DNA from Sasquatch.

Well. That would be interesting, if it were true.

It’s not just the subject matter of the press release that is strange, though. There’s the little fact that it’s for a paper that is in review, not one that has been published. Usually, papers in review don’t get press releases, because goodness knows Reviewer Number 2 has taken a lot of manuscripts out of contention and they never see the light of day.

In fact, I have to admit: I am so pulling for Reviewer Number 2 to take this manuscript down. Preferably with sniper-style precision and finality. As Adam Goldstein indicated on Twitter, this is something that most journal editors would not even send out for review.

A quick search on Google Scholar revealed one article on animal DNA co-authored by the researcher mentioned in the press release, Melba Ketchum: Recommendations on animal DNA forensic and identity testing. This morning, I’ve found another: A low-cost, high-throughput, automated single nucleotide polymorphism assay for forensic human DNA applications.

That Ketchum is a published author on DNA techniques makes me think this is not a hoax. And I’ve smelled sasquatch hoaxes before (see related posts at bottom). This feels much more like... overly enthusiastic interpretation, if I’m being charitable about it.

More details emerged this morning courtesy of @mem_somerville.

The source of the DNA appears to have been from a woman in Michigan who claims to feed blueberry muffins and bagels to Sasquatches on her property. The researcher, Melba Ketchum, also appears to claim to have DNA from angels. This longer article has more details.

I would love some other science blogger to do a post on, “If this were true, this is what the DNA would be like, and these are the reasons someone could get mislead.” On the latter, I can say: I lived through the rush to find dinosaur-era DNA back in the 1990s. There were a lot of papers published in Glamour Mags claiming to have DNA tens of millions of years old. It didn’t replicate. Lots of cases of contamination. This taught me that DNA is much trickier to work with than you might think.

I think Neil himself has a good summary of this story so far:

I do not care if this is true or not. It makes the world a cooler place & it delights me(.)

While I am extremely skeptical of the results scientifically, this is shaping up to be one fascinating glimpse into fringe science.

Update: Apparently, this story has been bubbling in the sasquatch community for some time now. This post is interesting, is that it looks at the business that Melba Ketchum is in. The Better Business Bureau has several complaints lodged against her business for failing to deliver results.

More updates: Back in January, Melba Ketchum applied for copyright for media around “The Sasquatch Project.” (Hat tip to The OpenHelix Blog.) This is not surprising, as we have often seen people with sexy scientific projects try to make money with documentaries (e.g., the documentary on Darwinius).

Another report from back in January motes Ketchum says she has seen Sasquatch personally.

Update, 27 November 2012: I don’t like either of these two news stories. This one is two credulous (“actually proves the existence of Sasquatch”). This one is too mocking (“Like OMG!”). Hat tip to Leonid Kruglyak for spotting both.

Update, 28 November:  Corrections and additional information from Robert Lindsay in the comments.

A attention-grabbing headline:

Boffin claims Bigfoot DNA reveals BESTIAL BONKING

...at odds with a nuanced final paragraph:

El Reg awaits it with interest. While it's easy to chortle at such stories, the scientific method demands that disbelief be suspended until peers have reviews and retested. Maybe it is possible that someone had the one-night stand from hell and we ended up with a near relative – but great claims demand great evidence.

Update, 29 November: Here’s an article from someone else who was responsible for testing sasquatch DNA back in 2005, which I blogged about at the time.

(W)hat exactly might Ketchum have sequenced? Coltman doesn’t know for sure, but he said that it’s easy to pick up human mitochondrial DNA because of contamination, and that the nuclear DNA could represent environmental noise, more contamination of yeast, fungi, or other microorganisms–very common occurrences with any forensic sample. “There are big piles of DNA sequence that come out of any environmental sample that don’t line up to anything,” he said.

Update, 1 December: Here’s an interview with Dr. Ketchum on Houston television.

Related posts

Another cryptozoology disappointment
Smell the popcorn, carny’s coming to town
Hype, hoax, or hope?
More sasquatch honesty than expected
ABout where I expected we’d end up with sasquatch

Temporal harassment

A few other people were writing about career and training practices at the end of last week.

First, I want to shine a light on something that Dr. 24 Hours said on Twitter, in conversation with Dave Bridges. Dr. 24h was commenting on the expectation of many senior academics that their trainees put in unreasonably long work hours, and offered this solution:

Harrassment to keep longer hours. Policed the same as other sorts of harrassment.

It’s one of those things that is so obvious in retrospect, I can’t believe I didn’t frame it that way before. A senior scientist demanding a grad student or post-doc work 80 hours a week, for years, is creating a hostile work environment, like telling racist jokes, or a man pinching a woman’s bum, or making innuendo laden comments.

Realizing that it is harassment gives me hope, for two reasons. First, we have made progress on getting rid of other kinds of harassment. Second, institutions like universities usually have people and procedures to deal with harassment.

Second, I recommend PalMD’s post, “Call me a commie, I dare you.”

The system itself devalues labor, and thereby the people who perform the labor. We perpetuate the idea that medical schools and grad schools must be cut-throat-competitive. This may or may not be true, but this creates a system where laborers (medical and science trainees) are told they are “lucky to be here”, that “there’s a dozen others ready to take your place should you fall.”

Once again, this may or may not be true, but it helps perpetuate a feeling among laborers that their position is always at risk, that they should be thankful for their abusively long hours and any other mistreatment they receive. And they should thank the boss that they get paid anything at all.

Third, in the “Pay us” department, is a New York Times article, “Skills don’t pay the bills,” that shows manufacturers who claim they can’t hire skilled workers... are offering crappy wages.

At GenMet, the starting pay is $10 an hour. Those with an associate degree can make $15, which can rise to $18 an hour after several years of good performance. From what I understand, a new shift manager at a nearby McDonald’s can earn around $14 an hour.

25 November 2012

One meth lab ready to go

Congratulations to Ethan Perlstein. Last night, he accomplished something that I didn’t think would be possible:

Over twenty-five thousand dollars for science!

Ethan ran one of the most ambitious crowdfunding campaigns for a single scientific research project to date. Technically, he was working on the neuroscience of drug addition, but non-technically, he was asking for help to run a meth lab.

If you’d asked me at the start about how much money I thought Ethan would raise, I probably would have guessed about $10,000.

I’ve gone through two rounds of #SciFund myself, and I’d worked very hard to hit my targets ($1,750 combined). With #SciFund, the sweet spot for success seemed to be about $1,000. Bigger projects with higher targets were much less likely to succeed. No single project in #SciFund had ever raised more than about $10,000.

But last night, when the last minute donations started coming in for Ethan’s project? Exciting. When I went to bed, Ethan had less than $1,500 to raise. To be completely honest, I had dreams about checking Ethan’s project on RocketHub. In some dreams he made it, in some, he didn’t.

I was surprisingly anxious to check the tally when I got up this morning.

How did he do it? Ethan hit the bricks, and did a superb job of promoting this project. He got himself in Nature, Scientific American, Talking Points Memo, and others.

Ethan also had the advantage of a great hook. An academic running a meth lab? Shades of Breaking Bad.

This is another important moment for science crowdfunding. One of the most common arguments against science crowdfunding is that crowdfunding will never be able to raise enough money to do cutting edge science. Ethan’s project exemplifies my first* response to that:

No, crowdfunding can’t raise enough money. Yet.

When you look at crowdfunding for the arts and games, it started slowly. It took a few years to grow before there were the big break-out successes.

Science crowdfunding still has not had its breakout moment yet. We still don’t have a Pebble or Double Fine. But Ethan’s project has moved the dial on what is possible in crowdfunding science. Again. That’s important. Every time, we change the conversation about what is possible in crowdfunding science. The success of Ethan campaign helps us take another step forward in convincing skeptics that crowdfunding is part of the future of science.

The only problem is that now Ethan is going to have to repaint his door:

Post script

I would be remiss if I didn’t point out that #SciFund round 3 is running right now. There are 35 projects. Three have made their targets, and seven are more than 50% funded. (My favourite? a crustacean biologist who needs to money to help him travel to a field site in Florida. Sound familiar? I symapthize!)

You should go to RocketHub and support science!

If you can’t contribute money, you can still help by spreading the word about a favourite project! Like it on Facebook, tweet it, tell your friends and family about it!

* My second response to that argument is lots of perfectly respectable science is cheap. Not zero dollars, but much less than typical grant proposals.

23 November 2012

Kill the “scientist as monk” meme

Is The Guardian trolling us?

Yesterday is was a flailing piece that tried to justify the existence of for-profit scientific publishers, and flailed like a beached fish.Today, it’s an article about scientific careers that asks researchers to just accept that society will treat them like crap.

Steve Caplan says that academic research looks like a Ponzi scheme. This charge that is so well known that Ph.D. Comics has parodied it:

In the next paragraph, Caplan anticipates my reply: entry level positions always outnumber the managerial positions at the top. Nobody calls this a Ponzi when it occurs outside academia. Perhaps the problem is that outside academia, you can make a comfortable career in middle management. In academia, there fewer opportunities to have a long term career in the middle of the pyramid.

Yet despite saying academic research looks too much like a criminal scam, Caplan won’t bite the bullet and say that we are producing too many graduate students. According to Caplan, the problem is that we aren’t doing a good enough job at getting people to leave academia.

The problem... is... general failure to inform students (as well as post-doctoral fellows) of their career options and train them for a wide variety of scientific careers, including the many opportunities that exist outside academia.

The reason that people typically embark on doctorates, though, is to become professors. To join academia. To be a working scientist. It’s no surprise that they don’t want to leave because that’s what they set out to do. It seems pointless and a little cruel to get people into programs, then spend a lot of time telling them they will probably have to take on jobs that they didn’t sign up for.

Are other professional programs worrying about this? Are medical schools running workshops on what career options their med students have for when they fail to become physicians? Do law programs train their students for the many career opportunities outside of the legal system?

My first idea... is to provide far better training for students. Many universities are already employing career development plans to help their graduates prepare for a wide range of science-related jobs.

My question is whether that “wide variety of scientific careers” are careers that need a Ph.D. to do. I suspect not. Instead of putting people through an academic wringer that was designed to create professors, let’s create new programs and training that are not doctoral programs. Let’s get those people out in those science-related careers faster and more efficiently.

It’s Caplan’s second recommendation that makes me mad, though.

I am of the opinion that despite dwindling academic job prospects, this country and the world needs more scientists with PhD degrees, not fewer. Although for the most part careers in science are unlikely to lead to high-paying salaries, society benefits greatly from churning out more scientists with advanced degrees. Critical thinkers who have a working knowledge of the intricacies of scientific research can be the very best ambassadors for science. Whether they become politicians, businesspersons or leaders in any other occupation, their support for science could be the key to the future of science. So in some respects, I almost view a graduate degree in science as a form of national (or international) service – poor pay, but something to be proud of and with great benefits for society as a whole.

Screw. That. Noise. I am so sick of the “Joining the monastery of science” and “Science is a calling, you shouldn’t do it for the money” memes. This “scientist as monk” meme is hurtful and deserves to die a flaming death.

As I’ve mentored students, and watched them consider graduate school and scientific careers, it’s become clear to me that a major reason that they don’t want to go into graduate school is because they want to live their lives. They see continuing in university as something that will interfere with them from meeting people (including potential partners), travelling, having families, and enjoying themselves on their own terms.

In other words, I’ve seen that many bright, hard-working students who could get doctoral degrees do not want to be monks and nuns for science. I don’t want to be a monk for science. I set on this path because I thought it was a career that could offer me some long-term stability and a way to keep a roof over my head and food on my plate.

We cannot simultaneously:

  1. Call for more people trained in science, and;
  2. Say people trained in science should be willing to leave the profession they want to join, and accept a crummy standard of living regardless of whether they join the profession. 

These two things are not compatible. You want more scientists? Then FUCKING PAY US. Other professions are not stupid enough to fall for self-immolation, and scientists shouldn’t be, either.

Photograph of Charles Ponzi, originator of Ponzi scheme.

All I needed to know about the Universe, I learned from Doctor Who

What’s the point in being grown-up if you can’t be childish sometimes?!

You can’t change history – not one line. (The First Law of Time.)

Aim for the eyestalk.

Military intelligence is a contradiction in terms.

Nothing is rubbish if you’ve got an inquiring mind.

Have a jelly baby.

A straight line may be the shortest distance between two points… but it is by no means the most interesting.

Logic merely allows one to be wrong with authority.

Of course we should interfere; always do what you’re best at.

You can always judge a man by the quality of his enemies.

When you make a plan, plan in depth.

Computers are really very sophisticated idiots.

Scientists earn their right to experiment at the cost of total responsibility.

There are no other colours without the blues.

Reverse the polarity of the neutron flow.

Somewhere there’s danger, somewhere there’s injustice, somewhere else the tea’s getting cold.

Happy 49th anniversary, Doctor Who!

By me, apparently around 30 January 1994 (that’s the date on the computer file, anyway).

22 November 2012

Academic publishers need better defenders

A new article by Alexander Brown in The Guardian tries to argue that scientific publishers do add value to research manuscripts. But Brown does not help the publishers’ cause.

Let’s see what he lists as services that scientific publishers provide to authors.

Editors help “ensure that research can be universally understood.” By that criteria, editors are failing miserably. I’m a working scientist, and I have problems reading many journal articles in my own field. I have never had a journal editor who has recommended, or made, substantial changes to the text of one of my articles for readability, and particularly not to the point where it could be understood by someone who was not a professional scientist. Any suggestions for improving my manuscripts have come from reviewers, not editors.

Editors help “to recognise emerging fields.” Researchers can do that themselves.

Publishers “create new journals.” That is valuable to publishers, not to authors. There is no shortage of venues to publish in.

Publishers “build and maintain the brands and reputations of journals.” The “brand” of a journal is more valuable to a publisher than an author.

“Developing systems and platforms” to “get the right research into the hands of those who need it most.” And the platforms that I hear most researchers in biology use to find their research are Google Scholar and PubMed, neither of which was created by publishers. arXiv wasn’t created by publishers, either.

Adding metadata, XML generation, and tagging. I’ll spot Brown that one. I love DOI numbers, for instance. But he may be overstating the value. My impression is that if you have machine readable text, just the number of times key words are used in the text will accomplish much of what tags are supposed to accomplish.

Bringing old print archives online. Yes, I’m glad publishers have made their “back catalogue” available. But that is a mainly benefit to scientific readers, not current and future authors.

Depositing works into institutional archives. No publisher has ever even offered to do that for me. I don’t doubt that it happens, but how much does that matter for how many authors?

It’s kind of astonishing that Brown’s listing of ways publishers add value miss almost every major thing that I, as an author, value.

Organizing peer review and fact checking. But there is so little difference in how journals do this, that I think no journal can brag about how much better its reviewing process is. Many entries in Retraction Watch show that journal reviews are often not very thorough. I would love it if there were journals that boasted of having a dedicated fact-checking staff, or advertised that they checked every manuscript for plagiarism, or that routinely sent papers to five authors instead of two, or that guaranteed a 48 hour review turnaround.

Professional typesetting
. Journals do make things prettier than I can do on my own.

Promotion. I have never had a paper that a journal decided was sexy enough to promote. But I see what happens when a Glamour Mag gets behind promoting an article and pushing it to the press. It’s like watching a lion take down a zebra: a display of unfettered power. Seeing a hot article appear again and again over the course of a few weeks shows that this is something that publishers are supremely good at.

Archiving. Institutions do a better job of this than individuals, and publishers have a decent record of this (see “Bringing old print archives online” above). But the fact remains that for profit publishers are not guaranteed to be around forever. Many publishers have been bought up by other companies. Publishers could go bankrupt. Publishers are certainly not the only ones interested in, or charged with, archiving. Google Scholar, PubMed, and university libraries all do this. I am not sure publishers are doing a better job than those entities.

Publishers, if Brown’s giving the best arguments in your favour... you’re in worse trouble than you think.

Lion picture from here.