Friday 29 November 2013

The moral dilemma of the Christmas tree



A tree can be a moral dilemma and worthy of quite a bit of discussion. I am crowdsourcing thoughts.


For Christmas, should we:

a) Go down to the Christmas tree farm and pick a live, healthy tree which is happily sequestering CO2 in the sunshine, then ask a strapping young local lad  gainfully employed at the farm to chainsaw the tree down , and then haul said tree carcasse home for a mere five weeks' decoration, watching it drop needles on the carpet and decompose, only to finally park its skeleton in a corner of the garden for months as a reminder of our shameful act, or else squeezing it into the green waste bin for the good folks at the council tip to dispose.

or

b) Get a plastic Christmas tree, made from either polyvinyl chloride (PVC) or polyethylene (PE), the former of which exudes surface and gaseous chlorine when heated as in a closed room on a scorching hot sommuers day, and the latter of which contains pthalates, chemicals which mimic the sex hormones testosterone and estrogen, and the long-term exposure effects of which are unknown but suspected to be quite negative. No, I don't like the smell of plastic trees.

or

c) Get a potted coniferate for about 4 times the price of a) or b), which would be knee-high for the first Christmas, and (working on the assumption that it does actually survive) a respectable Christmas tree size for the next 3 to 5 years, after that would require a permanent, towering spot in my otherwise-native-treed garden while we source a knee-high successor. I have seen the gardens of those who believe in live Christmas trees. They generally have a comically stepped row of old Christmas trees up behind the old shed. The people in the picture have clearly only been living there for 8 years. Another decade or two and they will have a windbreak.

or

d) Not get a Christmas tree. Bah. Humbug.


Your thoughts?

ETA
D has suggested a wire tree. I love it. Attractive, postmodern, pthalate-free and broadly speaking sustainable.



Tuesday 19 November 2013

Malleability, and Tibetan-Buddhist engineering


I wandered further in my reading and connected another fragment to this idea.


In my travels, I got to thinking about malleability. Much of the world which we perceive as fixed is actually highly malleable, in the right circumstances, with the right heavy machinery or scalpel and bone saw, and with a little bit of know-how. Unless you are in a profession which directly causes a particular kind of change, you are unlikely to notice it and you are likely to assume that whatever-it-is is fixed.

When I was a wet-behind-the-ears new graduate engineer, and spending a lot of energy dealing on being one of about 3 women on a worksite of several hundred men, I had the good fortune to be supervised and mentored by a guy called Craig. He was just the right combination of expert on everything rail-related, patient and attentive teacher, and curious friendly human being.
There was a track machine running during the commissioning, it looked a bit like this. It was huge and floodlit and it trundled back and forth on track. Craig knew the driver. (Craig knew everybody.) We got to climb up in the cabin and have a ride.


The "business" end of the tamper machine is in the middle. It reminded me a little of the maw of the Alien Queen, with all those toothy bits and hanging cables.
File:BTM-73909 Saturn-05.jpg
This is what the tamping machine does:
  • Two claws pick up the track like spaghetti strands.
  •  The four-clawed maw then pokes coarse gravel (known as ballast) down between the sleepers - which are still attached to the rail, they've been hoisted up too.
  • By doing this the machine can raise the track by up to 30cm in a single pass. 
  • It moves 1.5m down the track and repeats the process. 
  • It can move the track left and right too, to straighten out kinks. 
  • It has laser levels and all kinds of straight line sensors to do this.
There are three men in the control box, one driver, one tamper, one supervisor. I thought it would be an interesting job. Craig replied: "No, it sucks. It's bloody boring. And really noisy. These guys are good because they're all a bit obsessive. They just really like getting the track absolutely straight and level. You need a particular kind of mind to work the tamper."

It took a couple of hours for my perception of the railway to readjust. I mean, I did realise that we were constructing a new rail junction. I could cope with a new building project. But once it was built, it had not occurred to me that a railway line was still malleable. That you could get a machine and hoik the whole thing two feet to the left and up a bit in a couple of passes. Craig then told me that heat expansion is a real problem, and that to prepare for summer, the civil team stretches out the rail and then cuts out a couple of metres every km, welding the ends back together. At the start of winter, they weld the cut sections back in again. So if you're ever looking out a train window in summer and you see a couple of metres of rail sitting out trackside, it's not thrown away or forgotten, it will get welded back in later. These days, I think of a railway as malleable, flexible, stretchy string-lines rather than fixed immutable permanent infrastructure.

People with limited experience of earthworks or building projects vaguely assume that the terrain is fixed, and that building structures are solid and immovable. When they see large-scale earthworks or even some renovating acquaintance knocking out an internal wall, it jars for a while until they readjust to the new landscape, and then it's 'how it always was'. The driver of the giant excavator has a surveyor to tell him where and how to dig, and measurement equipment to tell him when to stop. With the right tools, any quantity of earth can be dug, it's only a matter of planning it, persisting, and having the right mind to prevent the task being boring.

A friend's father had a hip replacement last year. He described the procedure and it sounded really quite creepy and a bit extreme. Bone saws and a modified eggbeater for the socket, and he was aware of the procedure because he had it done under spinal anaesthetic and heavy sedation. When I see him these days, my brain forgets very quickly that he ever had the surgery, that the socket and ball are now ceramic and are all set to give some archaeologist a hell of a shock in a thousand years time. But for the orthopedic surgeon, a bone hip joint is not fixed, it is quite malleable. I looked up "hip replacement surgery" on Youtube (despite my rant on Expertosis) and just like the tamping machine and the guy driving the big excavator, the surgeon has equipment which provides levels and reference points and straight lines to follow. I wondered if other medical specialists look sidelong at orthopedic surgeons and mutter "Bloody boring work. Twelve hips a day, it's like being a car mechanic. But clearly he's got the right mind for it."


So on to this matter of perception of malleability.

Kahneman (in Thinking Fast & Slow, still one of my favourite books) and Tversky identified the mental shortcut (heuristic) of "What You See Is All There Is". By default, our minds exclude anything outside the present, the recent past, and a future with very limited change from the present. This influences our decision making enormously - it's why it's so hard to stretch your paycheck for the end of the month, or buy only the groceries you need when you're shopping hungry. As to the world around us, individuals often have this silly idea of things being fixed, where there is so much more malleability in systems. And as a species, we are getting very good at coordinating to change just about anything, by developing specialists to deal with small areas of malleability and wrapping their brains around it, developing tools, following straight lines and laser levels, and turning the utterly revolutionary mind-blowing power of massive change into a relatively boring, routine job.

Another established mental heuristic is the distinction between "being" and "doing". What parts of what you do are intrinsic to your being, and what are just transitory behaviours? In general, we assume a lot more is fixed, where in reality our choices are so context-driven, we are fabulously malleable, and in particular self-malleable just by changing the context.
I have spent most of the last year experimenting on this at my work. The prevailing negative language in the office, especially when managers talk about people, was the "being", fixed, intrinsic language. "He's not smart enough for that problem." ... "The project leadership team is crap." ... "That department is incapable of meeting a deadline." ... "Whoever wrote this document is an idiot." ... etc. There was a perception that the culture was fixed, and a very strong "What You See Is All There Is" habit of persisting with technical and cultural approaches that had never ever worked.

I had come across the idea in my reading (which I would reference here if I could remember where) that fixed-language descriptors inhibit an individual's capacity to change.
I figured my team would be happier if they were not crap/incapable/idiot etc. I had to find a way to frame it as malleable.
So a year ago, I set about shielding them from the worst of this language, and constantly trying to reframe the "being" into "doing" ("... yeah, we did do a bit of a rubbish job on that site, didn't we... well we need to work out what to do about it now."). I maneuvered into a spot where my job was to focus persistently and deliberately on what we do, rather than what we are. In due course, I 'became the expert' on project process and continuous improvement. (I'm a bit stuck with intrinsic language there!)
And the language was malleable. While we are still struggling with deadlines and rework from projects where the contract finished years before, the big projects from the last year have unfolded so smoothly our head of engineering cannot believe the change. We have gone from 50 commissioning logs being a good result, to 5 logs being a substandard result. We have had to rework all our bidding metrics because the labour costs have shrunk - and shifted to earlier design phases. Staff turnover has reduced, staff mood has improved enormously.
The change has been noticed around the company. Maybe I am enjoying being a little bit smug.

I found a book about the Tibetan buddhist practice of Mahamudra, which requires rather a lot of reflection on the impermenance of your self and of the world, meditating on time and death and :
"In Buddhist logic it is said that all concepts are based on exclusion. As soon as we affirm something by saying, 'It is this', we automatically exclude so many other possible identifications, or things that might have been. By imposing a conceptual limitation, we create or fabricate an idea..."
"Wisdom will only arise if we realise that the things we take to be real and substantial are not real and substantial at all. All of our negative views and habits come from failing to understand how things really are and concentrating instead on how they appear... This fiction gives rise to the belief in our psychophysical constituents as a 'self' and to the misapprehension of objects... as real and substantial... If we want to put an end to the dissatisfaction of samsara [the suffering of being alive and unenlightened], we have to put an end to our delusions..." 

I like the thought of doing Tibetan-Buddhist engineering; teaching others about the misapprehension of self, and releasing them from suffering by helping them embrace the impermanent and malleable nature of the project environment and the corporate culture. I don't think the negative-language managers or some of my more literal, technically focused colleagues would take this kind of talk particularly well. If I run the experiment, I will post on it.....


There might be another post brewing about malleability - it is a potentially powerful idea which I need to roll around a bit more in my head- but it is not tonight. I need to be kind to my precious human body, and take it off to sleep.

Tuesday 12 November 2013

Hair, and our War against it

I've been quite sick recently. I am convalescing. I have been noticing how important my hair is in my perception of how I am doing. On a good day, I brush my hair and plait it up and feel a little stronger. On a bad day, I can't control how it falls around my eyes, lustreless and sad. Maybe I noticed this because I have been reading Malcolm Gladwell's compilation of articles "What the Dog Saw", and there was a nice one called "True Colors: Hair Dye and the Hidden History of Postwar America". It's also about the revolution of marketing specifically to women. Apparently Clairol blondes are the girl-next-door home-grown apple-pie American teens & young women, and L'Oreal blondes are sassy, powerful, older women who choose the more expensive brand for themselves ("Because I'm Worth It").

I have never been a blonde. I don't think I could do it. I could not tolerate how blondes are portrayed and differentially treated. But I very much liked the underlying observation in his article.

A person's identity and social position - especially a woman's - is very caught-up in her hair: colour, length, style of cut or treatment, type of care; and the layers of meaning embedded therein. Blondes, for example, are well studied. Gladwell references a guy called Grant McCracken and his "Blondeness Periodic Table", which pegs six different images for bottle-blonde women:
The Bombshell, eg Marylin Monroe or Pamela Anderson
The Sunny Blonde, eg Goldie Hawn or Cameron Diaz
The Brassy Blonde, eg. Sarah-Michelle Gellar in Buffy the Vampire Slayer
The Society Blonde, eg. Paris Hilton or Kiera Knightly
The Dangerous Blonde, eg. Sharon Stone or Meryl Streep
The Cool Blonde, eg.Patricia Arquette or Cate Blanchett

Caucasian brunettes and redheads also carry layers of meaning - and I am sure if I find Grant's book there will be some mention of them. But clearly we are more invisible than blondes. Brunettes are permitted - and perceived to carry - more authority and capability than blondes, and redheads are almost expected to have a temper.
In Asia,where hair is almost always black and straight to Caucasian eyes, there are actually clear distinctions between hair in terms of the shade and lustre (blue-black, red-black, purple-black, green-black, silver-black etc.) as emphasised in female Anime characters. Additionally, the choice of style is critical: sleek, straight, untied long hair is a youthful trait, a low-maintenance short cut is a sign of practicality and good sense.
I have insufficient information on African-American haridos, so if anyone who knows would like to post a comment, please do!

I notice cross-cultural congruences, though. Around the world, obvious chemical treatment can be the sign of a tart, whether it is bleaching red streaks into a young HK-Chinese woman's otherwise long straight locks, or a frizzy boofy middle-aged perm. Around the world, haircuts mostly get shorter as women get older, as a nod to practicality and also acknowledging the gradual slide towards the androgeny of old age (old men and old women are hormonally and neurochemically extremely similar). A haircut can also signify a life change: let go of an ex, then cut your hair short or dye it a completely different colour. Across many disparate places and times, short hair on a woman has been a sign of mourning, or a new start.





But hair is not just head-hair (although this gentleman clearly likes his very much!). Hair is eyebrows, eyelashes, facial hair, noticeable body and pubic hair that men and women often try to remove, the downy invisible hairs all over our bodies that prickle in the cold or a scary movie. Human hair has evolved with our species over millions of years to be mostly-bare in some spots, and to grow unregulated on our heads, and to be a naturally self-regulating length in other areas.






And yet, in this late-20th and early-21st century, women are at war with their hair.
Again.
Some proportion of the population was at war with hair during most of the great empires of history. This page has a nifty summary - although I haven't cross-checked each one - but the short version is, Society people of both genders from the Egyptian, Greek, Roman, Moghul, Manchu and Qing Chinese, Victorian British and modern American empires insisted on extensive or complete hair removal. Heads were shaved in order to wear wigs on them, or in order to be shiny and bald. Eyebrows, body hair, pubic hair, leg hair - these have been the Enemy for a long long time.


There were of course exceptions.
Roman Emperor Hadrian is famous for quite liking Greek statues with beards, and growing a beard himself, turning the established Roman fashions upside-down.
Where the Qing dynasty had an edict of shaving mens' heads but leaving a queue, rebellions were extensive and bloody. Truly a gurerrilla war over hair. Also, Google indicates that Incan hair-removal was probably not a particularly important custom, which is handy because that's how archaeologists have worked out the extent to which child sacrifices were drugged.







But body hair is useful. It reduces chafing when you work at repeated physical tasks (hoeing a field, running after a bison etc).
It absorbs sweat and stops it beading on the skin. This ad would not be able to create a problem if such women weren't so inexplicably keen on removing their pubic hair. 

Body hair keeps you warm - to a surprising extent.
It repels dirt. Eyebrows also direct rainwater, snow and sleet off the forehead to the sides, away from your eyes.
Beards and moustaches warm the face and inhaled air.
Body hair can be a good instant indicator of age - particularly the beard-fuzz of a boy at the end of adolescence growing into a proper man's beard, and older adults going grey.
Body hair can also indicate something about  hormonal health: thyroid conditions and certain gynecological issues such as PCOS can cause masculine pattern body hair growth; anorexia and other eating disorders can cause hormonal disruptions that trigger long fine downy hair all over the face and body.

So why do we go to war against hair? My theory connects two pieces of established thought.

1) Humans are still primates. We have evolved from a group of animals where every single species has a complex social structure which is maintained through grooming. and every single one of us - whether rhesus monkey or Romanian orphan - has a psychological need for physical contact and touch. Without loving contact as children, our brains simply don't grow - the orbital frontal gyrus, prefrontal cortex, and the deep brain (amygdala, hippocampus, brain stem) are all compromised in size and function. Such individuals can't regulate their emotions, they can't interact 'normally' with others in their species, and they can't manage the tasks of finding a mate and parenting young. A grooming culture is actually a critical neurological prerequesite for reliable transfer of the genetic material of the individual to the next generation. When a primate colony's food supply is good, much of the spare time is spent grooming. So the condition of a troupe's hair may indicate to others the prosperity of that troupe.

2) Modern humans have lots of spare time (defined by waking time not directly occupied with the business of survival). An archaeologist friend of mine has a (not explicitly published) theory that spare capacity in a society is turned to "goofy stuff", ie the development of culture and custom. Goofy stuff can be construction (like the Easter Island statues or the Mayan plaster-coated temples which deforested their lands and caused micro-climate-change, contributing to the fall of their empire), the development and refinement of art and music, and these days Hollywood blockbusters and Pomeranian shows and kinder coffee mornings and writing blog posts are all clearly goofy things to put resources into.
Considering this in the context of the War on Hair, a large proportion of culture and custom is devoted to grooming and socialisation (the fashion industry, fitness industry, and aforementioned coffee morning, for example). But another proportion is devoted to establishing and maintaining social hierarchy in terms of acquiring posessions, and grooming those posessions as a simulacrum for self-grooming and allogrooming (grooming other people). This is what the consumer culture hinges on: using our "extra spare time", above and beyond our primate cousins' "grooming spare time", to shop for objects that make our house look better ('grooming' the house) or working overtime to save up for a renovation (more house 'grooming'), or cleaning and washing things that our primate cousins wouldn't bother with, like dishes and underpants. And this self-pride and house-pride is actually a social indicator of good mental health too: if you let your dishes stack up and don't wash your undies, you're one shopping-trolley away from being the batty old bag lady who talks to herself.


So don't go to war with your own hair. That's just goofy. It hurts, and hair has some very practical uses.
Don't go to war with the war on hair either. The war has been running for at least 4,000 years. And if you win a skirmish, and encourage a community to be comfortable with their natural body hair, another type of grooming will bubble up to fill the spare "grooming time". For example, the twirling of dreadlocks. (Hi, happy dancing lady with the hairy armpits and dreadlocks.)


I think we should all return to the original purpose of primate 'grooming time': human touch. Give your partner a massage. Rumble with your sons or go play your local type of football with your friends. Have a hot shower then get yourself a ludicrously soft towel. Race on the grass barefoot. Stand in a shopping mall next to a sign that says "Free Hugs". Use touch to look after your brain.

Sunday 3 November 2013

Expertise or Expertosis

Mistaking Expertosis for Expertise


Before the mid-20th-century, in most places and times, experts were those very rare people to whom a particular body of information was available, and who had been taught how to sift through libraries and who to correspond with in order to have all the relevant stuff in their heads.
Much of the population was occupied and somewhat isolated in their everyday lives, (yes a gross generalisation but roll with me please), and few felt entitled to call themselves expert on anything outside their direct experience (barrel making, farming, building things, bringing babies into the world alive etc)

From the 19th century in Europe (and in fits and starts in other places and times), vocational and lay experts were people who were hooked on a particular thing, spending all their time wondering about it, running tests, and studying and trying something new and making observations. Metallurgy, chemistry, engineering and geology have become fields of knowledge because of lay-expert experiments. For example chemistry started with William Perkin's faffing about with coal-tar waste products until he made mauve aniline dye, and a weather man called Alfred Wegener who had a couple of world maps and a pair of scissors and a wacky idea and came up with tectonic plates and continental drift.

Universities have always (okay since the 9th century) liked to style themselves as storage places for knowledge, and an important crucible for any aspiring (upper class) expert. They would hire those fabulous lay experts who had made a Significant Contribution. But for anyone else, to 'read' law or history or archaeology or natural sciences meant literally that - to spend a number of years on your backside in a library with books open in front of you. If you managed either, you were an expert - a Bachelor, a Master or a Doctor. And you got to wear a robe and lord it over people who weren't experts.


This century, the exclusivity of information-based expertise is rapidly unravelling. In ten minutes, a fast-reading and halfway competent 15 year old can tell you most things your orthopedic surgeon would about hip replacements. And you can watch Youtube tutorials on how to do a hip replacement. In fact, well-informed and questioning patients are the bane of many doctors' working days! All those tricky questions - having to justify your professional position over and over again, client after client! Ideally this would motivate the diligent professional to bone up on the most recent research, and have good counter-arguments for the large amounts of swill available on the web.


Expertise now lies in your ability to evaluate and work with the data you can find. To have the background and analtyical techniques to decide which information is flat wrong. To hold the scalpel, to advocate in a court, to drive an excavator on a steep slope, to design the election-winning advertising campaign. Also, to know where the data stops and where your own knowledge stops. To pick the outliers, the particular problems which can't be answered by WikiHow and a decent Youtube video. And to excel in the complex, ambiguous, grey areas.


Expertosis is the syndrome that you think you know lots but you don't. Just go listen to a student political group yakking about "They Should... " (publicly fund all undergrad places/close the student union/ban umbrellas/force everyone to study a second language - and that was all in 10 minutes!)
Teenagers are prone to expertosis. So are the middle class in their forties and fifties. Not that they are a new phenomenon  - everyone's met an annoying uncle who tells you how it Ought to Be at the Christmas barbie, or an obnoxious teenager who says  "It's all so clear. The answer is obvious. You're just idiots."

Their opinions must be right, because when they think them, they feel warm and fuzzy and right.
Everyone else just hasn't seen it yet.


But in essence, feeling right is only that. A feeling. Not actually correlated with whether or not you are right in any kind of physical or moral sense. In fact, the more right you feel, the less you may have evaluated the problem and the more likely it is that you could be wrong.

Cognitive ease is when the answer is obvious and comes to you effortlessly.
In Thinking Fast and Slow (which is still one of my favourite books) Kahneman summarises research on cognitive ease and cognitive strain- book extract in this link.
You feel cognitive ease if the thought is repeated, or you're in a good mood, or something has primed you for the idea, or if it's easy to take in (eg. small words, clear font, simple, clean, structured, apparently congruent)

So some examples:

Do you like apples?

YES I like apples.
Easy answer. Feels right. Apple. Crunch. Yum. Good for you. Like.

In the recent election campaign, the Liberal campaign was expertly crafted to bring cognitive ease.

Short words.
Four dotpoints.
Clear font. Big print.

Must be right.

Cognitive strain on the other hand, feels uncomfortable, happens for unfamiliar problems, happens more when you're in a bad mood (or even just frowning with a pencil in your mouth), or when the problem is not easy to take in (eg. long words, small print, poor structure,

It's the thing that is triggered when you're asked to do this sum in your head:

158 x 14 = ?

Did you even try?
Bet you didn't.
Bet your pupils dilated and you frowned and went "Oh that's haaaard" and gave up.
I gave up first time, and I'm an engineer.
Go get a bit of paper and give it another shot. See, I will too.

(158 * 10 = 1580) +
(158 * 4 = oh that's haaard)

(158 * 10 = 1580) +
(150 * 4 = 600) +
(8 * 4 = 32)

1580 + 600 + 32 = oh that's haaaard, no hangonatic I can do this, 2180 + 32 = 2210 + 2 =2212 HOORAY! That feels good and right, too.
(Now I've got the length of cladding I need for the kids' cubby front wall. Thanks.)

Another example, to give your poor sore brain a rest.
John Hewson was politically sunk when he gave this very famous and slightly funny interview. Poor guy. Bet he feels wistful when he gets a birthday cake.

Cognitive ease and cognitive strain, and expertise and expertosis

Here is another interesting fragment of research.

If a person is making a conclusion about something famillar, or has a habit of trusting their gut reactions, or likes to express opinions, cognitive ease brings a greater certainty. In Fast & Slow, religious evangelicals are cited as an example.

If a person is working with something unfamiliar, has a habit of trusting thoughtful responses or thinking carefully about their opinions, cognitive strain brings a greater certainty. Graduate students are cited as the counter-example.



So sitting on your backside surrounded by books in a university library for a number of years is actually good way to train a person to trust in the answer that comes after cognitive strain. They may not end up truly expert in the material, but they do end up at least trying to think harder about everything they come across.
In contrast, being able to google "hip replacement" and instantly knowing lots, is a good way to train the 15 year old to trust cognitive ease.

This is what the limitless availability of information may be doing.
  • Replace many experts with expertoids. 
  • Train us to trust cognitive ease, not cognitive strain, to bring the Right Answer. Foster a habit of seeking the congruent, familiar, easy answer.
  • Give a voice to those who would have not had the confidence to say they know anything (eg. mad old uncle Tim) - on an equal footing with those who have trained themselves to strain.
  • Equip us to handle the 90% of simple problems (build a retaining wall, write a novel) and then give us the contact details and web reviews of a good orthopedic surgeon when we need it.
Four dotpoints, see. I can do it too.
Gotta work on those short words, though.

Halloween food





So I got a bit ... well... enthusiastic at Halloween this year.

Not posting pictures of the 12 kids under 6 years old that came to my house, in case one of them wants to be police commissioner in some future where halloween is a moral outrage. Or photos of my giant sparkly fake eyelashes, which were fun! but if electronically public they might stymie my own ambitions to be a police commissioner in the future (or something). Or the bit where the lizard kid tried to wrap the skeleton kid in toilet paper to make him a mummy. That was fun. Although I should have supplied cheaper toilet paper. The supersoft paper just kept breaking.
There were monster drawing games, much screaming and running and hide and seek and BOO!, but all my other game plans dwindled when the little boys (and my daughter) started chasing the chooks around the block screaming, and the other little girls sat in their princess outfits and asked politely for some textas and started drawing butterflies and writing long words like "because".


I shall post pictures of my WATERMELON BRAIN
complete with hemispheres, the main lobes - OK not enough sulci and the gyri are too fat but it's not bad for 20mins from an engineer not a neurosurgeon.
I challenge any neurosurgeons reading this to make a better one. And post a comment.
Not my idea. I googled Halloween Food and found this. In the spectrum of brains posted in the comments, mine comes out pretty nice looking. 
 
Am I being too competitive? Is competitive brain carving in the Mummy Olympics?


Here is all the food.
Brain top right.
Dragon dropping cream puffs with chocolate top left.
Banana chips supposed to look like sliced vertebrae
Eyeball eggs - which were very tasty but nobody was game to eat them at first
Ghost bananas - how cute!
Mandarin pumpkins,
Slices of apple cut into mouths with teeth made of almond flakes
Fish fingers with almond flake fingernails

And there were also baby frankfurts wrapped like tiny mummies, instructions here,
which disappeared between putting them out and getting my camera. (I think the lizard boy ate them.)
Although if you are making them don't cut the puff pastry into strips. Cut it into one continuous spiral, Wrap each frankfurt, making sure there's a spot for the eyes later, and break it off. Much less fiddly than all those little strips.


What fun.

I shall have to do it again next year.