Tuesday, 12 November 2013

Hair, and our War against it

I've been quite sick recently. I am convalescing. I have been noticing how important my hair is in my perception of how I am doing. On a good day, I brush my hair and plait it up and feel a little stronger. On a bad day, I can't control how it falls around my eyes, lustreless and sad. Maybe I noticed this because I have been reading Malcolm Gladwell's compilation of articles "What the Dog Saw", and there was a nice one called "True Colors: Hair Dye and the Hidden History of Postwar America". It's also about the revolution of marketing specifically to women. Apparently Clairol blondes are the girl-next-door home-grown apple-pie American teens & young women, and L'Oreal blondes are sassy, powerful, older women who choose the more expensive brand for themselves ("Because I'm Worth It").

I have never been a blonde. I don't think I could do it. I could not tolerate how blondes are portrayed and differentially treated. But I very much liked the underlying observation in his article.

A person's identity and social position - especially a woman's - is very caught-up in her hair: colour, length, style of cut or treatment, type of care; and the layers of meaning embedded therein. Blondes, for example, are well studied. Gladwell references a guy called Grant McCracken and his "Blondeness Periodic Table", which pegs six different images for bottle-blonde women:
The Bombshell, eg Marylin Monroe or Pamela Anderson
The Sunny Blonde, eg Goldie Hawn or Cameron Diaz
The Brassy Blonde, eg. Sarah-Michelle Gellar in Buffy the Vampire Slayer
The Society Blonde, eg. Paris Hilton or Kiera Knightly
The Dangerous Blonde, eg. Sharon Stone or Meryl Streep
The Cool Blonde, eg.Patricia Arquette or Cate Blanchett

Caucasian brunettes and redheads also carry layers of meaning - and I am sure if I find Grant's book there will be some mention of them. But clearly we are more invisible than blondes. Brunettes are permitted - and perceived to carry - more authority and capability than blondes, and redheads are almost expected to have a temper.
In Asia,where hair is almost always black and straight to Caucasian eyes, there are actually clear distinctions between hair in terms of the shade and lustre (blue-black, red-black, purple-black, green-black, silver-black etc.) as emphasised in female Anime characters. Additionally, the choice of style is critical: sleek, straight, untied long hair is a youthful trait, a low-maintenance short cut is a sign of practicality and good sense.
I have insufficient information on African-American haridos, so if anyone who knows would like to post a comment, please do!

I notice cross-cultural congruences, though. Around the world, obvious chemical treatment can be the sign of a tart, whether it is bleaching red streaks into a young HK-Chinese woman's otherwise long straight locks, or a frizzy boofy middle-aged perm. Around the world, haircuts mostly get shorter as women get older, as a nod to practicality and also acknowledging the gradual slide towards the androgeny of old age (old men and old women are hormonally and neurochemically extremely similar). A haircut can also signify a life change: let go of an ex, then cut your hair short or dye it a completely different colour. Across many disparate places and times, short hair on a woman has been a sign of mourning, or a new start.





But hair is not just head-hair (although this gentleman clearly likes his very much!). Hair is eyebrows, eyelashes, facial hair, noticeable body and pubic hair that men and women often try to remove, the downy invisible hairs all over our bodies that prickle in the cold or a scary movie. Human hair has evolved with our species over millions of years to be mostly-bare in some spots, and to grow unregulated on our heads, and to be a naturally self-regulating length in other areas.






And yet, in this late-20th and early-21st century, women are at war with their hair.
Again.
Some proportion of the population was at war with hair during most of the great empires of history. This page has a nifty summary - although I haven't cross-checked each one - but the short version is, Society people of both genders from the Egyptian, Greek, Roman, Moghul, Manchu and Qing Chinese, Victorian British and modern American empires insisted on extensive or complete hair removal. Heads were shaved in order to wear wigs on them, or in order to be shiny and bald. Eyebrows, body hair, pubic hair, leg hair - these have been the Enemy for a long long time.


There were of course exceptions.
Roman Emperor Hadrian is famous for quite liking Greek statues with beards, and growing a beard himself, turning the established Roman fashions upside-down.
Where the Qing dynasty had an edict of shaving mens' heads but leaving a queue, rebellions were extensive and bloody. Truly a gurerrilla war over hair. Also, Google indicates that Incan hair-removal was probably not a particularly important custom, which is handy because that's how archaeologists have worked out the extent to which child sacrifices were drugged.







But body hair is useful. It reduces chafing when you work at repeated physical tasks (hoeing a field, running after a bison etc).
It absorbs sweat and stops it beading on the skin. This ad would not be able to create a problem if such women weren't so inexplicably keen on removing their pubic hair. 

Body hair keeps you warm - to a surprising extent.
It repels dirt. Eyebrows also direct rainwater, snow and sleet off the forehead to the sides, away from your eyes.
Beards and moustaches warm the face and inhaled air.
Body hair can be a good instant indicator of age - particularly the beard-fuzz of a boy at the end of adolescence growing into a proper man's beard, and older adults going grey.
Body hair can also indicate something about  hormonal health: thyroid conditions and certain gynecological issues such as PCOS can cause masculine pattern body hair growth; anorexia and other eating disorders can cause hormonal disruptions that trigger long fine downy hair all over the face and body.

So why do we go to war against hair? My theory connects two pieces of established thought.

1) Humans are still primates. We have evolved from a group of animals where every single species has a complex social structure which is maintained through grooming. and every single one of us - whether rhesus monkey or Romanian orphan - has a psychological need for physical contact and touch. Without loving contact as children, our brains simply don't grow - the orbital frontal gyrus, prefrontal cortex, and the deep brain (amygdala, hippocampus, brain stem) are all compromised in size and function. Such individuals can't regulate their emotions, they can't interact 'normally' with others in their species, and they can't manage the tasks of finding a mate and parenting young. A grooming culture is actually a critical neurological prerequesite for reliable transfer of the genetic material of the individual to the next generation. When a primate colony's food supply is good, much of the spare time is spent grooming. So the condition of a troupe's hair may indicate to others the prosperity of that troupe.

2) Modern humans have lots of spare time (defined by waking time not directly occupied with the business of survival). An archaeologist friend of mine has a (not explicitly published) theory that spare capacity in a society is turned to "goofy stuff", ie the development of culture and custom. Goofy stuff can be construction (like the Easter Island statues or the Mayan plaster-coated temples which deforested their lands and caused micro-climate-change, contributing to the fall of their empire), the development and refinement of art and music, and these days Hollywood blockbusters and Pomeranian shows and kinder coffee mornings and writing blog posts are all clearly goofy things to put resources into.
Considering this in the context of the War on Hair, a large proportion of culture and custom is devoted to grooming and socialisation (the fashion industry, fitness industry, and aforementioned coffee morning, for example). But another proportion is devoted to establishing and maintaining social hierarchy in terms of acquiring posessions, and grooming those posessions as a simulacrum for self-grooming and allogrooming (grooming other people). This is what the consumer culture hinges on: using our "extra spare time", above and beyond our primate cousins' "grooming spare time", to shop for objects that make our house look better ('grooming' the house) or working overtime to save up for a renovation (more house 'grooming'), or cleaning and washing things that our primate cousins wouldn't bother with, like dishes and underpants. And this self-pride and house-pride is actually a social indicator of good mental health too: if you let your dishes stack up and don't wash your undies, you're one shopping-trolley away from being the batty old bag lady who talks to herself.


So don't go to war with your own hair. That's just goofy. It hurts, and hair has some very practical uses.
Don't go to war with the war on hair either. The war has been running for at least 4,000 years. And if you win a skirmish, and encourage a community to be comfortable with their natural body hair, another type of grooming will bubble up to fill the spare "grooming time". For example, the twirling of dreadlocks. (Hi, happy dancing lady with the hairy armpits and dreadlocks.)


I think we should all return to the original purpose of primate 'grooming time': human touch. Give your partner a massage. Rumble with your sons or go play your local type of football with your friends. Have a hot shower then get yourself a ludicrously soft towel. Race on the grass barefoot. Stand in a shopping mall next to a sign that says "Free Hugs". Use touch to look after your brain.

Sunday, 3 November 2013

Expertise or Expertosis

Mistaking Expertosis for Expertise


Before the mid-20th-century, in most places and times, experts were those very rare people to whom a particular body of information was available, and who had been taught how to sift through libraries and who to correspond with in order to have all the relevant stuff in their heads.
Much of the population was occupied and somewhat isolated in their everyday lives, (yes a gross generalisation but roll with me please), and few felt entitled to call themselves expert on anything outside their direct experience (barrel making, farming, building things, bringing babies into the world alive etc)

From the 19th century in Europe (and in fits and starts in other places and times), vocational and lay experts were people who were hooked on a particular thing, spending all their time wondering about it, running tests, and studying and trying something new and making observations. Metallurgy, chemistry, engineering and geology have become fields of knowledge because of lay-expert experiments. For example chemistry started with William Perkin's faffing about with coal-tar waste products until he made mauve aniline dye, and a weather man called Alfred Wegener who had a couple of world maps and a pair of scissors and a wacky idea and came up with tectonic plates and continental drift.

Universities have always (okay since the 9th century) liked to style themselves as storage places for knowledge, and an important crucible for any aspiring (upper class) expert. They would hire those fabulous lay experts who had made a Significant Contribution. But for anyone else, to 'read' law or history or archaeology or natural sciences meant literally that - to spend a number of years on your backside in a library with books open in front of you. If you managed either, you were an expert - a Bachelor, a Master or a Doctor. And you got to wear a robe and lord it over people who weren't experts.


This century, the exclusivity of information-based expertise is rapidly unravelling. In ten minutes, a fast-reading and halfway competent 15 year old can tell you most things your orthopedic surgeon would about hip replacements. And you can watch Youtube tutorials on how to do a hip replacement. In fact, well-informed and questioning patients are the bane of many doctors' working days! All those tricky questions - having to justify your professional position over and over again, client after client! Ideally this would motivate the diligent professional to bone up on the most recent research, and have good counter-arguments for the large amounts of swill available on the web.


Expertise now lies in your ability to evaluate and work with the data you can find. To have the background and analtyical techniques to decide which information is flat wrong. To hold the scalpel, to advocate in a court, to drive an excavator on a steep slope, to design the election-winning advertising campaign. Also, to know where the data stops and where your own knowledge stops. To pick the outliers, the particular problems which can't be answered by WikiHow and a decent Youtube video. And to excel in the complex, ambiguous, grey areas.


Expertosis is the syndrome that you think you know lots but you don't. Just go listen to a student political group yakking about "They Should... " (publicly fund all undergrad places/close the student union/ban umbrellas/force everyone to study a second language - and that was all in 10 minutes!)
Teenagers are prone to expertosis. So are the middle class in their forties and fifties. Not that they are a new phenomenon  - everyone's met an annoying uncle who tells you how it Ought to Be at the Christmas barbie, or an obnoxious teenager who says  "It's all so clear. The answer is obvious. You're just idiots."

Their opinions must be right, because when they think them, they feel warm and fuzzy and right.
Everyone else just hasn't seen it yet.


But in essence, feeling right is only that. A feeling. Not actually correlated with whether or not you are right in any kind of physical or moral sense. In fact, the more right you feel, the less you may have evaluated the problem and the more likely it is that you could be wrong.

Cognitive ease is when the answer is obvious and comes to you effortlessly.
In Thinking Fast and Slow (which is still one of my favourite books) Kahneman summarises research on cognitive ease and cognitive strain- book extract in this link.
You feel cognitive ease if the thought is repeated, or you're in a good mood, or something has primed you for the idea, or if it's easy to take in (eg. small words, clear font, simple, clean, structured, apparently congruent)

So some examples:

Do you like apples?

YES I like apples.
Easy answer. Feels right. Apple. Crunch. Yum. Good for you. Like.

In the recent election campaign, the Liberal campaign was expertly crafted to bring cognitive ease.

Short words.
Four dotpoints.
Clear font. Big print.

Must be right.

Cognitive strain on the other hand, feels uncomfortable, happens for unfamiliar problems, happens more when you're in a bad mood (or even just frowning with a pencil in your mouth), or when the problem is not easy to take in (eg. long words, small print, poor structure,

It's the thing that is triggered when you're asked to do this sum in your head:

158 x 14 = ?

Did you even try?
Bet you didn't.
Bet your pupils dilated and you frowned and went "Oh that's haaaard" and gave up.
I gave up first time, and I'm an engineer.
Go get a bit of paper and give it another shot. See, I will too.

(158 * 10 = 1580) +
(158 * 4 = oh that's haaard)

(158 * 10 = 1580) +
(150 * 4 = 600) +
(8 * 4 = 32)

1580 + 600 + 32 = oh that's haaaard, no hangonatic I can do this, 2180 + 32 = 2210 + 2 =2212 HOORAY! That feels good and right, too.
(Now I've got the length of cladding I need for the kids' cubby front wall. Thanks.)

Another example, to give your poor sore brain a rest.
John Hewson was politically sunk when he gave this very famous and slightly funny interview. Poor guy. Bet he feels wistful when he gets a birthday cake.

Cognitive ease and cognitive strain, and expertise and expertosis

Here is another interesting fragment of research.

If a person is making a conclusion about something famillar, or has a habit of trusting their gut reactions, or likes to express opinions, cognitive ease brings a greater certainty. In Fast & Slow, religious evangelicals are cited as an example.

If a person is working with something unfamiliar, has a habit of trusting thoughtful responses or thinking carefully about their opinions, cognitive strain brings a greater certainty. Graduate students are cited as the counter-example.



So sitting on your backside surrounded by books in a university library for a number of years is actually good way to train a person to trust in the answer that comes after cognitive strain. They may not end up truly expert in the material, but they do end up at least trying to think harder about everything they come across.
In contrast, being able to google "hip replacement" and instantly knowing lots, is a good way to train the 15 year old to trust cognitive ease.

This is what the limitless availability of information may be doing.
  • Replace many experts with expertoids. 
  • Train us to trust cognitive ease, not cognitive strain, to bring the Right Answer. Foster a habit of seeking the congruent, familiar, easy answer.
  • Give a voice to those who would have not had the confidence to say they know anything (eg. mad old uncle Tim) - on an equal footing with those who have trained themselves to strain.
  • Equip us to handle the 90% of simple problems (build a retaining wall, write a novel) and then give us the contact details and web reviews of a good orthopedic surgeon when we need it.
Four dotpoints, see. I can do it too.
Gotta work on those short words, though.

Halloween food





So I got a bit ... well... enthusiastic at Halloween this year.

Not posting pictures of the 12 kids under 6 years old that came to my house, in case one of them wants to be police commissioner in some future where halloween is a moral outrage. Or photos of my giant sparkly fake eyelashes, which were fun! but if electronically public they might stymie my own ambitions to be a police commissioner in the future (or something). Or the bit where the lizard kid tried to wrap the skeleton kid in toilet paper to make him a mummy. That was fun. Although I should have supplied cheaper toilet paper. The supersoft paper just kept breaking.
There were monster drawing games, much screaming and running and hide and seek and BOO!, but all my other game plans dwindled when the little boys (and my daughter) started chasing the chooks around the block screaming, and the other little girls sat in their princess outfits and asked politely for some textas and started drawing butterflies and writing long words like "because".


I shall post pictures of my WATERMELON BRAIN
complete with hemispheres, the main lobes - OK not enough sulci and the gyri are too fat but it's not bad for 20mins from an engineer not a neurosurgeon.
I challenge any neurosurgeons reading this to make a better one. And post a comment.
Not my idea. I googled Halloween Food and found this. In the spectrum of brains posted in the comments, mine comes out pretty nice looking. 
 
Am I being too competitive? Is competitive brain carving in the Mummy Olympics?


Here is all the food.
Brain top right.
Dragon dropping cream puffs with chocolate top left.
Banana chips supposed to look like sliced vertebrae
Eyeball eggs - which were very tasty but nobody was game to eat them at first
Ghost bananas - how cute!
Mandarin pumpkins,
Slices of apple cut into mouths with teeth made of almond flakes
Fish fingers with almond flake fingernails

And there were also baby frankfurts wrapped like tiny mummies, instructions here,
which disappeared between putting them out and getting my camera. (I think the lizard boy ate them.)
Although if you are making them don't cut the puff pastry into strips. Cut it into one continuous spiral, Wrap each frankfurt, making sure there's a spot for the eyes later, and break it off. Much less fiddly than all those little strips.


What fun.

I shall have to do it again next year.

Wednesday, 23 October 2013

Why cemetaries are happy places

In my cultural background, when you go visit relatives, you go to visit the living ones and the dead ones. The living relatives are also supposed to make regular trips to visit the dead, clean the gravesite, weed around the headstones, have a chat, have a picnic, bring the kids. But in Australia, the council has paid gardeners and groundspeople who keep the cemetaries looking nice. They are often empty on weekends. People say cemetaries are ooky places, and some hold their breath driving past, almost as if death itself is contagious.
The living typically only show up under duress, and stand around silently in black on the muddy verge of the pit when the next grandparent/aunt/uncle dies. They get out of there as soon as possible.

At work, I was talking about going to visit my great-aunt in the cemetary in Poland. "Oh, that's morbid," one of the Steves said when it came up in conversation. "I hate cemetaries. They're depressing places."

It might be morbid in the sense of being to do with death, but it's not depressing. Cemetaries are lovely places.
Everyone buried in a cemetary had family or relatives who cared about them, who for whatever reason wanted to make sure that the deceased's body was given a spot, a reference point, a marker to say that they had lived. Everyone with an inscription had somebody who cared to write it. Everyone with an alabaster pot for flowers on their headstone had people who expected to come to drop off flowers. Everyone who had "beloved of" written on their stone had somebody who loved them.

Relationships in life are complicated things - there are whole professions to help us deal with them. We can love people and can't stand to be with them; we can be supported by our parents but feel freedom when they die; and family duty or financial obligation can make a mess of an otherwise straightforward relationship. Love and death are tied together through grief, which has a  depressing physiological effect on us - our endocrine, immune, autonomic nervous and cardivoascular systems are all affected through mechanisms which are only starting to be understood. Live humans are walking bundles of contradictions, and a dead body is not. So the living are left with all those contradictory emotions from the relationship, and they have to digest them alone. The focal point for this transition is often at the cemetary watching the coffin getting lowered into the ground. Whether the grief has surfaced by this point is irrelevant - that is the last image of that person burned into the retina of the survivor.

So if you only show up to a cemetary when somebody you know and love has just died, you're not going to develop an easy and comfortable association with the place, huh.



In broader terms, cemetaries are fascinating places. Archaeologists are always excited about grave digs. They are particularly interesting because for a cemetary to exist, the society must be stable, prosperous, healthy, and emotionally interconnected enough to find a spot to mark the resting place of their dead.
  • Cemetaries don't get populated in times of disaster or war - corpses usually get either a mass grave, or a spot on open earth or ocean to rot and get eaten by birds and eventually bleach and turn to dust. 
  • Cemetaries don't get populated in times of contagious disease - corpses get burned. Only in the 20th century did we really seal coffins adequately to routinely bury those dead from contagion. 
  • Cemetaries don't get populated by the extremely poor, or when food and money and energy are tight - when concentrating on survival, working 16 physical hours a day to subsist, or trying to keep alive your children who are too hungry to cry anymore, burying and marking the dead is a nicety that you can't afford.
  • Socially disconnected individuals don't normally get put in cemetaries. Digging a grave by hand requires substantial effort. Paying for a spot in a graveyard, and marking it with a stone, are the acts of someone who cares for the dead person. If there is nobody to care enough about you to put you in a grave, you would end up on the heap with the paupers, or cremated.
So when I notice that many suburban cemetaries are full, and the outer suburban ones are filling up, and gravestones a hundred years old are still cared for (and in some country towns the shop owners have the same surname as the oldest graves, I love it when I discover that!) I feel really happy about the modern world I live in.

What a privilege it is to bury your loved ones in a cemetary. To know where their bones are. To show your own kids and have a picnic there. To have a 3-high stack with 2 spaces left, and to ponder that this could well be a rare thing in human history, that people from all stratas of society (... poor in developing nations and modern slaves notwithstanding, but that post is still coming...) can plan where to put the corpses of their loved ones, ten, twenty, fifty years from now.

Please reconsider your local cemetary. Go check it out on a sunny weekend afternoon. Read the stones and think of all the care and love and joy and human connection that went into each one. It's actually a very happy place.

Saturday, 19 October 2013

Bechamel-ish sauce: dairy free, wheat-free, get the right flour and it's gluten-free

The key to a dairy free Western diet is finding substitutes. Every ingredient in traditional Bechamel is off our list. Here is a yummy alternative. Last time I cooked it in a bake, it was so convincing that DH asked if I was trying to poison him with dairy. I swear there's not a drop of milk protein in it.




Here is my best substitute for cheese. - it's quite awesome in a lasagna, a gratin, a creamy potato bake.It's based on a Bechamel recipe but carries not one iota of any of the actual ingredeints in Bechamel sauce.

1 cup soymilk - fridge or room temp to start with
3 tbsp canola oil
1/3 cup of the relevant flour (I use spelt flour cos we're not gluten-free, just wheat-free)
1 egg, whisked
Pinch of salt

Mix the milk and canola in a small pot. Whisk in the flour and start to heat it on the stove.
Mix the egg and salt in while the whole thing is still mostly cold.
Whisk vigorously and heat it until it has thickened. This takes a minute but when it happens it happens fast. Don't leave it on the stove
Use it in lasagna (smeared on top of super-thin slices of salted eggplant, you'd swear it was parmesan)
or use it on a sliced potato bake (sprinkling rye breadcrumbs on top gives it crunch and texture)
or use it on leeks/tomatoes/zucchini/your gratin of choice.











The places it does not work so well are where you look more closely at the sauce. For example, I need to come up with a better option for eggs benedict, fondue, pizza, and nachos.
But it is truly delicious in lasagna.

Getting discovery to stick: 1421, 1434, and why the Great Scientists we adore are actually not that adorable.

  • Newton Discovered Gravity.
  • Columbus Discovered The Americas
  • Einstein Discovered Relativity

The people above are those credited with the description of the phenomenom that has stuck in the present canon of history. But for someone to get credit for a discovery, it actually needs to get discovered a number of times, over many years, and then for some reason one version 'sticks' in the historical record and that makes that person a Great Man (or more rarely Great Woman).
I conjecture that the Great Discoveries, and more specifically our adoration of the 'genius' people who made such Great Discoveries, are simply the most recent version.

On an individual basis, we all need to learn things several times before they stick. Ever tried to teach yourself to juggle? Try it, it's fun, and apparently good for your peripheral vision and thought speed. You will 'get the hang' of' juggling several times, and then weeks or months later you will have forgotten it and have to re-discover it. I remember needing to learn how to dive about 4 times in consecutive summers before it stuck. Our brains need to learn, and practice, and lose it, and learn again, and lose it again, and learn it again and again before the new thing is integrated into our being, before the synapses are established and co-opted properly into their new job.


I don't think discovery in the collective mind is any different. I think we have to learn things many times, each generation, in order to bring it into the collective body of knowledge. And the collective contribution is only what the collective mind can tolerate.

I have just finished a pretty torrid but nevertheless interesting book called 1434; The Year a Magnificent Chinese Fleet Sailed to Italy and Ignited the Renaissance.
Poor Gavin Menzies is the punching bag of a series of well-read academic types in a number of fields. His book is not a properly peer-reviewed cross-referenced and footnoted dissertation. He likes maps, and astronomical navigation, and Leonardo da Vinci, and talks a lot about all of them. . Neither does the author clearly differentiate between speculation and well-evidenced historical factoid. I found some of his 'minor' evidence quite compelling: genome haplotypes on certain Dalmatian islands, sudden changes in the depiction of stars on chapel roofs etc. But I got bored and bogged down in all the map/globe stuff and endless comparisons between long-shelved drawings of river locks and extracts of ancient metallurgy texts. repetitive map footnotes talking about 47 carts of pepper per day consumed in Canton (now Guangzhou), and a series of Renaissance Men.

His main point, however, is that Chinese travellers to Italy caused the Renaissance. His theory is the conjunction of three historical factoids.
  • In 1434, the Chinese Ming dynasty had maps and navigation and astronomical and mathematical  knowledge way ahead of the Europeans (this is more or less historically established); 
  • The Europeans (specifically Florentines) made massive rapid leaps in science and technology at about the same time as a substantial number of sexy Asiatic slave girls messed with the Florentine social order; 
  • Just about all of the major documented technological, agricultural and military advances of the early Renaissance can be cross-referenced to some Chinese texts which are about 100 years older. To illustrate, he traces every one of Leonardo da Vinci’s inventions back to an old Chinese booklet, which he alledges was copied by the Sienese engineer Mariano Taccola.
Given this - and scraps of genetic evidence and folk stories - Menzies proposes that one particular Chinese admiral took his fleet to Europe, handed out a whole bunch of booklets, traded slave girls, and left in his wake the explosion of the Renaissance.

Whether or not Menzies is "right" about this Chinese fleet, he raises an interesting point. The history of discovery is quite different to the reality of discovery.


Chris Stringer, in The Origin of Our Species (which I found very much more engrossing than Gavin's book, sorry mate) talks about genetic evidence for wave after wave of migration, conquest, decline and elimination worldwide, of pockets of humanity. Technology and architecture develops and grows and then in spots it collapses and then it is sparked again and grows again. We rebuild on genetic ruins as well as the ruins of old cities. Mitochondrial (maternal) DNA mutations show dramatically different migration patterns to Y-chromosome (male) DNA mutations. For example, in New Zealand, evidence suggests that Polynesian conquerors came in from a different direction to the previous immigraiton wave from Australia,  and killed many of the men and took many women as wives, and now a high proportion of Maoris carry mitochondrial Australian Aboriginal DNA, but Polynesian y-chromosomes. Their genes carry the 'sins' of the long-forgotten forefathers. As do we all.

But back to Gavin and the Renaissance. Menzies' long-winded book. Menzies seems to want to 'set History right' and give the Chinese fleet more credit. SO credit is the issue of discovery.

I propose that the history of discovery is more about claiming the credit. That is not to say that the discoverer didn't do anything; just that they overstated their case. When Google Scholar's startup page offers that you can "Stand on the Shoulders of Giants", they are recognising that this is what Great Discoverers have done for all of recorded Western history. And they are offering you the opportunity to do it too.

SO my original conjecture.

Newton did not discover gravity. Every child who has ever dropped food off their high chair has discovered gravity and is running experiments.
Newton did described gravity mathematically for the first time in Western historical memory, and this was an important contribution. He also claimed the glory wth great gusto, by publishing through the Royal Society the Principa Mathematica (isn't that a somewhat arrogant title?) and by standing over the next generation of physicists as a President and Grand Old Man of the Royal Society, knighted by the Queen, member of Parliament, standardising the amount of silver in the currency, and prolifically corresponding with anyone on anything.
However, the astrophysicist who conceptualised the gravity well, and spat space and time into more dimensions, is not remembered by name at all.

Columbus is credited with the discovery of the Americas because he was the key person in the last wave who claimed the discovery. He had a mindset, not of curiosity, but of claiming credit in order to get the Spanish royalty to make him Admiral of the new world and give him a knighthood. In contrast, if Gavin Menzies is indeed "right" about China discovering the Americas in 1421, perhaps they didn't attempt to claim it is that they didn't frame it as a discovery. They assumed that it was already ruled by someone else, and didn't plan to interfere, knowing the administrative nightmare of taking it on.There was no big social advantage in China of conquest, so why bother?

Einstein is put forwardas the ultimate genius of the modern age. As a child, I remember being fascinated by a documentary about a pathologist who had dissected Einstein's brain and found an extra fold in the right frontal lobe. This was supposed to make him the genius that he was. My modern reading on neuroscience shows how woefully inadequate this explanation is.
So what did Einstein actually do?
He had a very clever original idea about the photoelectric effect, which was indeed revolutionary. In and around other scientists' discoveries, Einstein also came up with some maths to reframe space and time as the same thing - except we can only travel in one direction in the time dimension. But it was a very generative time in physics in general, and Einstein's big interest was in claiming a great discovery. Even Wiki says he wrote 300 technical papers and 150 non-technical works in his effort to make a comfortable living from simply being a smart fellow. Titles included, "On the General Molecular Theory of Heat", "A New Determination of Molecular Dimensions",  "On Science and Religion". They don't sound like humble scientific papers. They sound like blog titles. In any case, I am sure you can see that he was a big fan of his own genius.

So how come, in the 21st century, we can 're-discover' the 'truth' of prior discovery, and bring DaVinci,  Newton, Columbus, and Einstein down a peg or two in the collective firmament?


I propose that any society is very careful to forget or ignore known historical achievements that are beyond what we can conceptualise doing. We hate to feel inferior to the long-dead. We only look at the technology of the past when we can safely say "Oh look, they were still grunting and spit-roasting rats and picking each others' fleas". So Menzies' proposal only gets published because we now have widespread use of Newton's calculus, and GPS, and nobody needs to calculate longitudes using stars or clocks anymore.




So the tricks to being a Great Discoverer:
1) Find a new-ish idea and have a really good reason to take credit for it. Like a knighthood or a Nobel Prize. Market it to the scientific masses. Iterate the idea a few times, and draw a fabulous picture.
2) Quietly forget to mention your sources. Pretend it all came from your own brain.
3) Target your market. If in science, pick an idea just a little bit more complicated than the current cutting-edge. Don't try to propose anything too radical - the scientific community might feel inferior come after you with pitchforks. For example, don't try to propose planetary orbits to 12th-century Papists. Don't try to propose hygiene to 19th century Dutch obstetricians. Don't try to propose energy medicine and reiki to 21st century orthapedic surgeons.
4) Be a bit more fluent in academic structures, paradigms and language than Gavin Menzies. Newton had equations. Columbus had maps and charts - the academic structure of the time. And Einstein got very slick at producing papers.
5) Live a long time, get important, and dump on anyone younger than you who challenges your ideas too much. History is not written by the victor so much as the survivor.

Dear Gavin,
Your books 1421 and 1434 have certainly achieved 1) and 2). You are not comfortably established in your target market of academia, but you have great traction in the fiction-reading web-researching public (says me haha). But you do need to get your 4) sorted out, your peer-reviewed academic credentials. The real test of whether you get to keep your discovery is whether you live long enough to defend it. Good luck with that, buddy.
Sincerely
Lexskigator.

Friday, 4 October 2013

Talking to an amateur currency trader at lunchtime

"So I've been working on a program, it's been getting data from the FOREX portal and I've been doing dummy trades with it, and I sent it live three weeks ago, and I earned enough to take my wife to a country club for the weekend."
"You want to use it to quit your job?"
"Yeah. I want to earn enough to not have to show up here anymore."

He brings up the portal and his data. Over the day, he's "earned" about $500. He is stoked.

"You want to know the best thing? It's self-reinforcing. This pattern works better week by week. The best thing is that when some human spots a pattern, then he publishes the pattern, and people start to believe it, they start to follow the pattern and they get all excited when the recognise it in the data. Then as more people follow it, it is self-reinforcing. It gets stronger when people buy and sell on the pattern."

It's not fair, I say. Small nations get hit very hard by currency bubbles, people starve, you know.

"It's not a good system. But it's a big system. And when the whole planet is trading like this, you should expect a lot of inertia. It's like a mountain, it's just there.
And worrying about the morality isn't going to put dineros in my pocket."


I think the reason I posted it is because I find this quite a viscerally confronting moral position. This particular individual is totally refusing to recognise his part in the system - in fact he is pleased and satisfied to float on the bubble, and regards this as a worthwhile goal for his life. He is ready to trade in a generative, creative career for a shuffle-piles-of-money-at-the-expense-of-the-vulnerable career.
I am probably demonstrating my strangeness in the degree to which I disapprove. Youse all probably don't think this is a problem.

But how is this different to:
Pyramid selling schemes of any kind
Marketing superfoods or supplements for weight loss which are highly doubtful, or knowingly fraudulent
Running an energy futures market a la Enron
Packaging tranches of debt for on-selling as assets a la Freddie Mac and Fanny Mae
Selling FOREX pattern recognition software? or books? 


At least with shark fins or illegal drugs, there is actually a product which makes someone somewhere feel happier about buying.

So those currencies whose national production has been left plundered by FOREX traders chasing 'candlesticks' or 'pennants' or 'head & shoulders' profiles in the data (not that FOREX is the only culprit of course, there are many other 21st century globalisation-related trade phenomena which contribute), the otherwise unemployed seek to find another currency. Like shark fins or poppy heads. This picture of rotting finless small nations' production capabilities lets me post my disapproval in a way everyone can see.


Quit your FOREX trading, lunch friend of mine. Build infrastructure or sell drugs or get a job as a personal trainer. That way I will be able to keep my own lunch down better.