Sunday, November 16, 2008

Scientist Ian Frazer to trial skin cancer vaccine

By Clair Weaver

Ian Frazer

Disease prevention ... Professor Ian Frazer is trialling the world's first vaccine for skin cancer.
  • Skin cancer vaccine trials in Australia
  • Pioneer scientist heading testing
  • Available "within ten years"

THE pioneering Australian scientist who discovered the cure for cervical cancer is on the verge of creating the world's first vaccine for skin cancer.

Professor Ian Frazer, former Australian of the Year, has revealed the vaccine could be ready within the next five to 10 years.

As with the jab now given to millions of young girls each year to prevent cervical cancer, children aged between 10 and 12 would be given the vaccine to prevent skin cancer later in life, Professor Frazer envisages.

Testing on animals has shown the vaccine to be successful and human trials will start next year.

Australia has the world's highest rate of skin cancer with more than 380,000 people diagnosed with the disease and 1600 dying from it each year.

Professor Frazer will reveal this ground-breaking skin work at the Australian Health and Medical Research Congress to be held in Brisbane tomorrow.

He said it would be rewarding to develop a vaccine for a cancer that was so prevalent in Australia with its hot climate.

"It's an important challenge with a very major health benefit if it works," Professor Frazer told The Sunday Telegraph.

"If we get encouraging results we will try and push it on as fast as we can. It's really a given that we try to focus on health problems which are significant ones.

"When you're looking at treatments, your focus needs to be on diseases that are most common."

The new skin-cancer vaccine works by targeting papillomavirus, a common skin infection that affects most people and can linger in the body, turning abnormal cells into cancer.

Prof Frazer and his team from the Diamantina Institute at the University of Queensland are focusing on preventing squamous-cell skin cancer, which is strongly linked to papillomavirus.

Squamous cell is the second most common skin cancer, affecting 137,600 people in Australia this year and killing 400.

It's not yet known if melanomas which are the most deadly form of skin cancer, are also caused
by papillomavirus.

"My entire career has been focused on understanding the interaction between papillomavirus and the cancers they affect," Prof Frazer said.

"We know it causes at least five per cent of all cancers globally so one in 20 of the cancers that people get is caused by papillomavirus. It's a huge issue."

The new vaccine is part of a two-pronged approach to tackle skin cancer.

The other approach involves "switching off" one of the skin's controls to allow killer cells to destroy potentially cancerous cells.

"Getting the vaccine is the easy part," Prof Frazer said.

"We need to introduce this other component to change the setting in the local environment.

"The skin has a number of defences against the body's own immune system.

"What we're learning is the nature of those controls and how to turn them off.

"We can turn them off in animals and if we turn them off, the vaccine does its job."

Original here

Ovary transplant mother speaks of her "indescribable" joy after giving birth

Exclusive by Gordon Rayner and Rebecca Smith

Susanne and Stephan Butscher with their new baby Maja Charlotte Shasa
Susanne and Stephan Butscher with their new baby Maja Charlotte Shasa Photo: Heathcliff O'Malley

Maja, appropriately named after the Roman goddess of fertility, is a symbol of hope to millions of infertile women around the world who could benefit from the same pioneering procedure which enabled her mother Susanne to conceive naturally.

Mrs Butscher, 39, who went through an early menopause, fell pregnant a year after being given an ovary by her identical twin sister.

Recovering from the birth at the Portland Hospital in London, she said: "Being a mother at last is an indescribable feeling. It's been hard to take my eyes off her since she was born.

"I'm so lucky to have had this wonderful opportunity which has given me a sense of completeness I would never have had otherwise.

"Being the first woman in the world to give birth after a whole ovary transplant hasn't sunk in yet, but I'm just so grateful to the doctors who enabled this to happen and to my sister, of course.

"I'm happy to be sharing my story with the world to give other women hope who might have similar problems."

Doctors believe the pioneering transplant treatment which Mrs Butscher underwent in the US last year will not only benefit women who suffer an early menopause, but could also help women who undergo chemotherapy or radiotherapy for cancer and who could freeze one of their ovaries before beginning treatment.

Mrs Butscher, whose primary reason for the transplant was to halt the advance of osteoporosis which she was suffering as a result of her early menopause, began ovulating naturally for the first time in her life after receiving the ovary from her sister Dorothee.

She said she feared her transplanted ovary had failed when she missed her period eight months ago.

"It was a little bit worrying, but something inside me told me this was different," she said. "For the first time in my life, I went out and bought a pregnancy testing kit. When it showed up positive, I couldn't believe it, so I went out and bought another one to check."

Mrs Butscher, who is originally from Hamburg but has lived in London for the past six years with her husband Stephan, who is also German, said: "Ever since I found out I was pregnant it has been a magical journey.

"At the same time I was super-nervous. Every time we went over a bump or pothole in the road I was worried."

After a straightforward pregnancy, baby Maja Charlotte Shasa Butscher, whose third name means "precious water", was born by elective caesarean at 2.42 on Tuesday afternoon, weighing 7lbs 15oz.

Doctors at the Portland decided to perform a caesarean because Mrs Butscher had reached full term with no signs that she was about to go into labour.

"When I saw her for the first time I just cried," said Mrs Butscher. "You can't really put into words that feeling when you see your daughter for the first time. I heard her scream first, as she was delivered, and then I saw her. She really is a little miracle."

Mr Butscher, 40, said: "I don't think anyone has invented the right words to describe what it feels like to become a father."

Mrs Butscher, an acupuncturist and complementary therapist, was diagnosed as being infertile 12 years ago following years of tests on her ovaries and hormone levels.

She said: "I never had periods when I was younger, whereas my twin sister had regular periods. I was very slim and the doctors said that when I put weight on my periods would start.

"No-one realised at the time that my ovaries weren't working, they just said my hormone levels weren't normal, so I was put on the Pill to compensate.

"I had all sorts of blood tests, genetic tests, DNA tests, but it wasn't until we moved to Boston in America in 1996 that I was diagnosed with premature ovary failure.

"I was also told I had osteoporosis and that it would be very, very difficult for me to have children. It was hard to take on board."

Mrs Butscher and her husband Stephan, whom she had married that year, discussed the possibilities of egg donation and adoption, but, said Mrs Butscher: "We decided in the end we wouldn't go for any of this. We had a very full life and we were happy. We came to terms with the fact that we wouldn't have children."

Mr Butscher, a management consultant, said: "It was just part of what we were. It was never a massive issue because we had happy lives."

Mrs Butscher was put on hormone replacement therapy but was concerned about the long-term side-effects and began to look for other ways of treating her osteoporosis.

Her gynaecologist suggested she should contact Dr Sherman Silber, who had carried out pioneering ovary transplant procedures at the Infertility Centre of St Louis in Missouri. He suggested she might be a suitable candidate for a whole ovary transplant if her twin sister could be the donor.

"I wanted to make sure it wouldn't harm my sister, because it's such a big thing for someone to donate an organ," said Mrs Butscher. "She said she was happy to go along with it if it was what I wanted.

"At the time my primary concern was to treat my osteoporosis, but at the back of my mind it was also about fertility, even though I had been told so many times I couldn't have children. Dr Silber said it was possible I might start to ovulate, and that was what happened."

Mrs Butscher received her sister's right ovary in a four-and-a-half-hour operation in January 2007.

"It was really emotional because I'm very, very close to my sister and I knew I was the one putting her through this, so it was difficult from a physical point of view and from an emotional one," she said.

"After the surgery there was this tiny flame of hope that I might have a child, but it was difficult trying to balance hope with realistic expectations."

The moment Mrs Butscher had never dared hope for came 13 months after her operation, with confirmation that she was pregnant.

She said: "My husband was away at a conference in Dubai at the time and I didn't want to tell him until I was sure, so I said nothing when I spoke to him that evening and it was only the next day, after it was confirmed by my doctor, that I told him he was going to be a father."

Stephan Butscher said: "I was standing on a platform just about to make a speech in front of 50 people when Susanne rang me and said: 'I have to tell you something.'

"She asked me if I was sitting down, then said 'I'm pregnant.' It was the most fantastic news, and it was difficult to keep the grin off my face as I made my presentation."

Mrs Butscher, who now hopes to have more children, said: "Maja has been absolutely fantastic, she is a good feeder and she sleeps really well. She's got her own personality and she loves to observe everything that's going on around her."

As for her status as a world first, Mr Butscher said of Maja: "She is very calm and relaxed about the whole thing. She's just such a good baby."

Original here

Obama’s Favorite Pizza At The Inauguration

pizza Obamas Favorite Pizza At The InaugurationMichael Pollan wondered what the next President-elect would do about the food problems in our country. Well we’re now getting our first taste of food as it relates to Obama. Unfortunately what we’re being told is a fluff piece of little consequence. It seems that the pizza shop which makes our next President’s favorite pizza will be on hand to provide pie for the inauguration.

Patti Harris-Tubbs and her husband will fly out with their secret ingredients to help the Ritz Carlton recreate the pie from Italian Fiesta Pizzeria. The pizza won’t be part of the elegant dinner party, but rather handed out at the less glamorous locations. When talking to the Joliet Herald News Harris-Tubbs said that her immediate response to being asked for her pizza at the inauguration was, “Is this a joke?”

I don’t know if it’s a joke, but it’s quaint. And I guess that’s OK. But does anyone else feel that the PR spin is already at work? Bush was the President you’d have a beer with supposedly – personally I think he was the President you needed to get drunk to even believe the stuff coming out of his mouth – and now Obama will be the President you’d spend the afternoon sharing a whole pizza pie with.

Italian Fiesta Pizzeria
1400 East 47th Street, Chicago IL 60653

Original here

A gift or hard graft?

We look at outrageously talented and successful people - the Beatles, Mozart, Rockefeller, Bill Gates - and assume there is such a thing as pure genius. Not necessarily, argues Malcolm Gladwell...

Malcolm Gladwell

Malcolm Gladwell outside his home in New York last month. Photograph: Annie Collinge

The University of Michigan opened its new computer centre in 1971, in a low-slung building on Beal Avenue in Ann Arbor. The university's enormous mainframe computers stood in the middle of a vast, white-tiled room, looking, as one faculty member remembers, "like one of the last scenes in 2001: A Space Odyssey". Off to the side were dozens of key-punch machines - what passed in those days for computer terminals. Over the years, thousands of students would pass through that white-tiled room - the most famous of whom was a gawky teenager named Bill Joy.

Joy came to the University of Michigan the year the computer centre opened, at the age of 16. He had been voted "most studious student" by his graduating class at North Framingham high school, outside Detroit, which, as he puts it, meant he was a "no-date nerd". He had thought he might end up as a biologist or a mathematician, but late in his freshman year he stumbled across the computing centre - and he was hooked.

From then on, the computer centre was his life. He programmed whenever he could. He got a job with a computer science professor, so he could program over the summer. In 1975, Joy enrolled in graduate school at the University of California, Berkeley. There, he buried himself even deeper in the world of computer software. During the oral exams for his PhD, he made up a particularly complicated algorithm on the fly that - as one of his many admirers has written - "so stunned his examiners [that] one of them later compared the experience to 'Jesus confounding his elders' ".

Working in collaboration with a small group of programmers, Joy took on the task of rewriting Unix, a software system developed by AT&T for mainframe computers. Joy's version was so good that it became - and remains - the operating system on which millions of computers around the world run. "If you put your Mac in that funny mode where you can see the code," Joy says, "I see things that I remember typing in 25 years ago." And when you go online, do you know who wrote the software that allows you to access the internet? Bill Joy.

After Berkeley, Joy co-founded the Silicon Valley firm Sun Microsystems. There, he rewrote another computer language, Java, and his legend grew still further. Among Silicon Valley insiders, Joy is spoken of with as much awe as Bill Gates. He is sometimes called the Edison of the internet.

The story of Joy's genius has been told many times, and the lesson is always the same. Here was a world that was the purest of meritocracies. Computer programming didn't operate as an old-boy network, where you got ahead because of money or connections. It was a wide-open field, in which all participants were judged solely by their talent and accomplishments. It was a world where the best men won, and Joy was clearly one of those best men.

Sport, too, is supposed to be just such a pure meritocracy. But is it? Take ice hockey in Canada: look at any team and you will find that a disproportionate number of players will have been born in the first three months of the year. This, it turns out, is because the cut-off date for children eligible for the nine-year-old, 10-year-old, 11-year-old league and so on is January 1. Boys who are oldest and biggest at the beginning of the hockey season are inevitably the best. And so they get the most coaching and practice, and they get chosen for the all-star team, and so their advantage increases - on into the professional game. A similar pattern applies to other sports. What we think of as talent is actually a complicated combination of ability, opportunity and utterly arbitrary advantage.

Does something similar apply to outliers in other fields, such as Bill Joy? Do they benefit from special opportunities, and do those opportunities follow any kind of pattern? The evidence suggests they do.

In the early 90s, the psychologist K Anders Ericsson and two colleagues set up shop at Berlin's elite Academy of Music. With the help of the academy's professors, they divided the school's violinists into three groups. The first group were the stars, the students with the potential to become world-class soloists. The second were those judged to be merely "good". The third were students who were unlikely ever to play professionally, and intended to be music teachers in the school system. All the violinists were then asked the same question. Over the course of your career, ever since you first picked up the violin, how many hours have you practised?

Everyone, from all three groups, started playing at roughly the same time - around the age of five. In those first few years, everyone practised roughly the same amount - about two or three hours a week. But around the age of eight real differences started to emerge. The students who would end up as the best in their class began to practise more than everyone else: six hours a week by age nine, eight by age 12, 16 a week by age 14, and up and up, until by the age of 20 they were practising well over 30 hours a week. By the age of 20, the elite performers had all totalled 10,000 hours of practice over the course of their lives. The merely good students had totalled, by contrast, 8,000 hours, and the future music teachers just over 4,000 hours.

The curious thing about Ericsson's study is that he and his colleagues couldn't find any "naturals" - musicians who could float effortlessly to the top while practising a fraction of the time that their peers did. Nor could they find "grinds", people who worked harder than everyone else and yet just didn't have what it takes to break into the top ranks. Their research suggested that once you have enough ability to get into a top music school, the thing that distinguishes one performer from another is how hard he or she works. That's it. What's more, the people at the very top don't just work much harder than everyone else. They work much, much harder.

This idea - that excellence at a complex task requires a critical, minimum level of practice - surfaces again and again in studies of expertise. In fact, researchers have settled on what they believe is a magic number for true expertise: 10,000 hours.

"In study after study, of composers, basketball players, fiction writers, ice-skaters, concert pianists, chess players, master criminals," writes the neurologist Daniel Levitin, "this number comes up again and again. Ten thousand hours is equivalent to roughly three hours a day, or 20 hours a week, of practice over 10 years... No one has yet found a case in which true world-class expertise was accomplished in less time. It seems that it takes the brain this long to assimilate all that it needs to know to achieve true mastery."

This is true even of people we think of as prodigies. Mozart, for example, famously started writing music at six. But, the psychologist Michael Howe writes in his book Genius Explained, by the standards of mature composers Mozart's early works are not outstanding. The earliest pieces were all probably written down by his father, and perhaps improved in the process. Many of Wolfgang's childhood compositions, such as the first seven of his concertos for piano and orchestra, are largely arrangements of works by other composers. Of those concertos that contain only music original to Mozart, the earliest that is now regarded as a masterwork (No9 K271) was not composed until he was 21: by that time Mozart had already been composing concertos for 10 years.

To become a chess grandmaster also seems to take about 10 years. (Only the legendary Bobby Fischer got to that elite level in less than that time: it took him nine years.) And what's 10 years? Well, it's roughly how long it takes to put in 10,000 hours of hard practice.

Ten thousand hours is, of course, an enormous amount of time. It's all but impossible to reach that number, by the time you're a young adult, all by yourself. You have to have parents who are encouraging and supportive. You can't be poor, because if you have to hold down a part-time job on the side to help make ends meet, there won't be enough time left over in the day. In fact, most people can really only reach that number if they get into some kind of special programme - like a hockey all-star squad - or get some kind of extraordinary opportunity that gives them a chance to put in that kind of work.

So, back to Bill Joy. It's 1971 and he's 16. He's the maths wiz, the kind of student that schools like MIT, Caltech or the University of Waterloo attract by the hundreds. "When Bill was a little kid, he wanted to know everything about everything way before he should've even known he wanted to know," his father William says. "We answered him when we could. And when we couldn't, we would just give him a book." When he applied to college, Joy got a perfect score on the maths portion of the scholastic aptitude test. "It wasn't particularly hard," he says, matter-of-factly. "There was plenty of time to check it twice." He could have gone in any number of directions. He could have done a PhD in biology. He could have gone to medical school. He could easily have had a "typical" college career: lots of schoolwork, football games, drunken fraternity parties, awkward encounters with girls, long discussions with roommates about the meaning of life. But he didn't, because he stumbled across that nondescript building on Beal Avenue.

In the 70s, when Joy was learning about programming, computers were the size of rooms. A single machine - which might have less power and memory than your microwave - could cost upwards of a million dollars. Computers were hard to get access to, and renting time on them cost a fortune. This was the era when computer programs were created using cardboard "punch" cards. A complex program might include hundreds, if not thousands, of these cards, in tall stacks. Since computers could handle only one task at a time, the operator made an appointment for your program and, depending on how many other people were ahead of you in line, you might not get your cards back for several hours. And if you made even a single error in your program, then you had to take the cards back, track down the error and begin the whole process again. Under those circumstances, it was exceedingly difficult for anyone to become a programming expert. Certainly becoming an expert by your early 20s was all but impossible. "Programming with cards," one computer scientist from the era remembers, "did not teach you programming. It taught you patience and proofreading."

That's where the University of Michigan came in. It was one of the first universities in the world to abandon computer cards for the brand-new system called "time-sharing". Computer scientists realised you could train a computer to handle hundreds of tasks at the same time. No more punch cards. You could build dozens of terminals, link them all to the mainframe by a telephone line, and have everyone programming - online - all at once.

This was the opportunity that greeted Bill Joy when he arrived on the Ann Arbor campus in the autumn of 1971. "Do you know what the difference is between the computing cards and time-sharing?" Joy says. "It's the difference between playing chess by mail and speed chess." Programming wasn't an exercise in frustration any more. It was fun.

According to Joy, he spent a phenomenal amount of time at the computer centre. "It was open 24 hours. I would stay there all night, and just walk home in the morning. In an average week in those years I was spending more time in the computer centre than on my classes. All of us down there had this recurring nightmare of forgetting to show up for class at all, of not even realising we were enrolled."

Just look at the stream of opportunities that came Joy's way. Because he happened to go to a far-sighted school, he was able to practise on a time-sharing system, instead of punch cards; because the university was willing to spend the money to keep the computer centre open 24 hours, he could stay up all night; and because he was able to put in so many hours, by the time he was presented with the opportunity to rewrite Unix, he was up to the task. Bill Joy was brilliant. He wanted to learn - that was a big part of it - but before he could become an expert, someone had to give him the opportunity to learn how to be expert.

"At Michigan, I was probably programming eight or 10 hours a day," he says. "By the time I was at Berkeley, I was doing it day and night... " He pauses for a moment, to do the maths in his head which, for him, doesn't take long. "It's five years," he says, finally. "So, so, maybe... 10,000 hours? That's about right."

Is this a general rule of success? If you scratch below the surface of every great achiever, do you always find the equivalent of the Michigan Computer Centre or the hockey all-star team - some sort of special opportunity for practice? Let's test the idea with two examples: the Beatles, one of the most famous rock bands ever, and Bill Gates, one of the world's richest men.

The Beatles - John Lennon, Paul McCartney, George Harrison and Ringo Starr - came to the US in February 1964, starting the so-called "British Invasion" of the American music scene. The interesting thing is how long they had already been playing together. Lennon and McCartney began in 1957. (Incidentally, the time that elapsed between their founding and their greatest artistic achievements - arguably Sgt Pepper's Lonely Hearts Club Band and the White Album - is 10 years.) In 1960, while they were still a struggling school rock band, they were invited to play in Hamburg, Germany.

"Hamburg in those days did not have rock'n'roll music clubs. It had strip clubs," says Philip Norman, who wrote the Beatles' biography, Shout! "There was one particular club owner called Bruno, who was originally a fairground showman. He had the idea of bringing in rock groups to play in various clubs. They had this formula. It was a huge nonstop show, hour after hour, with a lot of people lurching in and the other lot lurching out. And the bands would play all the time to catch the passing traffic. In an American red-light district, they would call it nonstop striptease.

"Many of the bands that played in Hamburg were from Liverpool," Norman continues. "It was an accident. Bruno went to London to look for bands. But he happened to meet a Liverpool entrepreneur in Soho, who was down in London by pure chance. And he arranged to send some bands over. That's how the connection was established. And eventually the Beatles made a connection not just with Bruno, but with other club owners as well. They kept going back, because they got a lot of alcohol and a lot of sex."

And what was so special about Hamburg? It wasn't that it paid well. (It didn't.) Or that the acoustics were fantastic. (They weren't.) Or that the audiences were savvy and appreciative. (They were anything but.) It was the sheer amount of time the band was forced to play. Here is John Lennon, in an interview after the Beatles disbanded, talking about the band's performances at a Hamburg strip club called the Indra: "We got better and got more confidence. We couldn't help it with all the experience playing all night long. It was handy them being foreign. We had to try even harder, put our heart and soul into it, to get ourselves over. In Liverpool, we'd only ever done one-hour sessions, and we just used to do our best numbers, the same ones, at every one. In Hamburg we had to play for eight hours, so we really had to find a new way of playing."

The Beatles ended up travelling to Hamburg five times between 1960 and the end of 1962. On the first trip, they played 106 nights, of five or more hours a night. Their second trip they played 92 times. Their third trip they played 48 times, for a total of 172 hours on stage. The last two Hamburg stints, in November and December 1962, involved another 90 hours of performing. All told, they performed for 270 nights in just over a year and a half. By the time they had their first burst of success in 1964, they had performed live an estimated 1,200 times, which is extraordinary. Most bands today don't perform 1,200 times in their entire careers. The Hamburg crucible is what set the Beatles apart.

"They were no good on stage when they went there and they were very good when they came back," Norman says. "They learned not only stamina, they had to learn an enormous amount of numbers - cover versions of everything you can think of, not just rock'n'roll, a bit of jazz, too. They weren't disciplined on stage at all before that. But when they came back they sounded like no one else. It was the making of them."

Let's now turn to the history of Bill Gates. His story is almost as well-known as the Beatles'. Brilliant young maths wiz discovers computer programming. Drops out of Harvard. Starts a little computer company called Microsoft with his friends. Through sheer brilliance, ambition and guts builds it into the giant of the software world.

Now let's dig a bit deeper. Gates' father was a wealthy lawyer in Seattle, and his mother was the daughter of a well-to-do banker. As a child Gates was precocious, and easily bored by his studies. So his parents took him out of public school, and at the beginning of seventh grade sent him to Lakeside, a private school that catered to Seattle's elite families. Midway through Gates' second year, the school started a computer club. "The Mothers' Club at school did a rummage sale every year, and there was always the question of what the money would go to," Gates remembers. "That year, they put $3,000 into buying a computer terminal down in this funny little room that we subsequently took control of. It was kind of an amazing thing."

Even more remarkable was the kind of computer Lakeside bought: it was an ASR-33 Teletype, a time-sharing terminal with a direct link to a mainframe computer in downtown Seattle. "The whole idea of time-sharing only got invented in 1965," Gates says. "Someone was pretty forward looking."

From that moment on, Gates lived in the computer room. He and a number of others began to teach themselves how to use this strange new device. The parents raised more money to buy time on the mainframe computer. The students spent it. As luck would have it, Monique Rona, one of the founders of C-Cubed - a company that leased computer time - had a son at Lakeside, a class ahead of Gates. Would the Lakeside computer club, Rona wondered, like to test out the company's software programs on the weekends in exchange for free programming time? Absolutely!

Before long, Gates and his friends latched on to another outfit called ISI, which agreed to let them have free computer time in exchange for working on a piece of software that could be used to automate company payrolls. In one seven-month period in 1971, Gates and his cohorts ran up 1,575 hours of computer time on the ISI mainframe, which averages out at eight hours a day, seven days a week.

"It was my obsession," Gates says of his early high school years. "I skipped athletics. I went up there at night. We were programming on weekends. It would be a rare week that we wouldn't get 20 or 30 hours in. There was a period where Paul Allen and I got in trouble for stealing a bunch of passwords and crashing the system. We got kicked out. I didn't get to use the computer the whole summer. This is when I was 15 and 16. Then I found out Paul had found a computer that was free at the University of Washington. They had these machines in the medical centre and the physics department. They were on a 24-hour schedule, but with this big slack period so between three and six in the morning they never scheduled anything." Gates laughed. "That's why I'm always so generous to the University of Washington, because they let me steal so much computer time. I'd leave at night, after my bedtime. I could walk up to the university from my house. Or I'd take the bus." Years later, Gates' mother said, "We always wondered why it was so hard for him to get up in the morning."

Through one of the founders of ISI, Gates landed a secondment programming a computer system at the Bonneville Power station in southern Washington State. There, he spent the spring of his senior year writing code.

Those five years, from eighth grade to the end of high school, were Bill Gates' Hamburg, and by any measure he was presented with an even more extraordinary series of opportunities than Bill Joy. And virtually every one of those opportunities gave Gates extra time to practise. By the time he dropped out of Harvard, he'd been programming nonstop for seven consecutive years. He was way past 10,000 hours. How many teenagers had the kind of experience Gates had? "If there were 50 in the world, I'd be stunned," he says.

If you put together the stories of hockey players and the Beatles and Bill Joy and Bill Gates, I think we get a more complete picture of the path to success. Joy, Gates and the Beatles are all undeniably talented. Lennon and McCartney had a musical gift, of the sort that comes along once in a generation, and Joy, let us not forget, had a mind so quick that he could make up a complicated algorithm on the fly that left his professors in awe. A good part of that "talent", however, was something other than an innate aptitude for music or maths. It was desire. The Beatles were willing to play for eight hours straight, seven days a week. Joy was willing to stay up all night programming. In either case, most of us would have gone home to bed. In other words, a key part of what it means to be talented is being able to practise for hours and hours - to the point where it is really hard to know where "natural ability" stops and the simple willingness to work hard begins.

What is so striking about these success stories is that the outliers were the beneficiaries of some kind of unusual opportunity. Lucky breaks don't seem like the exception with software billionaires, rock bands and star athletes; they seem like the rule.

Recently Forbes Magazine compiled a list of the 75 richest people in history. It includes queens and kings and pharaohs from centuries past, as well as contemporary billionaires such as Warren Buffet and Carlos Slim. However, an astonishing 14 on the list are Americans born within nine years of each other in the mid-19th century. In other words, almost 20% of the names come from a single generation - born between 1831 and 1840 in a single country. The list includes industrialists and financiers who are still household names today: John Rockefeller, born in 1839 (the richest of the lot); Andrew Carnegie, 1835; Jay Gould, 1836; and JP Morgan, 1837.

What's going on here is obvious, if you think about it. In the 1860s and 1870s, the American economy went through perhaps the greatest transformation in its history. This was when the railways were built, and when Wall Street emerged. It was when industrial manufacturing started in earnest. It was when all the rules by which the traditional economy functioned were broken and remade. What that list says is that it was absolutely critical, if you were going to take advantage of those opportunities, to be in your 20s when that transformation was happening.

If you were born in the late 1840s, you missed it - you were too young to take advantage of that moment. If you were born in the 1820s, you were too old - your mindset was shaped by the old, pre-civil war ways. But there is a particular, narrow nine-year window that was just perfect. All of the 14 men and women on that list had vision and talent. But they also were given an extraordinary opportunity, in the same way that hockey players born in January, February and March were given an extraordinary opportunity.

Let's do the same kind of analysis for software tycoons such as Bill Joy and Bill Gates.

Veterans of Silicon Valley will tell you that the most important date in the history of the personal computer revolution was January 1975. That was when the magazine Popular Electronics ran a cover story on a machine called the Altair 8800. The Altair cost $397. It was a do-it-yourself contraption that you could assemble at home. The headline on the story read: Project Breakthrough! World's First Minicomputer Kit To Rival Commercial Models. To readers of Popular Electronics, then the bible of the fledgling software and computer world, that headline was a revelation. Computers up to that point were the massive, expensive mainframes of the sort sitting in the white-tiled expanse of the Michigan computing centre. For years, every hacker and electronics wiz had dreamed of the day when a computer would come along that was small and inexpensive enough for an ordinary person to use and own. That day had finally arrived.

If January 1975 was the dawn of the personal computer age, then who would be in the best position to take advantage of it? If you're a few years out of college in 1975, and if you have had any experience with programming at all, you would have already been hired by IBM or one of the other traditional, old-line computer firms of that era. You belonged to the old paradigm. You have just bought a house. You're married. A baby is on the way. You're in no position to give up a good job and pension for some pie-in-the-sky $397 computer kit. So let's also rule out all those born before, say, 1952.

At the same time, though, you don't want to be too young. You can't seize the moment if you're still in high school. So let's also rule out anyone born after, say, 1958. The perfect age to be in 1975, in other words, is young enough to see the coming revolution but not so old as to have missed it. You want to be 20 or 21, born in 1954 or 1955.

Let's start with Gates, the richest and most famous of all Silicon Valley tycoons. When was he born? Bill Gates: October 28 1955. The perfect birthdate. Gates is the hockey player born on January 1.

Gates's best friend at Lakeside was Paul Allen. He also hung out in the computer room with Gates, and shared those long evenings at ISI and C-Cubed. Allen went on to found Microsoft with Gates. Paul Allen: January 21 1953.

The third richest man at Microsoft is the one who has been running the company on a day-to-day basis since 2000 - one of the most respected executives in the software world, Steve Ballmer. Steve Ballmer: March 24 1956.

And let's not forget a man every bit as famous as Gates, Steve Jobs, the co-founder of Apple Computer. He wasn't from a rich family, like Gates, and he didn't go to Michigan, like Joy. But it doesn't take much investigation of his upbringing to realise that he had his Hamburg, too. He grew up in Mountain View California, just south of San Francisco, which is the absolute epicentre of Silicon Valley. His neighbourhood was filled with engineers from Hewlett-Packard, then, as now, one of the most important electronics firms in the world. As a teenager he prowled the flea markets of Mountain View, where electronics hobbyists and tinkerers sold spare parts. Jobs came of age breathing the air of the very business he would later dominate. He picked the brains of Hewlett-Packard engineers and once even called Bill Hewlett, one of the company's founders, to request parts. Jobs not only received the parts he wanted, he managed to wangle a summer job. He worked on an assembly line to build computers and was so fascinated that he tried to design his own... Steve Jobs was born on February 24 1955.

Another of the pioneers of the software revolution was Eric Schmidt. He ran Novell, one of Silicon Valley's most important software firms, and in 2001 became the chief executive officer of Google. He was born on April 27 1955.

I don't mean to suggest, of course, that every software tycoon in Silicon Valley was born in 1955. But there are very clearly patterns here, and what's striking is how little we seem to want to talk about them. We pretend that success is a matter of individual merit. That is not the whole story. These are stories about people who were given a special opportunity to work really hard and seized it, and who happened to come of age at a time when that extraordinary effort was rewarded by the rest of society. Their success was not of their own making. It was a product of the world in which they grew up. Their success, in other words, wasn't due to some mysterious process known only to themselves. It had a logic, and if we can understand that logic, think of all the tantalising possibilities that opens up.

By the way, let's not forget Bill Joy. Had he been just a little bit older and had to face the drudgery of programming with computer cards, he says he would have studied science. Bill Joy the computer legend would have been Bill Joy the biologist. In fact, he was born on November 8 1954. And his three fellow founders of Sun Microsystems - one of the oldest and most important of Silicon Valley's software companies? Scott McNealy: born November 13 1954. Vinod Khosla: born January 28 1955. Andy Bechtolsheim: born June 1955. ·

© Malcolm Gladwell 2008.

• This is an edited extract from Outliers: The Story Of Success, by Malcolm Gladwell, to be published on November 27 by Allen Lane at £16.99. Malcolm Gladwell: Live In London is on November 24 at 5.45pm and 8.30pm at the Lyceum Theatre, London. Tickets from £13.50 to £26.50. To book, call 0844 412 1742 or go to There will be an interview with Malcolm Gladwell in tomorrow's Observer.

Original here

The End of the World As We Know It—Not

By: Allie Firestone (View Profile)

With so much doom and gloom all around us lately, I feel like none of us would be the least bit surprised if frogs started raining from the sky and the news started talking about alien invasions. With the recent end of the presidential election—finally!—we seem to be torn between being sure of impending doom and being positive that the worst is behind us. To cheer myself up a little, I started digging around for past and future world’s-end scenarios. What I found was whole lot of failed Armageddon and second coming predictions. Yes, there are many. Yes, they are quite hilarious, but maybe more importantly, they’re a good reminder that this, too—whatever we think of the newest president and state of our nation—shall pass.

1. The First Second Coming: 30 CE
The Bible’s New Testament has many predictions by Jeshua of Nazareth (a.k.a Jesus Christ) claiming that God’s kingdom would arrive very soon after Jesus’s crucifixion, or that it was already in the process of arriving. For example, the book of Matthew states, “There shall be some standing here which shall not taste of death, till they see the Son of Man coming in his kingdom.” This is followed by: “This generation shall not pass, till all these things be fulfilled.” Being the first century, life expectancy was around thirty years, so this means JC was expected to reunite with his old friends within a few decades. Good news for the sinners, bad news for his followers: looks like they got stood up.

2. First Round-Numbered-Year Panic: 500
We were trying to do the math on the world’s expiration well over a thousand years ago. For the first big numerical transition, a theologian named Hippolytus predicted that the anti-Christ would make its appearance this year, which would be followed by the second coming of Christ—and subsequent end of the world. What did he base this on? He calculated that the world would last about 6,000 years and, since Adam, the first man according to the Bible, was born 5,500 years before Christ (you knew that, right?), this had to be the big day.

3. Y1K Hysteria: 1000
The approaching end of the first millennium caused mass hysteria (hmmm, sounds familiar) across the medieval Christian world. Folks believed that Christ’s return would happen on this monumental day—that Jesus would be the guest of honor at all their New Year’s Eve bashes. According to historians, during the year 999, there was a huge burst of religious fervor as believers prepared to be taken up to heaven. They began selling belongings, donating possessions to the church (though I have to wonder, what use would the church have for them if the world was about to implode?), releasing people from prison, and neglecting earthly duties like planting crops. When the predictions turned out to be a bust, the church didn’t give back the gifts, which caused some criticism of the church to follow. This was quickly silenced with the execution of the heretics, which settled down the critiques right quickly.

4. You Can’t Blame Her for Trying: 1814
Hey, at least there was one woman who got in on all these predictions. Joanna Southcott, an English mystic, preached that she had supernatural powers, declaring herself the woman spoken of in the King James version of Revelations: “There appeared a great wonder in heaven; a woman clothed with the sun, and the moon under her feet, and upon her head a crown of twelve stars.” She told followers that she would give birth to the second coming of Christ, marking the world’s end on October 19, 1814. She wasn’t as lucky as the Virgin Mary, failing to give birth to anyone on that date. Followers held on for a while, though—even though she died two months later, they kept her body hoping she’d return from the dead. (Hey, maybe her biblical events were mixed up—if not a virgin birth, perhaps a resurrection.) Once she started to decay, however, they handed her over to authorities.

5. Picking a Specific Year Is Never a Good Idea: 1890
Joseph Smith—a.k.a the founder of the Mormon Church—reported hearing a voice while praying. He wrote, “I was once praying very earnestly to know the time of the coming of the Son of Man, when I heard a voice repeat the following: ‘If thou livest until thou art eighty-five years old, thou shalt see the face of the Son of Man.’” He would have hit eighty-five in 1890. Not only was there no second coming that year, but he’d also been dead for nearly twenty-five years at that point. Some claim there was ambiguity in his prophecy that accounts for this slight road bump in the prediction—it could mean that Jesus would return in 1890 (which he did not), or maybe it meant that 1890 would pass without the return of Jesus. If seeing the face of the son of man means not seeing his face, then, yes, this prophecy has come true. Maybe I should start making prophesies like this.

6. He’s Already Here and We Missed It: 1982
On April 25, 1982, Londoners drinking their morning cup of coffee opened up their newspapers to come face to face with a rather disturbing headline: “The Christ Is Now Here!” It topped an article describing that Christ had already returned to Earth in 1977 and had been living among a group of Pakistani immigrants in South London. Fortunately, for all of us sinners not quite prepared for Judgment Day, it turned out just to be a series of full-page ads that were placed by a local religious group. Whew.

7. Tenth Time’s a Charm? 1984
Jehovah’s Witnesses have been in on the second coming predictions, too, which they say derive from the Book of Daniel. The most recent one hypothesized that life as we know it would come to an end in 1984. However, they said the same thing about 1874, 1878, 1881, 1910, 1914, 1918, 1925, and 1975. Not to be found wrong again, the Jesus of Burien (nee William E. Peterson) went with it anyway, stating that Armageddon did come on this day and that Christ returned and is already well into his reign. Talk about anticlimactic.

8. Said Pope Leo IX: 2014
In 1514 this beacon of the Catholic Church wrote, “I will not see the end of the world, nor will you my brethren, for its time is long in the future”—wait for it—“500 years hence.” Another specific prediction?! This one leaves us at the year 2014 (add 500 to 1514). For some reason, followers have made guesses as to where this prophesy might come to fruition, including Niagara Falls, where the base will turn into a lake of fire. Look out New York.

Throughout history, a select few have tried to divine when the end is coming. But it seems we’ve moved beyond just a select few worrying about the end; a Pew Research Center poll found that most regular Americans have a dire outlook on our future. More than a third believe that the U.S. will be involved in a nuclear war during the next fifty years; 56 percent think overpopulation will be a major problem and cause a strain on food and resources; and about the same number think there will be an epidemic worse than AIDS in that period. Almost two-thirds think there will probably be a major terrorist attack on this country involving biological or chemical weapons. Still, 70 percent say they’re hopeful about life this century thanks to their faith in science, technology, medicine, and higher education. How could they be so optimistic amid such dire predictions? Maybe they’ve already read these predictions and remember how they ended ... or didn’t.

Original here