Friday, 30 September 2016

Tobacco, petroleum and sugar, so what do they have in common?

All three are industrial concerns that have over the last hundred years have enjoyed huge, international markets. And all three pose a significant hazard to human health.

The tobacco industry peaked in the late 20th century, although the hazards of smoking had been known since the 1930s. Eventually, of course the health hazards of smoking were indisputable. The battle against petroleum is still being fought with most of us stuck as ‘users’ of a product that causes environmental damage through excessive CO2 in the atmosphere (causing global warming and oceanic acidification). And the direct health hazards of burned petroleum exhaust, especially from diesels, which are carcinogenic. 

It’s not been enough to simply inform people of the hazards. The tobacco and petroleum industries have fought back with disinformation campaigns that have sought to throw doubt on the science identifying these hazards. In fact, a major secondary industry has been created to generate doubt about the actual hazards of smoking and the reality of man-made climate change.  In my earlier blog I looked into the techniques being used by the tobacco and petroleum industries to protect their market share. engineering-consent

There’s another industry which is determined to maintain market share despite clear health hazards associated with its consumption. That is the sugar industry.

Over the last 50 years obesity has become a major health hazard. The obesity epidemic started in the USA and has spread across the world as different societies have adopted American dietary habits. Now, even third world countries that are barely producing enough food to support their populations have obesity problems.

Much of the trouble started in the 1970s when nutritionists were persuaded that foods containing fat were bad. In response, the food ‘manufacturing’ industry started to produce low fat products. 

However, these new ‘manufactured’ food products needed to be made palatable because low fat foods are not tasty. Low fat foods generally have sugar added to them to restore some taste. Additionally, manufactured foods usually have the ‘fibre’ removed. This is to produce foods with a longer shelf life. The combination of added sugar and low fibre is catastrophic.


In order to understand the problem with sugar and refined carbohydrates we need to look at how the body digests food. The digestive process turns food into sugar which is carried around  our  body by the blood stream. The blood stream is the distribution system for the energy that we take on board as food. Between meals our blood sugar level drops slowly as the normal process of maintaining the muscles and other organs is carried out. 

The body has a mechanism to regulate the blood sugar level. When the blood sugar level falls sufficiently we feel hungry and go looking for food. 

It is now known that much of the problems of obesity come from so called, refined carbohydrates. These are the easily digested sugars and starches. In refined carbohydrates the natural fibres, and thus much of the nutrient value, has also been removed. Because the fibre has been removed these foods are digested very quickly.

When refined carbohydrates are consumed our blood-sugar increases abruptly. Some of the sugar in the blood is immediately used by the muscles and other organs but the pancreas responds to the abrupt increase in blood-sugar by producing large amounts of insulin. The increased insulin converts some of the blood-sugar into new fat. This is the natural process. But because the blood sugar level increased so abruptly rather more insulin is produced than is required to merely restore the blood sugar level to normal. The blood sugar falls below normal.

The ‘control system’ for the blood-sugar level has overreacted. The new, low blood sugar level causes loss of energy and renewed feelings of hunger. This is very undesirable! The body has just converted some of our meal into fat and yet we are still hungry! So we crave more food and if we try and satisfy this with more refined carbohydrates the ensuing spike in blood sugar level will cause a further burst of insulin from the pancreas. More fat will be stored and the blood-sugar level will fall again and another hunger, low energy cycle will be induced.

And so it goes on. The cycles of low blood sugar keep causing feelings of hunger and lack of energy. The final result being we take on more calories than we need.


All this was understood many years ago. In the 1970s a British scientist called John Yudkin was one of the first critical voices speaking out against the dangers of sugar. He wrote a book called, Pure White and Deadly.

It was at this time that the sugar and manufactured foods industry started to promote the position that it was not sugar but fat that was the primary cause of heart problems. In 1967 three Harvard scientists were sponsored by something called the Sugar Research Association (a lobby group for the North American sugar industry) to publish a review of research on sugar, fat and heart disease. This report played down the effects of sugar and emphasised the role of fat in heart disease. In 1970 The Seven Countries Study was published. This was also a highly questionable report aimed at deflecting the blame for obesity away from sugar.

These and other sponsored studies ignored the science of how sugar is metabolised and turned into fat. They were all highly critical of foods containing fat. 

Today standard governmental health advice is still to reduce caloric intake by eating less and to increase caloric expenditure by exercising more. This ignores the role of refined carbohydrates which stimulate the production of extra insulin and, as I've described above, cause the production of fat even under conditions of regular exercise and reduced food consumption. (It should be said that regular exercise is valuable as it does reduce stress and this reduces the desire to over eat.)

Yudkin’s study from 1970, and more recent firmly based scientific accounts have still not been fully endorsed by the great public health institutions. Britain’s NHS and the public health advisors in the USA and other countries are influenced by the powerful food who use sugar and related products such as corn syrup in huge quantities. Thus, as we have seen with cigarettes and hydrocarbon fuels, solid scientific arguments highlighting the danger to public health are being by challenged by powerful vested interests, and using much the same techniques.

See, From the university of California for a very comprehensive analysis of the causes of obesity in general and in particular the role of sugar in diet.

Tuesday, 23 August 2016

Hover Slam

One of the most fascinating developments in space technology is the new art of controlled recovery of first stage launchers. A new technique,  developed by SpaceX, promises to reduce the cost of access to space considerably. It's got the Big Aerospace competition worried and it's also put the USA back on top when it comes to being demonstrably masters of cutting edge technology.

The spaceflight pioneers of the 1940's and on had no use for recoverable launchers. Manned spaceflight technology was derived from nuclear missile technology and there was never any question of reusing a launcher in the case of a nuclear missile.

When the Space Shuttle was developed, the two solid fuel boosters were recovered by parachute and had to be fished out of the sea and then subjected to an expensive refurbishment before the parts could be reused.

Now machines that land vertically have been around for quite a while. Hanna Reitsch, the German test pilot, demonstrated the FW61 helicopter at a 1938 Berlin motor show and the ability to hover was also a feature of the first jet with vertical land capability, the Harrier.

The Lunar Lander could also hover and this hovering business is quite a trick. It requires a propulsion system with a precise and responsive throttle control. By this I mean that the thrust can be adjusted in very fine increments and will change quickly in response to pilot commands.

The above video, BTW, shows Neil Armstrong getting into problems flying something called the Lunar Landing test vehicle. This was a rig designed for Apollo crews to practice their moon landings in. He runs into difficulties but still ends up demonstrating why he was picked to be the first man on the moon.

In order to hover a machine such as the Lunar Lander or a Harrier with precision it is required to be able to reduce the amount of thrust as the fuel is consumed. The amount of thrust being produced must be equal, no more, than the weight of the machine plus its fuel load. And on the Harrier and the Lunar Lander the fuel was used up pretty quickly hence the throttle setting has to be continually reduced has long has the hover is maintained.

But engine specifications tend to emphasise power in favour of controllability. And with space craft launchers the main issue is thrust, the engines are started up for lift off and tend to run flat out until all fuel is exhausted. And it turns out that the engines designed by SpaceX for the Falcon 9 launcher can not be throttled successfully to less than 50% of maximum thrust. 

It's a characteristic of combustion engines that they have a relatively constrained range that they can operate reliably in. Reduce the fuel flow too much, by reducing the throttle setting, and they stand a good chance of quitting completely, or surging (producing too much thrust in pulses).  In order to produce consistent thrust a certain minimum power setting has to be maintained. 

In what one might consider a characteristic SpaceX approach, they have come up with a software solution to a hardware limitation.

With helicopters, the Lunar Lander and Harriers the pilot first establishes a stable hover and then reduces the power gradually in order to touch down smoothly.  But for SpaceX the lack of throttle-ability means that hovering, as in the Harrier/helicopter/Lunar Lander sense is just impossible. By the time the recovering first stage has got to the touch down point, after following the profile above, it is so light that even 50% thrust from one of its nine engine would cause it to ascend.

In order to achieve that very neat trick we see in the video at the top SpaceX have done something pretty audacious which no human pilot has ever been asked to achieve,  This is the Hover Slam manoeuvre. 

After the second stage plus its payload has separated the first stage must be turned around and slowed down, at this stage it's almost going fast enough to get into orbit itself. Some of its engines are restarted and after slowing down the first stage starts its descent. It's still going faster than sound as it reenters the atmosphere and some heat shielding is necessary to protect the main engines, which are now at the front in the direction of travel. 

As it reenters the atmosphere steerable grid vanes, which can be seen in the video at the top of the page, provide some aerodynamic control. Grid vanes are used on guided 'smart' bombs and are robust and effective.  At this point the launcher is basically falling out of the sky but the grid vanes allow the tube fuselage of the launcher to be positioned so that a little aerodynamic lift is produced. This allows the thing to track horizontally,  towards the landing point. 

As the last few hundred feet of altitude is lost the grid vanes position the launcher vertically. One of the engines is restarted, the descent slows and vertical velocity becomes zero just as the thing reaches the deck height of the recovery barge. 

Of course, everything must be just right. Start the engine too early and the launcher would reach zero rate of descent too soon- while the thing was still above the landing deck. It would not touch down and it would continue to ascend until all the fuel was used up. Start the engine too late and it would be coming down too quickly at ground level. It would smash into the landing platform. In either case it ends in what Elon Musk likes to call RUD. Rapid Unscheduled Assembly. 

In fact, as the video below shows, SpaceX are getting pretty good at getting things, just right!

Look closely at the video and at the end of the sequence you'll see the launcher pop up, the engine shut off and the launcher drop back down to the pad.

BTW. Saving all the deceleration to the very last instant before touch down, as in Hover Slam, also happens to be the most fuel efficient way of landing.

Technical details and more background on Hover Slam can be found on these links.

Tuesday, 12 July 2016

Folly, Farce and Hope

Its impossible to think about the UK's recent EU referendum without trying to find some positive aspect to the situation. The referendum decision to leave the EU is probably the most serious self inflicted harm that the nation could impose on itself short of declaring war. In the aftermath the Tory leadership race turned into a fiasco with none of the Brexit side getting the top job. While this was well deserved it somehow poured farce on top of folly. So is there anything positive that we can find in the outcome? I think the answer to that is yes.

The full consequences of the vote to leave the EU are not yet clear. Can the UK disentangle itself from the European Union without doing serious damage to its economy? London generates 1/3 of all the income tax raised in the UK and if the City of London financial district loses its access to the European financial markets, which is quite  possible, this will have major consequences to the rest of the country.

The question of Scottish independence has reappeared. The three hundred year old union between Scotland and England is in question. Northern Ireland may yet seek to leave the UK and find a new accommodation with Eire. The possibility that in twenty years’ time we might be looking at a broken United Kingdom, consisting just of England and Wales, must now be considered.

So what did the people who voted leave think would happen? 

In a recent opinion poll by Lord Ashcroft it was determined that: 

More than three quarters (77%) of those who voted to remain thought “the decision we make in the referendum could have disastrous consequences for us as a country if we get it wrong”. More than two thirds (69%) of leavers, by contrast, thought the decision “might make us a bit better or worse off as a country, but there probably isn’t much in it either way”. 

Why is there such a difference between attitudes of LEAVE and REMAIN?

Some have argued that the Remain campaign didn’t do enough to inform people of the likely consequences of leaving the EU. David Cameron would probably say that he did. Expert advice was wheeled in from all over the world but it didn’t have the expected effect. Michael Grove, for Leave, made the memorable statement that he thought the British people had had enough of experts, ordinary people could make up their own minds.

So expert opinion doesn’t carry value. We wouldn’t expect to have an amateur pilot in charge when we fly off on holiday but on many crucial topics people seem much less convinced of the veracity of experts. Topics such as economics and climate change seem to have been moved into a zone where they are considered matters of personal opinion, as if they were of no greater consequence than a preference for marmalade over strawberry jam. A notion of absolute truth or falsehood doesn't enter in to it. There’s the view, to quote Isaac Asimov, that 'my ignorance is just as good as your knowledge.'

Quite how we got to the stage where people unschooled in complicated subjects such as economics or the chemical composition of the atmosphere consider their views of equal value to those of academics with years of study is remarkable. It was not always thus.

The fact the Earth orbits the sun, even though this is outside direct human experience, has long been beyond question. The Earth’s place in the universe was accepted by those who themselves couldn’t prove it. This was knowledge that we accepted as something that science had determined, no one seriously questioned the idea. So why did we get to the point where, in certain topics, lay people feel that their view is as good as that of a Nobel prize winning economist or physicist?

Aaron Banks is a multimillionaire businessman who financed the UKIP. Banks put at least £3 million of his personal fortune into the Leave EU campaign. He hired an American political advocacy firm, Goddard Gunster, to develop and implement the campaign.

Aaron Banks in an interview explains that, "What [Goddard Gunster] said early on was 'facts don’t work' and that's it. The remain campaign featured fact, fact, fact, fact, fact. It just doesn’t work. You have got to connect with people emotionally. It’s the Trump success."

The campaign against EU membership had many strands to it. There was the matter of immigration. All those East Europeans who would steal jobs while simultaneously hanging around on benefits. There was the spectre of Turkey joining the EU. This, despite the fact that all existing members may veto the admission of new members. To say nothing of that £350 million that was going to be reallocated to the NHS the moment Britain left the EU. That promise didn’t last beyond the first morning after the referendum.

In fact none of the Leave promises were worth anything at all because the Leave side never had a plan for how the United Kingdom would extricate itself from the EU or what its new trade relationship with the EU would be. They never could have such a plan. Such terms and conditions as may apply in the future were not and could not be arranged in advance. 

The nearest that LEAVE ever got to a post Brexit plan was a suggestion that the UK could adopt the so called 'Norway' model. This means being part of the EEA, the European Economic Area but not the EU. It is by no means certain that the United Kingdom would be accepted to such a group. There are currently some 30 states in the EEA any of which could veto the admission of the UK.

In any case the Norway model still requires Norway to make significant contributions to the EU budget and accept free movement of people.  And Norway has no say at all in the EU regulatory process. They just have to accept whatever Brussels says without debate.  The irony here is that the key LEAVE slogan, devised by Goddard Gunster, was ‘Take Back Control’. Its a slogan. In the context of post-Brexit Britain it will mean no more than ‘BEANZ MEANZ HEINZ’.

Andrea Leadsom was a major player in the Brexit campaign and she has a very casual way with the truth. By the time the campaign was over Leadsom seems to have convinced herself she could get away with anything. In an interview with the Times newspaper she suggested that as a mother she had more of a stake in the future than rival Theresa May. This was considered bad form as May had publicly stated that she was physically unable to have children.

Well, we've all made remarks that we've lived to regret. Leadsom, rather than apologising then accused the Times of misquoting her. She went on to call the Times report gutter journalism of the worst kind.

The Times then produced a tape proving that Leadsom had said exactly what they'd printed. Thus proving Leadsom an idiot as well as a liar. And that was the end of Leadsom's attempt to become Prime Minister.

But I said there was a little hope. So where is it? 

The lies of Leadsom and the steady stream of misinformation are becoming recognised. And now there's an anger following the EU referendum that has energised people. A tiny minority of voters changed the destiny of the UK. That the campaign was dirty and deceitful and tuned into people on an emotional level is clear.  

We've known for years that tabloid papers like the Sun and the Daily Mail print what they like with scant regard for the truth. But this been largely ignored as celebrity gossip and much of their other coverage is so trivial. But this time the stakes were higher and the tabloids were seen to be complicit with an opportunistic political class that had the same low ethical standards. We saw how the steady drip, drip of misinformation directly influenced something we all had a stake in.

I think that the next General election will be won and lost on different terms. When that election is fought the politicians will be dealing with an electorate that has informed itself better, has insight into the games being played with the truth and is much more aware of the consequences of their votes. 

And that, I believe, gives cause for hope.

Sunday, 26 June 2016

Len Deighton's SS-GB

BBC television is producing a mini-series based of the novel SS-GB. It will likely be sold to the USA and many other regions.

I re-read Len Deighton's novel again recently when I first heard the news of the upcoming TV version. Over the years I’ve read long passages from SS-GB out to friends as I’ve tried to share with them my love for this tense and moving book. It was no chore to pick it up again.

SS-GB is in one sense a straightforward detective story. A crucial difference, it is set in an alternate universe where the Nazis occupied London in 1940. But Britain has not yet lost everything, it still has some secrets that the Germans would like to get their hands on.

Len Deighton, in a memoir relating to film making remarks on how the film director can build the tension by revealing things that the hero cannot know. In historical fiction too the reader can have superior knowledge to the protagonists. In SS-GB Len Deighton exploits the reader’s knowledge of the atomic bomb and how it will fundamentally change warfare. The reader knows, as they cannot know, what an atomic bomb means.

The tension thus generated keeps the reader engaged throughout. In fiction I think there are three legs, world, character and plot. In a sense the plot and characters are only there to keep you turning the pages while you visit the world the writer has created. And so we stick with the hero, Douglas Archer as he makes his way through the horrific world of a German occupied London.

Yet some things are the same. The traditional British social structure is intact and somehow nostalgically cherished, The ex-soldiers, officers and men, belong to that special club that somehow facilitates the British class system. ‘Our rulers may be bastards, but at least they’re our bastards’. British aristocrats such as Mayhew are just as ruthless as Nazis like Huth, yet it is only since the Nazi occupation that Britain has had concentration camps and execution squads.  And the England revealed here is one where life for the aristocracy seems to have continued almost unchanged. They are defying the Germans yet retaining a comfortable life. It is the middle class professionals like Douglas Archer and the working people who are carrying the brunt of the suffering.

In a sense Len Deighton knew this world. He was a boy in London when these events were/would have been happening. Britain’s soldiers, though finally defeated, fought valiantly. We see the compassion for men at arms that runs through all of Len Deighton’s work. The scene in the artificial limb factory where the young man, probably an ex-soldier, struggles with his new artificial leg never fails to move me. When I was a child in England in the 1950s people missing arms and legs were a relatively common sight, and here too in Germany.

At the heart of the book is another mystery, one that Len Deighton does NOT reveal in the final chapter. Just how did Britain come to lose the war and be occupied by Germany? He reveals a few snippets. There’s a German amphibious landing at Dover. Tanks on Wimbledon Common and plucky resistance by civilians. The Royal Navy engineers tried to destroy certain port facilities at Portsmouth.

We need to go further back, back to the Battle of Britain. My understanding of the importance of that battle is that Germany would have needed to have defeated the RAF before launching an amphibious invasion. Germany would have needed air superiority over the English Channel to suppress the Royal Navy who would otherwise have repelled the invasion force.

In our world, after failing to destroy the RAF in the Battle of Britain, Hitler turned eastwards, he broke his secret pact with Stalin and the invasion of Britain was consigned to the ‘alternate’ history books. Did the RAF lose the Battle of Britain in the world of SS-GB? Yes, says Len in the new introduction. But he doesn't say why.

The primary German fighter aircraft used in the Battle of Britain was the Bf109E. Taking off from France this variant could only spend about 15 minutes over British territory. One suggestion is that by fitting the aircraft with extended range fuel tanks they could have spent longer over England engaging Britain’s Spitfires and Hurricanes and shooting more of them down.

Certainly, later versions of the Bf109 had extended range fuel tanks. Could such a technical change have swung the Battle of Britain and thus the war? This is a very intriguing question, potentially a real butterfly effect. Indeed, losing the Battle of Britain might have led to another possibility. Rather than face invasion Britain might have thrown in the towel and sought terms with Germany somewhat as discussed in another of Len Deighton’s novels, XPD.

Such are the questions we find in SS-GB. I first discovered Len Deighton’s writing about 45 years ago. At the time I was a huge science fiction fan. After I’d read Len Deighton I wished that he would turn his hand to science fiction. Then along came SS-GB and it seemed he had.

Yet SS-GB springs from a wealth of real-world research that gives it a verisimilitude that no science fiction story has ever had. The world of SS-GB so nearly might have been. This book, with it’s tension and compassion, is a wonderful and powerful work, let’s hope it becomes a great TV series.

Tuesday, 7 June 2016

The Tunnel Under the World.

Bertram Russell produced a handy definition of philosophy, Philosophy is that discipline where ideas can be discussed that are not scientifically provable. Well there's a theme that fits that category and science fiction fans are all too familiar with it, the concept of the universe as a simulation. 

Fred Pohl's story, The Tunnel Under The World, which you can read here, Pohl, Tunnel was probably my first encounter with the idea. Most likely I found it in one of those yellow jacketed collections of science fiction short stories produced by publisher, Victor Gollanz. I probably got it after school from the village library and read it instead of doing my homework.  This, of course, was long before Hollywood discovered 'proper' science fiction. The movie franchise, The Matrix, is a  better known expression of the concept of the universe as a simulation. 

But it was not until 2003 that the topic became worthy of the attention of philosophers. Nick Bostrom published a paper in the Philosophical Quarterly, ARE YOU LIVING IN A COMPUTER SIMULATION? And it's produced considerable discussion. You can read that paper here

The premise is, that with a sufficiently advanced technology, it would be possible to simulate our world, and all the people in it, on a sufficiently powerful supercomputer. The workings of all the brains of billions of sentient beings, the environmental effects, all the weather systems, the changes in atmosphere chemistry, even the motion of the tectonic plates would be modelled. But not perhaps what happens in deep space. Perhaps beyond our solar system the rest of the universe could be faked with a few simulations just good enough to fool the telescopes. 

But why go to all the trouble of simulating something that has a real existence? Well, who wouldn't want to examine the past? Without the possibility of a time machine an Ancestor Simulation might be the next best thing. In fact, a simulation permits such tricks as replaying history after changing a small thing. How would things have turned out if JFK had not been shot? Would we still have landed on the moon? You could find out by pausing the simulation and concocting a clue to cause the police to search that book depository in Dallas. The simulation need not run in real time, the conscious entities being simulated, we humans, would not be aware of it being stopped. 

There is a base level physical universe where the human population had advanced to what Bostrom calls Post Human level and these are the people who can harness sufficient resources to build the computer to run an Ancestor Simulation. But, and this is the main point, the probability of us being part of this one physical universe, rather than part of one of the simulations, is minuscule. This is because the one real, physical universe could host numerous simulated universes, and a simulated universe could itself host a further simulated universe when its technology was sufficiently developed. So, the probability of our existence as being part of a simulation is much higher than of our being part of the one physical universe.

The notion that we are actually part of a simulation has been proposed as an answer to the Fermi Paradox. The Fermi Paradox asks why, in such a large universe, have we yet to encounter aliens? Perhaps the original, physical manifestation of the human race, did encounter aliens but the simulation hardware cannot manage to simulate the alien solar system too. So the simulated universe, our universe, excludes aliens because there just isn't enough computer power to model them.  This may even be why the base universe is bothering to run Ancestor Simulations - to find out how humans might have turned out had they not had their culture corrupted by an alien encounter.

A simulated universe has also been suggested as an explanation for one of the more far out aspects of particle physics, quantum uncertainty. This is where either the position of a sub-atomic particle or its momentum can be known with accuracy, but not both together. At this level of granularity, it is suggested, the simulation models particle behaviour only sufficiently well to support one or other measurement but not both simultaneously. 

And so, if we all are part of a simulation, what then? If we behave badly we cause and possibly experience pain. It just happens to be that that the ultimate nature of our universe will forever be unknown to us. So how should we behave? Here is an interesting answer. simulation-problem

But what if we are not part of a simulation but part of the real universe?  Well somebody has to be part of that initial physical universe. In fact that's a very scary thought. Let's consider how we might have beaten the odds? 

If humanity should inevitably advance to the point of being capable of running numerous Ancestor Simulations and has not (or if you prefer will not) then something has gone very wrong. The likelihood is then, in our future, the human race becomes extinct before such Ancestor Simulations are technically possible. As a result we are part of a physical universe because that's the only one available, all those simulations will never be created so we are left with the only option possible, the physical universe.

So our condition, as part of the physical universe or as part of a simulation, is determined by an event in the future, by whether or not the human race survives! 

We may not like it but if we are not part of a simulation the most likely case is that the human race will get wiped out. 

Tuesday, 5 April 2016

Information as a Product

At one time life's essentials were limited to shelter, fuel and food. But new technologies are created and some have become essentials.  Sewage systems and electrical power have become, in relatively recent times, essentials. And, for the last 50 years or so, personal transport and the fuel to run it has too. 

The successful new technologies tend to become products or commodities and once something has become commodified capitalism takes over. It is the general goal of capitalism to maximise the market for a product. Suppliers want to roll their product out to the widest possible customer base. So it is ideal if a new product becomes an essential. 

As new products are introduced they tend to find favour with the wealthy first and then expand into less affluent markets. Then, if the product changes the way large sectors of the population live it eventually becomes an essential. Just since 1950, in rural areas in England, domestic electricity changed from being the privilege of the wealthy to an absolute essential of life. A civilised society without electricity has become, in just 60 years, inconceivable.

Given the preponderance of commodities that are essentials yet are also damaging to health and environment (largely but not exclusively petroleum fuelled products) it’s reassuring to see a new product emerging that is not of itself environmentally damaging. Their existence gives opportunities for concentrations of capital, which wield power and influence, to be invested in areas that are not environmentally damaging. 

So what is this new commodity? There is a product that has existed since the beginnings of human written language but only within very recent times has it become a commodity. This product - information - derives from a remarkable attribute of humans, the ability to create and manipulate written language. And written language is the earliest example of a disruptive technology.  

It’s believed that writing started about 6000 years ago, at the beginning of the Bronze age, and this is probably no coincidence, writing became an enabling technology for other technologies.  But for most of those 6000 years writing has been a specialised skill and the information preserved and transmitted through writing was limited. Yet despite this information in the form of written works was soon recognised as having a value. The Great Library of Alexander, as well as being a huge receptacle of knowledge, was a considerable financial investment. But the investment paid dividends and the library profited by selling copies of the works  that it preserved.

Copying the library’s texts was limited by the availability of writing materials. The development of papyrus (which superseded clay and stone), parchment  and eventually wood-pulp paper led to a decline in the cost of writing. But, in antiquity, all copying was done by hand and copies remained relatively expensive. A ‘Scriptorium’, where a skilled workforce painstakingly and laboriously hand copied texts, and sometimes translated them, is a remarkable factory. But the economics kept books (and knowledge) in short supply. And, as depicted in Emburto Eco’s novel, The Name of the Rose, such information as did exist could be kept out of circulation if authority deemed it so.

The situation changed with the development, in 1440, of the printing press. Written works became more common but printing required capital and it's owners could still be influenced by authorities who could still maintain control of information. 

In Britain the government set up means to restrict the number of newspapers in an effort to make publishing more amenable to control (a tax called stamp duty). In the twentieth century further new medias appeared: radio and television, but these were centralised and expensive and also controllable. The BBC, which started out as a commercial organisation was swiftly nationalised so that the government could keep it in check.  

The new electronic media established itself in the mid 1990s when widespread access to the Internet started. Now almost anyone could become a publisher or broadcaster. (the abundance of information has also served to limit access to information) But it would take a few more engineering developments before effective search engines, smart phones and high speed wireless, cellular networks enabled pure information to become a product. Rather than the book or newspaper being the product. After 6000 years information has entered a new phase and become, once more, a disruptive technology. 

The new technology permits of instant, perfect copying that would have astonished the toiling clerks in a monastic scriptorium. They'd be equally astonished by the quality of automatic language translation. Moreover, the ease of creating new websites has opened the flood gates to information and even the most authoritative regimes struggle to silence all dissenting voices.

The changes  to the way people live and work brought on by the new age of instant universal information are still in progress. But the signs are showing. For the first time in decades private car usage, in certain western countries, is declining. Old public transport technologies are becoming more usable through the use of smart phones. If your are on a business trip or holidaying, Google your destination and have the system find you a way. It’s become much easier to plan a cross country trip when the smartphone can give you alternative busses, trains and uber cabs as you make your journey. Once a rental car was considered an essential on a foreign trip, but now with all the world’s travel timetables accessible, and automatically searchable, the old travel infrastructure, at no cost to itself, just got more usable.

Travel information, wikipedia, music, continuous communications with our loved ones, shopping, and that very hard to quantify utility of how to do it, in fact, all the world of knowledge has meant that pure information has finally made it into the big time. The network providers and phone manufacturers  have given us access to it and companies like Google have organised it for us. The same information that always has existed just became more useful and it has become a real commodity,  sold by the Megabit.  

My relatives in the Yorkshire Dales, who sixty years ago had no domestic electricity, now complain that their internet is not fast enough. This newly commodified version of information is now secured by being both an essential and a product of the market. And while Authority would still like to control the content and flow of information, there is always one thing that Authority must defer to - Authority must always defer to the market.

Tuesday, 5 January 2016

The man who rediscovered the future

Recently I encountered a Margaret Atwood quote. She said that during the slavery era in the USA many people deplored the situation but felt that nothing could be done as slavery seemed an essential part of the economy. And so it is today with our current attitude towards fossil fuels. Most educated people accept that through our use of hydrocarbons we are creating an environmental time bomb. And, moreover, in order to feed our oil habit we must trade with states which are at best non-democratic and at worst active supporters of extremist fundamentalist regimes. And yet it’s felt that, as in the era of slavery, we just have to accept this because this is how our economy works.  

Even without the environmental damage the fact that the supply of fossil fuel is finite demands that sooner or later we have to find alternatives. But the apparent lack of options supports denial of the consequences. The result? A near future doomed to be much like the present and a more distant future that people just don’t want to think about.

It was not always like thus. The future, as seen from the 1960s, seemed very different. The huge technical achievements of the Second World War and the early 1950s seemed set to continue. When the movie 2001 was new, in 1968, Stanley Kubrick's vision, with nuclear powered spacecraft out roaming the rings of Saturn, seemed a reasonable extrapolation of present technology. 

Project Apollo, and it’s counterpart in the Soviet Union, was ongoing and funded by huge amounts of taxpayer money. An engineering culture that was as generously funded as the wartime Manhattan project developed. Space technology became, because it could be, an enormously expensive business. Yet the technology that largely underpinned it had been devised quickly and often on shoestring budgets during the war. Small teams of clever young people had been set very difficult targets and created jet engines, radar systems and  the beginnings of digital technology.  

When 2001 actually dawned progress had  slowed considerably. By then the Space Shuttle was the pinnacle of space achievement. It had been created, at enormous expense, as a way of reducing the cost of spaceflight.Yet the cost of access to space remained very high. By 2008, following a second fatal accident, it was heading for retirement and the USA had no serious replacement. Space was costly and dangerous and the USA had no particular need to go there. And, except when there was a little government funding available the aerospace industry had effectively abandoned space flight.

World wide the automotive industry was equally moribund. A few big multinationals were churning out a range of largely equivalent models. There were a few outliers, Toyota with their hybrid Prius managed to become a taxi drivers favourite but the closest the mainstream came to innovation was the environmental absurdity of diesel powered private cars. As mentioned in an earlier blog an electrified future for cars had seemed possible as early as 1910 but despite a few private and taxpayer funded initiatives electric vehicles seemed permanently parked on hold.  

One might be excused for asking, what happened to the future? How had the shining future that  had seemed so plausible in 1968 got lost?

In fact many bright young people, who were the spiritual children of the technological wizards of the 1940s, were hard at work. They had been busy engineering the internet and the new tools that grew out of information technology. And out of these ranks stepped someone who was ready to take on the dinosaurs of the aerospace  and automobile industries. And he would use the culture and practices of Silicon Valley.

That man is Elon Musk.  Barely 15 years ago he was just another rich geek who’d made millions out of the internet.  Musk sought out areas that were overdue for innovation.  An early endeavour was his take on on-line banking, this eventually became PayPal. Then Musk moved into an area of technology where previously only companies backed by huge amounts of taxpayers money had dared to tread. This was the field of  space technology.  Soon afterwards he got in on the beginnings of the Tesla car company.

Musk did more than finance two companies. He had grand goals for both of them. SpaceX was intended to radically reduce the cost of access to space. And the Tesla car company was to make electric cars. Musk has stated on a number of occasions that his goal is to reach Mars. He has another goal to make personal transportation independent of fossil fuels. 

Musk  brought with him a different kind of engineering culture. The aerospace and automotive worlds had a certain way of doing things. Traditionally they had deep management hierarchies with clearly drawn divisions between workers, engineers and management. They'd been structured like this for decades. But things had started out differently in the software world. The first software engineers had to design systems, write code and get it all working. And the Silicon Valley software engineering culture had a much shorter management chain. The chief executives often came up through the ranks having started out writing code. Musk himself did this. 

The aerospace and the automotive industries had taken to ‘outsourcing’ a lot of their essential engineering and manufacturing. Such a strategy reduces the risk and spreads the development effort. In any case, the strict traditional divisions between engineering and manufacture means that the actual hardware can be built almost anywhere. For a lot of the big names of manufacturing that means China. And outsourcing manufacturing means outsourcing jobs.

But the outsourcing approach means the supplier companies don’t always produce the optimum solution.  Moreover, in the automobile industry, where rival manufacturers share a common ‘automotive supply chain', it tends to steer all the competitors towards the same path.  A company such as Bosch may supply a similar product to different, competing manufacturers. Over in the area  of space technology outsourcing is justified on the grounds of spreading those taxpayer created jobs more widely. 

When Musk started SpaceX and got involved with Tesla he specifically picked engineers who were ‘hands on’. It was the Scrapheap Challenge/Junkyard Wars type of people that Musk recruited. An early photograph from Tesla shows JR Staubel, who is now the CEO of Tesla, assembling a battery pack for the prototype Tesla Roadster.  And not only were the engineers expected to get their hands dirty, the men managing the companies were expected to have, as Musk has, a complete understanding of the technical issues.  
And it wasn’t just the organisation structure that Musk has changed with SpaceX and Tesla. It is also about manufacturing ‘in-house’. This is what is referred to as vertical integration and is exactly the opposite to outsourcing. Vertical integration had Tesla and SpaceX creating, within their own premises, progressively more of their subsystems. They are designing and making even circuit boards. This has reduced costs and ensured that the product does exactly what the engineers want it to do. For Tesla this also means that cars like the Model S have a system architecture that is completely under the control of Tesla’s engineers. It lends itself to the approach where software derived improvements to performance and economy can and have been downloaded overnight to the car. (This isn’t completely impossible for other car companies but is much more difficult as vehicle functionality is distributed across a number of third party supplied subsystems.)

Musk has spent the last 15 years innovating in areas where big aerospace and big automobile had ‘proven’ that no other ways were possible or even desirable. Musk has shown that the difficult art of rocket science is still within the grasp of the USA and what’s more that there are ways of doing it that are cheaper than even the Russians and Chinese can find.

Musk has proven that not only can electric cars be practical they can compete at the luxury sedan level with the best in the world.  And there’s plenty more to come. Musk’s strategy of vertical integration is now being extended to include battery manufacture and this is essential in order to bring down the marginal cost of batteries and enable his stated goal of a mass-market electric car with a range of better than 200 miles.

In SpaceX another great milestone was recently passed when an orbital launcher was recovered and landed vertically under autonomous control. This was a challenge that has never even been attempted by 'big aerospace'. And the Tesla Model S is a luxury automobile which can have the performance of a supercar and include all the latest automated gadgets.

Musk's efforts have been a catalyst for change. Now,  the automobile big names are finally rushing to bring electric vehicles to market. Mercedes have announced four new electric vehicle projects for 2017. And General Motors have a new electric car design, the GM Bolt which is set to sell for $30000 with a 200 mile range. 

Here is a video of the Falcon 9 landing.  

The work of Elon Musk has meant that, once again,  innovation means real change. He is a great engineer and a great visionary and more than that, Elon Musk has rediscovered the future.