JenLab’s nanotechnology solutions speed up skin cancer detection

More and more people are developing malignant melanomas, the aggressive black skin cancer. Ultraviolet (UV) light causes such skin tumours and ages the skin, but we also need it to produce vitamin B and to boost our mood. Half an hour of UV exposure a day is recommended: further UV exposure should be blocked by sunscreens.

There are chemical sunscreens that absorb the UV light: they penetrate the skin and undergo chemical transformations that make them effective. Because of this, most lotions should be applied 30 minutes before sun exposure – but it also means some will lose their protective qualities after as little as an hour.

Nanocosmetics is becoming a major industry and sunscreen sales are worth billions of euros a year

More stable are ‘physical barrier’ sunscreens, sometimes called sunblocks. They contain titanium dioxide and zinc oxide, which reflect UV and work immediately – but particles larger than 250nm tint the skin white and are therefore unpopular with consumers. Applying a reasonably transparent skin protection based on nanoparticles (NP) can solve this cosmetic drawback. Properties of NP vary by size, shape and coating: particles sized around 30nm provide greater UVB but less UVA protection than 200nm particles.

Nanocosmetics is becoming a major industry and sunscreen sales are worth billions of euros a year. Nanoscale zinc was recently approved for use in European sunscreens, but not in sprays or powders, as there is still uncertainty about how much of a risk it poses to the environment and consumers. It’s assumed most NPs remain on the skin’s surface unless washed away and do not penetrate into the skin: if some do enter the epidermis it’s assumed they are dissolved into harmless zinc ions. However, most nanosafety studies have only been performed on skin biopsies or on pigs, and not on human volunteers.

JenLab’s recently developed multiphoton tomograph allows an exact view of what is going on. The particles can be tracked on top of and within the skin; their accumulation in the hair shafts and their interaction with living cells can both be monitored. JenLab’s researchers work with Professor Michael Roberts, who has carried out multiphoton studies at the Princess Alexandra Hospital in Brisbane, the ‘world capital of melanoma’, where sunscreens are part of daily life. The researchers track NPs in healthy volunteers of different skin types. They found 99 percent of NPs stay on the skin’s surface. However, a significant number of NPs were found in the skin of patients suffering from dermatitis. Nowadays, many cosmetic companies test their products with JenLab’s certified femtosecond laser multiphoton tomographs – an achievement for which the company has been awarded the 2014 New Economy Award for Best Medical Diagnostics Company.

Extremely precise sub-100nm cuts through a blood cell
Extremely precise sub-100nm cuts through a blood cell

Nanosurgery
When working with near-infrared femtosecond lasers and NPs, the researchers at JenLab found another astonishing effect: gold nanoparticles served as light antennae and nanolenses of light. The NPs collected, focused and amplified light. Furthermore, under certain parameters, they induced highly localised destructive effects. Together with scientists from the Institute of Photonic Technology, JenLab optimised the nanoprocessing particle by adding two types of coating: one coating was applied to change the absorption behaviour by adding silver; the other by adding artificial DNA to functionalise and force it to bind to a certain genomic region of the human chromosome 1.

When exposing the whole chromosome to intense laser light, nothing happened to the DNA without the nanoparticle. With the nanoparticle, however, a 40nm hole appeared, just at the site where the NP was bound to part of the chromosome, 20 times smaller than the laser wavelength. This was the smallest hole ever drilled with a laser beam. This new technology opens the way to highly precise optical molecular surgery and high-speed parallel nanoprocessing with functionalised nanoparticles. JenLab owns the patents for this novel technology.

Femtosecond lasers can even work as nanotools without the use of nanoparticles. JenLab developed the first femtosecond laser nanoprocessing microscope, ‘FemtOcut’, which is able to cut, drill and ablate with sub-100nm precision. This novel microscope has been used to perform completely non-invasive nanosurgery inside living cells by dissecting metaphase chromosomes, and to ‘knock out’ single mitochondria without destroying the surrounding membrane. The cell survived.

Targeted transfection
A very promising application of femtosecond laser nanosurgery is targeted transfection. The femtosecond laser beam drills a transient sub-100 nanometer hole in the cellular membrane to allow molecules such as microRNA or DNA to be entered into the cell’s cytoplasm. Typically, the cell closes the membrane within five seconds through its own self-repair mechanism. In this way, it is possible to transfect cells without the use of viruses, or electrical or chemical means.

In order to obtain high transfection efficiencies in sensitive stem cells, JenLab developed the compact transportable ‘FemtOgene’ microscope, which uses extremely ultrashort laser pulses. Laser pulses as short as 12 femtoseconds, in combination with ZEISS optics, are employed to realise safe laser transfection at low milliwatt laser powers. JenLab’s software is able to recognise cells, to focus on the membrane and to ‘shoot’ transient holes. Up to 200 cells per minute can be transfected with a survival rate of more than 90 percent.

In order to realise higher cell numbers, femtosecond laser transfection can also be performed in a special flow cytometer.

Currently, groups in the US, Canada, the UK, South Africa and Germany optimise the transfection protocols. JenLab owns the worldwide patents for femtosecond laser transfection.

The FemtOgene microscope can obtain high transfection efficiencies in sensitive stem cells
The FemtOgene microscope can obtain high transfection efficiencies in sensitive stem cells

Optical reprogramming and stem cells
The 2012 Nobel Prize in Physiology or Medicine was awarded to researchers Sir John B Gurdon and Shinya Yamanaka for the generation of induced pluripotent stem cells. The laureates employed viruses to transfect skin cells with a cocktail of four genes, and to force them to ‘re-develop’ and become pluripotent stem cells that are able to differentiate into different cell types and tissues including skin, heart and blood. This reprogramming of own adult cells has major advantages, such as helping us avoid the use of embryonic and adult stem cells from other donors. A major disadvantage so far is the use of viruses that may prevent clinical applications. JenLab’s femtosecond laser transfection technology may overcome this problem. Currently, a team from the Universities at Saarbrücken and Tübingen is trying to realise virus-free femtosecond laser reprogramming and direct programming. If a success, it would be a major step towards clinical application of this groundbreaking work.

JenLab’s award-winning multiphoton tomograph can be used for the early detection of skin cancer. Single cancer cells, such as strong fluorescent melanocytes, can be easily recognised with the femtosecond laser tomograph. What about destroying the cancer cell immediately when detected? Researchers are working on a future two-mode operation system of the femtosecond laser device. In the first mode, cancer cells can be imaged in three dimensions. Sophisticated image processing should recognise the tumour borders. In the second mode, the intensity of the laser beam should be significantly increased to become a highly precise nanoscalpel. This precise laser tool should destroy the cells of interest while leaving the surrounding cells alive. The realisation of this surgical tool is still, however, a long way off.

Ebola brings West African economies to an ‘alarming’ standstill | Video

With the death toll in the thousands and an international emergency declared by the World Health Organisation, all eyes are on West Africa – which faces not only a health crisis but an economic one as a result of the Ebola outbreak. The New Economy speaks to Robtel Neajai Pailey, PhD candidate at SOAS, University of London and Liberian native, about how investors need to be held to Corporate Social Responsibility commitments.

The New Economy: In a Guardian piece, you spoke very vividly about dead bodies being left on the streets, people being afraid to even go to the health centres to get help, and hide instead in their homes with their families, because they just didn’t trust the infrastructure that was put in place. That is a series crisis of confidence that exists in the country – how do you even more forward when you’ve got that sort of situation and a highly infectious disease that we’re talking about?

Robtel Neajai Pailey: It’s incredibly demoralising for the Government to announce on the radio that it’s an international health crisis, people must listen to the Government, they must listen to the safely precautions, and ask people to take patients to the hospital – and when patients arrive, they have been turned away.

The New Economy: Now we of course, have these sobering statistics that have just come out from the World Bank. We know that this is not just Africa’s problem, but this is a problem that the world really needs to take issue with. According to the World Bank, the low Ebola scenario is that lost GDP for West Africa – $2.2bn in 2014, and $1.6bn 2015. The high – even worse – $7.4bn in 2014, $25.2bn in 2015.

Now according to the World bank statement, the problem is not that they can’t keep up with the immediate problems on the ground, but this is the larger business implications that we’re talking about, or a version behaviour by people who are investing in the country. I mean, they don’t even want to go in, they don’t want to leave with any of the products from the country. We are seeing a continent that is more interconnected than ever before, in terms of its investment behaviour. Now let’s talk about the topple effect that this situation is going to create for markets all over the world.

Robtel Neajai Pailey: Countries in the sub-region are sort of realising that ostracising and completely isolating these countries isn’t going to help them, because it has implications beyond just the three countries that are affected globally, where concession companies or investors are saying, well this a West African crisis and therefore we can’t trade with these countries because there might be Ebola cases.

The New Economy: But you know, we’re taking beyond the regionals, as you said about the broader implications. If we see a rupture of the financial markets that are integrated, and as a result you know it’s all over the world in just a few days. I’m not trying to fear monger, but what I am trying to say is that these broader implications then mean what? Does that mean that the foreign governments have to pour more money to solve this situation, try to mitigate the risk? Is that what you’re saying?

Robtel Neajai Pailey: Well I think that’s a part of it, and I think even the World Bank has introduced somewhat of a stimulus package for Guinea, Liberia and Sierra Leone, so that the broader implications of the outbreak on their economies, it doesn’t lead to complete collapse and fragility – that’s a welcome solution. And the World Bank is even saying that we’ve got to do a lot more than the initial $20m that we talked about, because the economies have pretty much come to a standstill, we have to find other mechanisms to ensure that there’s not complete economic collapse.

I mean, even talking to family members on the ground now, people who are petty traders involved in informal work, can’t sell their market just because they are so fearful of going out, and people are not even buying their wares. Farmers are not going to the farm, commodity prices are increasing and I’m thinking about the instability that this is causing for local house-holds who aren’t able to even sustain themselves. I mean it’s quite alarming.

The New Economy: It is, it is. Let’s go back to the idea of building this infrastructure, health infrastructure – what obligation to these foreign firms have, given that they are coming in that they are coming in and helping to reap the economic benefits for the Government, as well as for themselves. But those people on the ground – we have to make sure they’re taking care of, if the governments aren’t doing a satisfactory enough job, whose responsibility is it to make sure the healthcare reform is in fact enacted swiftly?

Robtel Neajai Pailey: Sure. It is the government’s responsibility. The private companies, they can only follow the government’s lead.

The New Economy: So do you think the Government should hold them more accountable? And say, you guys are coming in let’s talk about these deals we’re making with you, and we want to make sure that these deals are more balanced.

Robtel Neajai Pailey: Absolutely. And I think the Liberian Government, as I mentioned, with the county development funds and social development funds, this is something that the Liberian Government introduced – it wasn’t the concession companies saying that, oh we have a corporate social responsibility and therefore we will. The Government said no, this is our expectation of you, you are going to export our natural resources then you have to invest back in the people. Again, they haven’t held their feet to the fire in ensuring that they do invest in the people and healthcare and education.

The New Economy: Now lets think about some of the FTSE companies that of course ascribe to the belief that they should be engaging in Corporate Social Responsibility tactics, and that’s part of the boxes that they tick off when they’re promoting themselves internationally. What does that then do to enhancing the ability of the local Governments to hold these companies accountable?

Robtel Neajai Pailey: Well I think part of it is the CSR, the corporate social responsibility has become sort of rhetorical and it’s not in actuality reality on the ground, it’s not having the impact it should.

The New Economy: But companies are using it as a measure of ‘look we’re going into your region, your continent and this is what we’re giving back’. So you’re saying that it’s not making any difference on the ground?

Robtel Neajai Pailey: Well, I’m saying that in terms of reality, from what I’ve experienced in Liberia, those investments are actually not trickling down.

There was a conference in Abuja, Nigeria in 2001 all the African Union Governments decided to sign a declaration called the Abuja Declaration. And it basically implored and sort of insured that they would invest in healthcare, and they also asked for donors to ensure that most of the aid coming in would go towards healthcare finance. There have been 26 countries that have increased healthcare financing in the past 11 years, Liberia is among them. Obviously more still needs to be done.

The New Economy: Robtel you are a smart dynamic woman, obviously you are able to leave the country, get educated and then, you know, impart wisdom on the rest on the Western world, what needs to happen locally. What is it like for you’re to see your brothers and sisters back home dealing with these systemic issues, unable to move forward.

Robtel Neajai Pailey: It’s very disheartening actually. I mean, I think the only thing that is keeping me from wallowing in despair and disillusionment is the fact that I speak to my father every day, and he’s a glass half full kind of individual, he always says ‘we’re ok, we’re praying, we’ll get through this it’s just a matter of time’. And my question is how many people have to suffer in the process before we get through this? We have so much more work to do post Ebola.

The New Economy: Well you are part of that change. Robtel, thank you so much for joining me today.

Robtel Neajai Pailey: Thank you very much for having me.

The Brits: great at inventing, not so successful at monetising

For a country as small as it is, the UK’s contribution to the world of science and technology is astounding. Some of history’s greatest innovators have come from the British Isles, helping usher in a wave of staggering technological advancements that have benefited the world. However, for all the scientific breakthroughs made by British researchers, few have translated into the sort of riches seen in places such as Silicon Valley.

With notable and respected research centres across the country, it is surprising how few Britons over the last century have garnered both praise and monetary rewards for their work. While there are some, such as James Dyson, who have been rightly praised for their inventive minds, all too often it seems a small UK technological innovation is snatched away in the early stages of development by a foreign rival. By contrast, the US has seen fortunes made for a vast number of people across Silicon Valley, thanks in large part to its commitment to both investing in ideas and helping researchers get the necessary funding, advice and support they need to realise their dreams.

[A]ll the jobs and wealth are created abroad, and do not benefit the British economy

The valley of death
Last year, British science journalist Michael Mosley wrote that the country had a chronic problem with monetising its inventions: “Perhaps because we are used to getting there first, we constantly fail to commercialise British inventions. Tim Berners-Lee, the father of the worldwide web, is rightly applauded for giving his invention to the world – yet on another level it would have been nice if he could have benefited from his work in the way Google’s founders have done.”

In a report released in 2013 titled Bridging the Valley of Death: Improving the Commercialisation of Research, the UK parliament’s Science and Technology Select Committee outlined the need for transforming the country’s world-class research into commercially viable products. The report said: “There exists the concept of a valley of death that prevents the progress of science from the laboratory bench to the point where it provides the basis of a commercially successful business or product.

“The future success of the UK economy has been linked to the success of translating a world class science base to generate new businesses with the consequent generation of UK jobs and wealth. For decades, governments have sought to promote technological innovation and ensure that the UK benefits from its world-class science base.”

The report added that, while many technology companies launch in the UK and do much of their development at home, they are soon snapped up by foreign firms. When the technologies they have invented become profitable, all the jobs and wealth are created abroad, and do not benefit the British economy. “These businesses take time to develop and to become profitable in an environment where financing is focusing more on quick returns and less on risky investments,” the committee’s report stated.

Words without action
However, there are signs the current UK Government wants to harness the inventiveness of its best scientists for the good of the country’s economy. In 2011, a year after two Manchester-based scientists won a Nobel Prize for discovering the super-strong material graphene, Prime Minister David Cameron announced the government would be giving the researchers further money to develop and commercialise the technology. At the time, Chancellor George Osborne said it was part of the government’s efforts to turn British ideas into commercially viable products: “We will fund a national research programme that will take this Nobel-Prize-winning discovery from the British laboratory to the British factory floor. We’ve got to get Britain making things again. Countries like Singapore, Korea, and America are luring researchers with lucrative offers to move their research overseas.”

Unfortunately, it seems the UK has yet to realise this ambition. In August, news emerged that NASA had successfully tested an “impossible” electric space drive that relied on microwave technology. The drive had been in testing for a year, inspired by a paper published by the Chinese Academy of Sciences. However, the research stemmed back to a design by British inventor Roger Shawyer from 2001. Shaw’s EmdDrive had been widely derided at the time by critics, and failed to receive any funding from either private or public bodies in the UK. Now, if NASA’s research proves accurate, the drive could provide a far more efficient, longer-lasting and quicker way of powering spacecraft. If successfully scaled up, it could deliver a vessel to Mars in a matter of weeks, rather than months. Shawyer, sadly, is unlikely to see much of the credit for it, let alone the money.

Whether it is scepticism from peers, neglecting to patent an idea, or a surprising level of altruism, history is littered with British inventors who have failed to monetise their revolutionary ideas…

An inventor hall of fame

The worldwide web, Tim Berners-Lee

Tim-Berners-LeeWhile former US Vice President and environmental campaigner Al Gore has been heralded many as the father of the web, that accolade actually belongs to an altruistic British computer scientist called Tim Berners-Lee. It was while working as an independent contractor at Europe’s particle physics laboratory, CERN, in 1980 that Berners-Lee developed the worldwide web.

Often confused with the internet, the worldwide web is a platform through which people can access and share huge amounts of information: the internet is the global system of interconnected computers on which the worldwide web operates, and has its origins in US Government work during the 1960s.

Berners-Lee devised a system of sharable information through links of ‘hypertext’ that could be spread through the internet. It was initially created as a useful and easy way in which physicists around the world could share their data with each other. Throughout the 1980s, Berners-Lee proposed widening the scope of the platform, suggesting in 1989 a “large hypertext database with typed links”. Eventually his boss at CERN, Mike Sendell, urged him to further develop the system, which resulted in the creation of the ‘World Wide Web’.

While Berners-Lee was at work in Europe, Gore was encouraging the US Government to invest in a network of connected computers known as Arpanet, which would later become the internet. All this work would establish a huge platform on which Berners-Lee’s system could thrive. Whereas the internet had previously been something used by governments, academics and technology enthusiasts, Berners-Lee’s contribution laid the foundations for it to be used on a far greater scale and in a far simpler form.

Berners-Lee would unveil his World Wide Web project in 1991, in a short post on an internet newsgroup. He made it publicly available at no cost, offering his system to the world to enhance and collaborate with. He wrote: “The WorldWideWeb (WWW) project aims to allow all links to be made to any information anywhere… The WWW project was started to allow high-energy physicists to share data, news, and documentation. We are very interested in spreading the web to other areas, and having gateway servers for other data. Collaborators welcome!” He could not have known that his invention would go on to transform the world in a way not seen since the Industrial Revolution.

The steam engine, Richard Trevithick

Richard-TrevithickAn invention that paved the way for the industrialisation of the world, the steam engine can be claimed by a number of people. Spanish inventor Jerónimo de Ayanz y Beaumont first patented a steam engine design in 1606, before Thomas Savery patented a steam pump in 1698. However, the first commercial use of a steam engine didn’t occur until 1712, when English inventor Thomas Newcomen developed one that could be used to pump water out of a mine. While the steam engine was put to use in a number ways during the 18th century, it wasn’t until the turn of the 19th that Richard Trevithick, an inventor and mining engineer from Cornwall, devised a contraption powerful enough revolutionise the world. His work on steam power proved crucial to the development of the technology for mass transportation, but is rarely spoken of nowadays.

It is widely agreed George Stephenson invented the steam railway; building the first publicly used inter-city rail line in the world that used steam locomotives in 1830. However, Trevithick’s work decades before acted as a crucial precursor to the dawn of rail travel. Towards the end of 1801, Trevithick tested a steam car that could climb a steep hill, making him the first person to successfully power a piston with high-pressure steam. In 1803, Trevithick was summoned to Samuel Homfray’s ironworks in South Wales to look at developing the high-pressure engines he had used for road engines. Homfray pushed Trevithick to devise a way of converting the engines into rail locomotives, and thereby help transport the heavy iron from Homfray’s factory across the country. Such was the scepticism that a local rival, Richard Crawshay, bet Homfray 500 guineas the journey wouldn’t be completed. A year later, however – and thanks to the work of Richard Trevithick – the first steam-powered rail journey in the world ran nine miles across Penydarren to the Merthyr-Cardiff Canal in Wales. Sadly, while the journey was successful, it was not one that could be easily replicated and Trevithick’s engine was considered too heavy for the rails.

Trevithick’s work on high-pressure steam engines would be taken by other inventors in the years that followed – including such celebrated 19th century engineers as George Stephenson and Isambard Kingdom Brunel – and refined to a point that made the machines lighter, more efficient and easier to transport.

Microchip, Geoffrey Dummer

Geoffrey-DummerThe impact the microchip has had on the world cannot be understated. An integral component of a staggering array of devices – including ATMs, digital watches, computers, smartphones and video game consoles – the microchip is what binds all these technologies together and allows them to operate.

Also known as an integrated circuit, the microchip is a set of electronic circuits made from semiconductor material. Credit for its invention can again be given to a number of people. In 1949, German engineer and Siemens AG employee Werner Jacobi filed a patent for a semiconductor that resembled an integrated circuit and was used as an amplifying device. However, it was never given a commercial application. Three years later, British radar scientist Geoffrey Dummer conceived the first integrated circuit, which he presented to the general public at a conference in Washington DC. Unfortunately for Dummer, nobody took his plans seriously and he was unable to obtain any funding to build the chip. Dummer’s was later hailed as a pioneer and “the Prophet of the Integrated Circuit”, but his campaigning for the UK Government to invest heavily in the technology was met with indifference. Six years later, US firm Texas Instruments designed a similar microchip and stumped up the money to construct it. Texas employee Jack Kilby patented the integrated circuit in 1959 and was awarded a Nobel Prize in 2000 for his troubles.

Dummer put his lack of funding down to “war-weariness”. “The plain fact is that nobody would take the risk,” he later wrote. “The Americans took financial gambles, whereas this was very slow in this country.”

The match, John Walker

John-WalkerYou would think something so commonly used as the match would have made its inventor an absolute fortune. However, the man responsible for creating the first friction match considered the invention – created by accident – “too trivial” to warrant patenting.

John Walker, an English chemist practising at the turn of the 19th century, had developed a keen interest in creating fire more easily than with previous methods. It had already been established that a number of chemical combinations would result in a sudden explosion of fire, but there had yet to be a method discovered of transferring the flame to wood. In 1826, Walker accidentally dropped a wooden splint coated with a combination of sulphide of antimony, chlorate of potash, gum and sulphur. This combination resulted in a sudden flame that remained on the splint.

Walker began producing the matches (which he called ‘Congreves’ after the inventor Sir William Congreve) without patenting them. The matchboxes came with a piece of double-folded sandpaper that the match could be lit against. Walker only sold 250 boxes of his matches, but he probably didn’t worry too much as he was already quite rich. The credit for his invention only arrived once he had passed away.

Walker’s mistake in not patenting his invention was soon repeated by another inventor. Three years later, Scottish inventor Sir Isaac Holden came up with a similar match. Unfortunately for him, he neglected to patent his idea either, and a man named Simon Jones spotted his mistake at a demonstration. Jones quickly patented the idea himself and began selling his own ‘Lucifer’ matches, which would become hugely popular.

Had either Walker or Holden patented their ideas, they would have made a fortune. While Jones made a considerable amount of money from his quick thinking, his design, which suffered from an unfortunate smell, was soon usurped by others. Today, matches are an easy, cheap and widely available source of fire: 500 billion are used each year.

Catecar fights against climate change with Dragonfly vehicle

Airports, especially city airports, are prodigious business centres, close to millions of people around the world. Hundreds of companies and thousands of workers make airports work, day after day. By showing their capacity to support, promote and spread innovation that encourages environmental protection, they will play a prominent role in the race for sustainability.

Dragonfly is a thermic/solar-powered electric vehicle that addresses the challenges of climate change and booming energy costs

By showing the political and business vision to find solutions, the airport business community can help to compensate for the current negative impact of air transportation. Our own electric vehicle, Dragonfly, is a small stake in comparison with the sustainability challenge in general. It’s a challenge that calls for a quick expansion of innovative concepts and business models throughout the community. The airport network could be a leading force for innovation, technology promotion and sustainability. This is, in itself, an exciting goal.

Airport, business and image
The name ‘Dragonfly’ reflects the vision of the innovative business implemented by Catecar. Prehistoric dragonflies had a wingspan of 74cm. Over time, confronted by oxygen diminution, they evolved into the smaller, quicker and highly efficient insects we know today. Becoming smaller was necessary to adapt to a changing environment: becoming more energy efficient is today’s challenge for the transportation industry.

Dragonfly is a thermic/solar-powered electric vehicle that addresses the challenges of climate change and booming energy costs. It weighs only 380kg thanks to state-of-the-art technologies developed in some of the best research centres in Switzerland: the Swiss Federal Institutes of Technology in Lausanne and Zürich.

Its key features include a ‘cabin in flax’, a Swiss-patented way of weaving vegetable flax fibres, moulded like a composite material. The cabin weighs only 35kg and absorbs shocks four times better than steel bodies, while being cheaper and more resistant than carbon or aluminium.

The Dragonfly’s solar autonomy ranges from 5km to 7km per hour of sunshine. It carries only 35kg of lead batteries thanks to a highly efficient and light (5kg) solar roof that generates 300W per hour of sunshine. The vehicle has a thermic autonomy of 1,000km in any weather conditions, thanks to a petrol-driven range extender that recharges the batteries as needed. This means the Dragonfly has no need for charging stations.

Catecar has not only developed a new product but also an innovative business model: producing locally through a network of small production units, then selling locally, with airports, ports, military bases and islands as its first clients. Catecar is willing to explore any opportunity with airports and/or enterprises to speed-up their interconnected business goals and joint interests.

For further information email catecar@catecar.ch, or visit catecar.ch

Tidal bore energy: how its harnessed

Ever since the Victorian era, engineers have attempted to harness the energy potential of the UK’s Severn Bore tidal range. It wasn’t until the 1960s that French scientists actually implemented the technology that proved it could be done, with their Saint-Malo facility bringing in an estimated 240MW every time the tide comes in.

With the Severn boasting one of the strongest tides on Earth, a tidal bore power station on the estuary floor could provide five percent of the UK’s overall energy needs. With government officials scrambling to meet an internationally agreed carbon reduction target of 80 percent by 2050, the implementation of carbon-neutral production facilities would help the nation bring down emissions and increase green contributions to the national grid by 15 percent.

Because tidal systems are dictated by the moon’s gravitational pull, energy can only be harnessed for 10 hours per day. That is less production time than wind turbines and solar photovoltaics, but the predictability of those tides means engineers know exactly when extraction systems need to be in operation. Furthermore, water has 1,000 times the density of air, which makes it possible to generate electricity from the tide even at very low speeds.

Once installed, bore systems require only rare maintenance, so there are very few overhead costs. Meanwhile, typical installation depths of 30-50m below the surface ensure minimal disruption to area aesthetics and local shipping.

Unfulfilled potential
That being said, there are quite a few reasons tidal power has yet to reach its full potential. First and foremost, it’s far too costly to maintain. In 2011, the UK government concluded a two-year feasibility study analysing 10 proposals to utilise the tidal power of the Severn Bore. The largest proposal, which would be installed between Cardiff and Weston, would cost at least £34.3bn to build. According to government researchers, a lack of demand among private investors means the public purse would be responsible for nearly the entire cost of installation.

There’s a lack of interest because the risks associated with bore power have yet to be financially mitigated. Strong ocean storms and salt-water corrosion can all but total the devices – instantly tossing away billions of pounds of investment. Considering the largest proposed system in the Severn would be able to produce just 4.6TWh per year, such an investment simply isn’t worth the risk of further costs.

Two new nuclear reactors proposed in Somerset will cost just £8bn each, and will provide enough power to supply seven percent of UK homes for the next 60 years. Proposed bore projects, on the other hand, would take up to a decade to reach full capacity, and could actually hinder Europe reaching its mid-term carbon reduction targets.

Ecological ramifications
Aside from cost-effectiveness, numerous tidal bore energy proposals have come under fire due to unanswered questions surrounding the potential ecological ramifications. While such schemes do create calmer waters and provide protection against rising sea levels, they fundamentally alter the habitats of the area’s sea life. Ecologists have warned that any one of the Severn’s 10 proposed schemes could chase away up to 30 species of birds – and would also lead to local extinctions of certain fish. Until such a time as those biological risks can be mitigated, government officials have ruled tidal bore development in the area is more trouble than it’s worth.

That doesn’t mean developments aren’t being pursued more successfully elsewhere. Experimental harvesting technologies being pioneered across Scotland, Northern Ireland and Canada are already showing investment in smaller-scale tidal projects can be more viable. Yet, for now, the vast majority of officials remain hesitant to support high investment in a method of energy production that is still in its infancy.

How it flows: a tidal barrage in action

tidal barrage in action

Hundreds of lives transformed by SynCardia’s artificial heart

Cardiovascular disease, which includes heart failure, is the leading cause of death in the civilised world – more than all cancers combined. For years, heart transplants were used to save patients dying from end-stage heart failure in which the heart no longer pumps enough blood to sustain the body.

1,347

Longest (in days) a patient has used the Total Artificial Heart

3,400

Patients in the EU waiting for a donor heart in 2012

2,004

Heart transplants conducted in the EU in 2012

The need for donor hearts is growing as heart disease continues to claim an ever-greater percentage of the world’s population. However, the number of donor hearts available for transplant has been flat in some countries and is declining in others.

Among EU countries, 3,400 patients were on waiting lists for a donor heart in 2012. According to the European Commission’s Department of Health and Consumers, only 2,004 transplants were conducted that year. According to the US Department of Health and Human Services, about 3,800 people wait for a donor heart transplant on any given day, while the supply of approximately 2,300 donor hearts annually has been flat in the US for more than 20 years.

Total Artificial Heart: a life-saving bridge
The SynCardia temporary Total Artificial Heart is the only FDA, Health Canada and CE approved total artificial heart in the world. It has been implanted more than 1,350 times and has provided more than 400 years of support for patients in Europe, Canada and the US.

According to data from the 10-year pivotal clinical study which led to FDA approval, published in the August 2004 New England Journal of Medicine, 79 percent of patients who received the Total Artificial Heart were bridged to transplant. This is the highest bridge to transplant rate for any approved heart device in the world.

Like a heart transplant, the SynCardia Total Artificial Heart replaces both failing heart ventricles and the four heart valves. It is able to provide blood flow of up to 9.5 litres per minute through each ventricle – more blood than an average human heart can pump. The SynCardia Heart is the only approved medical device that eliminates the source of end-stage biventricular heartfailure. Soon after implant surgery, doctors and family members often see patients turn from a sickly grey to a healthier pink as blood flow is restored to their body and vital organs.

Returning freedom following Cardiovascular Disease
Because of its elegant design, the Total Artificial Heart doesn’t require sensors, motors or electronics of any type inside the body. All of these “wear components” are located safely outside the body in the hospital driver and the Freedom portable driver. There is never a need to re-operate to repair or replace faulty electronics.

As soon as a patient becomes clinically stable, they can be switched to the Freedom portable driver. The Freedom can be worn in a backpack, carried in a shoulder bag or pulled on a rolling cart. Patients with the Freedom portable driver can be discharged from hospital to live at home and in their communities while they wait for a matching donor heart.

Freedom driver patients exercise, eat at home, socialise, recreate and even go back to work. All these activities help the patient get stronger and healthier, which helps lead to better outcomes when the patient receives a matching donor heart transplant. Hospital discharge can also save patients, hospitals and insurance companies thousands of dollars through eliminating most in-hospital costs for this portion of patient care.

SynCardia Heart Transplants Graph
Source: International Society for Heart and Lung Transplantation

The Freedom portable driver empowers SynCardia Total Artificial Heart patients to get on with living their lives while they wait for their heart transplants: Christopher Larsen boxes to stay in shape; Randy Shepherd wore the Freedom driver in his backpack while he walked a 4.2-mile course in an annual running/walking event that honours a fallen US soldier; at 16 years old, Nalexia Henderson became the youngest SynCardia Total Artificial Heart patient to be discharged from the hospital on the Freedom portable driver; Pietro Zorzetto, the man who has lived the longest (1,374 days) on the SynCardia Total Artificial Heart, spent all but two months of that time discharged from hospital.

More than 175 patients worldwide have used the Freedom driver, totalling more than 100 patient years of support and allowing practically unlimited mobility. SynCardia Systems is seeking approval from the FDA for a study into using the SynCardia Total Artificial Heart as a permanent heart replacement, known as destination therapy. The company has also applied for FDA approval of a study into using a smaller 50cc version of the Total Artificial Heart. The smaller heart is designed to fit all women, smaller men and many adolescents.

For further information tel: +1-520-955-0660, or visit syncardia.com

New study reveals the truth about economics and wellbeing

According to the report, titled How Was Life? Global Well-being since 1820, with the exception of sub-Saharan Africa, countries across the globe have become more equal to one another in terms of wellbeing than in terms of per capita GDP – particularly in recent years.

The study uses historical data to make assumptions about the degree to which people’s wellbeing is linked to the economic circumstances that they face. It uses metrics, such as income, per capita GDP, educational attainment, life expectancy, personal security, political institutions, environmental quality, gender inequality and even height (used to measure health and nutrition).

The study found little correlation between how wealthy a nation is and the personal security that it affords its citizens

It revealed that manual workers overall income, when adjusted for inflation, has increased eight times globally since 1820, while per capita GDP has risen 10 times over the same time period. Income inequality fell between the end of the 19th century until 1970 where it began to rise, with globalisation being a large contributor to higher income inequality within countries, while simultaneously helping to reduce the income gap between nations, the study says.

In 1820 less than 20 percent of the world’s population was literate. Literacy rates increased dramatically after 1945, reaching around 80 percent by 2000.

Worldwide life expectancy has risen from less than 30 years in 1880 to nearly 70 in the turn of the 21st century. Due to advances in healthcare, life expectancy has improved even when GDP per capita has remained stagnate.

The study found little correlation between how wealthy a nation is and the personal security that it affords its citizens, as measured by homicide rates and exposure to conflict, with murder rates in the US being relatively high throughout the past 200 years, while in large parts of Asia the rates are comparatively low.

How Star Wars changed the special effects industry

A long time ago, in a galaxy far, far away, a young cinematographer called George Lucas changed the way film was made. The year was 1977, the far away galaxy in question was an inhospitable barren land known as Hollywood. The young Lucas had a very clear vision and aesthetic in mind when he wrote his space fantasy magnum opus, but the technology available was still woefully inadequate to produce such challenging visuals. Lucas had created – in his storyboards at least – a rich environment filled with weird and wonderful creatures, space battles and hunk of junk spaceships leaping through hyperspace. But at a time when computer generated imagery (CGI) was still in its infancy (having debuted in Richard T Heffron’s Futureworld only a year earlier), it quickly became clear Lucas was going to have to build his universe by hand.

In our universe, almost four decades later, that young Jedi’s creation remains a big deal. Without going into the merits of plot, character and storylines – a debate almost as controversial as arguments over whether Han Solo or Greedo shot first – the way Star Wars was made set a high bar for the future of special effects. The original trilogy (Episodes IV, V and VI) produced between 1977 and 1983 set the gold standard for visual effects. John Dykstra, the special effects designer who worked on the original Star Wars and was chosen by Lucas to head up the director’s visual effects company, Industrial Light & Magic, has described sitting with a group of friends and building models and robots from scratch in order to achieve the realism Lucas was after.

“Back in the days of Star Wars, we kind of walked into an empty warehouse and sat on the floor and went ‘How are we going to do this?’” he told Den of Geek. “The producers, Gary Kurtz of Fox and George Lucas, took an incredible risk by listening to what myself and my collaborators had to say with regard to how to do this, because we were inventing this stuff from scratch. We have a basic toolkit now, in the form of a computer, and so much of it now is programming.”

UK tax rebates

25%

For films spending £20m or less

20%

For films spending over £20m

71%

Growth of UK film production due to tax incentives

1,050

Films claimed rebates between January 2007 and May 2013

£1.1bn

Total tax relief claimed in that time

£22m

Rebate claimed for Thor: The Dark World (believed to be the largest yet)

Dykstra and Lucas used extremely detailed miniatures, animation and a pioneering system of computer-controlled motion photography to create special effects (SFX) that still look fresh today. Star Wars was the first big-budget blockbuster to rely on realistic action scenes and explosions, and essentially invented the techniques to achieve this. But a mere two decades after Dykstra destroyed the Death Star in A New Hope using nothing but a cardboard box and titanium shavings, Lucas turned his back on animatronics and practical effects in favour of expanding his Star Wars universe digitally in the prequel Episodes I, II, and III, released in the late 1990s and early 2000s.

I’ve got a bad feeling about this
For the grace and balance of Lucas’s first trilogy, in which he and Dykstra expertly managed the juxtaposition of pioneering digital effects and animatronic special effects, his last three Star Wars films incited the ire of fans – despite being just as groundbreaking. Star Wars Episode I: The Phantom Menace, released in 1999, featured some of the same elements as A New Hope had 22 years before: the planet Tatooine and the famous gliding landspeeder driven by Luke Skywalker in the original. But the difference was that, in the more recent films, all the effects were created digitally and added in post-production by a talented team of VFX compositors. Across those two trilogies, Lucas revolutionised the industry twice with his approach to visual effects: once as he virtually invented the tools he needed to make his universe come to life, and second when he pushed digital effects further than ever before by building fully rendered characters, cities and spaceships out of computer animation. Now, as Disney prepares to launch the first Star Wars film free from Lucas’ control, the series is set to continue innovating as it turns away from flashy VFX and opts for a more analogue approach to filmmaking, true to its roots.

When Industrial Light & Magic pioneered the process of building whole environments digitally, and then inserting them into the original shots, it opened up a range of opportunities for art directors and filmmakers. Suddenly they were no longer constrained by the impracticalities of having to build every aspect of a scene from scratch, and could bend or break the laws of physics in order to make shots more powerful. But there was a weightlessness to the images created, which cinemagoers found hard to connect with. Jar Jar Binks – one of the key characters in Lucas’s prequel trilogy – is the finest example of this; the character was both annoying and at odds with his environment, and was widely ridiculed by audiences. However, just a few years later, The Lord of the Rings films would introduce audiences to Gollum, and he would become one of the most popular characters of the franchise – despite being created almost entirely through CGI.

Visually, the landscape, cities and characters of the Star Wars prequels were dazzling, but there were issues with how the actors contrasted with the CGI environment, detracting from the films’ stunning visual fabric. Inadvertently, as Lucas was pioneering a new way of making films, and a set of techniques that have become increasingly popular over the last decade and a half, he was both creating a new niche in the visual effects market and making audiences acutely aware of the limitations of the medium at the same time.

“When they introduced the Jar Jar Binks character, they were confident they could completely satisfy the audience with a full CGI character, whereas we have since learned that was not the case,” explains Owen Jackson, a London-based VFX compositor. “Directors today are certain they can tell a story with a full CG character: take Avatar for instance, which was carried by a series of CGI characters. Though that was only in 2009, and it was a tremendous success, audiences of today would already note the difference in the quality of the CGI available in 2014.”

A bright centre to the universe
In the years since the Star Wars prequel trilogy was launched, CGI technology has come along in leaps and bounds, with London emerging as the capital of the special visual effects (VFX) sector. It is a game-changing niche as far as making films is concerned: VFX, even more than conventional special effects and animatronics did in the 1970s, has opened up new worlds and galaxies to be explored in film. It has also helped cement science fiction as a credible genre, rather than a market for teenagers. Take 2014’s Oscar- and BAFTA-winning picture Gravity, which is primarily set in orbit: it is utterly realistic and has, perhaps more than any film since Avatar, helped displace audiences’ fear of CGI-heavy pictures.

VFX has become a rich industry, innovative both in terms of artistic creativity and cutting-edge technology. In the mid-2000s it was JK Rowling’s contractual agreement with Warner Bros that the Harry Potter movie franchise be primarily produced in the UK with British cast and crew that boosted the sector. This surge of activity by industry-leading companies led the UK Government to secure a slew of tax credits in order to incentivise the growth of the sector and compete on a par with other leading industry capitals. According to a recent study by Oxford Economics, film production in the UK would be up to 71 percent smaller if it were not for the government’s tax incentives.

“London and the UK is experiencing an extraordinary amount of film, TV and animation activity as a result of tax credits,” Adrian Wootton, Chief Executive of Film London and the BFI told the Financial Times. “There’s been a huge amount of R&D and expansion. We now have the biggest concentration of VFX in the world, with six of the eight top companies in this sector in London.”

The planet Tatooine appeared in multiple episodes. CGI enhanced the physical sets in later entries
The planet Tatooine appeared in multiple episodes. CGI enhanced the physical sets in later entries

Do or do not
As a result of this unexpected shift to Europe, the production of blockbusters and high budget showstoppers has been fragmented. Where once a film would have been commissioned, shot, cut and launched from a base in Hollywood, today tax incentives and the emergence of tech hubs such as London mean the process of making a film is no longer centralised. This means films can be cheaper to make, but creatively they are much more complicated products. The advent of modern CGI technology means changes can be made to the plot and art direction of a film up to the last seconds of post-production: a process that would have been extremely costly and time-consuming before digital images invaded the shots.

“Directors have gotten used to the ability to continue to make changes throughout production rather than going in with a clear direction,” explains Jackson. This has led to a slew of CGI-heavy but incoherent features that have cost production companies billions of dollars. Take Disney’s 2012 flop John Carter: an epic reworking of a 1917 adventure novel set on Mars, the film left Disney $200m out of pocket.

The film mostly consisted of epic battles with computer-generated beasts, but lacked a discernible plot or emotional focus for the audience to connect with. A CGI environment that is too extensive can leave audiences feeling alienated from the story and the characters, with nothing recognisable to identify with. It is no coincidence Avatar – the highest-grossing film of all time – expertly blends live-action sequences with those created digitally.

Given the damage done to the reputation of CGI by the likes of John Carter and Jar Jar Binks, it wasn’t coincidental that JJ Abrams, the man tasked by Disney to helm the upcoming Star Wars Episode VII, gleefully allowed images of a full-scale Millennium Falcon model and a fleet of X-wing fighter craft to circulate online. Abrams is drawing a line under Jar Jar and embracing the old school sets, props and animatronics that made the original trilogy so popular with fans and critics.

By combining analogue techniques with CGI effects, directors such as Abrams are looking to make up for some of the inevitable shortcomings of both technologies. However, some in the industry believe that, despite Abrams’ high-profile return to analogue, CGI will inevitably take over for good.

“When you watched Transformers, did it feel like the robot characters were real and did it look like they were alive in the same world as [the film’s star] Shia LaBeouf?” wrote Alex Billington, a film critic and industry commentator on firstshowing.net. “When you watched The Dark Knight, did you ever notice one second of CGI, even in Harvey Dent’s burned face? If Hellboy II is out of the question in the argument for practical effects, the next great example is Iron Man, where the late Stan Winston’s armour was completely hand-built and used while shooting. However, could you tell when his armour changed from practical to CGI? I know I never knew the difference and that’s a testament more to the CGI artists than anyone else. That’s not to say that Winston’s work wasn’t amazing, as it certainly was what gave the CG artists a design to work off of, but it shows that the use of CGI is not a bad thing by any means.”

A technician shows some of the design work behind key scenes from the hugely successful film Gravity in the London offices of a leading visual effects company
A technician shows some of the design work behind key scenes from the hugely successful film Gravity in the London offices of a leading visual effects company

But there is still a long way to go for the industry, especially when it comes to human or humanoid CGI characters. “Creating a fully digital CGI human face is the holy grail of visual effects, and we are nowhere near that yet,” says Jackson. “That is why studios like Pixar have very stylised human faces, because they cannot achieve a realistic enough final product. The realism that you naturally get from an animatronics face is more believable because it’s an analogue photographic technique that is easier for audiences to believe. People are so used to looking at human faces that they can pick up on even the most minute discrepancies and nuances of the animation: if anything is off, audiences will spot it even if they cannot put their finger on why it looks wrong.”

It is not a coincidence then that the most successful CGI characters – such as Gollum in The Lord of the Rings trilogy and the Na’vi people in Avatar – were created by blending real actors with many layers of CGI images, rather than by animating a character from scratch. Andy Serkis, who played Gollum, has helped pioneer these types of performance capture roles, which have allowed CGI characters to become more realistic.

The fact of the matter is directors no longer have to choose between including traditional special effects and animatronic models or going fully digital by filming sequences using a green screen and building the shots later – although many still do. The most successful products combine a mixture of both disciplines. This world is not one of light and dark: but it is, like the mystical Force in Lucas’ sci-fi epic, in need of balance.

Airports around the world compete to become more sustainable

In the US, it was Chicago’s O’Hare airport that started it all off. In 2003, a time when few of the nation’s big airports were paying much attention to sustainability, Chicago’s Department of Aviation embraced green aviation as a major priority on the grounds that the future viability of the state’s airport would otherwise be threatened by quality-of-life lawsuits and other citizen-friendly litigation.

Ever since, sustainability has underpinned every single aspect of running the operations at O’Hare and other state airports. Today, the department’s sustainable airport manual is the bible for the nearly 20,000 airports operating in the US.

Although only the most green-minded passengers will see the evidence, all US airports are now obliged to improve their sustainable management of a panoply of issues. They range from the reduction of noise levels and improvement in the quality of air to the proper use of natural resources – an obligation that includes ‘light pollution’ – and the protection of threatened wildlife and vegetation. Visual pollution is another target – the aesthetics of any new construction is a paramount concern.

20,000

Airports in the US

Blueprint for sustainability
Universally known as ‘SAM’, Chicago’s sustainable airport manual is the blueprint by which all kinds of operations are certified. These include maintenance procedures, management of buildings, compliance of airlines and their staff, use of kitchens, design and care of landscaping, parking, and hangars. Even minor changes in the way spaces in and around the airport are utilised must be ticked off. SAM doesn’t apply only to ground-based activities but also to the aircraft themselves. Under a five-tier system, airlines using the airport are measured under the Green Airplane rating.

Now other airports are following O’Hare’s lead. “Mounting evidence about climate change brings with it an urgency for all of us to protect the environment for future generations,” argues Robert Nicholas, Airport Manager at busy Ithaca in the scenic Finger Lakes region of New York, who is overseeing an FAA-funded green project. “Through our sustainable master plan, I hope we can do our bit and perhaps set an example for other airports to follow.”

Much of the pressure for greener airports comes from local residents. All over the world, people living near airports, big and small, are forcing local authorities to get tough on their behalf – and this includes taking action against noise-polluting aircraft.

In Italy, the Piaggio Avanti EVO (the third generation of the aerospace group’s highly successful corporate jet) will be much quieter as a result of this pressure when the first models come off the assembly lines later this year. Now 98 percent owned by Abu Dhabi sovereign wealth fund Mubadala, Piaggio knew it had to produce an acoustically kinder aircraft. “We were not doing well on our noise levels”, says Giuliano Felten, Piaggio’s Global Head of Sales, about the outgoing model. As a result, the latest Avanti EVO will be 68 percent quieter because of advanced engineering of the propellers and exhaust ducts.

One way airports meet the protests of residents in built-up areas is to impose curfews – and that’s especially true of heliports. Monaco, which has one of the highest ownership rates per resident of helicopters, imposes a curfew between 10pm and 7.30am at its heliport at Fontvielle alongside the Mediterranean (though insiders say the restrictions are broken by Prince Albert).

The only municipal heliport in Paris, Issy-les-Moulineaux near the Eiffel Tower, has a strict curfew between 9pm and 7am – and even that has been under fire from residents of the heavily populated 15th arrondissement, who, after years of negotiations, have managed to have the number of movements per year reduced to around 3,000 (that’s down from 36,000 a year, 20 years ago). At Issy-les-Moulineaux, penalties apply for light pollution: if the heliport has to switch on the lights in poor visibility or at night to facilitate a landing or take-off, it charges an extra fee.

Politicians and businesspeople have long been pleading for more heliports in the French capital – three are under discussion – but they are being fought all the way.

Terrific terminals
In 2006, Delta Airlines set a new standard with its $350m Terminal A at Boston’s Logan International Airport. Developed by the local Massport authorities, the building showed it was possible to create a vast structure that did a minimum of environmental damage and its lessons are being copied elsewhere. Terminal A was the first in the US to earn the LEED (Leadership in Energy and Environmental Design) certification from the Green Building Council. Until the construction of Terminal A, the vast impervious surfaces used to create runways and parking aprons, parking lots and roofs bigger than football fields had triggered environmentally hostile effects such as ‘heat islands’ and torrents of storm water run-offs. Terminal A got around these challenges with a heat-reflecting roof and windows, ‘low-flow’ taps, waterless urinals, self-dimming lights and filtration of storm water.

Montreal’s Trudeau International could be a step ahead of even Terminal A. Managed by Aeroports de Montreal, the facility began to take a close interest in sustainability more than a decade ago and has since come up with several ground-breaking technologies. Underground parking, for instance, is heated by low-temperature hot water. Automated, sensor-triggered blinds in the airport’s jetties respond to natural light, saving on heating and air-conditioning. And the thermal plant uses an electric boiler that is 70 percent more efficient than previous models and significantly cuts greenhouse gas emissions.

The main driver of sustainable aviation on the ground is Airports Going Green. A global body, it encompasses airports and their consultants all over North America, Europe, Asia and the Middle East – and its awards are highly prized. Last year’s winners included Phoenix Sky Harbor International for its PHX Sky Train, an automated, electrically powered transport system that won the LEED’s gold rating for its reduction of the airport’s greenhouse gas emissions by nearly 6,000 tons a year.

The hottest topic in airport sustainability – fittingly – is CO2 omissions. In the North Sea region, another group known as Green Sustainable Airports has been aggressively monitoring emissions at small and medium airports in six countries, including the Netherlands. “CO2 monitoring is a substantial element in the overall sustainable strategy”, the group says.

The project was launched by Groningen Airport Eelde, an historic facility in the north-eastern Netherlands that sees about a quarter of a million passengers a year. The airport authorities have been through the environmental mill, and their experience reflects that of hundreds of similar facilities around the world. In April 2000, the Dutch government approved a runway extension (from 1,800m to 2,500m) in part because low-cost carriers wanted to land there. But a wave of environmental legislation – plus civil litigation – embroiled the project for the next 12 years, during which time low-cost airlines came and went. Ryanair, for example, gave the short runway as a reason for abandoning the route. The extension was not started until 2012. Although more airlines now use the airport, Groningen is working to make its airport more environmentally acceptable.

Wildlife woes
In the UK, airport sustainability has run headlong into the wildlife lobby. Although most countries prefer to sanction the construction of airports along coastlines where the noise is less likely to disturb residents, this is anathema to a powerful political lobby, the Royal Society for the Protection of Birds (RSPB). For years, the RSPB battled plans to extend a runway at Lydd Airport in Kent on the grounds the facility was surrounded by protected habitats. At Lancashire’s Warton Aerodrome, the bird lovers even fought a cull of gulls due to be carried out because, it was feared, they could be sucked into engines and risk passengers. The RSPB opposed the cull despite the plans complying fully with the EU Habitats Directive.

Although the airports eventually won those battles, the scars left their mark on sustainability assessments. The main reason given for the rejection in September of bold, long-term plans to construct a giant new airport in the Thames Estuary near London, which would have taken pressure off overused Heathrow, was because the project posed environmental threats, especially to a teeming population of wading birds.

Right or wrong, the decision illustrates how quality of life has become a decisive factor in making airports work for the community at large.

Dreams nobody wanted: when massive infrastructure projects go wrong

Imagining what the world will look like in the future is a fool’s game. Many of the most carefully laid plans have been rendered obsolete by sudden innovations that nobody saw coming, so deciding on what to invest vast sums of public money into building for future generations is a staggeringly hard task. The problem is no more apparent than when building major infrastructure projects that are supposed to serve many generations.

While many large-scale projects have stood the test of time and greatly enhanced the lives of millions of people – from the pioneering Victorian engineers of Britain in the 19th century to the grand projects that came out of Franklin D Roosevelt’s New Deal – there are many others that either never got built or were deemed obsolete shortly after completion. There are yet others that have continued to operate successfully, but could not be deemed remotely successful when taking into consideration value for money.

Throughout history there have been grand infrastructure projects enthusiastically touted by governments or private backers as ways by which countries can be transformed

Not just pipe dreams
As governments slowly emerge from the economic catastrophes of recent years, many are banking on large-scale infrastructure schemes as a way of boosting employment and modernising their economies. However, governments face plenty of problems in getting these schemes off the ground. Paying for them is staggeringly expensive and the disruption they cause often proves hugely unpopular with people who don’t want to live near a building site. The long-term planning needed is frequently dropped when politicians are worrying about getting re-elected: the projects get kicked into the long-grass for a later generation to deal with.

In a rare sign of selflessness and foresight, Mayor of London Boris Johnson recently announced his London Infrastructure Plan 2050. He laid out plans for vast new road networks, rail and Tube lines, airports, housing, energy, irrigation, and internet networks. Johnson said long-term planning was essential if the needs of future generations were to be met: “This plan is a real wake-up call to the stark needs that face London over the next half century. Infrastructure underpins everything we do and we all use it every day. Without a long-term plan for investment and the political will to implement it, this city will falter. Londoners need to know they will get the homes, water, energy, schools, transport, digital connectivity and better quality of life that they expect.”

While such forward thinking is admirable from a politician, there must always be a level of caution when approaching large-scale infrastructure projects. Throughout history there have been grand infrastructure projects enthusiastically touted by governments or private backers as ways by which countries can be transformed. Shiny new roads, dizzyingly fast rail networks, and impressively vast new cities have all been trumpeted as essential for the growth of a country’s economy. But sometimes these projects come undone due to political bickering, poor planning and bad budgeting, resulting in white elephants and massive bills to pay. We have taken a look at a few projects that never quite worked out, as well as some that look to be going that way…

London Ringways

London-Ringways

Londoners frequently bemoan the outdated transport network they have to use on a daily basis, neglecting the fact the city once boasted an infrastructure that was the envy of the world. However, years of lack of investment have led to many of the roads and rail networks being woefully inadequate for the ever-rising number of people in the city. Politicians have avoided making big decisions about expensive transport schemes in order to placate the masses of voters who would be upset by the upheaval.

One example that could have provided a cure for the UK capital’s chronic traffic problems was the London Ringways scheme. A series of four ring roads that were proposed in the 1960s, the London Ringways would have circled the centre of the capital on motorways at various distances from the centre. Designed to alleviate traffic congestion by offering a high-speed route around the centre of the city, the scheme was popular with motorists but deeply hated by those unfortunate enough to have to live near it.

Heavy opposition was organised by pressure group Homes Before Roads, which pointed out that, not only would the roads blight the environment and quality of life for people, they would also be wildly expensive. For example, Ringway One – which would have been the most central of the routes – was estimated to cost around £1.7bn in 1970, which would have amounted to roughly £22.9bn in today’s money.

In the end, the project was scrapped in 1973, but not before parts of it had already been constructed. The North Circular route was built at great expense, ultimately blending into existing roads once it was decided not to continue the scheme. Similarly, what is now known as the M25 was formerly part of Ringway Four. The only section of Ringway One that was built is the raised Westway dual carriageway

Eurotunnel

Eurostar

While many in Britain and France might claim they want a greater gap between their two countries, the idea of creating a direct link to boost trade and access had been proposed for nearly 200 years before it eventually became a reality (it was first proposed in 1803 and the earliest geological surveys were carried out in the 1830s).

In 1985, a consortium of British and French banks and construction companies – known as Eurotunnel – came together and successfully lobbied their two governments to build a rail link underneath the English Channel.

Based upon a 1979 ‘Mousehole project’ that had been discussed between UK Prime Minister Margaret Thatcher and French President Francois Mitterrand, the Channel Tunnel scheme would form a 31.4-mile rail tunnel from Folkestone in southern England to Calais in northern France. With construction starting in 1988, the tunnel was eventually opened in 1994 after a process that saw two mammoth boring machines drill underneath the seabed. Described by some as one of the greatest engineering achievements in modern history, the project has also been lambasted as vastly overpriced.

While the Eurotunnel group had sourced private funding for the construction of the tunnel, it had done so through public share offerings that placed huge pressure on the stock markets. When the cost of the project spiralled to £4.65bn – thanks in large part to tougher safety and security demands than expected – it represented a cost overrun of 80 percent. This in turn led to Eurotunnel having to undergo a series of debt restructuring deals throughout the 1990s, with the company losing a record £925m in 1995. A fire in the tunnel caused even more trouble for the group. The company eventually managed to stabilise the running of the tunnel, and by 1999 finally posted a modest profit.

Lapsset Project

Africa’s need for an extensive and modern infrastructure network has led to a number of grand pronouncements from various governments across the continent. While many of these projects seem fanciful, some huge schemes are in the process of being constructed. Perhaps the largest infrastructure project currently under construction in Africa, the Lamu Port Southern Sudan-Ethiopia (LAPSSET) Corridor is set to drastically transform the economy of Kenya and its neighbours.

The transport and infrastructure corridor will stretch from the Kenyan border with South Sudan in the northwest of the country to the south-eastern island of Lamu. Announced in 2012, the project will encompass a brand new port at Manda Bay in Lamu, a standard gauge railway line to Juba in South Sudan and Addis Ababa in Ethiopia, a road network, oil pipelines, an oil refinery, and three airports. It will also include three new resort cities on the shores of the Lamu Archipelago, Isiolo and Laka Turkana.

However, while all this seems like the sort of transformative scheme the region desperately needs, it has been beset by problems. An initial budget of $16bn has jumped to around $30bn, and the completion date remains unclear: it’s hoped the bulk of the project will be constructed by the end of the decade.

There have also been a number of disputes over the project’s impact on the environment, with concerns raised about the effect on marine life, forests and the Lamu Port, which is listed as a UNESCO World Heritage Site. South Sudan’s government has also complained of the slow pace at which Kenya is constructing the project, while the political conflicts and violence in the region have severely hampered efforts to get the scheme off the ground.

Sydney Opera House

Sydney-Opera-House

One of the most recognisable buildings in the world, the Sydney Opera House is the pinnacle of Australia’s cultural achievements. Overlooking Sydney Harbour, the building acts as a central hub for the city’s arts, with four key resident companies situated within it: the Sydney Theatre Company, the Sydney Symphony Orchestra, the Australian Ballet and Opera Australia. Danish architect Jørn Utzon was selected as the designer of the stunning arts centre back in 1957, but the building wasn’t completed until 1973 – 10 years later than originally planned.

While the building has proven hugely popular with tourists and locals, it was steeped in controversy while under construction. Scheduling overruns, spiralling costs, and a scaling down of the design all led to the architect resigning from the project before it was completed, while the government that approved it was severely criticised for claiming it would cost just AUD6.5m to construct. By the time it was finally completed, the Sydney Opera House had cost the taxpayer AUD94m – an overrun of 1,400 percent, which is a record for an infrastructure project.

Utzon’s resignation from the project came as a result of political bickering over the cost and design, as well as reluctance from the government to pay for the building. While the aftermath of his resignation had politicians and the press presenting Utzon as a fantasist unconcerned with the realities of such a major project, he has more recently been praised for his work. In 2002, the building underwent a costly refurbishment worth AUD42m, although this admittedly brought the building more in line with Utzon’s design. In 2004, the Utzon Room was opened within the building as a tribute to the building’s designer.

California High-Speed Rail

America’s lust for the road means it hasn’t adopted high-speed rail with the sort of enthusiasm seen across Europe and Asia, though rapid rail transport has certainly had plenty of advocates. One of the most high profile – and controversial – schemes is the California High-Speed Rail project.

First proposed by charismatic Governor of California Jerry Brown in the early 1980s, the scheme has been backed, scrapped, and relaunched countless times ever since. A link was proposed between Los Angeles and San Francisco in 1992, with a formal planning authority established in 1996 to develop the project. Indecision meant the project kept getting pushed back until it was approved in 2008 at a cost of $33bn.

The approval came at a time when President Obama was trumpeting the need for New Deal-style infrastructure projects to boost the US economy, but the price of the link quickly rocketed. Estimates in late 2011 put the total cost of the scheme at an eye-watering $65.4bn (although that was later revised down to around $56bn).

The scheme has faced considerable opposition from those concerned about its soaring costs, while others think it will damage the environment. There have been concerns that, by the time it is completed (supposedly in 2017), the trains will not operate as fast as high-speed networks being built elsewhere. There have also been concerns that, with California sitting atop a major fault line, the rail link might be vulnerable to earthquakes.

Others are concerned the line may cost as much as $82bn by the time it is completed. At a time when state budgets in the US are particularly constrained, many question whether it is appropriate for the government of California to be spending such sums on a rail line.

The Ghost City of Ordos

Ghost-City-of-Ordos

Over the last decade, China has invested its colossal wealth into catching up with the rest of the developed world. Huge new cities have sprung up across the country, with high-speed rail, modern airports, and towering skyscrapers all built to provide China’s 1.3 billion citizens with the infrastructure to spur on the country’s economy. However, there have been signs China’s government has perhaps gotten ahead of itself, investing vast sums into projects that aren’t needed and don’t get used.

The prime example of this is the city of Ordos, which sits in the deserts of Inner Mongolia. This region of northern China had traditionally been a remote outpost of the country, but now holds one of its most futuristic metropolises. Unfortunately, while Ordos was designed to cater for more than half a million people, just two percent of its buildings have been occupied. It has been described as China’s ‘Ghost City’ because of the vast disparity between the scale of its infrastructure and its lack of citizens.

The planning all began in 2003, when a group of property developers devised a large new urban hub on the outskirts of the existing city of Ordos. The aim was to transform it into a thriving commercial hub. However, building the city became far more difficult than predicted, with continued construction delays and massive overspends resulting in few people choosing to actually live there. Those that did reportedly abandoned their expensive apartments and the city now sits largely neglected, littered with empty and incomplete buildings.

A huge amount of government funds were pumped into building Ordos City, with office blocks, shopping centres, as well as lavish monuments and parks, all sitting unused by the few people that actually live there.