Google buys stake in Bloomberg rival

Google, now trading under its new name, Alphabet, has bought a stake in Symphony Communication Services – bringing the value of the messaging company up to an estimated $650m.

Google joins a host of formidable co-investors, including Goldman Sachs, Morgan Stanley, Credit Suisse and Bank of America, which have teamed up to dislodge Bloomberg from its dominant position in Wall Street messaging.

Google joins a host of formidable co-investors, which have teamed up to dislodge Bloomberg

Founded as Perzo in 2012 and then renamed last year, Symphony officially launched its services on September 15. The cloud-based messaging company was created as an alternative to the trading floor terminals provided by Bloomberg and Thomson Reuters, which can cost tens of thousands of dollars every year. Symphony’s services on the other hand are available for just $15 per month for each user, a far cry from Bloomberg’s monthly fee of $1,850 per user. Furthermore, businesses with less than 50 users can use Symphony’s centralised communications platform for free.

In another move to rival Bloomberg and Reuters, Symphony signed a deal with Dow Jones in September to provide its news content. As such, news stories from Dow Jones’s flagship news publication, The Wall Street Journal, and Dow Jones Newswires will appear on the messaging platform.

A key selling point for Symphony’s service is its unique end-to-end encryption technology, which offers protection of sensitive material and a highly secure communication channel – an invaluable asset on Wall Street. As well as enhancing cybersecurity, the service also promises to simplify the workflow for financial institutions, while meeting compliance and regulatory obligations.

Symphony’s latest round of funding is expected to close this week.

Male – not female – jobs will be taken by robots

Since the early days of the Industrial Revolution, technological advance has eliminated certain forms of work at breakneck speeds. If public interest is anything to go by, however, automation has entered a new, accelerated stage. Books such as Nicholas Carr’s The Glass Cage and Erik Brynjolfsson and Andrew McAfee’s The Second Machine Age, as well as TEDTalks called things like “Will Automation Lead to Economic Collapse?” tap into the idea that humans are on the verge of being replaced. Various studies have given figures projecting how many jobs are at risk, but one aspect of this supposed auto-pocalyspe that is not so often addressed, however, is how it will affect men and women differently.

Heavy gender biases still exist within lower-paid occupations: women typically dominate people-facing jobs such as nursing and customer service, while men dominate in physically labour intensive roles such as construction and trucking. Within our gendered division of labour, it is the latter who are likely to be eliminated through automation quickest and most thoroughly. This raises the question of whether or not men, bereft of the jobs they typically would have held, will re-enter the workforce in roles previously deemed ‘for women’, and what implications that will have for gender roles in society.

Gendered division of labour
A recent article in The Atlantic noted, “women typically work in more chaotic, unstructured environments, where the ability to read people’s emotions and intentions are critical to success” while “many of the jobs held by men involve perception and manipulation, often in conjunction with physical exertion, such as swinging a hammer or trimming trees”. For instance, over 90 percent of nurses in the US are female, while over 90 percent of truck drivers are male. Women dominate in customer service roles, while men are still far more likely to work in heavy industry.

>90%

Of nurses are female

>90%

Of truckers are male

According to Stephanie Coontz, Professor of History and Family Studies at Evergreen State College, and Director of Research and Public Education Council on Contemporary Families, there exists “an association of women with care work and service work, stemmed in some part from the fact that such jobs were more compatible with multitasking and rearing children, but perhaps even more from a set of stereotypes that were actually very late arrivals on the historical scene”.

Gendered divisions of labour are nothing new; as Coontz pointed out, gender divisions stretch back to hunter-gatherer societies. “Every society in history has had a division of labour by gender, but the tasks assigned to each gender have been varied enough that it’s difficult to make a case that there is some invariant arrangement dictated by our natural desires”.

At one time, she explained, “men were intimately involved in domestic matters, community affairs, kin networks etc”. However, “as work moved out of the home”, starting in the 18th century, a “new set of stereotypes about men as providers and public figures” emerged, accompanied by the idea of “women as nurturers and private caregivers”. These social stereotypes were further reinforced by work patterns: “Women tended to go to work while single but then to retire upon marriage, so the new stereotypes combined with that pattern to create the sense that women were peripheral workers, most suited for jobs that were an extension of, or preparation for, their caregiving at home.”

Factory automation has been ongoing since the start of Britain’s Industrial Revolution, but other occupations, which required more detailed human perception, were often thought of as more resistance to such trends. As sensory technology improves, however, these roles are increasingly at risk. Robotic sensors can now make more precise cuts to trees, eliminating tree surgeons, while the pioneering work on driverless cars may make America’s long lonely highways even lonelier, rendering truck drivers redundant.

A 2013 Oxford University study listed the occupations most likely to face automated elimination, with those near the top end tending to be stereotypically male, lower-paid jobs. For instance, cement masons and concrete finishers have a 94 percent chance of automation, while cargo and freight agents have a 99 percent likelihood of being replaced by machines. Truckers face a near 80 percent chance, with the study bluntly noting: “Truck driving will soon be automated”. A host of other jobs (such as machine setters, rock splitters, locomotive engineers and vending machine repairers) face around a 90 percent chance of automation. At the same time, roles such as nurses and various social work positions have a low likelihood of automation.

End of divisions
As male-dominated jobs disappear, they open up the question of where men will find alternative forms of employment. Unless men who once had those jobs simply drop out of the workforce, the obvious answer would be for them to re-enter the job market in search of roles traditionally thought of as feminine. Jobs dominated by women are not only persisting, they are also those projected for the highest growth in coming years.

According to Hanna Rosin in her somewhat sensationally entitled book The End of Men, of the 30 job roles projected to grow at the fastest rates in the coming years, 20 are dominated by women, such as nursing, accounting, child care and food preparation. “The list of working-class jobs predicted to grow is heavy on nurturing professions, in which women, ironically, seem to benefit from old stereotypes”, said Rosin.

Indeed, across America’s rust belt, once the heartland of heavy lifting and heavy industry, large university healthcare centres have become the mainspring of economic life. John Hopkins University and Hospital – the best medical school in America according to US News & World Report – is the largest private employer in Baltimore. Further up the East Coast, in New Haven, Connecticut (once a centre of heavy industry), Yale University (with its medical school) is the largest employer. The same is also true in Cleveland, whose University Hospital and various other medical centres are now the largest employers, and Indianapolis, where Indiana University Health was the second largest employer in 2014; in Iowa City, University of Iowa Hospitals and Clinics was the largest employer in 2011.

Whether men successfully turn to ‘women’s work’ may depend on them overcoming their own prejudices. According to Coontz, men “are either going to have to learn that care work can be manly (nursing requires a lot of strength after all) or they will be marginalised. And they are learning, but they have been slower to adapt to what was traditionally defined as ‘women’s work’ than women have been to adapt to and enter what was traditionally defined as ‘men’s work’. The very things that privileged men in the past – the higher prestige of male occupations, the contempt [for] or dismissal of ‘women’s work’ – are now a major barrier to their adaptation to the new economy”.

Since the gains made by feminism in the 1960s and 1970s, Coontz said, “we have been helping women go beyond the limiting messages and options of ‘the feminine mystique’”. Society must now “offer the same kinds of opportunities, encouragement, social support and training for men, so that they can expand… beyond the constraints of the masculine mystique”.

Others, however, are not so optimistic. Even if men re-enter the workforce in this expanding economic sector, divisions of labour could still reappear.

Joyce Jacobsen, Provost and Vice President for Academic Affairs and Professor of Economics at Wesleyan University, argued that, “there is still gender segregation in which caretaking occupations the men enter”. New job categories within sectors “can be created and thus continue the fundamental pattern of widespread occupational sex segregation”. For instance, “men may be orderlies but women [are] nurse assistants. Men do appear quite reluctant to enter female-stereotyped occupations”. She pointed out gender segregation has stayed fairly stable in most occupations, with the historical pattern being a switch from male to female (or vice versa) domination in roles, rather than integration.

Humans or robots
The economy “has become more amenable to women than to men”, wrote Rosin. Yet gender roles are not set in stone: they are, as the true-yet-clichéd phrase of social sciences goes, “socially constructed”. They are constantly subject to change, depending upon larger forces in society: the structure of the workforce being one such force. As automation and the closing of stereotypically male jobs starts to become more generalised, a change in attitudes may come with it. Men growing up with the expectation of following their father into the factory only for the plant to close may have ingrained perceptions of certain jobs not being for them. However, as boys grow up with no expectation of obtaining such jobs, they may be more willing to enter the workforce in what are currently seen as feminine roles.

As more men enter these jobs, the gender stereotypes may gradually chip away. Divisions of labour will hopefully no longer be between men and women, but between robots and humans.

Neural networks shed light on autism

Right now, the scientific world is very much in the dark about autism. This is because the genetic and environmental factors that give rise to the disorder are extremely diverse. The behaviour exhibited by those suffering from it also varies considerably. Two individuals who sit at roughly the same place on the autistic spectrum still have very different personalities underneath and will often not display symptoms of the disorder in the same way.

This poses a unique obstacle for scientists to overcome and has meant that, for a long time, not only has their ability to accurately classify the various gradations of the condition been hindered significantly, but so too has their capacity to treat it. However, according to a recent study published in Proceedings of the National Academy of Sciences of the United States of America, the complexity of the disorder could actually provide the key to understanding it.

The brain is really just made up of a lot of circuits, which are designed to solve particular problems

Ari Rosenberg, a computational neuroscientist at the Baylor College of Medicine and co-author of the study asserted that, because autism is pervasive throughout the brain, rather than simply effecting individual systems such as affection or vision, the disorder may “broadly alter neural computation”. This means autism may be the result of an alternate algorithm that has somehow been implanted across multiple areas of the brain, disrupting communication between those various systems. In an attempt to identify and correct this corrupt neural algorithm, Rosenberg and his team turned to artificial neural networks (ANNs) for help, by providing them with a computational perspective on autism.

Autism and ANNs
ANNs allow scientists to create models that mimic biological neural networks, including the central nervous systems of animals and, more specifically, their brains. In short, they allow scientists to create models of biological neural systems that help them understand how networks of interconnected neurones exchange messages between one another and how disruptions to the network impact this process.

“Our understanding of the brain basis of autism has advanced significantly over the last 10 years and we are learning that there may be a wide array of neurological differences in autism”, explained James Cusack, Research Director of the UK-based charity Autistica. “By modelling neural networks using state of the art computational modelling, we may well learn about some of the differences in brain connectivity and how this could be adjusted with tailored interventions.”

In Rosenberg’s study, he and his team used ANNs to measure how alterations to conflict-ridden neural networks could give rise to symptoms exhibited by patients with autism. The team is also hoping their research can help resolve debates within the field and aid in identifying physiological pathways that can be targeted in the treatment of the disorder.

“There is a lot going on and it is very complex, and so we stepped back and looked at the level of complexity, both on the etiology of the disorder and the behavioural manifestations of the disorder”, said Rosenberg. “What we noticed was that there is a lot of behavioural data, biochemical data, and a lot of genetic data, but there wasn’t really much connecting them all. So what occurred to us was that we could look at this problem in terms of dysfunctions of neural circuits, because the brain is really just made up of a lot of circuits, which are designed to solve particular problems.”

Early in life, individuals with autism processed visual stimuli very differently

Every day, people take in a massive amount of visual information (stimuli), which is then processed by the brain in order to make sense of the world around them. This is then converted into appropriate motor actions in order to interact with the environment. From previous studies the team conducted, they knew that, early in life, individuals with autism processed visual stimuli very differently to those without the disorder. Armed with this knowledge, the scientists from Baylor College set out to make a neural network model of the visual cortex that would allow them to understand how neurones in the early visual system break down the information entering the eyes. They then altered the connectivity in the network to see if it gave rise to behavioural predictions that matched those seen in autism.

“The nice thing about having a neural network model is that you know the parameters that control how it behaves, and so it give you knobs that you can turn to change the connectivity within the network, allowing you to see how behaviour changes as you dial in or dial back connectivity”, said Rosenberg. “This gives you very nice, clean predictions, so, if I increase the excitation or I decrease the amount of inhibition in the connections, it helps us understand how this changes the behaviour of the model. It gives us a fascinating insight into what might physiologically be going wrong in the human brain.”

Tailored treatment
The hope is that, once the team gathers as much information as they can from ANN models, they will one day be able to create a more targeted intervention for those diagnosed with autism. This will allow mental healthcare professionals to target specific physiological pathways that were implicated based on the neural networks modelling. By identifying these physiological pathways, individualised treatment plans can be crafted, which could use a combination of drugs and behavioural therapies.

“Better classification could lead to better tailored support and diagnosis for people with autism”, said Cusack. “Adults with autism often discuss how challenging it is to obtain a diagnosis. Many families also felt that the ‘cupboard was bare’ in terms of evidence-based services once a diagnosis had been made.

“Our ‘one in 100’ report shows that people affected by autism feel that autism research needs to start to yield practical benefits. So it is a priority for the field to undertake autism research that can translate into clinical practice and ultimately benefit those affected by autism.”

The simple fact is that the more the scientific community knows about the specifics of any given individual’s manifestation of autism, the better the treatment that person will get. And while Rosenberg admits a cure is a long way off, ANN modelling could one day help find it.

“That’s really our hope. That is something that we are really pushing hard for”, he said. “Autism manifests itself differently depending on the individual, and so, hopefully, the neural network modelling will provide us with a way of identifying unique or at least distinct clusters of different types of autism.

“I don’t know at what point there really will be a cure, but if we are fortunate enough to find a way to cure autism, it will not be a single cure.”

We should worry about the developing world’s growing numbers

The United Nations’ latest population projections suggest that Japan’s population could fall from 127 million today to 83 million by 2100, with 35 percent of the population then over 65 years old. Europe and other developed economies are ageing as well, owing to low fertility rates and increasing longevity.

But those who warn that huge economic problems lie ahead for ageing rich countries are focused on the wrong issue. Population ageing in advanced economies is the manageable consequence of positive developments. By contrast, rapid population growth in many poorer countries still poses a severe threat to human welfare.

Increased automation could be a huge barrier to economic development for countries still facing rapid population growth

In 2008, the UN projected the world’s population reaching 9.1 billion by 2050 and peaking at about 10 billion by 2100. It now anticipates a population of 9.7 billion in 2050, and 11.2 billion – and still rising – by 2100, because fertility rates in several countries have fallen more slowly than expected (in some, notably Egypt and Algeria, fertility has actually risen since 2005). While the combined population of East and Southeast Asia, the Americas, and Europe is projected to rise just 12 percent by 2050 and then start falling, sub-Saharan Africa’s population could rise from 960 million today to 2.1 billion by 2050 and almost four billion by 2100. North Africa’s population will likely double from today’s 220 million.

Such rapid population growth, on top of even faster increases over the last 50 years, is a major barrier to economic development. From 1950 to 2050, Uganda’s population will have increased 20-fold, and Niger’s 30-fold. Neither the industrialising countries of the 19th century nor the successful Asian catch-up economies of the late 20th century ever experienced anything close to such rates of population growth.

Such rates make it impossible to increase per capita capital stock or workforce skills fast enough to achieve economic catch-up, or to create jobs fast enough to prevent chronic underemployment. East Asia has gained a huge demographic dividend from rapid fertility declines: in much of Africa and the Middle East, the dividend is still missing.

In some countries, sheer population density also impedes growth. India’s population may stabilise within 50 years, but, with the number of people per square kilometre 2.5 times that of Western Europe and 11 times that of the contiguous United States, disputes over land acquisition for industrial development create serious barriers to economic growth. In much of Africa, density is not a problem, but in Rwanda competition for land, driven by high and rising density, was among the root causes of the 1994 genocide. By 2100, Uganda’s population density could be more than twice India’s current level.

Ageing in comfort
The demographic challenges facing advanced economies are slight in comparison. Greater longevity poses no threat to economic growth or pension-system sustainability as long as average retirement ages rise accordingly. Population stabilisation reduces pressure on environmental assets such as unspoiled countryside, which people value more as their incomes increase.

To be sure, rapid population decline would create difficulties. But if writers like Erik Brynjolfsson and Andrew McAfee are right that information technology will create new opportunities to automate jobs, gradual population decline could help offset falling demand for labour, which otherwise would generate unemployment and/or rising inequality.

On the other hand, increased automation could be a huge barrier to economic development for countries still facing rapid population growth. By making it possible to manufacture in almost workerless factories in advanced economies, automation could cut off the path of export-led growth that all of the successful East Asian economies pursued. The resulting high unemployment, particularly of young men, could foster political instability. The radical violence of ISIS has many roots, but the tripling of the population of North Africa and the Middle East over the last 50 years certainly is one of them.

Continued high unemployment throughout Africa and the Middle East, and political instability in many countries, may in turn make unrealistic the UN’s projection that Europe’s population will fall from 730 million today to 640 million by 2100. With Africa’s population likely to increase by more than three billion over the next 85 years, the European Union could be facing a wave of migration that makes current debates about accepting hundreds of thousands of asylum seekers seem irrelevant. The UN assumes net migration from Africa of just 34 million over the century – only one percent of the population increase. The actual figure could be many times that.

As a result, Europe’s population – unlike, say, that of East Asia or even the Americas – may well continue to rise throughout the century. This, some will say, will help “solve Europe’s ageing problem”. But, given that the ageing “problem” is overstated and solvable by other means, mass migration may instead undermine Europe’s ability to reap the benefits of a stable or gently falling population.

The gender issue
Both increased longevity and falling fertility rates are hugely positive developments for human welfare. Even in the highest-fertility countries, rates have fallen – from six or more children per woman in the 1960s to three or four today. The sooner fertility rates reach two or below, the better for humanity.

Achieving this goal does not require the unacceptable coerciveness of China’s one-child policy. It merely requires high levels of female education, the uninhibited supply of contraceptives, and freedom for women to make their own reproductive choices, unconstrained by the moral pressure of conservative religious authorities or of politicians operating under the delusion that rapid population growth will drive national economic success. Wherever these conditions prevail, and regardless of supposedly deep cultural differences – in Iran and Brazil as much as in Korea – fertility is now at or below replacement levels.

Sadly, this is not true in many other places. Ensuring that women are educated and free is by far the most important demographic challenge facing the world today. Worrying about the coming population decline in advanced countries is a meaningless diversion.

Adair Turner is Chairman of the Institute for New Economic Thinking

© Project Syndicate 2015

Five of the biggest names in fintech

Arguably no other area has had more of an impact on financial services than technology, and much has been said about the role of fintech firms in bringing the industry kicking and screaming into the digital age. More than a mere buzzword, ‘fintech’ encapsulates that which has reshaped financial services over the years, and key innovations have forced leading financial powers to overhaul their operating models and reconfigure their strategies.

Writing in the report The Rise of Fintech in Finance, Kantox’s CEO, Philippe Gelis, said: “Fintech is changing the finance sector just like the internet changed the written press and the music industries. In what is a stagnant sector monopolised by banks, finance is ripe for innovation and fintech is unquestionably the catalyst needed for change.”

Finance is ripe for innovation and fintech is unquestionably the catalyst needed
for change

Major banks and brokers no longer enjoy the monopoly over traditional revenue streams they once did, and the technology-first mind-set adopted by fintech firms has succeeded both in lowering costs and levelling the playing field. Having done much to advance the industry’s cause, fintech has captured the imagination of the investment community, and Accenture figures show investment in the sector last year was three times greater than the year before, at $12.2bn.

By plugging into the fast-evolving fintech ecosystem, major banks and insurers have been able to adapt more quickly to the changes sweeping financial services, and path breaking technological advances mean many have been able to cut costs and boost compliance. Despite being both a threat and an opportunity, the fintech revolution has done little to unseat the powers that be, though its significance in bringing disruptive ideas to the table can scarcely be ignored.

Here The New Economy takes a look at some of the biggest names in the business and their impact on financial services.

Stripe

Five years on from its formation, the San Francisco-based online payments firm Stripe announced it was entering a partnership with the world’s leading credit card company Visa. In doing so, Stripe raised its market cap to $5bn. Already in cahoots with Apple, Facebook and Twitter, the startup has made a habit of forming famous friendships, and has so far equipped each of its partners with the tools to manage and process online payments.

Stripe is confident its infrastructure is the surest way to accept payments online. Its stated ambition is to expand internet commerce by making online transactions more manageable.

The integration of social media and e-commerce has given those at the company reason to feel optimistic, and so too has Apple’s decision to work alongside it and a handful of partners for the launch of Apple Pay. The emergence of the ‘buy button’ on social media means Stripe is uniquely positioned to capitalise on the changing payments landscape, and its leading role in facilitating social media’s foray into e-commerce looks assured.

DueDil

The London-based startup pulls together data from thousands of sources so users can more easily contextualise and digest private company information. Short for ‘due diligence’, DueDil is little over four years old but already counts KPMG, Unilever and Dell among its users.

Created as a means of providing due diligence before a takeover deal goes through, the platform has since come to be seen as an effective means of finding new customers. According to the site: “DueDil Advanced Search enhances every step of the B2B lead generation process, because we believe we should be accountable for your numbers, just like someone on your team.”

A DueDil survey of 6,000 professionals says finding new customers was the biggest ongoing challenge of 2015, and the company’s Advanced Search function allows users to employ upwards of 40 filters to segment leads and find the right client.

DueDil’s ambition is to do for private companies what Bloomberg has done for public ones, and comparisons between the two are common. The similarities do not stop there, however; the appointment of Lex Fenwick, former CEO of Dow Jones and Bloomberg, to the board suggests DueDil will likely tread much the same path as his former employer.

Kabbage

The Atlanta-based startup this year posted a three-year growth rate of more than 6,700 percent and in doing so underlined its status as one of the fastest growing companies in the fintech sector. Kabbage “puts the power of business back in your hands by giving you instant access to funds”, according to its site.

Little over four years on from issuing its first loan, the company has handed out $1bn in capital and come to be seen as a leading provider of small business finance in both the UK and US. Kabbage has extended a welcome lifeline to a growing number of small businesses without the requisite revenue or legal structure to qualify for a traditional loan.

Small businesses can qualify for loans ranging from £1,000 to £40,000 simply by signing into their eBay and PayPal accounts, and allowing Kabbage to review and approve past transactions in real time. By employing more intelligent methods to determine whether businesses are eligible for a loan, funding has become much less of an issue.

Tipalti

Having set in stone a series of new partnerships and product upgrades, Tipalti this year extended its support to an additional 58 countries and underlined its status as the world’s leading partner payments automation platform.

Rarely is there a simple, common denominator when it comes to making payments around the globe, and the ability to streamline the payments management process means users can easily improve their efficiency and regulatory compliance. The firm automates and streamlines payments for more than 300,000 payees and processes more than $1.5bn on an annual basis.

A study conducted by Tipalti recently pinpointed the operational failures concerning affiliate payment and communication processes, and showed that almost 41 percent of respondents had stopped working with an affiliate network due to payment issues. What’s more, 65 percent of the remaining sample said they would do much the same if confronted with consistent payment problems.

The report serves as a reminder that proper and timely payment is crucial in maintaining a healthy working relationship, and 99.5 percent of the 250-partner sample agreed with the opinion that the issue was an important one.

TransferWise

Arguably the best-known fintech firm of the moment, TransferWise was founded in 2011 by Taavet Hinrikus and Kristo Käärmann to bring a greater measure of transparency to the small matter of sending money abroad. Recognised by the World Economic Forum earlier this year as one of the world’s most innovative companies, TransferWise allows users to avoid banking fees and make savings of up to 90 percent on their transfers.

Whereas banks often charge five percent in hidden fees, TransferWise charges only 0.5 percent, and for a more efficient service at that. Richard Branson spotted the startup’s potential and, last year, invested $25m in the so-called “anti-bank” and its “mission to make the world of money transfers a fairer place” for both individuals and businesses.

Infrequent users stand to save a sizeable percentage on any transfers they make, though the biggest beneficiaries by far are businesses that regularly transfer money overseas. Some have even gone so far as to suggest foreign exchange between banks could soon become a thing of the past.

The history of the dotcom bubble – and how it could happen again

Over the last couple of years, the tech industry has seen a number of colossal deals. From the $8.5bn Microsoft paid for Skype in late 2011, to the $22bn acquisition of messaging app WhatsApp by Facebook last year, tech start-ups have become hugely sought-after among investors. While many have been shocked by these huge valuations, the reality is the technology industry has been here before – and the last time it didn’t end happily.

From 1997 until early 2000, tech companies were the most exciting new investment to be made. As the internet increasingly became a tool for both businesses and consumers, and began to play a part in people’s everyday lives, a huge number of companies emerged, touting dazzling services that harnessed this brave digital new world.

Dotcom fever

The number of IPOs that doubled their prices in their first day:

2

1997

12

1998

107

1999

67

2000

0

2001

Source: PBS

Investors scrambled to give these new web-based companies money, hoping the returns when the start-ups floated would make them incredibly wealthy. In many cases, that proved to be the case, with the leaders of companies such as eBay, Amazon and Google gaining vast wealth and becoming titans of the tech community. Many others, however, weren’t so lucky.

When the dotcom bubble burst at the turn of the millennium, shareholders lost approximately $5trn. For such a new and exciting industry, the collapse was spectacular and brutal. Countless companies had come to market with supposedly revolutionary new technologies, all clamouring for attention: huge amounts of the investment funds venture capital firms were chucking at them were spent on marketing, expansion and, in some cases, lavish lifestyles. In other cases, they were simply not ready for market, with attempts to meet wildly unrealistic expectations doomed to failure.

When the collapse happened, between 1999 and 2001, large numbers of companies went bust and ceased trading. Others saw big declines in value, but somehow managed to cling on.

Is it happening again?
The concern in the industry that another bubble is forming has grown in recent months, with disappointing IPOs suggesting a slowdown in enthusiasm for tech stocks. However, if there is a bubble, it has been growing for longer than the two years it took for things to spiral out of control at the end of the last century. Many of the firms involved have been around for considerably longer than their late-90s counterparts, and have had time to build up more sustainable followings and infrastructure.

There is another key difference: many of the companies that became massively inflated towards the end of the 90s were publicly listed, meaning many of the creditors were regular investors seeking to profit from the tech boom. Today, the majority of the tech companies receiving big investment are privately held, and, while they are likely to eventually go public and give investors a return, they will probably have their valuations downgraded before doing so. And if they are overly leveraged and unable to get the IPO valuation and subsequent investment they require, then any collapse will be relatively isolated to the private markets.

Indeed, bigger players – be it Google, Facebook or Apple – snap many of today’s exciting new tech companies up long before they can reach their own IPO. This has been seen recently with Facebook’s acquisition of both WhatsApp and Instagram for seemingly inflated prices, as well as Apple’s purchase of Beats for $3bn.

Stephen Stanton-Downes, Head of Digital Implementation at consultancy firm Moorhouse, said many of these deals are defensive: “Facebook’s value proposition is dependent on it being the go-to for social conversations. At the time of the acquisition, mobile (smartphone) was taking off fast and Facebook’s own Messenger product was weak. Also, as can be seen from Facebook’s recent revenue growth, mobile advertising is pivotal and further strengthens the desire to buy out mobile social communications platforms that might present a credible threat.”

Source: PBS
Source: PBS

While some big firms tend to swallow up their new acquisitions, Facebook’s recent strategy has been to keep their brands distinct. “Leaving the brand of the acquired company intact attempts to minimise the likelihood of a mass exodus of customers”, said Stanton-Downes.

As the value of these deals increases and many tech firms look to IPOs to raise finance, many in the industry worry whether the strategy is sustainable. Stanton-Downes said it depends on the underlying business. “Some are sustainable and others not. Facebook’s IPO debuted at $38 per share in 2012 – the third largest in history. Today its share price is roughly $91. On the other hand, Zulily just sold to QVC for less than its IPO price – having spiked at near twice the initial offer price.”

Not all companies that emerged during the dotcom bubble collapsed; many of the most recognisable technology brands around today successfully navigated the choppy waters of that period. However, a number of firms that survived the crash did so with valuations much lower than those they had enjoyed at the height of the boom.

MarketWatch

$17

Opening share value

$9.75

Value at end of first day

$1.26

Share value in 2001

A recent Bloomberg article cited the example of MarketWatch. The business news website went public on January 17, 1999, with trading starting at $17 a share, before soaring to $97.5 by the close of the day. However, by 2001 the company’s share price had slumped to $1.26 a share. While it survived the crash, it was ultimately sold to Dow Jones & Co in 2005 at $18 a share. The company may have survived, but it required a hefty correction in valuation in order to do so.

Avoiding pitfalls
If today’s industry is to avoid a similarly painful collapse, the potential of new businesses is going to have to be realistically appraised based on market chances, rather than excitement over the technology they might offer.

“The internet allows for low transactions costs and low global distribution costs and hence huge leverage”, said Stanton-Downes. “Having said this, there are many tech companies attracting significant multiples based on the assumption they can use their same platforms to bring adjacent services to market; assumptions [that] may prove not to be valid in the long run. Uber, for example, is now valued at $50bn, even though it’s fighting a number of regulatory battles around the world.”

Investors need to take a similar stance when selecting which tech stocks to back. While a company’s technology might seem full of potential, without a business case, it is unlikely to deliver healthy returns, said Stanton-Downes. He cited the example of Twitter’s recent struggles to transform its vast user base into a profitable business. “Avoid the companies that aren’t making money or whose private equity multiples are crazy. Companies often use money from an IPO to scale, but if the business case is not there when the company is smaller, it’s not going to be there just because a company becomes bigger.”

While bubbles tend to be a natural feature of a market, the tech industry knows all too well just how damaging they can be. While it seems unlikely that the likes of Facebook and Twitter are due a collapse in value, history has shown that previously valuable stocks can collapse unless there is a strategy in place to ensure profitability.

Boohoo.com

Launched in 1998 by a group of Swedish developers, Boo.com was one of the highest profile casualties of the dotcom bubble. Based in the UK, the company was set up with the intention of creating a global online fashion store.

However, having secured considerable venture capital funding in 1998, Boo.com’s official website launch was repeatedly delayed; it eventually went live in the autumn of 1999.

Between its conception in 1998 and its eventual collapse in May 2000, the company managed to spend around $135m of its venture capital funding. This included expanding its initial number of staff from 40 employees in a small London office to 400 employees dotted around eight global offices.

Many reasons have been given for the collapse of the firm, from an overly aggressive expansion plan to constant complaints about the user experience of the website, but ultimately Boo.com was doomed because it got ahead of itself. It spent all its funding in a short space of time – just as investors were getting cold feet about tech firms – while the all-important sales failed to materialise. Investors lost millions, with just $2m recouped from the sale of its remaining assets.

Flooz.com

Flooz.com, a New York-based start-up that went online in early 1999, was the child of iVillage cofounder Robert Levitan, and one of the first companies to try and crack the online payment market. The platform worked by allowing consumers to use ‘Flooz’ credit on other e-commerce sites, such as clothing store J.Crew and bookshop Barnes & Noble.

Flooz.com spent considerable amounts on promotion, with expensive TV adverts featuring Hollywood actress and comedian Whoopi Goldberg frequently airing around its launch. The company managed $3m worth of credit card sales in the first year and $25m in its second.

Disaster struck, however, when the FBI informed the company it was being used by a group of Russian criminals to launder money. Levitan later admitted fraudulent activity at one point accounted for as much as 19 percent of all the transactions the site was processing. The damage to the firm’s reputation was done and Flooz.com collapsed in August 2001. It had used up all of its near $50m venture capital funding during its brief time in operation.

Pets.com

One of the first online retailers targeting pet owners, Pets.com launched in 1998 to considerable fanfare. A number of big name investors backed the business, including Amazon, which bought a 54 percent stake in the firm. Large sums of money were spent on a high profile marketing campaign (featuring a sock puppet), which included an extremely expensive $1.2m advert during the 2000 Super Bowl. Thanks to the wave of publicity, Pets.com managed to attract large numbers of customers, even though the management had done no market research prior to launching.

However, strong sales figures couldn’t make up for the huge amount of money the company had invested on infrastructure, including warehousing. With a revenue target of around $300m just to break even, the management expected the company would take between four and five years to achieve profitability. Generous discounts were offered to customers in an attempt to generate loyalty, but it meant the business was selling items at a 27 percent loss.

Despite an IPO in February 2000, enthusiasm for the stock collapsed over the course of the next six months. It came at the same time as a number of other dotcom bubble stocks collapsed, heightening the pessimism towards the business. Pets.com went into liquidation in early November 2000.

Inktomi

A software development firm that was launched in 1996 by UC Berkely professor Eric Brewer and graduate Paul Gauthier, Inktomi originally began life as an extension of a search engine designed at the university. It eventually became a provider of software solutions to internet service providers (ISPs), in particular the then-popular search engine HotBot. It would later develop the Traffic Server software, which was adopted by many big providers.

A series of acquisitions in 1998 grew the user base of Inktomi. These included shopping search engine C2B Technologies, as well as Impulse Buy Network and Webspective. The company also helped develop the ‘pay per click’ payment platform that is now used across the web. However, such an aggressive expansion strategy exposed the company to the dotcom bubble burst in 2000, which hit ISPs hard and wiped out much of the company’s user base.

At the height of the bubble, Inktomi was worth a staggering $25bn. This vastly inflated valuation was not sustainable, however, and merely reflected the over excitement of the time. Inktomi was eventually sold to Yahoo in 2002 for just $235m – a dramatic collapse in valuation over just two years.

TheGlobe.com

Launched in 1994, TheGlobe.com was one of the first forms of online social media. Founded by Stephan Paternot and Todd Krizleman, it was initially aimed at fellow college students as a chat room style communication platform. Soaring popularity led to the company securing $20m in venture capital funding in 1997, with the 23-year-old founders awarding themselves generous salaries of over $100,000 a year.

The firm became famous thanks to its IPO in November 1998, when it posted the largest first day increase in value in stock market history. It managed to raise $27.9m and increase in share price by 606 percent, giving it a market capitalisation of $840m.

Perhaps the lasting image of TheGlobe.com was one of a shiny-leather-trousered Paternot filmed in a Manhattan nightclub with his model girlfriend, boasting to a CNN journalist: “Got the girl. Got the money. Now I’m ready to live a disgusting, frivolous life.” Mocked as the “CEO in the plastic pants” by many, he became the poster-boy for dotcom millionaires.

In 2000, the bubble burst and the company’s share price was hit dramatically, with its value tumbling 95 percent to just $4m in 2001. The founders were forced out, cutbacks were made, and the rot set in. The company still exists as a shell corporation, but has no assets or operations.

Pixelon

Another company that got ahead of itself at the height of the bubble was supposedly high quality video streaming service Pixelon. The company became notorious for its lavish launch party in 1999, which cost $16m and featured performances from rock bands KISS and Dixie Chicks, singers Tony Bennett and Faith Hill, and even convinced rock band The Who to reform. The iBash’99 was held in October 1999 at the luxury MGM Grand Las Vegas hotel and casino. The event was supposed to be broadcast over the internet via Pixelon’s streaming technology, but the attempt failed, severely harming the company’s reputation.

Pixelon collapsed just one year later after it became apparent its technologies were not what its founder Michael Fenne had claimed. In fact, Fenne was actually called David Kim Stanley, and had previously been convicted of a series of stock scams. Stanley was reported to have arrived in California and been living out of the back of his car just two years before the launch of Pixelon. The company started to sack employees in 2000, before being forced to file for bankruptcy.

OECD: poor digital security poses serious economic risk

In recent years, the risk posed by large-scale digital security threats – which have the capability of creating massive economic consequences for businesses – have risen, not just in frequency, but in sophistication.

Last year, Sony had its systems breached
by hackers

Digital technologies have become central to the modern economy, but as a result they are at a greater risk from those who stand to gain from exploiting their flaws in their security.

In a recent OECD Recommendation on Digital Security Risk Management, the international economic organisation warns world leaders and CEOs in both the public and private sectors that they must make digital security a primary concern.

“Digital risk cannot be eliminated, and a totally secure digital environment is impossible if you want to reap the economic potential it opens up,” said OECD Science, Technology and Innovation Director Andrew Wyckoff. “But digital risk can be managed effectively. The leaders of an organisation are best-placed to steer the cultural and organisational changes needed to reduce this risk to an acceptable level.”

Last year, Sony had its systems breached by hackers who managed to steal everything from employment and salary records to private emails and personal documents, which was not only embarrassing for the company, but cost it considerable financial damage as well.

Sony is certainly not the only company to find out what can happen when online security protocols are found wanting, and it certainly won’t be the last. But it acts as a warning to those businesses and governments that it is essential to make digital security a priority.

Crowdfunding could create a greener world

“It’s a way for you to create a more secure future for your finances, your family and the world you live in”, according to Abundance Generation. Operating under the banner of “democratic finance”, the site is one of a growing band of alternative investment platforms making waves in the renewables business, and, more importantly perhaps, sending a signal to investors and policymakers.

“Crowdfunding has opened up renewables to a broader investment community by showing that it isn’t just something for ‘greens’ who are happy with lower returns; it’s in fact a sound financial investment that is also able to benefit our society and environment too”, said Karina Sidenius, Marketing Executive at Abundance.

Established in 2011 with a view to democratising investment in renewables, the platform has attracted close to 2,000 backers and over £11m in investment so far, all while giving supporters a reason to believe that crowdfunding has a part to play in the low carbon economy.

2,000

Backers on Abundance Generation

£11m

Investment raised

In an interview with BusinessGreen, Greg Baker, the UK’s Minister for Climate Change, called crowdfunding an “incredibly powerful” way to create a “decentralised energy system, and help achieve the goal of turning the Big Six into the Big 60,000”. There is arguably no greater impediment to renewables than lack of financing, and the emergence of dedicated investment platforms marks an important first step on the path to renewable and decentralised power.

Crowd Energy, Mosaic, Trillion Fund: each has done a great deal to underline the appetite for community-scale renewables projects and demonstrate that crowdfunding can ignite interest in the sector.

“Crowdfunding platforms have created a new way for individuals, families and communities to support and take a stake in clean energy projects”, said Joss Garman, Associate Director for Energy, Transport and Climate Change at the Institute for Public Policy Research. “Given the returns available on many of these energy projects, and the willingness of many people to invest small – or even larger – sums in them, crowdfunding facilities provide the means by which to connect them with one another. By broadening the pool of people who are able to own a part of our energy system, crowdfunding is helping to democratise the sector and to provide a much-needed new source of financing.”

Benefits to the investor
The premise is simple: investors need only choose from a list of projects and the amount they want to invest, and they will receive a return on the project once it’s operational.

“Unlike traditional investing via funds, which can demand starting amounts of £1,000 plus from investors, crowdfunding typically requires very low minimum investments, from as little as £5 (although £50 or £500 minimums are more normal)”, said Rebecca O’Connor, Content and Communications Director of Trillion Fund. “This means people who have smaller amounts of spare cash to put aside (but are not less savvy investors) can get involved. Crowdfunding also appeals to the growing band of ‘self-directed’ investors; people who do not have a high enough net worth to afford financial advice, but do have enough spare cash to save and invest.”

In addition, the risks are relatively minor; operational projects generate steady, long-term revenue flows, supported by government incentives linked to inflation, which means they function much like income investments that deliver a dividend every few months. O’Connor added that the framework gives investors some degree of certainty they will get a decent risk-adjusted return at a level that beats savings rates – albeit with greater risks – if they are refinancing existing assets.

Renewables and crowdfunding are well suited, considering campaigns often take a social or environmental strand and routinely target areas that escape the attention of traditional investors, for whom quick returns take precedent.

“Community projects benefit local people because they give them more energy security and fewer carbon emissions, as well as lower bills in some locations”, said O’Connor. “Furthermore, there is a sense that renewable energy, which is a free and limitless resource, should be owned by ordinary people, and not companies hungry for pure profit (who are disinclined to develop it anyway because in general it is not profitable enough for them). Giving people a stake in the development of renewables ultimately means that more projects will be approved and built.”

Community and company
This renewables crowdfunding movement is particularly notable in the UK, though it’s in Germany that the fruits of community-funded renewables can best be seen, with over 50 percent of renewable capacity community-owned. Already responsible for €63bn in the decade up to 2013, there are now predictions that approximately 20 percent of UK renewables could be people-powered by 2020. However, the phenomenon is not necessarily exclusive to individuals, and corporate demand for renewable energy, as well as a willingness to participate in community-scale projects, is on the up.

Businesses can help locate the best opportunities and connect them with investors

“Businesses stand to gain huge cost savings and new revenue streams from the development of renewable energy on their land or premises, or even just in the local area”, said O’Connor. “Supporting local projects is great marketing; offering local people a stake in an installation on your building or in a local business park is even better. Businesses can match fund projects, putting in half of the loan or investment and offering the rest to the crowd. The sense of a shared interest and a common goal between businesses and local people is very positive, and the presence of a commercial interest can give individuals more confidence in the project. Businesses that have installed their own projects, from IKEA to small holdings in the Dales, have seen energy bills fall to virtually zero in some cases.”

Garman added: “Since many renewable projects are relatively small-scale and dotted all over the country, businesses can help locate the best opportunities and connect them with investors. Businesses can also offer their expertise and understanding of what is and is not likely to be a successful development to screen out the riskiest projects. Given the scale of expansion foreseen in low-carbon energy, there should be ample opportunities for businesses to challenge the dominance of the big corporations that currently dominate the market, and you can see precedents for this already happening in more advanced renewable energy economies like Germany, Denmark and California, where the ownership of the energy system is much less concentrated in the hands of a few.”

According to Sidenius, business could have a part to play either by working alongside developers to install renewable measures themselves, or by investing in the projects to secure both the business and workers with a long-term income. Already, Ecology Building Society has made one investment on Abundance and expects to make more in the years ahead in order to meet its costs, pay returns to savers, and grow its capital.

While it’s true that crowdfunding hands the impetus to the individuals who make a community, these projects can reduce energy costs for businesses too, and promise valuable reputational points for participating companies. Responsible companies are always looking for an opportunity to demonstrate their social contribution, and involvement in a crowd- or community-funded project not only allows them to do this, but also to build a relationship with consumers. Crowdfunding has received attention precisely because it allows consumers to play more of a role in the investment process, although businesses stand to benefit a great deal from the shift.

What’s stopping carbon capture and storage?

In 2006, Norway’s then-prime minister Jens Stoltenberg loudly proclaimed carbon capture and storage (CCS) would be his country’s equivalent of the “Moon landing”, and ploughed billions of kroner into a plan to greatly reduce man-made emissions in that way.

Though often hailed as a leading method in the fight against climate change, CCS has time and again failed to deliver on its promise, perhaps most dramatically in 2013, when Norway called an end to its moonshot. After the country spent a mammoth $1.2bn on projects at Kårstø and Mongstad, and hundreds of millions more on research, the Norwegian Government was effectively forced to pull the plug on its large-scale carbon-capture plan, and in doing so curb mounting enthusiasm for CCS.

Project overruns were one thing, but the eye watering costs associated with the technology proved to be the final nail in the coffin – not just for Norway but also throughout the EU and US. The sad truth of the matter is investment in CCS has failed to get off the ground for nigh on a decade, and will continue to struggle for as long as cheaper options are available.

“The technology could – if governments commit to specific policies – account for nearly one fifth of the emissions reduction required to cut GHG emissions from energy use in half by 2050”, said an optimistic International Energy Agency (IEA) in 2012. However, the considerable doubts expressed in the same report have grown over the last three years, and many experts have been quick to label the technology – or at least the couple of dozen projects dedicated to it – a disappointment.

22

CCS projects in operational or construction phase

34

CCS projects in the planning stage

30

IEA target for operational plants by 2020

Despite this, a commitment now could restore CCS’ credentials on the global energy stage and save countless coal plants from closure. Speaking on whether the technology has fallen short of its promise, Paul Fennell, Reader in Clean Energy at Imperial College London, gave a firm “No. The technology works as well as anyone could hope – it is fairly basic engineering. What has failed has been global appetite to do anything about climate change”.

With rosy predictions for CCS still in place, and targets for the near future not yet out of reach, optimists are betting on the backing of industry and government to jolt this dormant technology into action.

“As long as fossil fuels and carbon-intensive industries play dominant roles in our economies, CCS will remain a critical greenhouse gas reduction solution”, said the IEA’s Executive Director, Maria van der Hoeven. “After many years of research, development, and valuable but rather limited practical experience, we now need to shift to a higher gear in developing CCS into a true energy option, to be deployed in large scale.”

Failed projects
Analysis conducted by Simon Evans and Rosamund Pearce of the Carbon Brief showed that, as of the end of 2014, there were 22 CCS projects either in the operational or construction phase. Going by IEA estimates, the number of operational plants must reach 30 and capture 50 million tonnes of CO2 by 2020: a target that is not altogether unreachable, given there are another 34 projects in the planning stage. Less achievable is the mid-century target of 7,000 million tonnes, which looks to be all but a pipe dream unless the political and corporate will to curtail emissions grows bolder.

The premise of the technology is simple, and so too are its attractions: the process of carbon sequestration strips CO2 from fossil fuels before or after they are burnt and pumps it back into the ground. Professor David King, former UK chief scientist, called it “the only hope for mankind”, and a string of academics have made similarly ambitious claims about the technology, only to come away feeling short-changed.

Asked whether he thought CCS would play a major part in the fight against climate change, Fennell said: “Without it the cost of decarbonisation is more than twice as much.” He said there were widespread cost overruns and delays “because the world is not serious about decarbonisation yet – if it were, there would be a sufficiently high price on carbon emission that would allow the support of CCS via an efficient market mechanism”.

The Kemper County Energy Facility in Mississippi encapsulates much of that which has kept CCS from going mainstream. Three years behind schedule and almost three times over budget, even those responsible for getting the project off the ground have expressed doubts about its viability.

Utilities eyeing the Kemper project are unlikely to acquire a taste for CCS, with renewables and low carbon alternatives showing themselves to be both cheaper and more accessible. Essentially, the two-thirds reduction – in the case of Kemper at least – equates to approximately the same emissions reduction as switching to natural gas, and there are few – if any – justifiable reasons operators would chose CCS over the more mature and cheaper alternatives.

Encouraging adoption
Those optimistic about the technology’s future, meanwhile, insist a string of failed projects allow latecomers to succeed where others have not, now the data from early-stage experiments is more readily available and some of the more obvious mistakes ironed out.

Professor David King called CCS “the only hope for mankind”

A great deal of enthusiasm should also be gained from Saskatchewan’s Boundary Dam, which last year became the world’s first full-scale CCS facility to enter into the operational stage. Speaking about the project, Brad Wall, the Premier of Saskatchewan, said: “There is only one way we can square this circle of slashing greenhouse gases, while ensuring economic growth continues, and a big part of that, absolutely, is CCS.”

Boundary Dam was born largely as a result of emissions caps imposed on coal plants; plans to do much the same in the US and Europe could pave the way for future projects, giving CCS the impetus it needs. Despite much of what has been said about the process, the issue lies not with the technology itself but the failure of governments to introduce a system in which polluters are appropriately penalised for their contributions.

Once those caps come into play, operators will likely show much more enthusiasm for the technology, and an ability to reduce emissions – and in turn financial penalties for exceeding the cap – might be reason enough for plant operators to invest in CCS. For now, at least, the incentives are not enough to warrant the investment.

The fact remains that fossil fuels still constitute a considerable chunk of the global energy mix, and, for as long as this is the case, CCS has a part to play in slashing emissions. Utilities might struggle to see the benefits here and now (and understandably so), but once the issue of carbon pricing is appropriately accounted for, governments and corporations alike could be looking at making a huge saving from CCS.

Retailers must go offline to improve their bottom line

Alibaba has taken time out this last year to pour $4.6bn into the bricks-and-mortar electronics chain Suning and strike up a partnership with KFC China, all with a view to beefing up its offline presence. It’s proof that even China’s online marketplace cannot guarantee consistent returns.

Close rivals Baidu and Tencent have mirrored Alibaba’s online-to-offline shift, marking a turning point not just for China but for retail overall. While online sales have been accused of strangling bricks-and-mortar’s development, the relationship is becoming a more complex – and complementary – affair. Internet companies are only now beginning to realise that the offline experience brings benefits to shoppers that online simply cannot.

Offline shift
“The long-promised multichannel or omnichannel environment is finally upon us”, said Ray Hartjen, Director of Marketing and Public Relations at analytics company RetailNext, “and it’s a blend of various branded touchpoints – stores, online sites, kiosks, mobile apps, catalogues, call centres and more – literally every way a retailer can connect and engage with shoppers.”

The dawn of e-commerce saw countless bricks and mortar retailers take to the web for survival, but until recently, there had been few instances of online retailers doing the opposite.

76%

Of CIOs put the integration of selling channels in their top three priorities

“Smart online retailers see the opportunity to reach shoppers who value personal service, the ability to see or try products, or the convenience of buying a product and taking it home right away”, said Bill McCarthy, CEO of EMEA at ShopperTrak.

Greg Girard, Programme Director of World-Wide Omni-Channel Retail Analytics Strategies at IDC Retail Insights added: “They want to give customers more choices for shopping, experience, and fulfilment, greater exposure to the brand, and direct interaction with the product.”

Even Google opened a physical store earlier this year, complete with a ‘doodle wall’ and interactive map. Amazon also opened its own shop, albeit with an emphasis on student services. All things considered however, the shift is in its early stages. Retailers would do well to temper their enthusiasm for brick-and-mortar, and any decision to focus on the physical and only physical is to ignore the larger transformation at hand.

“I think that a proportion of online retailers will want a few stores, but the idea that lots of online retailers will go offline as well is a bit of a fantasy”, said Professor Joshua Bamfield, Director of the Centre for Retail Research. “Apart from anything else, it ignores the role of culture in the people who run online businesses. Alibaba has some logistics purpose in its purchase of Suning, and I would be surprised if they extended much further into offline – but of course it does have a lot of money to spend and you don’t need to read Oliver E Williamson to know that cash can produce non-rational decisions.”

More than a return to offline, this newfound receptiveness to physical stores marks a response to weak consumer demand, changing consumer behaviour, rising occupancy costs and the fracturing of the business model. It has taken some time to set in, but it has finally hit.

“Retailers are coming to the realisation that each channel has its own inherent advantages”, said Hartjen, “and that, when tied together to deliver a seamless, branded experience, they deliver value to the shopper throughout their entire shopping journey.”

Omnichannel experiences
Analysts once made all sorts of predictions about the death of brick-and-mortar, and a string of studies into changing consumption patterns and shrinking demand appeared to suggest the physical was falling foul of digital.

Not content to lie down and concede these losses, bricks and mortar retailers have made a decent fist of improving their competencies online, while also incorporating elements of the digital experience in-store. The integration of on- and offline platforms has brought a new breed of retail to the masses, and in today’s hyper-connected marketplace, anything less than an omnichannel experience is inadequate. “The ultimate goal of a retailer is to realise their brand promise no matter how they interact with a customer”, said McCarthy.

Going by the results of a recent Forrester survey, conducted jointly by the market research company and the National Retail Federation, 76 percent of CIOs felt the integration of selling channels was among their top three priorities for 2015 (up from 64 percent in 2013).

A shopper’s best experience anywhere sets his or her minimum expectation everywhere

“Consumers don’t shop channels, they shop retailers”, said Girad. “Omnichannel retail represents the mature capabilities to converging all of a retailer’s channels into a singular consistent experience. The fundamental truth is that today’s shopper (remember millennialism is a lifestyle and a set of habits, not an age cohort) expects every retailer to engage him or her in an omnichannel sort of way. A good way to think about this is that a shopper’s best experience anywhere sets his or her minimum expectation everywhere. They see something and ask: ‘Why can’t (retailer X) do that?’ So omnichannel is table stakes.”

In-store innovation
It’s no longer enough to offer customers the anytime, anywhere approach, and in place of a focus on digital has come another mode of retail, where the winners are the ones who bring something closer to a multichannel approach. Bricks and mortar businesses have spent years playing catch up and online retailers must now boost their offline competencies.

“Technology has enabled both the retailer and the shopper”, said McCarthy. “Shoppers have much more information on products and prices before they even arrive at a physical store. That makes shoppers more efficient with their time and money. Fortunately, retailers are also more knowledgeable and efficient, as their technology solutions help them understand their customers better. This in turn means that they can deliver the right level of service, offer the right products, and provide convenient shopping choices both in the store and online.”

Meanwhile, the influence of technology can be seen in store, and, as a growing number of retailers work towards an omnichannel experience, we’ll likely see a host of changes sweep the in-store environment.

“We’re only in the early stages in how technologies will fundamentally change the in-store experience, so it’s really more about how technology ‘is changing, will change’ not ‘has changed’ the store experience”, said Girard. “A few areas stand out: 
engagement of consumers via location-based, in-context text messages and alerts;
insight into store operations and performance
insight into customer behaviour in the store;
better execution of in-store processes (customer-facing and operations, e.g. keeping products on the shelves); and
improving customer-associate interactions.”

Expect to see the word ‘omnichannel’ crop up a fair few times in the months and years ahead, as online retailers, for whom growth has come easily up till now, get to grips with the offline experience. “Retailers are now able to develop deep insights on shoppers and their shopping behaviours, and as a result are able to make data-driven decisions on everything from store design and merchandising, to store operations and staffing”, said Hartjen. More than that, however, the introduction of tech-savvy retailers to physical retail could give brick-and-mortar the pick-me-up it so desperately needs, as digital-first brands pull the physical experience into the digital age.

Blockchain’s technology has far-reaching implications

Some time in 2013, online currency Bitcoin emerged from the stranger corners of the internet and began attracting mainstream attention. It was actually created in 2009 and was for a long time championed almost exclusively by cyber-libertarians and online drug dealers. While interest has waned since the halcyon days of two years ago, the technology underpinning Bitcoin, known as blockchain, is increasingly being considered for other applications, from music streaming to historical record keeping.

Blockchain should streamline transactions made on the
NASDAQ exchange

Blockchain is a giant public ledger. Its use for Bitcoin has been to track who owns how much. The ownership of a bitcoin is essentially ownership of a piece of information, recorded on a long, ever expanding chain of blocks; currency units are transferred from one person to the next as part of a new transaction block, to which is added the previous blocks in the chain. A new block is added to the chain roughly every 10 minutes. So far, according to the The Economist, the Bitcoin blockchain is “already over 8,000 times the length of the Bible”.

This technology, decentralised and encrypted, records every detail in long chains of data. Thousands of computers – known as miners – update the system, so no centralised authority is needed to approve or oversee the validity of the information (i.e. the currency of Bitcoin) on the ledger. At the same time, the way the technology is designed makes it immune to being tampered with by any single user; each miner has to agree to each transfer.

This distributed blockchain network, which creates a permanent, public record of every single Bitcoin transfer (essentially just information updates), could be used to record far more. The hope is the block building technology can be applied to other areas where permanent and secure records with an open and clear ledger of ownership are needed.

Branching out
In June, NASDAQ agreed to start trials using blockchain for trading IPO shares. In a press release, the organisation said it would be working with the blockchain developer company Chain to determine how “to leverage the blockchain platform to facilitate the secure issuance and transfer of shares of privately-held companies. Chain, currently privately held, plans to be the inaugural company to use the blockchain technology on NASDAQ Private Market”. Blockchain, it is hoped, will allow stockholders to “seamlessly transfer securities between entities, and companies and their affiliates can be provided with a complete historical record of issuance and transfer of their securities. Importantly, the use of a blockchain-based distributed ledger can also offer integrity, audit ability, issuance governance and transfer of ownership capabilities”.

It is time-consuming and expensive to keep track of the ownership structure of a company. As MIT Technology Review noted, often “companies manage their own data in a spreadsheet program like Microsoft Excel, and pay lawyers to validate the information every time the table changes”. In theory, blockchain should streamline transactions made on the NASDAQ exchange and make ownership record keeping cheaper and more accurate.

Meanwhile, in quite another market, the music streaming industry has come under a lot of pressure for how it remunerates artists. A recent report by the Berklee College of Music and the Institute of Creative Entrepreneurship Rethink Music claimed “of the $15bn in global recorded music revenue for sound recordings reported by the IFPI for 2014, only a small portion of the money beyond the initial recording advances ultimately makes its way to artists as ongoing revenue”. Due to “faster release cycles, proliferating online services, and creative licensing structures”, it continued, “finances and revenue [are] even more complex to understand and manage”.

The complexity of the system means that, according to Coin Telegraph, “hundreds of millions of dollars are lost under the current system, in which musicians could not get hold of their revenues and the owners of a song are not accurately identified”. The most high-profile instance of underpayment of artists is the case of Taylor Swift, who in November 2014 received $500,000 for a year’s worth of streaming. While this figure seems high (it is the equivalent of selling 50,000 albums), the CEO of Big Machine, Scott Borchetta, pointed out it was pitifully low for an artist of Swift’s popularity.

Blockchain could offer a more thorough way of recording history

Blockchain, with its open ledger chains of data, offers a solution. As Coin Telegraph argued, “by keeping a public ledger of transactions, the deals between labels and streaming companies would not require third party organisations to take a cut of the revenues. By implementing blockchain technology, the payments would be automatically separated and distributed instantly”.

Making history
Ultimately, blockchain is a keeper of records. Away from being used to divvy up payments and recording ownership, it is also being touted as the ultimate record keeper of human history. Civilisations have been making and keeping records for the note of future generations since the advent of human writing, starting with Sumerian cuneiform tablets way back in the 31st century BC. The further back in recorded history we go, usually, the patchier the records get; fires, weather, nihilistic conquerors and other misfortunes have resulted in the loss of (obviously) unknown amounts of historical documents and chronicles. In more modern times too, governments and businesses have attempted to alter, fabricate and expunge politically inconvenient records. Blockchain could offer a more thorough way of recording history, without the possibility of manipulation or damage to material records.

The first Bitcoin transaction was made with the Genesis block chain in January 2009. According to David A Johnston, writing on the website Medium, this was “the first time, in the whole of human history, a truly persistent and immutable ledger of record began operating”. If blockchain does prove to be such a historic piece of record keeping, this will perhaps be its greatest legacy, beyond its usefulness for business operations in the present. The historians of the future will have a trove of endless information to sift through – assuming the private companies using them do not seal off blockchain records. Cultural historians will be able to meticulously study the structure of the early 21st century music industry, while stock ownership patterns could be fully reconstructed by economic historians.

As The Economist said: “Asked to name an event that has reshaped finance in recent years, bankers will point to the collapse of Lehman Brothers on September 15, 2008… Fintech types are more likely to mention something that happened six weeks later.” It meant the first use of blockchain, in the form of Bitcoin. Yet even if blockchain merely proves to be a minor technology fix for a few problems such as music streaming, the accuracy of the information it records could well change the way history is recorded.

Facebook changes tack with Messenger for Business tool

As businesses fully embrace the social media frenzy of the 21st century, Facebook has again upped its game in the world of commercialisation. The latest tool offered by the social media giant is Messenger for Business. As Facebook’s over one billion monthly users will know too well, Messenger is a live chat mechanism that was introduced in order to allow ‘friends’ to exchange messages in real time, for free.

Messenger for Business lets companies interact with consumers quickly and efficiently, allowing them to bolster their customer service offerings, and is estimated to require around half the cost of operating a phone centre. Through the app, communication history between the individual and the company is easily recorded and can be reread by both parties each time a conversation restarts, offering a smoother experience.

The service will be particularly useful for customers who need to ask a question about a product they are viewing online, saving them the lengthy wait for an email response or an expensive phone call. In fact, Craig Borowski, Market Researcher for CRM research and evaluation firm Software Advice said: “From what we’ve seen in our surveys, consumers prefer live chat because it’s very easy to use and it provides immediate answers to their questions. These qualities really can’t be matched by other service channels.”

Quick time
Consumers demand high-quality customer service and greater levels of engagement from companies than ever before, regardless of size or industry, while on social media immediate responses and instant communication have become the norm.

Customer satisfaction with service channels

77%

Live chat

61%

Email

44%

Smartphones

“Consumers can communicate and engage with businesses within the same messaging app they already use every day with friends and contacts, on their own terms”, said Nick Peart, EMEA Marketing Director for Zendesk, a cloud-based customer service platform that has partnered with Facebook to help businesses communicate with the messaging app. “This allows consumers to have a more personal and ongoing conversation with the business.”

Companies are getting better at meeting these evolving consumer expectations, but the lag itself causes its own reaction. “Even if the business in question answers emails immediately and always has phone operators standing by, that’s somewhat secondary, because most consumers expect that an email will take time to answer and that calling a company nearly always requires waiting on hold or some other inconvenience”, said Borowski. “These expectations determine consumer behaviour, even if they’re not correct for every situation.”

According to a consumer survey carried out by eDigital’s Customer Service Benchmark, live chat has the highest level of satisfaction for any customer service channel, at 77 percent (followed by email with 61 percent, and smartphones with 44 percent). As a result, it is reasonable to assume more companies will adopt the service in the near future, particularly if and when a live chat option becomes a determining factor in which sites customers choose to shop on.

“Right off the bat, they’re engaging better with the customers by meeting their preferences”, said Borowski. “With live chat, more of those questions will get asked and answered, and more customers will move beyond that purchase barrier. The end result is that the business offering live chat will have more purchasing customers.”

Facebook era
What makes Messenger for Business so smart and also so promising is the fact Facebook is the second most visited website in the world; according to the company’s Q2 2014 report, on average, users spend around 40 minutes a day on the site. By using tit as a commercial platform, companies can harness some of its mammoth user base and highly frequent patronage. “It also offers powerful analytics and reporting capabilities, so a business can identify customer satisfaction ratings, customer service agent performance, and help identify problems before they escalate”, said Peart.

That being said, it is not enough for companies to simply employ Messenger or alternative live chat tools – they must do so effectively. Implementation will be a challenge; particularly deciding on which web pages and in which situations the service will be offered. This will require in depth research to clearly define what customers are trying to achieve at various points on the site(s), while also ensuring they are not frustrated by overzealous and intrusive pop ups.

Borowski advised companies “map out the various customer journeys, understand where customers are more apt to have difficulty or questions. Then use live chat to help them along their journey, towards their goal – whether that’s making a purchase, comparing products, or getting general customer service questions answered. The more strategically live chat is implemented, the more success it will bring”. It is also crucial to train agents to effectively handle multiple conversations simultaneously.

In time, companies will improve the service they offer through Messenger as they develop the tools and personnel needed to provide more helpful and efficient assistance, while also bolstering the number of direct communications that can take place at the same time. As Borowski explained, this may involve a first tier level of support consisting of automated responses for frequently asked questions, and then a second tier in which an agent steps in to deal with more complex queries.

The transition may not take all that long, particularly as an increasing number of consumers turn to Facebook for their needs. Messenger for Business just adds another facet to Facebook as a platform for e-commerce; it is already used by some of the world’s biggest brands, including Disney, Coca-Cola and Starbucks. Through the continued development of its platform for commercial activities, the site is able to offer a far more integrated online experience for users – a new norm that can be expected to progress in the coming years. In the meantime, Messenger is actually a great tool that will benefit consumers and businesses alike.