The seemingly unstoppable shift towards online retailing over the last decade or so has meant that high street stores have faced increasingly tough conditions. With many established retail brands going out of business, it has seemed like shopping the future will be largely conducted online.
RadioShack is said to be considering selling half of its 4,000 US-wide stores to telecoms firm Sprint
This move has been spearheaded by US web giant Amazon, but the company might be about to make a tentative reversal towards having physical retail stores of its own. According to two sources that spoke to Bloomberg, Amazon is looking to acquire some of ailing electronics chain RadioShack’s stores as a way of showcasing its products and offering customers a place to pick up purchases.
RadioShack is currently in the process of filing for bankruptcy after it saw its position as America’s top electronics retailer collapse during 2014. In September it replaced its Chief Financial Officer with a bankruptcy specialist, while defaulting on a $250m loan received from two private equity companies the previous year. In January, the New York Stock Exchange issued a warning to the company that it would be delisted if its market capitalisation remained below $50m for more than 30 consecutive days.
By contrast, Amazon has grown over the last two decades from a online bookseller to the world’s leading internet retailer, as well as provider of all manner of other digital services. These include music and television services, as well as hardware. It is thought the stores it wants to buy from RadioShack will primarily be used to showcase their range of Fire tablets and smartphones.
According to other reports, RadioShack is said to be considering selling half of its 4,000 US-wide stores to telecoms firm Sprint, while shutting the rest. If Amazon was to acquire these remaining stores, it would gain a considerable high street presence and help it compete with its new hardware rival Apple.
Apple has announced plans for a $2bn investment in order to convert a failed factory in Arizona into a new data and command centre. The hub will control traffic for Apple services, such as iCloud, Siri and iTunes.
A host of engineering and construction jobs will be created for the area
The plant, which was a joint endeavour between Apple and GT Advanced Technologies (GTAT), was established in 2013 to make phone screens for the tech powerhouse. Despite plans to incorporate the factory into Apple’s supply chain and high hopes for the sapphire glass industry, the plant was unable to meet quality requirements and GTAT filed for bankruptcy in October.
Apple originally invested in the factory in order to create jobs and promote production within the US, thereby making a shift away from Asian manufacturing. The new data centre will employ 150 full-time staff, in place of the 700 employees axed at the end of last year. “I got the impression from Apple that they wanted to do the right thing and we’re excited that they are doing it here in Arizona”, the state’s governor, Doug Ducey, told Bloomberg.
In statements made by both Ducey and Apple, a host of engineering and construction jobs will be created for the area as plans are unveiled to power the facility completely with renewable energy. According to Apple, its new command hub will incorporate a solar farm with enough energy to power 14,500 homes.
As part of the agreement forged between the two companies, Apple will allow GTAT to use the 1.3mil square foot facility until December this year.
This is great news for the city of Guilin, as tourism accounted for nearly 20 percent of its economic output in 2012, with much of the visitors coming primarily to gaze at the iconic river and surrounding karst formations in southwest China’s Guangxi Zhuang autonomous region.
The Guilin municipal government has worked tirelessly to maintain the water quality and prevent pollution
The Guilin municipal government has worked tirelessly to maintain the water quality and prevent pollution by using methods, such as relocating industries, building wastewater treatment plants and landfills, and rehabilitating tributaries.
Implemented between 1998 and 2007, the earlier project focused on improving the environmental conditions in Guilin’s core urban area by improving wastewater collection and treatment, as well as effluent regulations and solid waste management.
But the new Guilin Integrated Environment Management Project, which is supported by the new loan from the World Bank, is to be implemented from 2015 to 2020, and will focus on improving the water supply system and the collection and treatment capacity of Guilin’s wastewater treatment plants.
It will also enhance sludge management, and strengthening water quality monitoring and pollution management to benefit local communities, as well as tourists.
“The new project will support these efforts, building on our past experience from the World Bank-supported Guangxi Urban Environment Project”, said Sing Cho, the World Bank’s Urban Specialist and task team leader for the project.
Big data is a rapidly growing industry and it certainly has its benefits for businesses. This app, available on iOS, interprets all-important data and transforms it into simple-to-read graphs, giving companies eager to get in on analytics the chance to do so at the mere touch of a (smart) button.
Bizzabo
This app, available on both Android and iOS, offers members of a conference the opportunity to connect with fellow attendees via all the usual social media sites – from LinkedIn to Facebook – through one simple platform. To get it going, the conference organiser just needs to download and use the app themselves.
MailChimp
For businesses looking to develop email marketing, MailChimp is one to treasure. It helps users to monitor the success of email campaigns by showing how many people have opened and clicked on mail sent out via the app or website. It’s also useful for keeping up to date with the number of people on mailing lists, and it can help companies to focus, target and modify their campaigns accordingly.
Evernote
The era of pen and paper seems to be slowly dying out and it’s little surprise given apps like this handy one, which offers a range of helpful features besides the simple (yet effective) digital note-taking staple. It lets users create, save and share audio notes, images and web pages with colleagues and also other devices, forming a useful platform for businesspeople on the go.
HopStop
For those embarking on a business trip and unsure of how to navigate around using the local transport, HopStop has the answer and it works in a large number of European and American cities. The app displays accurate information on all viable transport options, from biking it to hiking it, bus-ing to taxi-ing, as well as a map and a feature allowing users to track down nearby stations.
Norway’s largest pension fund Kommunal Landspensjonskasse (KLP) recently announced it had decided to pull out of its investments in companies that derive a large proportion of their revenues from coal. The announcement came hot on the heels of a similar move by the Swedish fund Second AP, which announced in October that it will pull out of its investments in 12 coal and eight oil and gas production companies – together accounting for a divestment of holdings with a total market value of about SEK840m (£72m).
And it is not just pension funds that are pulling the plug on fossil fuel investments. A number of institutions in the higher education sector, including Stanford University and Glasgow University have followed the trend, with organisations as diverse as local authorities, medical and religious institutions, and the Rockefeller Foundation following suit.
According to Nathaniel Bullard, Director of Content at Bloomberg New Energy Finance and author of a recent Bloomberg white paper Fossil fuel divestment: a $5trn challenge, there are “numerous motivations” for divestment, including a desire to reduce exposure to the growing amount of financial risks associated with holding fossil fuel assets – resulting from the regulation of emissions, higher potential prices or falling demand – and an inclination to remove capital from investments investors see as ‘wrong’ in some way. “Objectives are again numerous, and include portfolio diversification or investment security on the part of professional investors and a moral duty on the part of others”, he adds.
£72m
Total market value of KLP’s divested holdings
Ben Caldecott, Programme Director at the University of Oxford’s Smith School of Enterprise, and the Environment and Founder and Director of its Stranded Assets Programme, agrees there are both ethical and investment arguments for fossil fuel divestment. He points out fossil fuel assets could eventually be vulnerable to “material environment related risks”, which investors are keen to manage and mitigate over the long term: “I think these investment arguments are gaining increasing traction among a wide range of investors.”
Stranded assets
In addition to environmental concerns and a commitment to adhere to ethical investment strategies, many divestment strategies are driven by a desire to avoid being left with assets that may become ‘stranded’ as a result of emissions regulations. As Caldecott explains, stranded assets are those that have “suffered from unanticipated or premature write-downs, devaluations, or conversion to liabilities”, and can be caused by a variety of risks.
“Increasingly, risk factors related to the environment are stranding assets and this trend is accelerating, potentially representing a discontinuity able to profoundly alter asset values across a wide range of sectors”, he says.
In Caldecott’s view, there are several useful strategies investors and businesses could adopt to manage the potentially negative consequences of stranded assets, including the accurate monitoring of risk exposure and a commitment to “managing it over the long term”: “This might involve investing in new products less exposed to climate change risks, engaging more proactively with fossil fuel companies, or divesting.”
More broadly speaking, Bullard believes “careful and constant assessments of supply, demand, regulation and changing behaviours” are important strategies to adopt for those investors in energy equities who are keen to manage the potential financial risks of holding fossil-fuel industry assets. “Fortunately for those holding public equities, their investments are liquid and for the most part, investors are free to buy and sell stocks at any time as their assessment of risks and rewards dictate”, he says.
As far as larger pensions funds are concerned, Bullard also argues, if institutional investors decide to divest from fossil fuel holdings, it will not occur in response to pressure from divestment campaign groups, but rather as a result of a considered “reallocation of assets”. He says: “I believe some investors may reweight their portfolios towards other assets, but in so doing they will not necessarily adopt the rhetoric of divestment, as such.”
Market impact
In the short term at least, Jeanett Bergan, Head of Responsible Investments at KLP says it is “difficult to conclude” that such divestment decisions will be of “a significant enough scale to negatively impact this market”. However, if divestment “goes mainstream” over the long term, she expects the trend will “ultimately impact fossil fuel companies’ cost of capital, all things being equal”. “Hopefully, sensible global climate regulation will enter into force and affect market conditions long before we can test that thesis out in practice”, she says.
Bullard agrees that more attention will continue be placed on the long-term viability of energy assets exposed to decarbonisation of the energy sector. For him, exactly how this will manifest is not yet clear – although he stresses that, “as potential future risks in an asset class rise, so does its cost of capital”, meaning the net effect is likely to be “a rising cost of capital for companies deemed higher risk or environmentally unsustainable”.
In the future, Caldecott predicts there will be a “growing realisation” across the investment market that fossil fuel industries are undergoing “profound structural changes” – and that many of the drivers are increasingly related to environment and climate change issues. “The other factor is changing social norms – fossil fuels are already becoming a stigmatised industry and this looks set to continue”, he says.
For Bergan, it is important pension fund managers such as KLP take the lead in adopting divestment strategies because investors are subject to “ever greater scrutiny on how their investments contribute to or detract from a low carbon future”.
“Having a climate change policy is a part of what is expected of a responsible investor”, she says. “That said, an investor’s toolbox encompasses more than divestment. At KLP, we see the divestment from coal as a way of minimising our involvement in the problem, but through our direct investments in renewable energy, we are attempting to be a part of the solution as well.”
Kommunal Landspensjonskasse
In late November, Norway’s largest pension fund KLP decided to commit an extra NOK500m (£45.7m) in increased renewable energy capacity. It simultaneously announced its intention to pull investments out of companies that “derive a large proportion of their revenues from coal”. According to Bergan, the move forms part of the fund’s desire to contribute to efforts to keep global temperature rises within 2°C of pre-industrial levels. “We believe this can best be done by boosting investments in new renewable energy production, reporting on the carbon footprint of our investments, and by engaging in active ownership with portfolio companies to reduce their emissions”, she says.
For Bergan, coal divestment is “compatible” with KLP’s risk and return requirements for its mostly passive investments. She believes it sends a strong signal that “more financing needs to be shifted from fossil fuels to renewable energy in order to achieve the two degree target”. The divestment will affect companies in the coal mining and coal power sectors that derive at least 50 percent of their revenues from coal.
In advance of making the divestment decision, KLP also launched an internal analysis into possible fossil fuel divestment at the request of Norway’s Eid municipality, whose pension assets are invested through KLP. Bergan says: “Our objective was to review possible divestment from a financial, ethical and environmental perspective to determine how KLP can best contribute to limit global warming, while managing the pension assets of our customers and owners in a financially responsible manner.”
KLP also broadened the internal analysis to include active ownership strategies and a possible increase in KLP’s direct investments in renewable energy. Bergan points out the process determined that increasing KLP’s direct investments in renewable energy “would have the greatest impact in contributing to a new low carbon economy”. She says: “Moreover, excluding coal companies from our portfolio would not represent a significant financial risk for KLP’s investments, but would send a powerful signal that a major switch from fossil fuels to renewable energy is needed to achieve the two degree target. At the same time, KLP’s analysis is clear that divestment of the entire fossil fuel industry would not be financially sustainable at the present time. There is a shift underway, but this will not happen overnight.”
For those unfamiliar with the term, the Internet of Things (IoT) refers to the ability of everyday objects to connect to the internet, allowing them to send and receive data. Nowadays, we all possess an abundance of personal gadgets that are constantly connected to the net, with practically everyone carrying a smartphone around in their pocket.
But the rise in other forms of technology that are constantly connected to the web is growing. This might be home automation systems that turn on your kettle when you wake up in the morning, to activity trackers that share with your friends how many miles you’ve run that day.
Six years ago the number of “things” connected to the internet surpassed the number of people
Six years ago the number of “things” connected to the internet surpassed the number of people, with experts estimating that in 2015, there will be more than 25 billion connected devices, and by 2020, that number will increase to 50 billion.
It is for this reason that UK regulators, Ofcom and the Federal Trade Commission (FTC) in the US are taking steps to create a regulatory framework that will protect consumers, though they are aware that any decision must be designed in such a way that it does not harm innovation in the technology sector.
In the UK, Ofcom wants to ensure that the necessary tools and infrastructure are in place to allow the IoT to develop unhindered. To support this, Ofcom has already released spectrum for machine-to-machine uses – making the UK among the first countries in Europe to do so.
Ofcom plans to work closely with the UK government, along with other regulators and the technology industry in order to strike a balance that will ensure that IoT products can be developed without barriers, but at the same time ensure that issues, such as data protection are protected.
“The Internet of Things will bring benefits to a range of sectors and could change the way we live our lives”, said Steve Unger, Acting Ofcom Chief Executive. “As a result of this growth, we have listened closely to industry and want to develop a framework for this technology to evolve in a way which will ultimately benefit citizens and consumers.”
In the US, the FTC admitted in a recent report that, while the IoT will bring many benefits to consumers, it is likely to meld the virtual and physical worlds together in ways that are currently difficult to comprehend.
The US regulator shared similar sentiments to that of Ofcom, with it reluctant at this stage to propose any legislation for risk of harming its highly profitable technology sector, but recognised the need to protect consumers. For now, the FTC recommended self-regulatory efforts on the IoT, along with enactment of data security and broad-based privacy legislation.
“We’re now in a world where data is being collected all the time”, said FTC Commissioner Edith Ramirez at the State of the Net conference. “We’re bringing these devices into our homes, into what used to be private spheres, and the data that is being generated is increasingly much more sensitive.
“It’s really in my mind fundamental that consumers continue to be in the driver’s seat, that they have a say in their own information and how it’s being used.”
Just seven years ago, had you asked someone how many apps they had, they’d most likely have looked at you blankly and accused you of being a colossal nerd. Now the term has become part of everyday language, with people owning smartphones that house hundreds of the things. These ‘things’ (i.e. mobile applications) have transformed the way in which people interact with their phones, and revolutionised the world of software development.
Smartphone platforms have opened up a world of opportunity for software developers, with individuals, small start-up companies, and major designers all clamouring to get in on the action. However, opinion is split on how to actually make money from mobile apps. While some favour one-off payments for their work, others employ a range of different methods, such as advertising or a ‘freemium’ service (where the initial app is free but upgrades or additional services must be paid for).
Direct sales of apps can prove attractive to companies looking to see a guaranteed return from every copy downloaded. It is a popular method for apps that aren’t going to continually offer new features and often require a higher initial cost. These include popular games such as Angry Birds. However, for all those offered on Apple’s iOS platform – the biggest revenue generating system on the market – Apple takes a 30 percent cut.
30%
Apple’s cut from each iStore app sale
Other developers choose to offer sponsorship around their apps. Time Out magazine’s app has done this with vodka maker Smirnoff, while The Economist entered into a similar deal with Lloyds Bank. This is particularly attractive as the sponsors’ logos don’t impact on the user experience (a problem many people have with ad-based apps).
Annoying pop-up adverts that have to be watched or closed before an app’s content can be accessed may bring in money to the developer, but they detract from the user’s enjoyment. Twitter is yet to work out a way in which it can generate revenues that match its vast user base: attempts to insert discreet promotional tweets into people’s timelines have received furious responses from many users.
Other firms (such as music-streaming firm Spotify) offer two versions of their apps: a free, lighter version that comes without many features and is littered with adverts; and an ad-free, paid-for version with a full range of features.
Finding the money
The way in which app designers try to monetise their work depends on the type of service being offered, but also on the current trend. Initially, advertising was a popular method, but soon proved unpopular with users who became frustrated with the amount of garish pop-up marketing that got in the way of the app they were trying to use. The next trend was the freemium model, but this has turned out to be hugely controversial: countless stories have emerged of children making in-app purchases on games their parents had presumed were free, racking up huge bills. Currently, there doesn’t seem to be a consensus on how best to monetise mobile apps, but a few of the biggest developers are coming up with new models.
News emerged in October that mobile dating app Tinder would soon start to offer a premium service. Tinder has become hugely popular since its launch in 2012, in part, presumably, because it has been free to use. As a result, however, it has not been able to generate much in the way of revenue. Now the service is beginning to look at ways it can turn all that activity into money.
Having initially toyed with the idea of adding advertising to the app, Tinder has instead gone for the freemium model. Sean Rad, Tinder’s CEO, told tech website the Drum: “We are adding features users have been begging us for. They will offer so much value we think users are willing to pay for them. We had to get our product and growth right first. Revenue has always been on the road map.”
Another firm to take a different approach is messaging service WhatsApp. The service was initially free when it launched back in 2009, but quickly adopted a paid model so it could handle the surge in users. By mid-2013 – six months before Facebook paid an astonishing $19bn for it – the company implemented an annual subscription fee. For $0.99, users are now able to message their friends to their hearts’ content.
Trial and error
In September, Tim Rea, the CEO of global messaging app Palringo, told technology website TechRadar: “It goes without saying that the service needs to be good quality and engaging. I use the word ‘service’ rather than ‘app’ because in many cases I think it is dangerous to think and talk about ‘an app’ as though it is a one-off piece of development that a developer throws out there. A service is different.”
Palringo now employs a free service with in-app purchases, but the core use of the app remains free for users. Getting to this point took a while, says Rea: “We went through various stages starting, with a naive view that we’d have a cool messaging app, generate some ad dollars and then charge people for some element of usage. We then went through a phase of thinking maybe we shouldn’t worry about getting money from consumers, but instead try to deliver parallel services to telecoms (white label) and to enterprise and just charge them, maybe justifying the consumer service on the basis that it is a great test bed. Then, a little over a couple of years ago, we took a cold, hard look at the situation and decided that we needed to focus down onto a particular niche and build a suitable, sustainable model that we could then expand up into a viable business.”
Offering incentives to users to pay for additional content is the model Rea believes works best: “It focuses attention on what users are doing, what they are interested in and what they want. It generates some money versus free-to-use apps and is a basis for a real business versus ad-based approach.
“As I said, an ad-based approach can be viable but it’s tricky. Generally in the mobile world it just doesn’t work well because we have got into a view of the world that has most publishers thinking ‘if my users don’t pay I will show them ads’ so they are basically saying: ‘Hey, I’ve got a bunch of people who I know will not pay and I’ll show them your ads!’ That is why they are not worth much. Beyond that, a service needs decent, coherent volumes of users and an ability to properly characterise their audience and package them for sale. Not so easy.”
Which method app-makers choose ultimately depends on the type of service they are offering. Controversies over micropayments and annoyance with intrusive advertising will likely mean new services will have to go down the same route as Tinder and offer a free service in order to capture users. The question is what to do when those users are secured, and how to avoid alienating them while also taking their money.
Long-distance communication was the result of one man’s agonising heartbreak. In 1825, while working away from home in Washington DC, Samuel Morse received a letter. In it, his father explained that Morse’s wife had suddenly become unwell and passed away. On hearing the news, Morse left for his home in New Haven, Connecticut, but by the time he arrived his wife had already been buried. Morse’s anguish over the fact that he was left in the dark while his beloved’s health waned led him to passionately pursue alternative methods for relaying information. Eventually, his imagination stumbled upon the idea of what would become the telegraph: the world’s first tool for high-speed communication.
89
Countries in which the Tor Network has servers
Humans are social animals. We desire connection with each other. We crave it. It is this need that has driven many of our greatest advances in communication. It is why, in such a relatively short time, our means of relaying information has developed from the humble telegraph in the middle of the 19th century to the inception of the internet at the end of the 20th. When Morse created the telegraph he ensured others would not have to feel the pain he felt. It was an invention that improved our world for the better. In the same way, when the internet was first conceived, it was seen as a force for good: one that would enable greater freedom.
“The dream behind the web is of a common information space in which we communicate by sharing information”, says Sir Tim Berners-Lee, the man who invented the World Wide Web. “[But] there was a second part of the dream too, dependent on the web being so generally used that it became a realistic mirror (or in fact the primary embodiment) of the ways in which we work and play and socialise.” He believed that, once our interactions with each other made it online, “we could then use computers to help us analyse [those exchanges], make sense of what we are doing, where we individually fit in, and how we can better work together”. He got what he wanted it would seem, but definitely not in the form he envisioned.
Tapping the web
Today, computers analyse our every move on the web in order to make sense of our actions. These machines once did this in secret, but thanks to the whistleblower Edward Snowden we now know that the US National Security Agency (NSA) and the British Government Communications Headquarters (GCHQ) are the ones behind this mass analysis; with the NSA’s massive data centre in Utah capable of collecting and storing every last byte of our personal information on its huge hard drives. As a result, the internet is now embroiled in the greatest controversy of its short life, as governments turn a technology that was meant to bring us greater freedom into a tool for keeping an ever-watchful eye over their citizens.
Due to the way most people connect to the internet – via a Wi-Fi connection – many do not completely understand how the data they send gets to its destination. To play video games on your Xbox One or post a picture of your night out on Facebook requires a vast physical network of cables that provide the backbone of the internet’s infrastructure.
Modern fibre optic submarine cables are the foundations for of this information super-highway and are installed along the seabed, linked together by an array of land-based stations, which then carry digital information to their designated destinations. One such destination is Cornwall, on the west coast of England. The cables that come ashore there carry approximately 25 percent of all internet traffic, and, unsurprisingly, it is also the home of GCHQ Bude. This satellite ground station is responsible for intercepting and analysing all the data that flows along these cables. It does so using what is known as fibre tapping. This is a technique that allows intelligence services to extract any and all information passing through the optical channels without interfering with the signal, thus avoiding detection. That was until June 2013, when The Guardian, with information supplied by Snowden, revealed the extent of the surveillance efforts by US and UK authorities. But what made the revelations even more shocking was the method in which data analysis was being carried out.
Running the numbers
When the average person imagines the secret services engaging in surveillance they imagine it being done by a team of people working meticulously day in, day out to analyse and make sense of all the data that flows through the internet’s fibre optic pipelines. In reality, that would be impossible because of the sheer size of the datasets being dealt with. Each day we create 2.5 quintillion bytes of data, a number hard to wrap your mind around. In order to make some sort of sense of it consider this: we now produce such an abundance of information on a daily basis that 90 percent of all the information in existence has been created in the last three years. So just how do the NSA and GCHQ manage to sift through all that? The answer is algorithmic surveillance.
Computer algorithms, not human beings, are responsible for sifting through all the bits and bytes. They are programmed to flag individuals based on certain criteria. For example, an algorithm may flag an individual who is caught searching certain terms the government deems dangerous or warranting further investigation. But there is a big problem with allowing computers to be responsible for picking out potential enemies of the state.
Data-rich algorithmic models such as the ones employed by the NSA have some serious limitations when attempting to correctly identify possible criminal or terrorist activity because of the way in which they are programmed to screen data. Algorithmic surveillance looks at all our mesa and metadata, watching everything from our emails to what words and phrases we search for on Google. They do so in order to build a profile based on parameters the government decides are potentially dangerous or worthy of suspicion. If a pattern that matches those criteria is found, it is flagged and then passed on to senior officials who can then take a closer look at what it has found. But algorithms have limitations. They are good at mapping what, when and where we are doing things online, but not necessarily very good at figuring out why. The fact algorithms have the potential to make false correlations from the mountain of information they screen should be a concern.
If you log onto Facebook or browse eBay there will be a host of advertisements in web banners that display products and services tailored to you. They are based on the information gleaned by the same types of algorithms employed by government » surveillance services. While many may be relevant, we have all encountered advertisements that make tenuous assumptions based on our online activities: ones that in no way relate to our consuming habits.
This may be harmless, because it is just marketing. But algorithms are also used to make decisions that have a greater impact on our lives. Individuals’ credit scores depend on algorithms; so does the manner in which people are treated when they travel through customs at the airport. But most worryingly of all, the US Government uses algorithmic surveillance to determine drone targets. Innocent people have died, in part, because of these complex mathematical models. In December 2013, the UN demanded the US disclose what its targeting protocol was after a fatal error killed 16 civilians in the al-Bayda province of Yemen. They were attending a wedding. Even though algorithms get it wrong, they are still used because there is no other way to satisfy the appetite the US Government has for sifting through the vast petabytes of information flowing through the internet.
Off the grid
Governments justify their obsession with our personal information by claiming it is a necessary evil that prevents acts of terrorism – but mistakes are clearly made. The other defence for this intrusion into our personal lives is that it is vital in combating cyber-crime, but activity of this kind does not take place on the same version of the internet that harbours cat pictures and viral videos. It occurs on what is commonly known as the ‘dark net’, where users are anonymous and the data they exchange with each other is almost impossible to track.
Access to the dark net is permitted through a piece of software known as a Tor Browser, which works in much the same way as Google Chrome or Mozilla Firefox, with one small difference: “It’s actually a browser to access the full internet – you just do so anonymously”, explained Andrew Lewman, Executive Director of the Tor Project, in an interview with the BBC, “and it puts the user in control of if they want to deanonymise themselves, to log into places like Google and Facebook. The Tor Network is a network of about 6,000 relays, which are servers spread around 89 countries or so. And what we do is relay your traffic through three of these relays in sort of a random order, so that where you are in the world is different to where you appear to come from.”
Due to the anonymity the dark net provides, it is a hotbed for illegal activity. It is the home of child pornography and black markets selling everything from fake IDs to stolen credit cards. There are even areas of it where users can pay to have someone killed. The largest markets, however, are concerned with the buying and selling of illegal drugs, and there are a number of reasons for this. The most obvious is the virtual market is far safer than its physical counterpart: both sides are able to remain anonymous and the deal is not done on the street, reducing the potential for violence. Direct transactions also cut out the middleman, which ensures the product’s purity remains high.
The Silk Road 2.0 was one of the largest marketplaces for the buying and selling of illegal drugs, until authorities closed it down in November. Those using the site thought they were beyond the grasp of law enforcement, until the US National Crime Agency released a statement that it had dealt a major blow to the dark web: they claimed they were able to locate the technical infrastructure of a host of illegal sites enabling them to shut them down. But rather than delivering a knockout blow, this instead proved to be a small set back.
Anyone who is familiar with illegal video streaming sites will know how, as quickly as the authorities can locate the servers to shut one down, within a few days an identical site will have popped up to take its place. The authorities are fighting a losing battle. The dark web is a game of cat and mouse, where the law enforcement will always be reacting and, therefore, always one-step behind.
Why we wear a mask
But the dark web and the Tor network are not just a cesspit for criminals; they are also a safe house for whistleblowers, journalists, activists, and those who just want to control their personal data. “Tor is spun out to be some big bogeyman to scare people”, Lewman told the BBC. “However, the average person is mostly worried about spreading their information online, unscrupulous advertisers taking advantage of leaked data… and for the same reason that you close the door to go into your house, some people just don’t want to leave a trail of data where they’ve been on the internet.”
Anonymity is a mixed bag. On the one hand it leads to drug dealers and paedophiles being able to get away with their illegal activities. But on the other, as in the case of the documents Snowden leaked, it acts as a safeguard, protecting our civil liberties, or at least allowing us to be aware that they are being threatened. The vast majority of us do not use the internet for malicious purposes, but just because a number do, does that justify mass surveillance programmes such as the ones carried out by the NSA and GCHQ?
“If we want to change the way that mass surveillance is done, encryption, it turns out, is one of the ways we do that”, said computer security researcher Jacob Appelbaum in the BBC documentary Inside the Dark Web. “When we encrypt our data, we change the value that mass surveillance presents.” But this cryptographic technology will only be developed and refined if we create a market for it. It is time to make our voices heard. The sanctity of the internet is something we all have a stake in. It is time we hid ourselves.
Leading toymaker Mattel has announced that its three-year chief executive Brian Stockton has resigned after the company racked up disappointing figures over the holiday season. The decision follows a succession of similarly shaky results, meaning that the company today faces a fifth consecutive quarterly decline in sales.
The company today faces a fifth consecutive quarterly decline in sales
Mattel’s profits for the 2014 holiday season were down 59 percent on the year previous to $149.9m, whereas sales have slumped a further six percent to marginally below the $2bn mark. With smaller competitors such as VTech gaining market share and Lego succeeding in markets outside of the traditional toy space, old dog Mattel must learn some new tricks or risk falling by the wayside.
Mattel’s quarterly statement shows that Stockton’s efforts to guide the toymaker through a time when children are choosing consumer electronics products ahead of the more traditional Barbie and Fisher-Price brands have come to nothing. Whereas in 2009, Barbie accounted for a quarter of the American market for dolls and accessories, the figure fell below 20 percent in 2013 and has been on the slide since, due largely to lack of innovation and changing cultural sensibilities.
The changes to the executive leadership team will take effect immediately, with Chief Brands Officer and President Richard Dickson and Tim Kilpin taking on expanded responsibilities, and longtime board member Christopher Sinclair taking the reins as Chairman and Interim CEO. “Mattel is an exceptional company with a great future but the Board believes that it is the right time for new leadership to maximise its potential”, said Sinclair in a company statement. I look forward to engaging with the entire Mattel community as we work to deepen our connections with children and parents through expanded product innovation and improved retail execution. We will be working during the coming months to revitalise the business and to identify the right leadership for Mattel as it enters its next phase of growth and value creation.”
Breaking up the banks has been a topic of heated debate for the last seven years; ever since major financial institutions being seen as too big to fail exacerbated the global financial crisis of 2008. The subsequent bailouts led to years of belt tightening by governments and considerable resentment from taxpayers towards the financial industry.
Despite a period of relatively strong growth for financial institutions in the US, there have been calls in recent weeks for the biggest banks to be split up, so that any potential meltdown will not require such a heavy state intervention. According to a recent poll by the Progressive Change Institute, 58 percent of voters would like to see the larger banks broken up into smaller institutions.
Banks have responded vociferously against further regulation
A spokeswoman for Citigroup, the third largest in the country, responded to the poll in a statement. “Since the financial crisis, Citi has returned to the basics of banking, has sold over 60 businesses and divested of more than $700bn in assets.”
The idea is attractive to taxpayers that want to see their deposits in banks unaffected by higher risk operations that the bigger banks also conduct. Some politicians, including Democratic Senator Elizabeth Warren and potential presidential candidate, have made calls for Dodd-Frank regulations to be tightened up and the influence of banks over government to be curtailed, citing Citigroup specifically.
In a speech on the floor of the Senate in December, Warren said, “Many Wall Street institutions have exerted extraordinary influence in Washington’s corridors of power, but Citigroup has risen above others. Its grip over economic policymaking in the executive branch is unprecedented.”
However, banks have responded vociferously against further regulation, with JPMorgan Chase’s CEO Jamie Dimon describing the industry as being “under assault” recently. He also said moves to split up its operations would “damage the franchise”.
The world of television is undergoing a belated transformation thanks to the internet. While many other industries have been quick to utilise online services to change the way they operate, television networks around the world have been resistant to change. However, with the advent of services such as Netflix, as well as the apparently imminent launch of an Apple-branded smart television, media networks are starting to realise they need to embrace the online world if they are to survive.
The traditional model for making money from television through advertising has been shaken by the advent of subscription services. At the same time, maintaining an accurate analysis of people’s online viewing habits has proven difficult. The dominant player in audience metrics has been US firm Nielsen, which has aided both television networks and advertising agencies in their quest to know the habits of viewers, allowing for targeted adverts during specific programmes. However, with the advent of online viewing, it has become harder to tailor specific adverts around shows as people choose to watch them at different times.
Nielsen has traditionally measured audience information through a mixture of diaries kept by, and devices attached to the televisions of, a sample of viewers. Across the US, the devices record what a range of people watch, giving television networks an idea of which shows are most popular, and in turn providing ad agencies with vital information about the sorts of programmes their target audiences watch.
50%
Share of media purchases IPG hopes to automate by 2016
Online shakeup
While the way in which people watch television remained largely the same over the last half a century, the shift online over the last few years has presented the industry with a problem. With people watching shows on demand and on new devices, there hasn’t been a unified platform that has allowed traditional networks to measure the ways people are engaging with their content.
Services such as Netflix and Amazon Prime now allow viewers far greater flexibility in how they watch programmes. Not only are people watching shows on demand, but the traditional form of a family sitting down to watch the same shows has also declined. People are far more likely to slip off to their own rooms with their various internet devices to watch what they want, than to put up with the democratic, middle-of-the-road choices that so often occur in a shared living room. Whereas a simple television was once the single method of watching programmes, now viewers are able to use tablets, laptops, desktops and their phones.
Nielsen has therefore come up with a new platform, in partnership with software giant Adobe, that will measure digital video across these different devices. The partnership is one that seems a decent fit, with Nielsen being the dominant television ratings agency and Adobe having large amounts of information on online users.
Both companies have stressed how important the new platform is. In a statement announcing the news, Megan Clarken, Nielsen’s Executive Vice President for Global Product Leadership, said: “This alliance is expected to accelerate the adoption of consistent and comprehensive measurement in digital. By integrating our technologies, together we’ll be able to offer our customers a more seamless and efficient way to plan and deliver against their audiences.”
Adobe’s Senior Vice President and General Manager of Digital Marketing, Brad Rencher, added: “Online TV consumption is at an all time high and Adobe and Nielsen are two leaders coming together to standardise audience measurement for digital content. Major media companies and broadcasters already depend on Adobe to bring TV across screens and better understand digital viewer engagement. Once complete, our partnership with Nielsen will provide analytics tied with ratings – benefitting advertisers, media companies and consumers alike.”
Long overdue
Analysts believe that the move to a digital platform for Nielsen was long overdue, with Luca Paderni of research firm Forrester telling Reuters: “A lot of Nielsen clients were complaining and pushing them over the past years that its methodology was not for digital devices.” The move has been widely supported by the industry. Artie Bulgrin of leading US sport channel ESPN said in a statement: “One of the challenges in digital measurement has been the lack of alignment between site analytics and syndicated measurement data, and we will be working with Nielsen and Adobe to help resolve this.”
Turner Broadcasting’s Chief Research Officer Howard Shimmel was also enthusiastic about the need for a platform that matched the new ways in which people were watching media content: “As consumers expand their video consumption across screens, the media industry needs stronger digital and cross-platform measurement to accurately track consumers and better monetise cross-screen audiences. Adobe’s strength in analytics and history in bringing together video and complementary content across platforms, combined with Nielsen’s strong audience measurement capabilities, will accelerate development and adoption of a single digital currency, which is what the industry needs.”
It was also warmly embraced by ad agencies, with IPB Mediabrands’ CEO Matt Seiler saying: “The ability to provide metrics to measure audiences accurately, allowing IPG Mediabrands to better allocate marketing dollars, across every major IP device is an important step in our quest to automate 50 percent of our media buys by 2016.”
Streaming services
It is clear the television industry is undergoing a dramatic change, with online being the key battleground for networks and programme makers. Networks are seeing upstarts such as Netflix and Amazon encroach on their territory, offering original content. At the same time, Apple is rumoured to be on the verge of entering the smart TV market, overhauling the way in which people choose the shows they want to watch.
A big marker of this shift is the recent news that cable network HBO will be making its popular HBO Go digital app independently available. Until recently, the company had been opposed to allowing people to access on-demand digital versions of its programmes without a subscription to the cable service. However, after considerable demand from users – as well as the dramatic surge in popularity of standalone services such as Netflix – HBO has decided to cut the cord. CBS has since followed suit, while other networks are thought to be considering similar moves.
Services that already exist purely online have welcomed Nielsen’s move towards measuring digital content. Crackle, an ad-funded streaming television and film service, was particularly keen. The firm’s Eric Berger said in a statement: “Crackle is the only premium ad-supported network that lives purely on over-the-top devices and there is tremendous value in understanding how people are consuming content. Being a part of the initial rollout will enable us to present real-time engagement metrics, allowing for advertisers to understand the true return on their investment and match our growing audience on Connected TV with blue chip brands at scale.”
It may seem to have been a long time coming, but the television industry is finally getting round to realising the potential of the internet. Networks that have traditionally relied on advertising revenues for their funding have found that standalone subscriptions services that live solely online are presenting viewers with a far more compelling and flexible service. Nielsen’s move online could be the thing that helps networks fight back.
Tackling Climate, Development and Growth
Chaired by Christine Lagarde and attended by a number of leading names in industry and academia, the panel looked at what resources and commitments are needed to tackle climate change, development and growth. “With an aggressive move towards clean transport and greater energy efficiency policies, we could [boost] the global economy by up to $1.8tn to $2.6tn per year”, said Jim Yong Kim, President of the World Bank. And all in attendance agreed that a commitment to sustainability on all fronts would bring development and inclusive growth.
François Hollande
The French president took to the stage to speak about the country’s response to the Paris attacks and to highlight the country’s unity and tolerance in the days and weeks that followed. “It is the very foundation of our society which has found itself under assault”, he said. “Every time the world lets a conflict linger, terrorism flourishes.” Hollande also took the time to welcome the ECB’s decision to introduce a QE programme, and said: “It compels us to be more daring – to release the brakes on growth.”
India’s Next Decade India has recently regained some of its former promise, and leading experts gathered at Davos to discuss how the world’s most populous democracy can realise its potential. “The Indian tortoise can take over the Chinese hare. If reforms are implemented India can achieve a growth rate of six percent to seven percent in the next two years”, said Nouriel Roubini, Professor of Economics and International Business at the Leonard N. Stern School of Business. The panel agreed that growth will only come if the government stays the course and does not compromise on its agenda.
The Russia Outlook Russia’s economy has been hit by western sanctions, a flailing currency and falling oil prices of late, and only by introducing much-needed structural reforms and shifting towards a more liberalised political system will the country avert a crisis. “The government and the state shouldn’t work in the oil and gas market as it does today”, said Alexei Kudrin, Professor and Dean of the School of Liberal Arts and Sciences, Saint Petersburg State University. “One of the problems is that state corporations were not successful in creating an innovation economy.”
The Global Economic Outlook
Leading figures in world finance gathered on the last day of Davos 2015 to discuss what should feature at the top of the agenda in the coming year. Panellists agreed that the economy is better positioned today than it was a year ago and started off by welcoming the ECB’s QE programme, though highlighted the importance of reform in Europe. Falling oil prices also featured in the discussion, as did Japan’s stimulus package, US economic performance and the role of technology in the real economy and in the financial sector.