DevOps Live: "DevOps is not a checklist"

By Tom Allen | News | 18 September 2018
Panellists were Henrik Lynggaard (Maersk), Natasha Wright (Accenture), Ryan Bryers (Worldline) and Rick Allan (Zurich Insurance)

Defining DevOps as a term is hard; defining the process is even harder

Just like there is no single definition for the term ‘DevOps' - or not one that everyone agrees on - there is also no one approach to its implementation.

During a session dedicated to DevOps tools and strategies at Computing's DevOps Live North event in Manchester today, panelists spoke about their own experiences with beginning the DevOps process - agreeing that trying to follow a rigid checklist is a futile gesture.

Ryan Bryers, CTO and CIO at Worldline, admitted that, "As much as pure techies would like to, we haven't taken the purest approach and made everyone move to DevOps."

Worldline has debated DevOps rollouts at many levels, and has spent several years seeing what works.

"Now we're pausing to say ‘What is the dev strategy across the business?'" Bryers continued. "We could roll out one way of working and seven per cent of people would adopt it, so we know we need to put out three or four options that we're happy with for people to adopt. It certainly isn't going to be ‘This is the one way'."

Natasha Wright, DevOps manager at Accenture, had a similar experience. She said that one of first things that the company did when moving into DevOps was to create a list that they thought contained all of the essential components. "Actually, as that conversation and journey has progressed," she said, "we've realised that completing things in a checklist isn't really what DevOps is."

Accenture is a professional services firm, and has also experienced this tendency among its clients:

"We worked with a partner recently who had this really great Powerpoint deck of all of these things that should be automated, and they said, ‘We're not doing this so we're not doing DevOps'. Actually, they were completely forgetting that DevOps is all about doing things quickly, it's not necessarily about automating everything or having the right tools.

"When we talk about the things that we won't do [around DevOps], it's not a case of ‘You must do these things in this checklist to conform to this DevOps agenda'."

Making better use of your DevOps rock stars

By John Leonard | News | 18 September 2018
Early success can go to the head

DevOps teams need to share the secrets of their successes before the methodology is rolled out, says Rob Vanstone of XebiaLabs

The introduction of DevOps in an organisation tends to follow a pattern similar to the familiar Gartner hype cycle. Early promise give rise to inflated expectations which inevitably lead to a fractious descent into the trough of disillusionment before, hopefully, lessons are learned and acted upon to bring real productivity.

In DevOps terms, the first step is often a sudden local increase in productivity in part of the technology department brought about by the introduction of CI/CD, followed by the automation of low hanging fruit and a doubling or tripling of the frequency of releases.

Suddenly you have a team of rock stars on your hands with egos to match. They are going to transform this dinosaur business into a sleek, modern, flexible and agile entity fit for the 21st century.

Then, as the methodology is scaled up, the problems start. Some teams are resistant to having their operations disrupted, legacy applications prove hard to change, methodologies turnout to be less transferable than had been thought, automation actually results in more work, people leave, the compliance and security people start pushing back, and now you're back where you started, or worse.

Avoiding this scenario, or at least smoothing the curve to avoid hitting the ‘scalability wall', was the subject of a talk by Rob Vanstone, technical director UK at DevOps intelligence and automation vendor XebiaLabs, during Computing's DevOps Live event in Manchester on Tuesday.

There are four common factors that adversely affect DevOps and Agile transformations, Vanstone said: a ‘Waterfall' mindset across the business which is out of step with Agile; a tendency in management to prefer incremental change over transformation; a false sense of security among the rock stars; and low priority being given to QA and testing.

These issues are frequently only faced once the trough of disillusionment has been plumbed, but is there a shortcut to this organisational learning? Could, perhaps, the initial stage of creating a specialist DevOps team to tackle quick wins and demonstrate the potential of DevOps and Agile be bypassed "in favour of involving all teams and doing it properly from the off?" asked a member of the audience.

This depends very much on the leadership, said Vanstone.

"Winning breeds winning so I completely understand the pattern of the rock star team, but sometimes that rock star team becomes the new unicorn within its own organisation," he said.

Instead of having the DevOps team focused solely on creating highly visible success stories, they should be encouraged to share their knowledge at an early stage, Vanstone said.

"Implementing the change in other areas is the key thing to do here. If you don't do that you end up with another silo, the DevOps silo, so there needs to be a visible way of getting that information out. So there's certainly a role for showing off these poster children, showing it's not just the realm of the Amazons and Etsys but you really need the structure and leadership to make it work."

Above all, an engineering, problem-solving mind set needs to be promoted across the organisation, Vanstone said.

"Build, measure, learn. That's the mantra that works everywhere."

DevOps: 'Test everything, everything is testable!'

By Stuart Sumner | News | 18 September 2018
DevOps can help organisations leap over some development hurdles

Aubrey Stearn, interim head of DevOps at the Solicitor's Regulation Authority explains how to ensure your DevOps culture results in secure releases

"Test everything, everything is testable!"

That was one of the opening statements made by Aubrey Stearn, interim head of DevOps at the Solicitor's Regulation Authority in her talk at today's DevOps North Live event from Computing.

Stearn explained how to engender a DevSecOps strategy, dicsussing the entire development cycle, and finishing with some general tips of how to foster a DevOps culture.

"Over the last ten years we've seen a huge increase in the adoption of open source software, with more people coding, and more people writing software. And there's a huge focus on doing a better job, and trying to tackle security earlier," she said.

Stearn described the large number of different attack vectors in any piece of software, starting with the machine on which the code is first written.

"It all starts with your machine. Even if it's a brand new machine and you've just got it out of its box, it may already be compromised. Then there's your development package like NPM, NuGet or GEMS, we've seen each of them compromised."

She also discussed Docker, stating that in 2016 nearly 80 per cent of images hosted on the service had high vulnerabilities.

She said that the first time for coders to "save themselves" in testing code pre-push.

"You have to test pre-push. You can't push code unless testing coverage is 100 per cent. Then your second chance is when pushing it to GitHub. You can run SNKY or Greenkeeper for vulnerability scanning. And lock down your packages so they're a fixed version. They should be immutable. If you're constantly pulling in new stuff, you don't know if you're safe."

The next opportunity to find problems, she added, is in the pipeline.

"You can do a code analysis, look at the quality, testability and maintainability of the codem and if there are any known vulnerabilities in it. Clair for example is a Docker scanner. If you've got containers running around with known vulnerabilities, it'll tell you."

She also advocated Detectify once code is released into what Stean described as "the scary real world" of AWS, Azure, Google or others. "Detectify will tell you if there are any deltas between what should be there, and what's actually there, " she said.

Stearn continued: "One of the fundamental problems with Docker is the lack of mantra in their journey to have testing in the container. Some work really needs to be done around that."

Firms should also look at a content security policy, she argued.

"That lets you say what content should be pulled and from where, and all the major browsers support it. If you're in a position to say where data should be coming from, you absolutely should be using a content security policy."

Stearn went on to give some general tips around implenting a DevOps and agile culture.

"Agile is a way of accruing credit based on the things you build. You can't do transformation without trust, it's like a currency. You build confidence in the things you're running and you can use that on experiments. So you'll then have the autonomy to do the things you want to do."

She recommended smaller builds where possible.

"I like the deconstruct challenge, so you do one week rather than two week sprints. You need to tell a story, so you get something new out every day, and by the end of the week you have the full story told. Devs get off on that, they don't want to get to the weekend with things still hanging over them."

Stearn added that having successful releases to show the rest of he business can be invaluable.

"If you want to make progess, and if building something like a Docker container, this is a piece of work I can popularise and PR the hell out of. My team built a Docker container, which means I can show the business that it's something we can consistently deploy over and over again. That gives me credits I can use to go do something else."

And when showing progress to the business, prioritise the C-suite, she said.

"C-level buy-in is incredibly important. Sometimes you need smaller projects to earn trust with your company, to show you've been successful working at a smaller scale.

"The key is to have the execs come down to you, to look at way you're working and see why it's succesful. Also imagine how powerful it is for your team to have someone from the C-level come down to your team to say 'Hey I've heard you're being successful I want to see it for myself.' That's so powerful," she concluded.

Switching to the cloud has lowered wait times, increased visibility and freed up resources at Stockport County Council

By Tom Allen | Interview | 18 September 2018
In the age of austerity, self-service and automation are key

The SCC service desk gets about 5,000 tickets a month, and going online has saved time for customers and agents

 

Like many local government bodies across the UK, Stockport County Council (SCC) has a significant installed base of legacy technology. In the past, that also translated into legacy thinking - but the Council is on a journey to refresh its systems and its approach to IT.

Dan Simmonds is the IT service desk manager at SCC. Although much of his career has been spent in the public sector, at organisations including the Science Museum and the Metropolitan Police, he is applying his private sector experience at firms like the Louis Vuitton Group to his current role.

Trying to get the Council thinking in "a more private sector way" has been "a real challenge," he says, but things have been changing:

"The workforce now is a lot more fluid…[and] we're able to move people around more easily. It's never going to quite the same, but [SCC] is moving a little bit more towards the private sector [way of working]."

Our system was so complex… It just wasn't fit for purpose any more

A major project has been upgrading the service desk to use Freshservice, an entirely new system that replaced Marval.

"We'd let it get out of date," says Simmonds. "We'd never applied any updates to it, or seen any of the more recent versions of the product.

"Our system was so complex - you couldn't get any data from it. It just wasn't fit for purpose any more. That became obvious from about a month in. It was clear that I wasn't able to really tell how we were doing or what we were doing."

After meeting several vendors, Simmonds and his team chose to move forward with Freshservice, which they could use to enable self-service.

"We didn't have enough staff to deal with every single ticket within an SLA, so we wanted to push self-serve more, and we wanted to get better data on the ticket that was coming in; as it came in we wanted to have all of the information available for us to do the job, not to have to go backwards and forwards to a user to ask more questions.

"Because that was the big driver for it, we felt that the UI was one of the most important things, and that's why we went with it. It was very easy to set up and get out there for the users, and cost-wise it was where it needed to be."

Freshservice now routes service desk tickets directly to the correct team, based on the information that a user gives when filing it.

As well as self-serve, the service desk team wanted to adopt automation. This is perhaps more important in the resource-limited public sector than it is in private industry: government austerity is a big challenge for Simmonds.

"We get 5,000 or so tickets coming in a month, and we need to have the staff to be able to resolve them all. We need to have the resources in place to get users to self-serve, or we need to be a lot slicker with our automation."

I was working blind and we were trying to make decisions based on what we believed to be the case

After WannaCry ("It certainly focused the mind a little bit!") last year, the service desk team rolled out several new automations. One is designed to handle emergencies: if a user ticks the bespoke ‘major incidents' field, "a whole series of events go off and email various heads of service and senior managers in IT." Others will track and report on filed security incidents.

The tracking and reporting features and analytics tools of Freshservice have hugely increased visibility in the SCC systems: a big change from the Marval days.

"We couldn't get any real data from our old system, so I was working blind and we were trying to make decisions based on what we believed to be the case. Now, using the analytics tools... I can break it down by team, by ticket type, by priority, by time of day: anything I want, really… It's given us a better understanding of our workload."

The reaction to the new system, which is cloud-based using AWS, has been very positive. Users raise about 70 per cent of service desk tickets through the online portal, which has only been live for about 10 months - a big shift from the old system, and something that Simmonds calls "a massive positive."

SCC plans to continue its digitisation journey, with a move to Office 365 and Sharepoint. That is going to impact the service desk, but Simmonds is undeterred: he is already planning to integrate Microsoft System Center Configuration Manager and the existing Cisco phone system into Freshservice.

Computing's Cloud & Infrastructure Summit Live returns on Wednesday 19 September, featuring panel discussions with end-users, strategic and technical streams and a session with guest speaker Inma Martinez. The event is FREE to qualifying IT leaders and senior IT pros, but places are going fast. Register now!

5G news: 5G could mean the return of carrier-locked phones

By Tom Allen | News | 17 September 2018
5G news: 5G could mean the return of carrier-locked phones

The fragmentation of 5G will make it difficult to move between networks

5G is the latest networking technology for connected devices, promising speeds up to 100 times faster than 4G, and 10 times faster than home broadband.

It's not all about speed, though: a more ubiquitous and responsive (between 1ms and 10ms, compared to 4G's 40-60ms) network are also planned, enabling connected devices like autonomous vehicles, drones and even artificial limbs.

That network will be enabled by new infrastructure, which will need to be installed across the country over the next few years. This technology will operate at a higher frequency than existing networks, enabling the speed and bandwidth of 5G but limiting its range - requiring many more base stations to cover the same area as a single 3G or 4G mast.

5G phones won't be launched until next year, and the infrastructure isn't expected to be in place before 2020 at the earliest, although there are plenty of trials taking place, as well as a spectrum bidding war.

Check back here frequently for all the latest 5G news.

Computing's Cloud & Infrastructure Summit Live returns on Wednesday 19 September, featuring panel discussions with end-users, strategic and technical streams and a session with guest speaker Inma Martinez. The event is FREE to qualifying IT leaders and senior IT pros, but places are going fast. Register now!

17/9/18 - 5G might mean a return to carrier-locked devices, at least in the short term, says AT&T. Gordon Mansfield, VP of radio networks and device design, told PCMag that the first wave of 5G phones will not be transferrable between networks.

That is because 5G, as a launch technology, is still very fragmented, with carriers using a variety of frequencies for their signals. Unless two networks use the same frequencies, or early smartphones support multiple bands (unlikely), then customers won't be able to leave a carrier and take up with a rival using the same device.

Thankfully, Mansfield doesn't expect that situation to last long.

12/9/18 - EE is switching its 3G signal in more than 500 mobile towers to 4G, mainly in cities like London, Birmingham and Edinburgh. 5G sites will be built on top of the converted sites, preparing the way for the company's launch of 5G services next year.

EE says that 3G use is falling ‘rapidly', with customers now making more calls on 4G than 3G for the first time.

12/9/18 - Orange Poland and Huawei have launched a station in Gliwice that supports 5G technology. The station will be used for 5G tests - the first to be done in Poland outside lab conditions.

4/9/18 - The West Midlands is to host the UK's first large-scale 5G test-bed, with connectivity hubs planned for Birmingham, Coventry and Wolverhampton as part of the government's Urban Connected Communities Project. The West Midlands Combined Authority (WMCA), which will oversee the work along with the Department for Digital, Culture, Media and Sport (DCMS), said that the trials will support the work of organisations in the health, construction and automotive sectors.

29/8/18 - Juniper Research predicts that there will be 1.5 billion 5G connections worldwide by 2025, with initial growth driven by fixed wireless access to replace or complement current broadband connectivity.

The firm expects almost half (43 per cent) of 5G connections to be in Japan and South Korea, whose oprators are leading in testing, partnerships and innovation, by 2025

28/8/18 - The European Union has granted Nokia a €500 million loan to work on 5G R&D. The money comes from the European Investment Bank (EIB) and the European Fund for Strategic Investments (EFSI).

Nokia has not yet announced what it will use the money for. However, EIB president Alexander Stubb told Finnish newspaper Helsingin Sanomat that Europe must stay level with China and the USA: the world's other major 5G players.

21/8/18 - The UK government is looking for companies that will take part in a trial of 5G in the transport sector. Specifically, the Trans-Pennine Initiative (TPI) will evaluate use of the technology on the Manchester-Leeds rail route. Companies that take part will be able to install their radio equipment on the route without investing in track-side infrastructure.

The government wants connectivity on all trains in the UK to be above 1Gbps by 2025.

20/8/18 - Nokia and Verizon have successfully completed a trial of 5G NR mobililty call technology in a vehicle. The call, on the 28GHz spectrum, was sent from a Nokia building to a receiver on a moving vehicle, which travelled between two 5G radios on the building using the 3GPP New Radio standard. The companies say that the trial was 'seamless'. 

6/8/18 - Three is preparing for its 5G rollout, with plans for trials later this year in partnership with Huawei. In its recent financial results, the company said that it had invested £125 million into upgrading its network and IT systems in the first half of 2018. Based on the company's current growth rate and Ofcom's predictions about mobile data use in the UK, Three is going to need to increase its capacity 20-fold by 2025.

2/8/18 - Audi and Ericsson have agreed on plans to work on 5G for automotive production. The companies have signed a Memorandum of Understanding and experts from both companies will run field tests in a technical center of the Audi Production Lab in Gaimersheim, Germany, over the coming months.

As well as the 'smart factory' using 5G, Audi and Ericsson are exploring the use of the technology at other plants.

31/7/18 - The AutoAir project, by Airspan Networks, is deploying 'advanced' 5G networks at the Millbrook Proving Ground in Bedford to test connected and autonomous vehicles (CAVs). Airspan says that the work will deliver pervasive 4G and 5G  through a network of base stations and will focus on issues such as cell tower handoffs for CAVs.

The project was one of six to receive funding from the government earlier this year. 

27/7/18 - Nokia is confident that 5G will save its financial bacon. The company has said that its poor Q2 results (with operating profit down 42 per cent YoY) will recover following accelerating 5G network rollouts in the USA later this year.

26/7/18 - O2, through its parent Telefonica UK, has invited every company in the FTSE 100 to take part in its 5G Testbed trials. O2 wants to understand the processes and use cases where these firms could use 5G - and, of course, it would help the operator manoeuvre them towards its own 5G network. Sky is the first company to accept the offer.

23/7/18 - Huawei has claimed that continuous large bandwidth (100MHz per operator) is the "golden spectrum" for 5G business success. It says that countries with insufficient C-band spectrum can allocate 100 MHz of continuous large bandwidth on TDD 2.6/2.3 GHz to each operator, which will 'improve investment efficiency, while helping to prepare for an evolution towards high bandwidth 5G'.

The company was speaking at the fourth annual Asia-Pacific Spectrum Management Conference.

19/7/18 - Orange has extended its practical 5G trials to the . As part of the trials, Orange will install a demo area in its Opéra megastore in Paris; give its partners an opportunity to test 5G services like 4k video streaming in Châtillon; and work with 5G in vehicles at a facility in Linas-Montlhéry.

9/7/18 - Three has signed an agreement with SSE Enterprise Telecoms to connect its network to 170 BT exchanges, and an additional 270 in the future. This will mean that Three's mobile network reaches thousands more masts, and lays the groundwork for a future 5G rollout.

3/7/18 - Orange Romania, with technology partners Samsung and Cisco, is performing a 5G multi-vendor Fixed Wireless Access (FWA) test in real environment conditions in Floreşti, Cluj county. 15 residential customers, plus Carrefour Market Floresti and in the Florești City Hall, are using Samsung 5G terminals (with 26GHz mmW technology) and Cisco routers  for internet access. The results have been 'similar or better' than the existing installed systems.

2/7/18 - Dave Dyson, CEO of Three UK, has said that he wants to focus on providing a reliable and high-quality service to customers, rather than a fast 5G rollout.

"Being first doesn't matter to me," he said to City A.M. "What matters is that when we do launch 5G we give customers a reason to join us. We won't be out this year making claims about a 5G launch, which some operators have done.

"But we will spend between now and the middle of next year making sure whatever we launch has the right quality of service."

He dismissed 5G trials by other operators as "a PR exercise."

29/6/18 - Huawei will launch a 5G phone in 2019, it announced at MWC Shanghai this week. The company will use its own modem and processor in the handset, which will be launched before the end of June next year, instead of one made by an external firm like Qualcomm.

29/6/18 - Finnish carrier Elisa is launching the 'world's first commercial 5G network' - joining Qatar's Ooredoo, which claimed the same thing last month.

There are, however, a couple of differences between Elisa and Ooredoo's implementations. Elisa is already selling subscriptions and used a Huawei device to make a 5G phone call between Tampere and Tallinn; the two locations where the network is active. Ooredoo is offering 5G broadband hardware now and mobile devices next year.

27/6/18 - Qualcomm and Chinese phone maker Vivo are working together on new 5G antenna technology, integrating both sub-6GHz and 28GHz (mmWave) antennas into a commercial smartphone. They tested performance at Qualcomm's antenna lab in San Diego.

Low frequencies (such as sub-6GHz) provide wide coverage, while high frequencies enable higher speed and lower latency.

20/6/18 - Vodafone has announced that it will start 5G trials later this year, at more than 40 sites in Birmingham, Bristol, Cardiff, Glasgow, Liverpool, London and Manchester. The company is also talking to enterprise customers, to test 5G applications like AR and VR in offices, factories and hospitals. The trials will begin between October and December.

19/6/18 - Only just over half (53 per cent) of UK users of mobile internet users in the UK are satisfied with the speed that they are getting, a survey by IHS Markit company RootMetrics has shown.  These findings 'counter industry cynicism about the need for 5G investment', says the firm.

Almost 80 per cent of respondents said that they would pay more for higher speeds and reliability.

14/6/18 - 3GPP has approved the standalone (SA) specifications for 5G Release 15, complementing the non-standalone release last from November. The SA release enables independent deployment for 5G New Radio, but does not include the extremely low 0-5ms latency that has been hyped as one of the technology's main draws.

13/6/18 - Scientists at the Tokyo Institute of Technology have developed a tiny 28GHz transceiver, designed to be capable of establishing a stable high-speed 5G connection. The device, in a 4mm x 3mm circuit, uses beam steering to guide the main lobe of the radiation pattern, instead of RF phase shifters.

12/6/18 - Ericsson, in its new Mobility Report, claims that 5G networks will be launched this year, but that Europe will lag behind the rest of the world. Half of all mobile connections in the US will be 5G by 2023, it says, along with a third in North East Asia - but only a fifth in Western Europe.

The company says that regulation and competition will hold 5G back in Europe, while others have blamed market fragmentation and a prolonged focus on 4G.

Also today, Ofcom has announced the radio spectrum bands that it thinks should be prioritised for global 5G use. They are the 26GHz, 37-43.5GHz and 66-71GHz bands, as well as potentially 32GHz. The issue will be discussed at the WRC-19 conferene next year.

6/6/18 - Deloitte has released its report, conducted for the Department for Digital, Culture, Media and Sport, on the economic and social impacts of 5G. The report concludes that mobile broadband, especially at 5G speeds, has provable economic and transformative benefits, but also that the technology needs further development.

6/6/18 - EE will launch a live trial of 5G services this year. The company will push the scheme live at 10 locations in London's Tech City area this October. Mark Allera, CEO of EE owner BT's consumer business, said that the trial will be "a big step forward in making the benefits of 5G a reality for our customers".

31/5/18 - EE has responded to O2's claims, below, calling them "misleading". Head of network communications Howard Jones told us that the telco's 5G network will feature much faster speeds and lower latency than existing 4G, but will need further development to reach the speeds and 0-5ms latency necessary to enable applications like driverless cars.

29/5/18 - An O2 UK spokesperson has warned that any version of 5G launched prior to 2020 would lack essential capabilities. This is because it would be based on an earlier version of the standard (Release 15, due in June), with the final version (Release 16) not scheduled until December 2019.

"Any UK operator launching ‘5G' before 2020 would be using a ‘lite version' of 5G," the spokesperson told 5G.co.uk. Release 15, for example, does not include super-low latency or the capabilities to support autonomous driving.

24/5/18 - Qualcomm has announced what (it says is) the industry's first 5G New Radio solution for small cells, which are expected to be important to 5G networks. OEMs can use the FSM100xx solution to reuse their soft- and hardware designs across new sub-6 and mmWave products. Qualcomm will begin sampling next year.

23/5/18 - ComSenTer, a US-based multi-university research project, is looking at a successor for 5G, despite the industry insisting that a sixth-generation standard will not be needed.

The group is examining very high frequency transmissions: up to 330GHz, much higher than the 3.4-3.8GHz used for 5G. That means more speed, but even more pronounced problems with range, signal blocking and packet loss. Speeds, though, could reach up to 10Tbps.

21/5/18 - A consortium of 20 companies have completed the EU's Horizon 2020 project: a three-year effort to develop an integrated 5G backhaul and fronthaul transport network called 5G Crosshaul.

The network is able to flexibly interconnect distributed 5G radio access and core network functions, hosted on in-network cloud nodes. It has been trialled in experiments across Europe and Taiwan.

17/5/18 - A trial by BT, Verizon, Ericsson and King's College London has shown how 5G technology can be used in emergencies. In the first test, a drone was sent instructions over a 5G connection using Ericsson's network slicing technology between BT and Verizon's networks; it then proceeds to the supply drop location, guided by a software control system and ground-based markers.

In a second demonstration, the drone was shown providing real-time video surveillance to a remote location, enabling people at that location to make more informed decisions. The drone uses the same 5G radio connection as the first demo, but connects to Verizon's US networks to receive mission details. This demonstrates the possibility of temporarily deploying networks abroad in emergency situations.

15/5/18 - BT CEO Gavin Patterson plans to get the drop on rivals by launching 5G in 2019, a year ahead of the government's commercial rollout plans. He told analysts in an earnings presentation, "We will look to have a commercial 5G product launched in the next 18 months."

BT's owner EE had an early start in 4G, but UK competitors like Vodafone are unlikely to want a repeat of that situation in 5G; so we may see a revision to their plans, too.

14/5/18 - UK cities are competing to win a £100 million infrastructure investment from the Department for Digital, Culture, Media and Sport as part of its 5G Testbed and Trials Programme.

Recently discussed at the 5G Summit in Glasgow (see below), the fund will be used to install infrastructure to support 5G trials. Applications close on the 5th of June, with a winner chosen by the end of July.

14/5/18 - The University of Surrey's 5G Innovation Centre has announced its first seven SME technology partners, chosen through their involvement in its testbed programme. The University says that the move aims to widen the 5G opportunities available to smaller enterprises.

The companies are Blu Wireless, CBNL, Cardinality, Eptomical, Estatom, Lime Microsystems and Paramus.

9/5/18 - As part of a consortium investigating autonomous vehicles in Leeds, the city will begin testing self-driving pods in the South Bank area later this year. Consortium member Aql, which holds 5G test licences in the UK, will build the required communications infrastructure. Other members include Gulf Oil, the University of Leeds and Leeds and Bradford city councils.

The pods will be based on the Westfield POD, which Heathrow Airport has used since 2011.

8/5/18 - While England was languishing in sunny Bank Holiday weather, 5G progress continued on the continent. Deutsche Telekom has turned on 5G mobile data in Berlin, deploying six antennae in the city centre using the New Radio (NR) standard. An additional 70 cells are planned to be added over the summer.

In France, Nokia has been testing 5G calls on the 3.5GHz frequency band with telco SFR. The trial was also accomplished using the NR standard, with Nokia technology like the AirScale radio platform.

3/5/18 - The Institute of Electrical and Electronic Engineers (IEEE) has announced that it will hold its 5G Summit in Glasgow on the 14th of May, where delegates will discuss the potential and challenges of the technology. Speakers will focus on the latest trials, standards and deployments, as well as how 5G can be used in verticals like farming and health.

Cisco, Nokia, Vodafone, Qualcomm, University of Strathclyde, Surrey 5GIC (5G Innovation Centre), the DCMS and Ofcom are among the attending speakers.

Equifax report uncovers unencrypted usernames and passwords and security equipment that wasn't working

By Graeme Burton | News | 17 September 2018

Official report into Equifax breach reveals dysfunctional IT department that didn't even know how much data had been stolen

A US government report into the Equifax breach in 2017 (PDF), in which the sensitive personal details of more than 145 million people were stolen - including millions of UK accounts - has uncovered evidence of a dysfunctional IT department that failed to take security seriously.

The report, by the Government Accountability Office (GAO), reveals that the attackers initially breached Equifax's online dispute portal, but were able to gain access to the organisation's database for a period of almost three months due to poor security, such as failing to identify software that needed to be patched and ensuring that critical security equipment was functioning properly.

The initial breach was due to the company failing to patch the Apache Struts 2 framework running on the online dispute portal. Evidence of a breach was only noticed by IT staff after a long-expired certificate on security equipment, which was supposed to be monitoring outbound encrypted traffic, was updated. The lack of a certificate meant that the device had effectively not been working for ten months.

"On March 10, 2017, unidentified individuals scanned the company's systems to determine if the systems were susceptible to a specific vulnerability that the United States Computer Emergency Readiness Team (US-CERT) had publicly identified just two days earlier. The vulnerability involved the Apache Struts Web Framework and would allow an attacker to execute commands on affected systems," the report explains.

It continues: "Equifax officials stated that, as a result of this scanning, the unidentified individuals discovered a server housing Equifax's online dispute portal that was running a version of the software that contained the vulnerability.

"Using software they obtained from an unknown source and that was designed to exploit the vulnerability, the unidentified individuals subsequently gained unauthorised access to the Equifax portal and confirmed that they could run commands. No data was taken at this time…

"Beginning on 13 May 2017, in a separate incident following the initial unauthorised access, attackers gained access to the online dispute portal and used a number of techniques to disguise their activity."

The attackers used encryption, which should have monitored, in order to disguise their activity and surreptitiously exfiltrate the records. This should have been picked up.

"Equifax officials added that, after gaining the ability to issue system-level commands on the online dispute portal that was originally compromised, the attackers issued queries to other databases to search for sensitive data.

"This search led to a data repository containing PII [personally identifiable information], as well as unencrypted usernames and passwords that could provide the attackers access to several other Equifax databases."

As a result of Equifax storing user names and passwords unecnrypted on the network, the attackers were easily able to extend their attack beyond the three databases supporting the dispute resolution portal to 48 other unrelated databases.

"After reviewing system log files that recorded the attackers' actions, Equifax officials determined that the attackers then ran a series of queries in an effort to try to extract PII from the databases they had located. Altogether, the attackers ran approximately 9,000 queries, a portion of which successfully returned data containing PII…

"After successfully extracting PII from Equifax databases, the attackers removed the data in small increments, using standard encrypted web protocols to disguise the exchanges as normal network traffic. The attack lasted for about 76 days before it was discovered."

The attack was only uncovered by a network administrator "conducting routine checks of the operating status and configuration of IT systems [who] discovered that a misconfigured piece of equipment allowed attackers to communicate with compromised servers and steal data without detection".

Specifically, as a result of that misconfiguration - a security certificate that had expired ten months earlier - encrypted traffic was not being inspected as it should have been, and the malicious traffic was therefore not detected.

"After the misconfiguration was corrected by updating the expired digital certificate and the inspection of network traffic had restarted, the administrator recognised signs of an intrusion, such as system commands being executed in ways that were not part of normal operations."

It was only then that the company belatedly started to address the attack, but didn't know how much data had been taken, nor how the breach had occurred. However, log files hadn't been altered by the attackers and IT forensics experts were able to piece together the attack, step-by-step. On 2 August 2017, Equifax finally got round to notifying the FBI of the breach.

According to the report, the Apache Struts vulnerability was not properly identified as running on the online dispute portal when patches for the vulnerability were installed throughout the rest of the company. Furthermore, databases were not properly segmented, enabling the attackers to access multiple databases during the attack.

And data governance was also woeful - especially the storage of unencrypted credentials for internal databases.

"Equifax officials noted one other factor that also facilitated the breach. Specifically, the lack of restrictions on the frequency of database queries allowed the attackers to execute approximately 9,000 such queries - many more than would be needed for normal operations."

The fall out from the attack has cost Equifax an estimated $439 million, although it doesn't appear to have significantly damaged the company's business. Earlier this year, it was accused of not disclosing the full extent of the attack.

Impact of automation on jobs has been overstated, claims Bank of England's Mark Carney

By Tom Allen | News | 17 September 2018
Carney expects automation to replace about 10 per cent of jobs in the UK

The head of the Bank of England rejected earlier research on automation, some done by the BoE's own economists

Mark Carney, governor of the Bank of England, has said that the threat of job losses from automation is being blown out of proportion, because some jobs are simply not suited to being replaced by machines.

Economists believe that automation's impact on jobs - commonly known as Industry 4.0 - will be faster and more disruptive than previous revolutions, due to the pace of change in modern technology.

"Every technological revolution mercilessly destroys jobs and livelihoods - and therefore identities - well before the new ones emerge," Carney said at an event in Dublin on Friday, according to the FT.

However, the prevailing feeling among technologists is that new jobs will emerge to replace those that are lost, and jobs that remain will be more productive.

Carney said the worst estimates of job losses are too extreme - even highlighting research that the BoE's own in-house economists have used.

Oxford professors Carl Frey and Michael Osborne famously predicted that almost half of jobs in the USA could be replaced by automation in their 2013 paper, 'The Future of Employment'. Two years later, BoE chief economist Andy Haldane said that as many as 15 million jobs in could be at risk in the UK, and 80 million in the USA.

Haldane reiterated his warning last month, when he said that AI and automation could disrupt the jobs market "on a much greater scale" than anything that has come before.

However, Carney cited research predicting a much more modest impact of around 5 million jobs: about 10 per cent of the UK workforce. Rather than replacing jobs, he expects automation to change them, instead.

In addition, the research states that automation does not always make economic sense: a human bartender could be cheaper and more versatile than a machine, as well as able to build a rapport with customers.

There will be some impact on the economy, Carney admitted. Even with a fairly smooth transition, the total amount of hours worked could fall by two per cent over a 30-year period: equivalent to a two percentage point rise in unemployment.

Central banks cannot prevent these changes, but will adjust monetary policy to suit. The BoE is aware that, while automation could boost productivity, it may also cause disinflation - as job losses would affect demand and workers would have lower bargaining power.

Carney's comments came just before the release of a report by the World Economic Forum, which said that automation will create about 133 million jobs worldwide over the next 10 years, compared to 75 million that will be replaced.

However, the WEF also sounded a warning, saying that white collar jobs - such as those in accounting and data entry - are under the most risk. Eight out of 10 businessses surveyed in the UK said that they are likely to begin automating work within the next five years.

Have we arrived at a public cloud duopoly?

By John Leonard | News | 17 September 2018
Public cloud is becoming a two-horse race

The two IaaS/PaaS leaders are pulling away from the rest

There are plenty of examples of industries dominated by two key players: Visa and MasterCard, Boeing and Airbus, Coca-Cola and Pepsi, and in technology there's Intel and AMD and iOS and Android. Other mobile operating systems are also available including the free and open source LineageOS but they enjoy only a tiny share of the market. Meanwhile other attempts at creating new viable mobile OS such as FirefoxOS and Ubuntu Touch failed.

Are we seeing the same thing with public cloud IaaS and PaaS? One CIO we spoke to during our recent research programme certainly thinks a duopoly is possible.

"When you look at AWS it's massive, it's very sophisticated and it's got loads and loads of different products. They pride themselves on how many new products they bring to market every year. It's like going into a toy shop where there's always a new toy," the CIO said.

"Microsoft, on the other hand, have a much simpler product offering. They're not bringing so many products to market but they're packaging things more tightly so that you're not having to do so much understanding. It's more standardised."

With these two different approaches, with Microsoft's installed base, and with both moving rapidly to fill the gaps where the other is strong, it's easy to see how enterprise incumbents such as IBM and Oracle and relative newcomer to the enterprise Google may struggle to achieve anything but a distant third in the public cloud space.

"You could see a world where AWS and Azure will take the lion's share of the activity," the CIO said.

Indeed, these two players dominated our the results of our survey of IT professionals in terms of both usage and perception of leadership and innovation. Interestingly it was Microsoft's cloud that led the way in both areas, despite AWS being the larger by most measures.

We asked the third party vendor, machine analytics firm Sumo Logic which works across most of the main cloud vendors for their opinion.

"AWS is clearly the 800-pound gorilla in the market - it was first and has a huge range of services and options today that help a wide variety of technologists from developers to security analysts tackle trends such as serverless, microservices and DevSecOps," said Mukesh Sharma, vice president EMEA.

"Azure is doing well in companies that are looking at taking a multi-cloud approach. So many companies are Microsoft shops, so Azure is a good platform for them to use as part of their enterprise agreements," he went on, adding support for the duopoly theory. "Azure is being used alongside AWS as a way for CIOs to keep control over their cloud strategies."

Sharma said he does not see much of the IBMs and Oracles among his client base, instead placing Google Cloud Platform in third place for public cloud IaaS/PaaS, in part because of the growth of Kubernetes, which was originally a Google project.

"Kubernetes is getting a lot more attention now, and I think part of this is because there are a lot of IT leaders leaning on containers as a way to avoid lock-in. Kubernetes can help in that," he said.

However, GCP was not found to be making much headway among our research respondents for IaaS/PaaS, except for in the education sector. "Too late to the party", was one comment.

One to watch in the future may be Alibaba Cloud which has the potential to beat AWS on price and possibly functionality too in some key areas, but few of our respondents had chosen to back that particular horse as yet.

 

Computing's Cloud & Infrastructure Summit Live returns on Wednesday 19 September, featuring panel discussions with end-users, strategic and technical streams and a session with guest speaker Inma Martinez. The event is FREE to qualifying IT leaders and senior IT pros, but places are going fast. Register now!

Linus Torvalds apologises for 'unprofessional' rants over Linux

By Graeme Burton | News | 17 September 2018

Torvalds' blunt but entertaining emails likely to become a thing of the past

Linux Torvalds, the creator of Linux, has apologised for his ‘unprofessional' emails and promised to get help. Torvalds will take a break from leading the development of Linux in the meantime.

The admission was made in a message to the Linux mailing list over the weekend, headed: "Linux 4.19-rc4 released, an apology, and a maintainership note".

In addition to the usual small talk about driver fixes and kernel updates, Torvalds also wrote at length about "maintainership and the kernel community" in which he admitted that he had "been ignoring some fairly deep-seated feelings in the community".

He continued: "I am not an emotionally empathetic kind of person and that probably doesn't come as a big surprise to anybody. Least of all me. The fact that I then misread people and don't realise (for years) how badly I've judged a situation and contributed to an unprofessional environment is not good."

This admission comes after people in the Linux community had confronted Torvalds about his attitude, he goes on to admit.

"My flippant attacks in emails have been both unprofessional and uncalled for. Especially at times when I made it personal. In my quest for a better patch, this made sense to me. I know now this was not okay and I am truly sorry.

"The above is basically a long-winded way to get to the somewhat painful personal admission that, hey, I need to change some of my behaviour, and I want to apologise to the people that my personal behaviour hurt and possibly drove away from kernel development entirely."

In response, he added, he will be taking some time off and will "get some assistance on how to understand people's emotions and respond appropriately". Greg Kroah-Hartman will finish up the 4.19 release in Torvalds' absence.

However, Torvalds asserted that he wasn't planning on walking away from Linux.

"This is not some kind of ‘I'm burnt out, I need to just go away' break. I'm not feeling like I don't want to continue maintaining Linux. Quite the reverse. I very much do want to continue to do this project that I've been working on for almost three decades.

"This is more like the time I got out of kernel development for a while because I needed to write a little tool called ‘git'. I need to take a break to get help on how to behave differently and fix some issues in my tooling and workflow.

"And, yes, some of it might be ‘just' tooling. Maybe I can get an email filter in place so at when I send email with curse-words, they just won't go out. Because, hey, I'm a big believer in tools, and at least some problems going forward might be improved with simple automation.

"I know when I really look ‘myself in the mirror' it will be clear it's not the only change that has to happen, but hey... You can send me suggestions in email."

This came to a head after this year's Linux Maintainers' Summit was moved from one continent to another entirely because Torvalds had accidentally booked a family holiday in Edinburgh at the same time.

While Torvalds suggested the event go ahead without him in Vancouver, Canada the Program Committee decided to moved it to Edinburgh instead. The Maintainers' Summit is an invitation-only workshop where Linux process and development issues - nothing technical - will be discussed among around the 30 leading Linux luminaries.

Computing ran a three-part interview with Torvalds in 2012, shortly after he picked-up the Millennium Technology Prize.

The hybrid cloud is both freedom and challenge, says Scalr

By Tom Allen | Sponsored | 17 September 2018
The hybrid cloud combines private and public clouds - with both benefits and challenges

Everyone will eventually end up in the hybrid cloud

One of the most interesting parts of working in IT is the rapid pace of change. Tools are constantly evolving or being developed to meet new business needs, and what is at the heart of your business one year might be considered legacy just a few years down the road.

It's hard to believe that that might ever happen when it comes to the cloud. It is increasingly the backbone of modern business, used for everything from lowering costs to increasing agility.

Despite its rapid entrance into our work lives, though, modern cloud computing has only existed since the mid-2000s. In its early days, most companies were moving to the cloud for cost reasons - and while costs scaling with consumption is still a big factor, there are others that are just as, if not more, of a draw.

The cloud increases agility and flexibility. It makes collection and analysis of massive data sets feasible; connects systems, people and customers; increases response rate and lowers time to market. In short, companies in the cloud have a significant competitive edge over their competitors.

Evan Klein, head of product marketing at hybrid cloud management platform Scalr (a sponsor of Computing's Cloud & Infrastructure Live event this week), said that businesses often use the cloud differently depending on their size:

"SMEs, due to the nature of their smaller infrastructure footprint and, in some cases, lack of IT capabilities, have found that cloud services have allowed them to concentrate on their core business. The adoption of SaaS applications, which do not require the up-front investment and ongoing costs for infrastructure, increases their bottom line and improves business agility.

"For larger enterprises, the legacy of building physical data centres and heavy investment in infrastructure means cloud adoption has needed to be phased in. This usually starts with private cloud environments…[but] public cloud adoption has started to increase recently, with enterprises using public cloud for areas such as test and dev and disaster recovery."

Many firms, especially those large enterprises that began with on-premise private clouds, are now moving to hybrid and multi-cloud environments. This isn't only an issue of cost, but of feasibility. Applications on legacy infrastructure are notoriously difficult to migrate to the public cloud, and some simply aren't built to work in that environment: they need to be re-factored before a firm can even begin the migration process.

However, most companies will eventually find themselves operating in the hybrid cloud, if only because different teams adopt technologies at different rates. This can add its own challenges, like decentralised management and network complexity.

One answer is the idea of a ‘hyper cloud': an abstraction layer that sits above multiple cloud providers. Partly mitigating the problem of decentralised management, companies can use a hyper cloud to monitor cloud usage; set controls for the type of infrastructure that is provisioned; set corporate policy, security, and compliance controls; and gain visibility into costs.

However, most current solutions lend themselves poorly to abstraction. "Rather than thinking about whether everything should be abstracted or not, it may be better to break down that thinking into what lends itself well to abstraction and what does not", said Klein. "For example, service authentication does lend itself well, as does the library/SDK layer. That is one of the reasons infrastructure as code has been so popular."

Security is often perceived as a challenge of using multiple clouds, but Klein sees the association of decreased security in multi-cloud environments as a fallacy; risks are more often the result of bad processes, which itself stems from insufficient visibility.

The GDPR, which makes data sovereignty such an important part of any data strategy, is also increasing the importance of visibility. The regulation is especially an issue for large enterprise-scale firms, with multiple data centres and a global presence. Control of data is crucial in such an environment.

"If organisations require data sovereignty it is important that the provider used has the resources to offer local retention of data. If there is no control over the users' access to the cloud service providers they choose, there is a danger that data can be stored outside of the designated region.

This can be mitigated by using a private cloud infrastructure, but for those using public cloud infrastructure it can also be mitigated by controlling access to the data centers that are used by public cloud providers.

"The number one problem companies face here is lack of visibility into what developers are doing. Without that, they cannot have control, much less compliance. Effective management of cloud environments can dramatically reduce the risk of data being mismanaged and enterprises falling out of compliance."