Following Transport for London’s decision against Uber, Cliff Saran looks at the role of professionalism and ethics in software development.

Uber is the latest company to get caught out for using software to help it overcome official audits and tests. Among the reasons Transport for London (TfL) gave in September 2017 for not renewing Uber’s licence to operate in London was the software that the app-based taxi firm allegedly developed to avoid officials inspecting its drivers.

While newspaper commentary has largely been about the Licensed Taxi Drivers’ Association, which represents London’s black cab drivers, lobbying TfL against Uber, an important part of its decision was Uber’s stealth software.

This is not the first time a company has been found to have written software explicitly to get around official tests and audits.

In May 2014, Volkswagen was found to have modified its engine management software to detect when diesel cars were being run on an official emissions test so that it could dial down the emissions.

The carmaker effectively wrote software specifically to cheat, according to the New York Times, which wrote: “Volkswagen admitted 11 million of its vehicles were equipped with software that was used to cheat on emissions tests.”

The newspaper reported that an on-road test conducted by West Virginia University found some cars emitted almost 40 times the permitted levels of nitrogen oxide. This led to the California Air Resources Board’s investigation of Volkswagen.

Looking at TfL’s decision not to renew Uber’s licence to operate in London, among its concerns was the use of so-called Greyball software, which geofences government and official buildings.

The software reportedly presents an alternative site to customers, or people wishing to book a ride from outside those buildings, which is used to prevent officials from booking an Uber ride.

Other cities have been concerned about the use of Greyball software. In a blog post, Gerald Gouriet and Charles Holland of barristers’ chambers Francis Taylor Building described Uber’s Greyball program as a way to identify regulatory staff using the customer app and thereby avoiding regulatory activity and highlighted the case of New York.

 “Uber initially robustly defended the program, but after six days, announced it would be withdrawn,” the pair wrote. The US City of Portland recently published an audit looking into the use of Greyball software at Uber, which confirmed that the transport company had admitted using such software.

“In a letter dated 21 April 2017, Uber’s counsel provided their second response. In this response, the company admits to having used the Greyball software in Portland for a two-week period, from 5 December to 19 December 2014 against 17 individual rider accounts,” the audit report said.

The records provided by Uber show three of those individual riders actively requested and were denied rides on the Uber platform, the court filing stated. The company said it would never engage in a similar effort to evade regulators in the future. But as Computer Weekly’s sister title, TheServerSide, notes, the company’s record of unethical practices in software development appears to reveal a culture of contempt among managers.

On her blog about sexual harassment at Uber, Susan Fowler wrote about a “toxic culture” in the company, where managers refuse to cooperate. “I remember a very disturbing team meeting in which one of the directors boasted to our team that he had withheld business-critical information from one of the executives so he could curry favour with another,” she wrote.

There is also the case of Uber’s God View tool, infringing users’ privacy by collecting data about their location even when the Uber app is not being used.

Overcharging clients Beyond Uber and Volkswagen, examples of unethical coding include overcharging clients, producing poor quality code, and stealing intellectual property.

In a post on open source repository GitHub, one developer has been trying to raise the profile of coding ethics. The developer described how on one occasion, an employer asked to change the value of refund vouchers on an e-commerce site to make the refund worth less.

The coder wrote: “I think we need to establish a code of ethics for programmers. Doctors, social workers, and even lawyers have a code of ethics, with tangible consequences for skimping on them. Why not programmers as well?

“I want to live in a world where a programmer who hasn’t agreed to follow our code of ethics has a hard time getting employed. It is simply not acceptable to write code that is harmful to users. What the hell is wrong with these people?”

The Association for Computer Machinery’s ethics statement says: “Software engineers shall approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment. The ultimate effect of the work should be to the public good.”

Ethics in software engineering is also an area that has been looked into by the BCS, The Chartered Institute for IT. The BCS’s code of conduct for its members: “You shall have due regard for public health, privacy, security and well-being of others and the environment.”

Improve human wellbeing David Evans, BCS director of policy and community, believes an overriding outcome in the domain of computing should be to benefit society and improve human well-being.

For organisations that value customer relationships, ethics is very important; he says: “In the academic world, ethics is top of the checklist.” But working in an ethical manner can be challenging.

“The idea of public benefit or human wellbeing turns ethics into a misplaced concept,” says Evans. “You can lose the reason why you do it. We want professionals who do things that do not cause harm to others, and we also want our IT team to understand the effects of what they do.”

The value of working ethically should, says Evans, be ingrained into the corporate culture, including IT and software development.

Rewrite the rules to win the cyber arms race, says McAfee. Cyber defence cannot be effective unless it becomes more automated and proactive, says Raja Patel, vice-president and general manager of corporate security products at McAfee.

Patel told the Mpower Cybersecurity Summit in Las Vegas: “We are in the same evolutionary race against cyber predators, who are constantly evolving to resist our defences. We can’t change our pace – we have to change the race.”

Source: Computer Weekly, 7-13 November 2017