Tag Archives: iPhones

Sustainability-In-Tech : All iPhones To Be Powered By Renewable Energy By 2030 Says Apple

Apple’s Vice President of the environment, Lisa Jackson, has announced that all iPhones will be powered by renewable energy by 2030.

Acquiring Renewable Energy From Wind Farm 

The announcement was made in Australia while celebrating the company’s 40th anniversary there. It is understood that, in line with this announcement, Apple will be acquiring renewable energy from a new Australian wind farm in Queensland, which could supply 80,000 homes with electricity.

Lisa Jackson said of this latest green target, “At Apple, we recognise the urgent need to address the climate crisis, and we’re accelerating our global work to ensure our products have a net-zero climate footprint across their entire lifecycle.” 

Entire Business Carbon Neutral By 2030 

The hope is that in addition to already achieving carbon neutrality two years ago for its corporate activities e.g., Retail Stores, Offices, and Travel, Apple now plans to make its entire business, including supply chain and customer products carbon neutral by 2030. This means that  iPads, Macs, and iPhones will need to run entirely on renewable energy by that date.

How? 

The reason why Apple can say this when you’re plugging your iPhone charger or Mac into your normal electric socket at home as usual is because it has it has examined usage patterns across its 1.8 billion devices and knows that this accounts for 22 per cent of the company’s overall carbon footprint. Therefore, if Apple can offset that percentage with renewable energy from projects like the massive Australian windfarm, it will be able to say that it has reached carbon neutral status for its users’ devices and is powering all iPhones with renewable energy.

Global Facilities Powered By Clean Energy Since 2018 

Apple has long been committed to reducing its carbon footprint. For example, as far back as April 2018, Apple announced that, as part of its commitment to combat climate change and create a healthier environment, its global facilities were powered with 100 percent clean energy. At the time, this included its retail stores, offices, data centres and co-located facilities in 43 countries, including the UK.

What Does This Mean For Your Organisation? 

Apple is one of the many big tech companies engaged in looking seriously at reducing the carbon footprint of their entire chain and making sure that this is widely communicated to customers. Critics could point to how most of an iPhone’s lifetime carbon emissions are made in the production phase and to news stories such as a lawsuit against a recycler that appeared to have instead diverted old phones to China (in 2020). Also, the company’s drive to sell new devices inevitably has green consequences and, as organisations like Greenpeace have said, offsetting projects don’t deliver what’s actually needed which is “a reduction in the carbon emissions entering the atmosphere.” That said Apple has been focusing for many years on how it can become a much greener company. It is which is good news for all users of their products and for wider society that a massive global business is setting itself some quite challenging environmental targets.

Tech News: Apple To Scan Phones For Inappropriate Content

Apple has announced that all iPhone photos will be scanned for any evidence of Child Sexual Abuse Material (CSAM) to protect children and to help stop the spread of CSAM online.

How?

Apple’s new versions of iOS and iPadOS, due to be released later this year, will include a new system designed to detect any CSAM using a cryptographic technology called private set intersection. The system can perform on-device matching using a database of known CSAM image hashes provided by the National Centre for Missing and Exploited Children (NCMEC) and other child safety organisations. The system uses its own unreadable and securely (on-device) stored hashes and safety vouchers to encode any matches that it finds. Apple says that the system’s threshold is set to provide “an extremely high level of accuracy” which should ensure that there is less than a one in one trillion chance per year of incorrectly flagging a given account.

The system means that an automatic on-device matching process against known CSAM hashes is performed on any photo before it enters iCloud photos storage.

Manually Reviewed

Apple says that only when a certain threshold in the safety vouchers is exceeded (i.e. the automated system is sure of a match) can a photo be manually reviewed by Apple.

If There’s A Match

If Apple’s system confirms that there is a match (i.e. the photo contains evidence of CSAM), Apple says that it will disable the user’s account, and send a report to NCMEC.

What If There’s A Mistake?

Apple says that if a user feels that their account has been mistakenly flagged, they can file an appeal to have their account reinstated.

Criticism

The announcement of the new system has been criticised from the point of view that allowing a system to scan users’ private photos for any prohibited material has general privacy implications and could even be paving the way for government or other surveillance.

Apple Says…

Apple says that the system has “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.”

What Does This Mean For Your Business?

There is no doubt that any innovations that can genuinely help in the fight against child sexual abuse have to be a good thing and it’s a bold move from Apple to announce the introduction of this system. Apple has gone to great lengths to publicise the fact that the system is very accurate and appears to go as far as it can to protect privacy. Despite Apple’s good intentions however, there are fears that this kind of system could be misused in future to allow agencies, authorities, and governments a ‘back-door’ into surveillance of the wider population in the same way that governments have long wanted back doors into end-to-end encrypted apps like WhatsApp. Unfortunately for WhatsApp, for example, it has just introduced a ‘View Once’ disappearing pictures feature that has drawn criticism that it could be misused in a way that enables CSAM to be shared more easily on the app. Another benefit for Apple using its new system is that it can ensure that its file storage areas don’t contain illegal material and, therefore, it can help ensure that Apple can keep its own house in order legally, professionally, ethically, and morally.