Computing exams, AI, water and datacentres – Computer Weekly Downtime Upload podcast

0
247
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

In this episode of the Computer Weekly Downtime Upload podcast, Clare McDonald, Brian McKenna and Caroline Donnelly discuss the 2021 A-level and GCSE computing results, what algorithms are and are not good for, and the water consumption habits of datacentres, and their environmental impact.

Clare opens up the episode with an account of the most recent A-level and GCSE results in England, Wales and Northern Ireland – a bumper crop of exam results which are normally a bit more spaced out.

This year’s results have been algorithm-free, unlike, initially, the 2020 results. But not free of their usual class bias: private school students did even better than usual.

Clare notes the stressful Covid context for this year’s students. Despite that, there was a higher-achieved set of A-level computing results. There was a rise in the number of students taking A-level computing: 13,829 students in the UK took computing at A-level, an increase from 12,428 entrants the previous year.

There was a year-on-year increase in the number of girls taking A-level computing but also, worryingly, a drop in the numbers taking GCSE. On the other hand, girls are doing better. For girls, 25.7% achieved an A* result, an increase from 17.8% last year; whereas only 18.9% of boys achieved an A* level grade, an increase from 13.1% last year. For the first time, girls also outperformed boys in mathematics in A-levels and GCSEs.

Algos – what are they good for?

The episode then moves on to a related discussion about what algorithms and artificial intelligence (AI) are good for, if they are not good for exams.

Brian touches on the A-level and Scottish Highers debacle of 2021, when algorithms got themselves a bad name. This was discussed on the podcast almost exactly a year ago.

The whole fiasco was mentioned in a more recent BCS report, Priorities for the national AI strategy, written by Bill Mitchell, the BCS’s director of policy.

The report says the UK can take an international lead in AI ethics if it cultivates a more diverse workforce, including people from non-STEM backgrounds. It refers back to the Ofqual algorithm that was used to estimate GCSE and A-level grades in 2020. In the BCS author’s words, this led to a “widespread public mistrust in algorithms making high stakes decisions about people”.

The report registers that “public trust in AI and algorithmic systems in general has been seriously eroded by events during the pandemic”, as shown in two national surveys by YouGov commissioned in 2020 by BCS. These found that:

  • Over half (53%) of UK adults have no faith in any organisation to use algorithms when making judgements about them, in issues ranging from education to welfare decisions.
  • 63% of UK adults disagree with the statement “Students graduating with a computer science university degree are qualified to write software that makes life decisions about people”.

The BCS report itself will feed into the government’s AI strategy, which will be published later this year.

On the podcast, Brian says the headline in it for him was the idea that for the public to trust AI systems, we need a broader set of people making them in the first place. Not that it is all bleak for algorithms. Where would we be without the recommendation algorithms of the streaming services we have depended on so much over the pandemic?

Brian then poses the question: are there areas of human life that should just be free of algorithms?

Caroline cites one area that should be off-limits to algorithms and data analytics: Queen of Pop Beyoncé’s career, as illustrated in an interview in Harper’s Bazaar. Beyoncé’s refusal of data analytics-based advice regarding the 2008 album I am…Sasha Fierce is an exemplar of keeping with “the human feeling and spirit and emotion in my decision-making”, she says.

The team discuss some other areas where AI could be inappropriate, such as recruiting, but also could be beneficial. We could, despite the massive hype, be in the early days of how AI will transform human lives. That is, if humans survive climate catastrophe.

Water and datacentres: a matter under-discussed in climate change debates

In the third section of the episode, Caroline discloses some work progress on the topic of the water consumption habits of datacentres, and what those could mean for the environment in the context of the growing climate crisis.

She reveals how her attention was originally drawn to this theme by a remark at a trade show in 2016 – that it is not known how much water is used by datacentre operators in their cooling systems.

Datacentres have come under scrutiny for their environmental impact and sustainability efforts along three dimensions, explains Caroline: how much power datacentres use, how much of that power is renewable, and how big the carbon footprint of datacentres is. Progress has been made on those three fronts.

There is a widely used industry metric for measuring the energy consumption of datacentres, PUE – the power usage effectiveness score.

And while datacentre operators are fine about disclosing their PUEs, they are oddly silent about another measure – their water usage effectiveness. There is a metric, published in 2010, that the datacentre industry could use for the water usage efficiency of their datacentres – WUE (water usage effectiveness). But it is either not being used, or the scores are not being published. Facebook is a partial exception, but that is of little moment to enterprise IT buyers.

Caroline cites an Uptime Institute survey from 2020 that said half of all datacentre operators do not measure how much water they use. Could this be because operators are using masses of water to deflate their PUE scores?

Water is a precious resource, especially so in regions that are increasingly subject to drought, such as California, or which suffer a general stress on the availability of drinking water. Those include countries such as Spain and Singapore. And that roster of water-stressed areas could get bigger in years to come – datacentres are long-term features of the landscape.

Fresh air cooling could be part of the solution to the over-use of water. Another could be the datacentre-on-a-barge approach being pioneered by Nautilus Data.

It is early days for this topic, says Caroline: “Water is something that datacentre operators are not talking about, but with climate change, and the threat of that becoming even more apparent and real than it previously was, that will have to change. And there needs to be more pressure on operators to be more transparent about how what they do affects water supplies.”

Podcast music courtesy of Joseph McDade

Source is ComputerWeekly.com

Vorig artikelEducational publisher Pearson fined for data breach cover-up
Volgend artikelSecurity Think Tank: Building privacy-preserving apps and platforms