Data Digest #6: Swimming Pools and Robot Therapists

Data Digest #6: Swimming Pools and Robot Therapists

April 14, 2023

The cogs of the data world are perpetually turning. Data never sleeps. Brace yourself for an exciting overview into some of the top data news stories that have been gracing our screens over the past month. 

CNN: Italy bans ChatGPT

Unless you’ve been living under a rock, you’ve likely heard of ChatGPT. The revolutionary AI tool has been making major waves in the tech scene since its release in November last year: Ask it to write a Shakesperean sonnet about Eastenders or a Kendrick Lamar-style rap about peanut butter, and prepare to be blown away. However, it’s not all fun and games. ChatGPT may appear to be an exciting new development with limitless capabilities, but it also raises some questions surrounding privacy and data laws.

We’ve never seen anything quite like ChatGPT before, so it floats in a legal grey area. Many countries have approached the new technology with caution, and the Italian government recently banned the technology pending investigation into OpenAI, the founding company behind ChatGPT. There are fears that ChatGPT breaches privacy laws, and that users don’t completely understand the implications of providing their data to the tool. ChatGPT is an unprecedented technological development, the likes of which hasn’t been seen before: There is currently no legal foundation to underpin it.

Italy has proposed an ultimatum: OpenAI either explains how exactly ChatGPT will work to comply with Italian data laws, or else pay a hefty fine of $20 million. This could potentially set the tone regarding how other countries chose to approach ChatGPT.

CNN: Italy bans ChatGPT over privacy concerns

Guardian: computer-generated models to boost ‘inclusivity’

The modelling industry has a real problem with diversity, inclusivity and representation. This isn’t news to anyone. You’d think the solution to this dilemma was glaringly obvious: Create more opportunities for models from diverse backgrounds. However, some major fashion brands have found a rather unconventional way to promote inclusivity on their ecommerce platforms: By creating custom, computer-generated models to showcase their products.

Brands such as Levi’s and Calvin Klein are incorporating virtual models on their website. However, this isn’t a particularly new development: Pretty Little Thing sparked an explosion of controversy in February last year, when they hard launched their latest addition to the PLL team on social media – a virtual avatar– inviting followers to come up with a name for her. The comments section was flooded with outrage from less-than-enthused customers. “Please can you just pay people to be models instead of replacing them with virtual ones to lower your costs and increase your margins”, said one user, while another branded the development “horrifying”. Admittedly, there is something quite Black Mirror-esque about computer-generated models.

A spokesperson for Levi’s defended their introduction of computer-generated models, claiming that they were intended to serve as a “supplement” to reflect different “body type[s], age[s], size[s], [and] race[s]”.

How about, I don’t know, just hiring more diverse and realistic models? They’re not exactly in short supply: Many real people would jump at the opportunity to represent major fashion brands. The issue of inclusivity will continue to linger and stagnate until it is addressed head-on, rather than skirted around. This doesn’t feel like innovation: It feels like a step backwards.

The Guardian: Computer-generated inclusivity: fashion turns to ‘diverse’ AI models

BBC: Data centre powers swimming pool

Picture this: on a crisp Spring morning, you go for a refreshing swim in the local public pool. You submerge yourself in the welcoming, warm water: Believe it or not, this pool is being powered by the excess heat from a data centre the size of a washing machine.

This is not a drill: a public swimming pool in Exmouth is actually being powered in this way. As we all know, the UK is currently caught in the throes of a cost-of-living crisis, so this development couldn’t have come at a better time. It’s an efficient and money-saving innovation, one that is saving the Exmouth leisure centre thousands of pounds a year, heating the pool for sixty percent of the time at a maintained temperature of thirty degrees. We’ve already seen many gyms shutting down or limiting access to their swimming pool facilities due to the cost of powering them.

And it’s a win-win. A major problem that data centres have is that they tend to overheat, so it’s welcome news that the excess heat has found a positive use. Anyone for a swim?

BBC: Tiny data centre used to heat public swimming pool

BBC: Chatbot therapists

“Alexa, play notifications”. “Hey Google, what’s the time?”. “Siri, what’s the weather like today?”

Chances are, you’ve asked AI chatbots these kinds of practical questions before. But have you ever tried to dig deeper, tapping into their ‘human’ side? Their answers can sometimes be unexpectedly empathetic, funny, or insightful.

Well, computer programmer Eugenia Kuyda is a firm believer that AI chatbots can function as worthy companions. Kuyda is the founder of Replika, a chatbot app that has one very single purpose: To provide users with someone to talk to, someone who is “always on your side”. While this may seem straightforward enough, the ways users chose to engage with the app vary dramatically. Autistic children may use it as preparation before engaging in conversation with other people, while married couples may use it as a relationship counsellor. It’s a chameleon that shapeshifts depending on what you want it to be.

According to the World Health Organization, there are almost one billion people with a mental health disorder. And while a medical professional should always be the first port of call for people who feel like they’re struggling, not everyone has straightforward access to this kind of support. This kind of technology is, to an extent, democratising wellbeing, providing people with the opportunity to open up about their feelings to a companion, albeit a virtual one. It’s a form of talking therapy that can, admittedly, help in a limited way, though it is no substitute for real therapy.

However, it’s not devoid of risks. There have been instances of inappropriate behaviour between users and their chatbots, for example, involving sexually explicit conversations. The founder of Replika as underlined that the app is intended as a conversational companion rather than a mental health tool. However, the reality is that it opens up a can of worms, and you can’t control how others choose to use it. It’s a controversial new development that warrants further research. Regulations and safety standards need to be put in place to prevent the misuse of this kind of technology.  

BBC: Would you open up to a chatbot therapist?

Share this article

Latest post
How To Stop Your CV Going Into A Black Hole