Pro
Siirry sisältöön
Service Business

Unpacking data: GDPR requests as a way to increase algorithmic transparency

Kirjoittajat:

Aarni Tuomi

lehtori, majoitus ja ravitsemisliiketoiminta
lecturer, hospitality business
Haaga-Helia ammattikorkeakoulu

 

Visiting Research Fellow
University of Surrey

Mário Passos Ascenção

yliopettaja, palveluliiketoiminnan kehittäminen ja muotoilu
principal lecturer, service business development and design
Haaga-Helia ammattikorkeakoulu

Published : 25.04.2023

Analysts have declared for a few years that “data is the new oil”. Just as oil fuelled the global economy for previous decades, data powers modern businesses and enables them to

  • make better decisions,
  • understand and serve their customers better,
  • optimise their operations and resource usage, and
  • develop innovative new products and services.

Data can be collected from a wide range of sources, including customer or employee interactions, social media, website traffic, IOT devices.

Data is also a key ingredient for providing AI-driven digital service experiences the modern consumer has grown to expect.

  • Personalised content across multiple channels
  • Search- and rank algorithms to curate most relevant content
  • Real-time order tracking
  • Anomaly detection to highlight sudden changes in behaviour e.g. due to stolen account
  • Ratings and learning analytics

In the context of algorithmic management, AI-processed data is strongly related to controlling how work tasks (e.g. delivery service) are conducted on digital labour platforms such as Uber, Wolt or Upwork.

Leveraging GDPR to access data

EU’s general data protection regulation (GDPR) gives users greater rights to their data. GDPR is designed to protect the privacy rights of individuals and ensure that organisations that collect and process personal data are accountable and transparent in their practices. Notably, GDPR grants individuals various rights, including the right to access, rectify, erase, and restrict the processing of their personal data.

As part of Haaga-Helia’s AlgoAmmatti-project, we requested different types of user data from one of Finland’s leading delivery platforms to analyse the kind of digital footprint that accumulates from user interactions on the company’s platform. Our experimentation is a continuation to a previous article, where we examined the transparency of digital platform companies’ algorithmic management practices.

Analysis of the GDPR data

In our analysis of the GDPR data, we found out exactly when and how many orders or delivery tasks the user had completed, as well as specific details about each order or task, including for example:

  • search terms used
  • ads shown
  • specific items put into or removed from shopping basket
  • items ordered
  • device and operating system used for accessing the platform
  • all conversation history with support/customer services along with resolution to the issue
  • ratings given for retail partners or for deliveries
  • keywords or tags the platform uses to give users recommendations
  • estimated minimum and maximum delivery times per venue
  • venue ratings and venue pricing
  • delivery task pick up and drop of times
  • delivery method
  • calls and their length to customers
  • login and logout times

Based on these, a user’s digital footprint seems to be quite extensive and cover all aspects of interaction on the platform. How that information is used is less clear to an individual user. It is for example unclear how items labelled as “venue_rating” or “venue_pricing” are used and determined by the platform’s algorithms, e.g. search and rank algorithms. There have been well-publicised examples of platforms prioritising some brands or products (e.g. their own) in search results over independent sellers (Yin & Jeffries 2021).

Giving all types of users a clear justification for what data is collected, why, and how it is used could help build trust and loyalty between users and the platform. Such data openness policies could also be a value-adding differentiation strategy. One of the authors of this post for example learned that they’ve used the delivery company’s platform circa 100 times in the last 2 years – almost once a week! This data could easily be shown to users in a concise and easy-to-access way to increase loyalty or to help users try new things – or to take better control over their consumption.

Our analysis of GDPR data also provided us with new evidence that highlights the algorithmic control and precarity of delivery courier’s work on digital labour platforms (Cano et al., 2021; Tuomi et al., 2023). We discovered that several retail partners give couriers explicit instructions regarding order pick-up and drop-off. Many restaurants and retail operators gave instruction regarding parking or delivery, e.g. “you can park behind the restaurant”, “do not enter through the main entrance to the store”, “keep the pizzas horizontal please” or “the cold items should be delivered outside the heat bag since they include drink”.

However, several also gave instructions that had more supervisory connotations, despite couriers being self-employed entrepreneurs: “do not enter without heatbag”, “please do not accept tasks from this venue if you can’t wear a mask inside the premises”, or “please show staff the order number if/when you are asked to do so!”. Perhaps the most striking finding was a courier instruction note given by one vendor: “Please note that the usage of [the vendor’s] restrooms is not allowed”.

Implications and next steps

As data grows in importance for businesses, it should also be increasingly important for users of digital services to know how and why their data is collected and how it is used, particularly if an automated decision-making system is used. We strongly encourage users interested in data privacy to request their own data through the GDPR mechanism. Most online platforms and other service providers provide users with a relatively straightforward process for doing so. Requesting your data through GDPR is surprisingly easy and may yield interesting insight, both from an individual user’s perspective as well as about the transparency of the business itself.

However, we also note the inherent problem of user-led data privacy policies like GDPR, whereby it is up to the individual user to go through the effort of requesting their data. In the case of digital labour platforms, some users (e.g. freelancers, retail partners) might also have fears of retaliation for being too “nosy”, e.g. through account suspensions or bans. Bucher et al. (2021) dub the phenomenon of self-censorship as “anticipatory compliance”.

It should also be noted that the relative proportion of users who request their data from companies, or even read through consent policies or other privacy protection documents before clicking “I agree”, is miniscule. The Finnish cyber security company F-Secure famously tested the degree to which users pay attention to corporate text related to consent. The company set up an experiment where they offered free wi-fi in a busy public space in London (Feltman 2014). All users had to do to connect was read through and accept the service provider’s terms and conditions. Little did they know that the legal text included a clause that in return of using the free wi-fi, the user grants the service provider permanent ownership of the user’s firstborn child. It turns out that surprisingly many were ready to give up their kid for free wi-fi.

Experiments like this demonstrate the issues with current, user-led systems. For managing things as complex as AI, novel forms of consenting to data collection and processing are needed. In the AlgoAmmatti-project, we propose the greater inclusion of platform users to take an active part in the governance of platforms, drawing inspiration from citizen’s assemblies/juries and other similar deliberative mechanisms.

Haaga-Helia’s AlgoAmmatti – Algorithmic Management and Professional Growth in Platform Economy -project seeks to understand algorithmic management practices and the impact of these on workers’ day-to-day experience in the context of digital labour platforms, e.g. Yango, Wolt, or Skillshare. The aim of the service design project is to develop a worker-centric model for conceptualising algorithmic management in the context of professional growth. We seek to create new value for service companies by shedding light on the broader impacts of algorithmic management on digital labour platforms and thus, help companies to proactively develop their services. From a worker-perspective, the goal is to facilitate and manage service work in a more human-centric and socially sustainable manner, focusing on creating balanced and fulfilling careers.

The project is funded by the Finnish Work Environment Fund between 03/2022-12/2023 and conducted by Haaga-Helia’s Service Experience Laboratory LAB8.

References

Bucher, E. L., Schou, P. K., & Waldkirch, M. 2021. Pacifying the algorithm – Anticipatory compliance in the face of algorithmic management in the gig economy. Organization, 28(1), 44-67.

Cano, M. R., Espelt, R., & Morell, M. 2021. Flexibility and freedom for whom? Precarity, freedom and flexibility in on-demand food delivery. Work Organisation, Labour & Globalisation, 15(1), 46-68.

Tuomi, A., Jianu, B., Roelofsen, M., & Ascenção, M. P. 2023. Riding against the algorithm: Algorithmic management in on-demand food delivery. In: B. Ferrer-Rosell, D. Massimo, & K. Berezina (Eds.), Information and Communication Technologies in Tourism 2023. Proceedings of the ENTER 2023 eTourism Conference, January 18–20, 2023 (pp. 28-39). Cham: Springer.

Picture: www.shutterstock.com