Wearable devices and the law: when the body becomes data
The gradual introduction of portable devices – such as smartwatches, fitness trackers, and biosensors – has transformed the relationship between technology and the individual to an unprecedented degree; the human body is no longer just the subject of rights but also a constant source of data. Heart rate, sleep patterns, geolocation, and levels of physical activity feed a continuous stream of information that, in legal terms, lies at the very core of privacy.
From the perspective of European law, this phenomenon clearly falls within the scope of the General Data Protection Regulation (GDPR), which recognizes the protection of personal data as a fundamental right and requires that its processing complies with principles such as data minimization, purpose limitation, and security by design. However, wearable devices put these principles to the test, as data collection is massive, persistent, and, in many cases, difficult for the average user to understand.
One of the main legal challenges lies in the nature of the data collected. Wearable devices do not only record identifying information, but also biometric and health data, which are considered particularly protected categories. Their improper processing can lead to serious consequences, ranging from discrimination in the workplace or insurance sector to the creation of highly intrusive behavioral profiles. This is compounded by the system's ability to make inferences: by combining physiological data with contextual data (location, routines, timings, mobility patterns), it is possible to reconstruct a person's lifestyle and habits with remarkable accuracy and, in some cases, even anticipate their future behavior.
In this context, informed consent, the traditional pillar of data protection, reveals its limitations, as users do not fully understand the extent of the processing or the potential future reuses, resulting in consent tending to become a mere formality rather than an effective guarantee of informational self-determination.
To these structural challenges are added practical risks that illustrate the true magnitude of the problem. A paradigmatic example can be found in the recent case reported by both national and international press, where a negligence in the use of devices during a military training allowed the exact location of the French aircraft carrier Charles de Gaulle to be leaked, as the user had uploaded their data to the sports application Strava. The incident shows how the combination of geolocation and habits recorded by personal devices can compromise not only individual privacy but also vital collective interests, including national security.
This is not an isolated anomaly, but the extreme manifestation of an inherent logic to these systems, as all captured data is potentially data that is published, reused or correlated with other data to generate additional insights about the individual.
From a regulatory perspective, the European Union has begun to respond by expanding security obligations applicable to connected devices, including wearable devices, requiring manufacturers to implement technical measures designed to prevent unauthorized access and data manipulation.
However, the pace of technological innovation often outpaces regulatory capacity, creating grey areas where legal responsibility is diluted among manufacturers, application developers, service providers, platforms, and third parties accessing the data.
In practice, the debate does not end with security but requires a review of the legal bases for processing and their coherence with the logic of these devices. The GDPR requires that specific purposes be defined, but many ecosystems of portable devices combine functions related to well-being, personalization, product improvement, and commercial exploitation, increasing the risk of purpose change and making it difficult for the data subject to anticipate subsequent uses.
This is compounded by the data minimization principle, whereby even when data can technically be captured, the legal question is whether they are necessary for the service provided and whether there is a less intrusive alternative (for example, processing them locally on the device, limiting temporal granularity, or allowing their use without geolocation).
The obligation of proactive accountability also takes center stage. Systematic and continuous collection of health or biometric data, along with the possibility of profiling, often places these processes within the scope of data protection impact assessments, especially when large-scale, monitoring, and special category data are involved.
This approach requires the identification of specific risks (reidentification, unauthorized access, sensitive inferences, secondary use), documentation of mitigation measures, and implementation of safeguards by design, such as encryption, pseudonymization, and strict retention policies. In complex environments, it is also essential to clarify the roles and responsibilities (data controller, joint controllers, processors), as well as the chains of access to data that may include third-party applications, analytics providers, or cloud services.
Finally, the challenges are compounded when data crosses borders or is integrated into decisions with significant implications for individuals. The international transfer of information generated by wearable devices, which is common in global infrastructures, requires compliance with GDPR mechanisms and effective safeguards, especially when it comes to sensitive data. And, in contexts such as employment, insurance, or finance, the temptation to use activity, sleep, or stress metrics as indicators of performance, risk, or reliability raises issues of proportionality, transparency, and non-discrimination, as well as limitations associated with automated decision-making and profiling.
In all these cases, the promise of user control is only real if it translates into understandable and functional options, such as prudent default settings, clear management panels, simple revocation, and effective access to rights such as deletion or portability.
Ultimately, wearable devices pose a fundamental question: can the law continue to treat personal data as a static category when it is continuously generated, automatically, and deeply integrated into daily life? The answer requires rethinking not only the mechanisms of consent but also the governance of data and its lifecycle, reinforcing transparency, proactive accountability, and above all, effective control of the individual over their information.
Because, ultimately, the true challenge is not technological, but legal: to ensure that in the age of constant quantification, the individual continues to be more than the simple sum of their data.
Read the full article by Carlota Parra, senior associate at ECIJA Madrid, here.