The concept of ‘Femtech’—or technology “which addresses women’s biological needs”—is so interlinked with the women’s health apps, that the term itself is credited to the Danish entrepreneur Ida Tin, the founder of “Clue”, a period and fertility management app.
As with the penetration of technology into any sector, women’s health is experiencing a massive boom and receiving global attention—it’s even been touted to be the next big market disruptor by many experts. The sector received funding of close to $800 million in FY 2019-20, and is estimated to grow to a $50 billion USD market by 2025. In India, the market is still in its nascent stage, yet is still estimated to be worth around $310 million.
The boom of femtech products began in late 2013, when two hugely successful period tracking apps found their way into the market. Glow, a period and ovulation management app, and Clue, were among the first successful femtech start-ups. Femtech now extends way beyond literal reproductive cycle management apps, finding its way into fertility solutions, women’s sexual wellness, and pregnancy and nursing care, in addition to period and fertility management apps.
These advances in personalised, discrete women’s health have also simultaneously helped shed the stigma usually linked to accessing information on health and wellbeing, especially in the case of women’s reproductive needs.
However, while innovation is always around the corner, there are few conversations surrounding how data-intensive the sector has become or how this affects consumer rights. Despite their popularity, many femtech products and applications lack basic data privacy standards, and fail to guarantee the safety of the massive hordes of health-related data being collected.
Though the initial focus was mainly on period tracking apps, femtech has exposed the potential for women’s personal health data to bear obvious personal and political consequences—such as being exposed to surveillance. More often than not, surveillance capitalism poses a higher level of threat to women both personally and professionally.
Can my menstrual health data really be used for surveillance?
The primary concern of any user-centric privacy debate these days is—what data is being collected and what is happening to it?
Based on the USP of most apps in this sector, the data collected is highly sensitive and personal in nature. A typical period and ovulation tracking app extracts intimate information from women regarding their bodies in exchange for the ability to provide users with an option to digitally track their menstrual cycles. Users give away sensitive information such as their sexual behaviour, and the duration and physical and emotional bodily symptoms related to their menstrual cycles. Such information is coupled with data that helps to identify the user, such as their name, age, and location along with email addresses and phone numbers. In most cases, such apps provide users an easy sign-in opportunity by linking their accounts to pre-existing social media or email accounts.
The user-interface of Flo, a period management app.
However, the problem here is that though most users understand that free applications collect this user information in lieu of a service, they are unaware of the specifics of these business models or the privacy policies that govern them.
More often than not, companies fulfill their legal obligations without truly enhancing the user’s knowledge of how their information is being used or effectively protecting their right to privacy. This is possible for companies to do, thanks to the jargon-loaded yet comprehensive privacy policies that quietly ensure that data is both collected legally and then further shared with third parties. These business models often further prioritize profit as opposed to user privacy.
For example, a 2020 study by the Norwegian Consumer Council examined 10 popular femtech apps, including Clue and Flo. It found that such apps were collectively transferring personal information to at least 135 third party companies—or ‘data brokers’, who collect, aggregate, and combine personal information about you from a variety of sources to create a digital profile on you.
The real-life implications of such practices are broad as the databases of information collected are huge and include sensitive information. Data ranging from a user’s location, to their personal health information and sexual orientation are sold to data brokers for a host of functions such as providing targeted advertising to enhance retail sales, to ascertaining life insurance and the price of the premium charged, to facilitating workplace discrimination.
It is no secret that employers track the ‘corporate wellness’ (details pertaining to personal health) of their employees using wearable health apps to save on health insurance for employees. The expansion of this trend to more intimate details is inevitable. Shockingly, apps have been designed for men to track their women colleagues’ menstrual cycles, to avoid interacting with them as ‘women are irrational’ during this period. Women are exposed to discrimination during hiring or promotions, owing to their marital and maternal status, which can be enhanced by the availability of apps designed to track such intimate data.
‘Trakher’ — a period app designed for men to discretely track the menstrual cycles of their partners, available for download on the Google Play App store.
Though these examples seem extreme, in reality, such information in the wrong hands could clearly prove to be detrimental to the individual’s interests—especially if those individuals are women. While men face similar threats from healthcare technology, women face greater harms due to menstrual or reproductive surveillance, a nuanced subset of surveillance capitalism. More often than not, these apps also enrich men, marketers, and big pharmaceutical companies: out of all the billions, “only 10% of all femtech investment funds go to women-led enterprises.”
So, what are governments doing about this?
Developed digital jurisdictions such as the European Union (EU) and the United States have begun to recognise the perils of privacy and data security in the femtech sector.
Strong protections provided by the EU’s General Data Protection Regulation (GDPR) govern how businesses and organizations should handle the sensitive digital personal information of EU residents. The GDPR prevents the dissemination of data to third party users without the informed consent of users, providing them with the right to delete their data. The same protections are not present in other jurisdictions.
In the US—where a majority of femtech companies are based—a sectoral approach to data protection is followed. The primary piece of legislation at play here is the Health Insurance Portability and Accountability Act (HIPAA), a 1996 federal law that limits where healthcare providers can share your health information. However, in the USA, femtech apps find themselves outside of the jurisdiction of such laws as they are rarely owned, conceptualized, or distributed by healthcare professionals. They also don’t have the requisite technology infrastructure that serves as a conduit for professional health information.
Instead, more often than not, they are merely treated as private companies that provide specialized services pertaining to women’s health and collect sensitive data.
Such situations thereby provide users with fewer protections and allow femtech apps to do more with the data they collect without facing legal repercussions. In Privacy International’s 2019 analysis of 36 menstruation apps, serious concerns were raised even about compliance with the GDPR’s obligations of informed user consent and transparency over data sharing practices. It was found that more than 60% of the apps tested were transferring data to third parties. The data shared included sensitive data and behavioural data.
India is also emerging as a large market for women’s healthcare, and in turn, femtech’s many grey areas. India’s leading mobile-based period tracking app “Maya” has previously come under the scanner for selling collected data to third parties. Maya denied the allegations, however, they shed light on India’s general lack of a strong data governance framework that protects and preserves privacy rights online.
Though the Government is striding towards a digitised healthcare system through the National Digital Health Mission, the same cannot be said regarding creating frameworks that would enhance digital awareness and literacy of individuals or make them more cognizant of their data hygiene. As a result, nascent femtech initiatives in India can continue to operate in a vacuum of user-centric privacy laws.
How should governments regulate how femtech companies use my data then?
At the outset though, it is evident that to create a safer femtech environment, companies need to enhance consumer knowledge on their right to control their data. Yet, though some femtech companies have made an effort to help consumers better understand their privacy policies and business models, most companies have not adopted these ‘privacy by design’ measures.
India, which is in the precarious position of developing its first comprehensive data governance framework from scratch, can tackle these issues by the horn by learning from the hurdles faced by developed jurisdictions. The first steps should include incorporating the basic principles of data protection: obtaining ‘informed’ consent from the user, respecting the boundaries of purpose limitation, and providing users with rights to access and edit their data held by data processors. Developing and promoting strong certification standards for data security infrastructure can also help win back the trust of the consumer, while facilitating widespread digital awareness programmes can create more educated and informed users of femtech apps.
Femtech has revolutionised women’s health, be it in terms of access to information, or solutions to pressing health issues. However, without reimagined legal understandings of the services these companies provide, there is a long way to go before femtech lives up to its promise of empowering women as opposed to commodifying them.
Shefali Mehta is the strategic engagement and research coordinator at The Dialogue. This article was first published in The Bastion.