By: Linzi Penman, Naomi Pryde, Sarah Cunningham, and Kirsty McKay
Due to an exponential growth in the investment in female health and wellbeing, Forbes and Dealroom reported that 2023 saw 1.14 billion USD raised collectively across 120 deals in ‘Femtech’. The phrase refers to technology products and services that help to solve the health needs and concerns suffered disproportionately or solely by females. With an increased awareness of women’s health issues, evolving societal perspectives and the development of artificial intelligence, the Femtech Landscape Report estimates that the industry will be worth over a trillion dollars by the end of 2027. In Israel, which ranks second globally in Femtech investment and has a thriving sector, startups like AIVF are leading the way in applying AI to female health.
In the UK, several household global brands and emerging growth companies are looking to empower female users with data and technology to make more informed choices about their health and remove barriers to accessing appropriate healthcare e.g., due to gender biases, stigma, and a lack of funding. Less than 5% of public funded research in the UK is dedicated to the subject of reproductive health, despite it being the cause of health issues for a third of women. Part of this systemic issue may be attributed to a lack of insight or awareness, as women have previously been excluded from clinical trials due to fluctuations in their hormones.
Femtech is helping to address these challenges by providing either free or low-cost subscription-based access to female health and wellbeing information. Examples include period-tracking apps, such as Clue; virtual online clinics like Maven; and fertility-tracking bracelets such as Ava. Some of these technologies can measure and track stress levels, weight, hormonal changes, and menstrual cycles. This generates a lot of data which could help to rebalance the legacy impact of female health being comparatively under-researched. Of course, this data is considered “special category” or “sensitive” and is therefore subject to enhanced data protection requirements.
A poll conducted by the UK’s data regulator, the Information Commissioner’s Office (ICO) in September 2023 revealed that more than 50% of the women surveyed said that:
- transparency about how their data is used; and
- the security of their data,
are of greater concern than the cost or ease of use of Femtech apps. As an example, it has been reported that some women attempt to conceal their pregnancies from their phones by not buying baby clothes online or using pregnancy apps to avoid being monitored and potentially subject to direct targeted marketing. As such, companies in this space have a significant trust gap to overcome in encouraging women to continue using their online services. The ICO is investigating technologies in this space to identify whether the services are negatively impacting users from a privacy perspective, for example by incorporating confusing privacy policies, storing unnecessary volumes of data, or targeting distressing advertising at users without valid consent.
Statistics published by Google Ads showed that conversion rates are typically up to 5 times higher for consented users, which Femtech companies ought to be aware of. This emphasizes the importance of a user-centric design e.g., by embedding legal privacy language in user journeys at the point of data capture which clearly outlines what data will be collected, what it will be used for and whether it will be fed to data brokers in the advertising ecosystem. The idea of sexual health being labelled is uncomfortable, so companies might also consider whether they can conduct advertising without analysing sensitive personal data; recognising there is a balance between brand loyalty and revenue driven from ads. Period-tracking app Flo appears to be alert to these issues and recently launched an ‘anonymous mode’ feature. This allows individuals to access the app without inputting personal data such as their name, email address or other identifiers after the topic of reproductive privacy gained global attention following the landmark US Supreme Court case decision to overturn Roe v Wade in 2022.
The UK Government appreciates the need to boost consumer confidence in buying and using tech products. On 29 April 2024, the UK Product Security and Telecommunications Infrastructure Act 2022 (“the Act”) came into effect, requiring manufacturers, importers, and distributors of UK consumer connectable (“smart”) products to meet minimum security requirements. It applies UK-wide. Many Femtech products are likely to be captured by this new law in one way or another, particularly given the number of Femtech applications available for use via internet-connected smartphones. The Act aims to reduce potential vulnerabilities in security that may result in cyber-attacks; it introduces requirements regarding the complexity of passwords, minimum security update periods, and closer engagement between users and manufacturers on the reporting of any security issues.
Meanwhile, generative artificial intelligence (“GenAI”) is an unescapable buzzword. In the Femtech space, it is poised to reinvent the industry, as it can analyse vast amounts of unstructured data and identify patterns. One of the more prominent use cases of artificial intelligence is chatbots. Whilst chatbots have been used since the 1960s, with ELIZA being one of the first to pass the Turing test, GenAI can “create new content” and could be leveraged – for example – for AI-virtual health advisors to provide increased awareness of female health concerns. There are unprecedented opportunities for this technology to increase health equity, particularly as the UK government rejected the proposal to roll out mandatory menopause training for GPs last year – despite it being estimated to cost billions annually in productivity loss and healthcare costs.
Some companies are already exploring the use of large/small language models in this space. Any company looking to fine-tune the model would have to ensure that they had permission to use the health data that they input. The model itself is unlikely to constitute personal data; however, to fully leverage AI-powered solutions companies are likely to add wider datasets to improve the accuracy and efficiency of the solution. The solution could be leveraged by individuals who could potentially input their own data and receive personalised recommendations that adapt to the needs of each female throughout the distinct stages of her life. For women suffering from conditions such as polycystic ovary syndrome (PCOS), a widely reported but underfunded health issue, the predictive capabilities of data-driven systems could forecast risks and empower women to take more proactive responses to their healthcare based on convenient access to real-time insights provided through an app. Importantly, Femtech does not just revolve around menstruation and family planning. AI analysis of mammograms can help with the early detection of breast cancer; personal nutrition plans can be created that are specifically tailored to a woman’s health and nutrition needs; and the technology can also help to predict the risk of diseases and genetic issues predominately faced by women.
Of course, ensuring accountability, transparency and safeguarding fundamental rights (including privacy) from an ethical standpoint is critical. This processing would certainly require a data protection impact assessment. For Femtech AI solutions that incorporate medical devices or in vitro diagnostics and are deployed in the EU, additional obligations will apply under the EU AI Act – as these are considered ‘high-risk.” Historically, women have been erased (or incorrectly accounted for) in medical studies and, as a result, this has impacted the extent of medical advice which can be provided around female health issues. For Femtech products to avoid similar, restrictive, outcomes, and thus falling foul of the EU AI Act, manufacturers must ensure that such products are free from bias (or are clear on what bias may remain).
Bias is not limited to gender. Bias can be present in relation to race, ethnicity, religion, sexual orientation, socioeconomic and educational backgrounds. The medical advice received by one woman will not necessarily be relevant for another whose background differs in one (or more) of these respects. Avoiding bias is no mean feat. Data is king – or perhaps that should be “queen.” Extensive, diverse, and informative data sets will be required to feed any Femtech solution incorporating AI and educate how to account for potential bias. That will require sufficient funding to allow the data to be properly obtained, assessed, and utilised; as the results are only ever as good as the data inputted. Done correctly, a well-funded Femtech industry demonstrating to consumers that products can help achieve health equity could hugely benefit the wider economy.
The opportunities in this sector are rapidly evolving, and deployed correctly, artificial intelligence can accelerate progress in bridging disparities and improving equal access to healthcare and education. However, the regulations in this space are complex and emerging on a global basis so care must be taken to ensure that data processed is adequately safeguarded.
Next Steps
You can find more views from the DLA Piper team on the topics of technology, regulation and the related legal issues on our blog, Technology’s Legal Edge.
If your organisation is deploying AI solutions, whether as part of Femtech or otherwise, you can download DLA Piper’s AI Act App and download our AI Report, a survey of real world AI use.
If you’d like to discuss any of the issues discussed in this article, get in touch with Linzi Penman, Naomi Pryde or your usual DLA Piper contact.