In another day and age, most love stories began with a glance across a crowded room, a chance encounter at the workplace or an introduction through mutual friends. Today, the landscape has shifted dramatically. Roughly one third of U.S. adults now say they have used a dating site, a number that jumps to 53% for 18–29-year-olds.[1] Dating apps haveallowed access to people you’d never otherwise meet. Yet this promise came with a hidden cost.

In its infancy, online dating was a static process. Users filled out questionnaires and websites showcased profiles. Back in the 2000’s, these platforms generated revenue through subscription fees.[2] Then the business model evolved. Corporations discovered something far more valuable than monthly payments: user data. They began deploying machine learning models to optimize matches, not for love, but for engagement.[3] With every swipe, additional data was collected. The more data was collected, the more behavior could be predicted, potentially for profit through premium features or data sale to third parties.[4] These business practices often conducted without the user’s explicit consent, created cascading problems: privacy breaches, cybersecurity vulnerabilities, and an erosion of user autonomy.[5]

Generative AI has only accelerated this trajectory: In 2024, Tinder launched Photo Selector, an AI tool that analyzes photos on a user’s device and selects the ones most likely to generate matches.[6] In 2025, Hinge followed by deploying AI-driven prompt feedback, purportedly developed by PhD behavioral scientists, to help users craft authentic profiles.[7] Grindr went further. The corporation is testing an AI agent that functions as a dating concierge and matchmaker in one: It scouts potential matches and converses with other users’ AI agents to assess compatibility before humans even meet.[8] According to CEO Arison, this automation saves time by identifying deal-breakers through bot-to-bot interactions.[9] But convenience masks a troubling reality: Machines are now mediating human intimacy, learning human desires, and predicting their choices, all while users remain largely unaware of how their data is being used.[10]

Those privacy risks predate AI. Civil society groups have sounded the alarms for years.[11] A 2024 study found that six out of fifteen dating apps leaked users’ exact location, exponentially increasing personal safety risks.[12] The study identified various vulnerabilities such as API traffic leaks, which not only affected personal data provided by users at profile creation, but also metadata about user behavior including likes, recent activity, and search filters.[13] And here is the legal vacuum: While certain federal statutes restrict the sale of specific categories of data — such as health or financial information — there is no overarching federal prohibition on buying consumer data that companies share with aggregators or brokers.[14]

The data vulnerability has spawned a troubling countermovement: safety-focused apps that operate in legal grey zones. One such app, Tea Dating Advice, leverages AI to verify users’ identities through facial recognition on an uploaded selfie. It then invites users to upload photos of others for reverse image searches and run background checks.[15] Images are scanned, identities verified, reputations rated, all without individuals’ knowledge, let alone consent. Safety, a noble goal, is undermined by questionable practices lacking adequate protections that can, in extreme cases, prove harmful to individuals.

Corporate accountability remains elusive. The Communications Decency Act, enacted in 1996, shields technology companies from most lawsuits by granting them broad immunity under Section 230 for user-generated content, and in many cases even for algorithmic recommendations that amplify such content.[16] This legal shield has proven nearly impenetrable.[17] The inadequacy of this framework is evident in the Matthew Herrick case. After an ex-boyfriend created a fake Grindr account on his behalf and solicited hundreds of men, Herrick brought a suit against Grindr.[18] Both the Second Circuit and the Southern District of New York read Section 230 as a bar of any type of claim against Grindr, including products liability relying on Section 230(f)(2)’s definition of interactive computer service.[19] When his counsel petitioned for certiorari in 2019, the Supreme Court declined to review the case.[20] Justice Thomas has since mentioned the case in a subsequent advisory statement signaling openness to reconsider Section 230, a statute he viewed as too broadly construed.[21] Today, recourse for plaintiffs remains sparse. Legal experts have suggested reforming Section 230[22] and supplementing privacy tort law with traditional tort claims to help courts consider the ways the internet services amplify privacy harms.[23]

Small changes, such as continuous opt-in consent for AI-use, should be mandatory.[24] Data deletion policies should be more rigorous to prevent indefinite data retention and minimize exposure risks during potential cybersecurity breaches. Europe offers a glimpse of what accountability could look like: In June 2025, a complaint was filed against Bumble related to its Icebreaker feature, an AI tool aimed at facilitating conversations, which had been released to millions of European users without consent.[25] The complaint alleged that Bumble had failed to properly inform users about data processing, lacked a valid legal basis for processing (including sensitive data like sexual orientation), and inadequately responded to data access requests.[26] Although the case remains pending, it illustrates how European regulators might use the GDPR to confront the growing use of personal data in AI systems on dating platforms.

“The right to be let alone,” as Justice Brandeis wrote, is the foundation of privacy.[27] In romance, this principle becomes existential. Yet, we voluntarily surrender this privacy, trusting corporations to protect our tender moments in our hope for connection. Courts will continue to face questions on how to address questions of liability when algorithms fail, when data breaches expose intimate secrets, and when AI companions blur the line between product and assistant. Technology should serve human connection, not replace it. As we swipe through modern romance, we must ensure that the law protects our right to genuine intimacy: Unpredictability, vulnerability, and perpetual becoming. Afterall, love has always required a leap of faith, not just an optimized algorithm.

[1] Emily A. Vogels & Colleen McClain, Key findings about online dating in the U.S., Pew Research Center (2023), https://www.pewresearch.org/short-reads/2023/02/02/key-findings-about-online-dating-in-the-u-s/.

[2] Robert L Mitchell, Online dating: The technology behind the attraction, Computerworld (2009), https://www.computerworld.com/article/1580886/online-dating-the-technology-behind-the-attraction.html.

[3] Bobby Allyn, Study: Tinder, Grindr And Other Apps Share Sensitive Personal Data With Advertisers, Npr.org (2020), https://www.npr.org/2020/01/14/796427696/study-grindr-tindr-and-other-apps-share-sensitive-personal-data-with-advertisers.

[4] Id.

[5] Evan Michael Gilbert, Antitrust and Commitment Issues: Monopolization of the Dating App Industry, 94 N.Y.U. L. Rev. 862 (2019).

[6] Tinder® Unveils “Photo Selector” AI: Feature to Make Choosing Profile Pictures Easier, Tinder Newsroom (2024), https://www.tinderpressroom.com/Tinder-R-Unveils-Photo-Selector-AI-Feature-to-Make-Choosing-Profile-Pictures-Easier.

[7] Hinge Launches Prompt Feedback to Help Daters Create Unique and Authentic Profiles, Hinge (2025), https://hinge.co/newsroom/prompt-feedback.

[8] Belle Lin, Grindr Aims to Build the Dating World’s First AI “Wingman,” Wall St. J. (2024), https://www.wsj.com/articles/grindr-aims-to-build-the-dating-worlds-first-ai-wingman-8039e091.

[9] Id.

[10] Siân Boyle, AI “wingmen” bots to write profiles and flirt on dating apps, The Guardian (2025), https://www.theguardian.com/lifeandstyle/2025/mar/08/ai-wingmen-bots-to-write-profiles-and-flirt-on-dating-apps.

[11] Civil, Civil Society Open Letter Urges Bumble to Take Privacy Seriously, Mozilla Foundation, https://www.mozillafoundation.org/en/campaigns/civil-society-open-letter-urges-bumble-to-take-privacy-seriously.

[12] Karel Dhondt et al., Swipe Left for Identity Theft: An Analysis of User Data Privacy Risks on Location-based Dating Apps Swipe Left for Identity Theft: An Analysis of User Data Privacy Risks on Location-based Dating Apps, USENIX (2024), https://www.usenix.org/system/files/usenixsecurity24-dhondt.pdf.

[13] Id.

[14] Id.

[15] Elliot Williams, Is this even legal? Answers to every question about the Tea app drama, CNN (2025), https://www.cnn.com/2025/07/25/us/tea-app-dating-privacy-cec.

[16] See Alan Z. Rozenshtein, Interpreting the Ambiguities of Section 230, 41 Yale J. Reg. Bulletin 60 (2024) (on how courts have grappled to decide Section 230 issues on social-media platforms and content-amplification algorithms);  Vincent Dumas, Enigma Machines: Deep Learning Algorithms as Information Content Providers under Section 230 of the Communications Decency Act, 6 Wis. L. Rev. 1581 (2022); Max Del Real, Breaking Algorithmic Immunity: Why Section 230 Immunity May Not Extend to Recommendation Algorithms, Wash. L. Rev. Online 1 (2024).

[17] Id.

[18] Herrick v Grindr LLC, 765 F. App'x 586 (2d Cir. 2019)

[19] “The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.” 47 U.S.C. § 230(f)(2).

[20] Carrie Goldberg, Winning Through Losing, Americanbar.org (2020), https://www.americanbar.org/groups/diversity/women/publications/perspectives/2021/december/winning-through-losing/.

[21] Malwarebytes, Inc. v Enigma Software Group USA, LLC, 141 S Ct 13 (2020)

[22] Danielle Keats Citron, How To Fix Section 230, 103 B.U. L. Rev. 713 (2022)

[23] Danielle Keats Citron, Mainstreaming Privacy Torts, 98 Calif. L. Rev. 1805 (2010)

[24] Paige Collings, Dating Apps Need to Learn How Consent Works, Electronic Frontier Foundation (2025), https://www.eff.org/deeplinks/2025/07/dating-apps-need-learn-how-consent-works.

[25] Bumble’s AI icebreakers are mainly breaking EU law, noyb.eu (2025), https://noyb.eu/en/bumbles-ai-icebreakers-are-mainly-breaking-eu-law.

[26] Id.

[27] S. Warren & L. Brandeis, The Right to Privacy, 4 Harv. L. Rev. 193, 193 (1890)