『Fighting dark patterns - regain your free will online』のカバーアート

Fighting dark patterns - regain your free will online

Fighting dark patterns - regain your free will online

著者: Marie Potel
無料で聴く

このコンテンツについて

Have you ever struggled to resist sites or apps which prompt you to share ever more personal data? Who hasn’t casually clicked on the big green “I agree” button, instead of the little grey links below? And who doesn’t lose their patience unsubscribing from a website, when all it took was one click to “try for free”?


These features are called “dark patterns” or “deceptive patterns”, which are interfaces that deceive or manipulate users to make them act without realizing or against their own interests.


Why should we care about it ? Because …


✅ dark patterns harm the autonomy, the welfare and the privacy of individuals, and are psychologically detrimental


✅ a number of laws already prohibit dark patterns, and the global legislative framework is shifting towards clarity, transparency, accessibility and fairness by design


✅ dark patterns affect competition and trust in brands, and even puts our democratic models at risk


And because eventually, there is nothing sustainable about tricking users.


Dark patterns are pretty much everywhere online and yet, it’s not inevitable to be deceived or manipulated. With this podcast, once a month, Marie Potel, founder of the legal design agency Amurabi and of the platform fairpattern.com explore with her guests all the aspects of this dark and broad matter : regulation, ethics, marketing, user experience and much more.


A question or a need for support ? Go on fairpatterns.com or contact Marie Potel on LinkedIn !


This podcast is proposed and presented by Marie Potel, produced by Amicus Radio and directed by Leobardo Arango.


Original Music : Alexis Mallet.

Hébergé par Acast. Visitez acast.com/privacy pour plus d'informations.

Marie Potel
マネジメント・リーダーシップ マーケティング マーケティング・セールス リーダーシップ 社会科学 科学 経済学
エピソード
  • Beyond the Interface: Building Safe and Fair Experiences with Stephanie Lucas
    2025/05/16

    In this episode of Fair Patterns, we sit down with Stephanie, a leading voice in ethical design and founder of the Virtuous Designers League, to explore how we can rebuild digital spaces rooted in integrity and trust.

    Stephanie shares her journey from advertising to shaping trust and privacy experiences at LinkedIn, offering unique insights into why deceptive design practices—also known as dark patterns—persist despite growing awareness and regulation. Together, we examine how companies can break free from short-term manipulative tactics and instead build long-term, meaningful relationships with their users through fairness, transparency, and value-driven design.

    From the power of respecting user agency to the business case for ethical design, this conversation is packed with practical advice, industry anecdotes, and inspiring ideas for designers, product leaders, and anyone committed to creating a more honest digital future.

    Join us as we uncover why respecting users isn’t just the right thing to do—it’s the foundation for sustainable business success. Don’t miss this powerful episode on designing with conscience and turning trust into a true competitive advantage.

    Hébergé par Acast. Visitez acast.com/privacy pour plus d'informations.

    続きを読む 一部表示
    39 分
  • The Future of Fairness: Navigating AI’s Ethical Landscape with Luc Julia
    2025/01/14

    In this episode of Fair Patterns, we dive deep into the ethical dimensions of artificial intelligence with none other than Luc Julia, a visionary computer scientist, innovator, and co-creator of Siri. Together, we explore the critical challenges and opportunities in designing AI systems that prioritize fairness, transparency, and user empowerment.


    Luc shares his insights on the misuse of AI through dark patterns, the environmental impact of AI technologies, and the importance of balancing innovation with ethical responsibility. From his groundbreaking work in human-computer interaction to his perspectives on the future of AI, Luc emphasizes the role of education and collaboration in creating technology that truly serves humanity.


    Join us as we uncover how AI can be a force for good—augmenting human intelligence, fostering trust, and elevating ethical standards in technology. Don’t miss this enlightening conversation with one of the leading minds in AI today.

    Hébergé par Acast. Visitez acast.com/privacy pour plus d'informations.

    続きを読む 一部表示
    26 分
  • Dark Patterns and Consumer Well-Being: Insights from Chandni Gupta
    2024/10/16

    In this episode of Fair Patterns: Regain Your Freedom Online, we welcome Chandni Gupta, Deputy CEO and Digital Policy Director at the Consumer Policy Research Centre (CPRC), Australia’s leading consumer policy think tank. Chandni shares insights from CPRC’s 2022 report, Duped by Design, which revealed that 83% of Australians have experienced negative consequences from dark patterns—deceptive online designs that manipulate users.


    We discuss how dark patterns not only lead to financial losses but also affect privacy, mental well-being, and consumer trust. Chandni highlights the legal gaps in Australia, where 8 out of 10 dark patterns remain legal, and contrasts this with stronger protections in the U.S. and Europe.

    Chandni also emphasizes the role of businesses in improving the online experience by rethinking design practices to build trust. Finally, she shares insights from her Churchill Fellowship, which focuses on finding global solutions to protect consumers from digital harm.


    For more insights, check out Chandni’s research and fellowship:

    • Duped by Design – Manipulative online design: Dark patterns in Australia
    • Let Me Out – Subscription trap practices in Australia
    • Cost of managing your privacy
    • Chandni’s Churchill Fellowship


    Subscribe to CPRC’s newsletter at cprc.org.au, and don’t forget to subscribe to Fairpatterns' newsletter: Unlock the Trust.

    Hébergé par Acast. Visitez acast.com/privacy pour plus d'informations.

    続きを読む 一部表示
    32 分
まだレビューはありません