Designing for (and Against) Control: Ethical UX in the Age of Surveillance

We live in an era where nearly every interface, every app, and every “frictionless” interaction comes with a hidden cost: control.

Whether it’s a social media feed engineered for maximum attention, a smart home assistant listening to every word, or a seemingly benign login screen nudging you to share just a little more data, design choices shape power dynamics between companies and users.

As a user experience consultant and designer with a background in technology, I think constantly about these dynamics. Imagine a world where surveillance is pervasive, augmentation alters agency, and technology promises empowerment while embedding dependence and control. This isn’t just fiction—it’s the context we design for today.

In this article, I want to examine how UX design can be used both to enable and resist control in the age of surveillance. We’ll explore:

  • How design patterns shape user behaviour and power asymmetries

  • Examples of “dark patterns” and surveillance-enabling design

  • Strategies for ethical, user-centred alternatives

  • How speculative thinking can help designers anticipate unintended consequences

If you’re a designer, developer, strategist, or founder, this is a call to think more deeply about the ethical dimension of what you build.

The Problem: Design as an Instrument of Control

UX design is about making things easy to use. But easy for whom? And easy to do what?

When we make interfaces seamless and “frictionless”, we often reduce the cognitive load on users, but sometimes that means we’re also reducing their ability to make deliberate choices. This is especially true in contexts where:

  • Companies benefit from gathering personal data

  • Behavioural economics is weaponised to maximise time-on-site or in-app purchases

  • Choices are framed to obscure or discourage privacy-protective options

Surveillance capitalism depends on these design decisions. It’s not just the code that determines how invasive an app is, but the interface. The button placement, the copy, the modal windows, the visual hierarchy—these shape what users see, choose, and consent to.

Examples of design as control include:

  • Default opt-in to data sharing

  • Confusing or hidden privacy settings

  • Endless scrolling to maximise engagement

  • Interfaces that intentionally delay or obscure account deletion

  • Consent banners designed to push “Accept All”

These aren’t mistakes. They’re deliberate design choices serving business models dependent on control over user data and behaviour.

Real-World Examples: Dark Patterns and Surveillance Enablers

Dark patterns are well-studied, but they remain widespread. Let’s look at a few:

1. Privacy Zuckering

Coined after Facebook’s design choices, this involves tricking users into publicly sharing more than they intended. Default settings often prioritise visibility over privacy.

2. Roach Motel

Easy to get in, hard to get out. Account sign-up is one click. Account deletion is a maze of confirmation screens and hidden links.

3. Confirmshaming

Using guilt to push consent. For example, “No thanks, I don’t care about my privacy” as the decline option.

4. Forced Continuity

Users sign up for a free trial with card details, but cancelling requires a phone call or navigating confusing menus.

5. Obscured Opt-Outs

Designers may bury the “Decline” or “Manage Preferences” options behind modals, tiny text, or extra steps, making it far easier to “Accept All”.

These patterns are effective precisely because they exploit cognitive biases. They’re designed to remove user agency under the guise of convenience.

Why Ethical UX Matters (and Why It’s Hard)

Many designers don’t set out to manipulate users. Instead, they’re working under business pressures that reward certain metrics:

  • Engagement time

  • Conversion rates

  • Data capture rates

Ethical design often feels like swimming against the tide. It may reduce short-term KPIs in favour of long-term trust. It can be difficult to sell to stakeholders who want immediate results.

Yet ignoring the ethical dimension risks long-term consequences:

  • User backlash (think GDPR fines or public scandals)

  • Erosion of trust in the brand

  • Personal reputational risk for designers

  • Contributing to a surveillance society that limits human agency

In short, designers hold real power—and with that power comes responsibility.

Designing for Agency: Principles for Ethical UX

If you want to design against control—helping users retain meaningful agency in a surveillance-prone world—there are practical strategies you can adopt.

1. Make Privacy the Default

Instead of opt-in tracking being the default, make privacy the baseline:

  • Default settings that minimise data collection

  • Easy-to-understand explanations of what data is collected and why

  • No forced consent (genuine choice)

This aligns with both ethical practice and modern regulations like GDPR.

2. Reduce Cognitive Load without Removing Choice

Good UX aims to reduce friction. But friction isn’t always bad—some friction is necessary for informed consent.

Example: Instead of one big “Accept All” button and a buried “Manage Preferences,” provide clear, accessible controls.

Design for clarity over speed when it comes to consent and settings.

3. Be Transparent about Data Use

People don’t want walls of legalese. They want:

  • Plain-language explanations

  • Clear icons or visuals

  • Contextual prompts (“We need your location for X”)

Transparency helps users make informed decisions—and builds trust.

4. Avoid Manipulative Language

Copy matters. Avoid guilt trips, shaming, or scare tactics to push consent or purchases.

Example:

  • Bad: “No, I don’t care about saving money”

  • Better: “Decline offer”

Ethical design respects user autonomy.

5. Empower Users to Leave

Make it easy to:

  • Delete accounts

  • Withdraw consent

  • Export data

This signals respect for the user and their ownership of their data.

6. Consider Edge Cases and Vulnerable Users

Surveillance harms often fall hardest on marginalised communities.

When designing, ask:

  • Who could be most harmed by this data collection?

  • Could this be used to discriminate or target?

  • How can we mitigate those risks?

Inclusive design is ethical design.

Anticipating Failure Modes: Lessons from Speculative Thinking

As an aspiring novelist writing about augmentation, AI, and surveillance cartels, I’ve spent a lot of time imagining how tech can go wrong. This is a valuable design skill.

Speculative design is about anticipating unintended consequences before they happen. It’s asking:

  • How could this interface be misused?

  • What happens if the company is acquired by someone with different values?

  • What if the data falls into the wrong hands?

This approach isn’t about fear-mongering—it’s about responsibility. Just as engineers design safety features into bridges, designers should build safeguards into interfaces.

Example Exercise: Pre-Mortem for UX

  • Gather your team.

  • Ask: “Imagine this feature is being called out in the media for unethical design. Why? What went wrong?”

  • Use these answers to refine your design before launch.

Designing for Trust in the Age of AI

AI complicates these questions. Modern interfaces increasingly:

  • Predict user needs

  • Personalise content

  • Automate decisions

While these can improve UX, they can also introduce opacity and control:

  • Black-box recommendations

  • Emotional manipulation

  • Automated surveillance

Ethical AI design means:

  • Explainability (help users understand why a decision was made)

  • Control (allow users to opt out or adjust preferences)

  • Privacy (minimise unnecessary data collection)

Designers must hold the line on these principles—even as business incentives push towards maximum personalisation and data extraction.

Real-World Inspiration: Companies Doing It Better

Not every company relies on dark patterns. Some prioritise ethical design:

  • Apple: Emphasises on-device processing and privacy labels

  • Signal: Collects minimal user data

  • DuckDuckGo: Privacy-focused search with clear settings

These choices aren’t purely altruistic. They differentiate brands in a market where user trust is increasingly valuable.

Designers as Stewards of User Agency

If there’s a single idea I want to emphasise, it’s this:

Designers shape power dynamics.

  • We can design to maximise control for companies.

  • Or we can design to maximise agency for users.

Ethical UX isn’t just about complying with regulations. It’s about respecting people as autonomous, thinking beings. It’s about resisting the temptation to turn design into manipulation.

Bringing Fictional Thinking into Real-World Practice

I write worlds where people lose control—where interfaces and systems are built to dominate, deceive, or surveil. That’s not just entertainment. It’s a warning.

But it's also a tool. By imagining extreme futures, we can better understand the risks of today’s design decisions.

Questions to ask your team:

  • What would an unethical version of this feature look like?

  • Could this design be misused under authoritarian regimes?

  • Are we normalising surveillance in subtle ways?

This isn’t paranoia. It’s professional responsibility.

Conclusion: Choosing the Future We Design

Design is never neutral. Every choice reflects values—whether we acknowledge them or not.

In the age of surveillance, where incentives often push us towards user manipulation and data extraction, designers, developers, and strategists have a choice:

  • Design for control.

  • Or design against it.

By adopting ethical UX principles, practising transparency, respecting user agency, and thinking speculatively about consequences, we can build products that serve people—not exploit them.

If you're a founder, designer, or developer reading this: ask yourself what kind of future you want your work to enable. Because one way or another, we are the ones shaping it.

—-

About the Author

I’m Abi Fawcus, a user experience consultant, graphic and web designer, programmer, and long-time observer of the tech industry. My work blends design strategy with future-focused thinking, helping clients create better, more ethical products.