Friday, May 14, 2021

Lawmakers target insidious digital ‘dark models’

Must read


In 2010, British Designer Harry Brignull has coined a practical new term for everyday boredom: dark patterns, which means digital interfaces that subtly manipulate people. It has become an art term used by privacy activists and researchers. Today, more than a decade later, currency takes on new legal weight.

Dark patterns come in many forms and can trick someone short of time or money, or to confiscate personal data. A common example is the digital obstacle course that pops up when you try to remove an account or subscription online, such as streaming TV, repeatedly asking you if you really want to cancel. A 2019 Princeton dark patterns investigation in e-commerce, 15 types of dark patterns have been identified, including obstacles to canceling subscriptions and countdowns to rush consumers into hasty decisions.

A new California law approved by voters in November ban certain dark schemes that make people give businesses more data than they want. the California Privacy Rights Act aims to strengthen historic privacy law. The section of the new law defining user consent says that “consent obtained through the use of dark patterns does not constitute consent.”

This is the first time that the term dark patterns appeared in U.S. law, but probably not the last, says Jennifer King, a privacy specialist at the Stanford Institute for
Human-centered artificial intelligence. “It’s probably going to proliferate,” she said.

State Senators in Washington This Month introduced their own privacy bill – a third attempt to pass a law that, like California’s, is in part motivated by the lack of general federal privacy rules . This year’s bill verbatim copies California’s ban on using dark grounds to obtain consent. A competing invoice unveiled Thursday and backed by the Washington ACLU does not include the term.

King says other states, and perhaps federal lawmakers emboldened by the Democrats’ takeover of the U.S. Senate, could follow suit. Bipartisan duo of senators took on dark models with the 2019 failure Deceptive Online User Experiences Reduction Act, although the text of the law does not use this term.

With one caveat, comes the country’s premier dark pattern regulatory status. It’s unclear exactly which dark models will become illegal when the new law takes full effect in 2023; the rules are to be determined by a new California privacy agency that won’t start operating until later this year. The law defines a dark pattern as “a user interface designed or manipulated with the substantial effect of subverting or compromising the autonomy, decision-making or choice of the user, as further defined by regulation.”

James Snell, a privacy partner at Perkins Coie law firm in Palo Alto, Calif., Says it’s not yet clear whether or what specific rules the privacy agency has privacy will work out. “It’s a little troubling for companies trying to comply with the new law,” he says.

Snell says clear limits on what’s acceptable – like restrictions on how a business obtains consent to use personal data – could benefit both consumers and businesses. California law might also be more notable for law catching up with privacy jargon, rather than a dramatic expansion of regulatory power. “It’s a cool name, but it just means you’re lying or deceptive, and there are a host of statutes and common law that deal with that already,” Snell says.

Alastair Mactaggart, the San Francisco real estate developer who powered the ACPL and also helped create the law it revised, says dark patterns were added in an attempt to give people more control over their lives private. “The rules of the game are not at equal distance because you have the smartest minds on the planet trying to make this as difficult as possible for you,” he says. Mactaggart believes the Dark Pattern Rules should eventually empower regulators to act against the sensitive behavior that now escapes censorship, such as making it easier to track the web, but extremely difficult to use the disengagement that California law requires. .

King, of Stanford, says it’s plausible. US privacy regulators’ enforcement typically focuses on cases of outright deception. California’s dark schema rules could act against clearly harmful tricks that are not. “Deception is planting a false belief, but dark models are more often than not a business that leads you down a predefined path, like coercion,” she says.


More WIRED stories

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article