Your Brain Waves Are Up for Sale. A New Law Wants to Change That. (April 17, 2024)

Comment:

Austin 2h ago: This is like a bandaid trying to patch a river from flowing.

By Jonathan Moens
April 17, 2024, NYT

Consumers have grown accustomed to the prospect that their personal data, such as email addresses, social contacts, browsing history and genetic ancestry, are being collected and often resold by the apps and the digital services they use.

With the advent of consumer neurotechnologies, the data being collected is becoming ever more intimate. One headband serves as a personal meditation coach by monitoring the user’s brain activity. Another purports to help treat anxiety and symptoms of depression. Another reads and interprets brain signals while the user scrolls through dating apps, presumably to provide better matches. (“‘Listen to your heart’ is not enough,” the manufacturer says on its website.)

The companies behind such technologies have access to the records of the users’ brain activity — the electrical signals underlying our thoughts, feelings and intentions.

On Wednesday, Governor Jared Polis of Colorado signed a bill that, for the first time in the United States, tries to ensure that such data remains truly private. The new law, which passed by a 61-to-1 vote in the Colorado House and a 34-to-0 vote in the Senate, expands the definition of “sensitive data” in the state’s current personal privacy law to include biological and “neural data” generated by the brain, the spinal cord and the network of nerves that relays messages throughout the body.

“Everything that we are is within our mind,” said Jared Genser, general counsel and co-founder of the Neurorights Foundation, a science group that advocated the bill’s passage. “What we think and feel, and the ability to decode that from the human brain, couldn’t be any more intrusive or personal to us.”

“We are really excited to have an actual bill signed into law that will protect people’s biological and neurological data,” said Representative Cathy Kipp, Democrat of Colorado, who introduced the bill.

Senator Mark Baisley, Republican of Colorado, who sponsored the bill in the upper chamber, said: “I’m feeling really good about Colorado leading the way in addressing this and to give it the due protections for people’s uniqueness in their privacy. I’m just really pleased about this signing.”

The law takes aim at consumer-level brain technologies. Unlike sensitive patient data obtained from medical devices in clinical settings, which are protected by federal health law, the data surrounding consumer neurotechnologies go largely unregulated, Mr. Genser said. That loophole means that companies can harvest vast troves of highly sensitive brain data, sometimes for an unspecified number of years, and share or sell the information to third parties.
Image
A mannequin head with a black electrode cap.
An electrode cap, part of a stress EEG assessment system, on display at a Hipposcreen Neurotech trade booth.Credit…Steve Marcus/Reuters
Image
A black-and-white surgical robot.
A Neuralink surgical robot used for implanting neural devices.Credit…Woke Studios/Neuralink, via Reuters

Supporters of the bill expressed their concern that neural data could be used to decode a person’s thoughts and feelings or to learn sensitive facts about an individual’s mental health, such as whether someone has epilepsy.

“We’ve never seen anything with this power before — to identify, codify people and bias against people based on their brain waves and other neural information,” said Sean Pauzauskie, a member of the board of directors of the Colorado Medical Society, who first brought the issue to Ms. Kipp’s attention. Mr. Pauzauskie was recently hired by the Neurorights Foundation as medical director.

The new law extends to biological and neural data the same protections granted under the Colorado Privacy Act to fingerprints, facial images and other sensitive, biometric data.

Among other protections, consumers have the right to access, delete and correct their data, as well as to opt out of the sale or use of the data for targeted advertising. Companies, in turn, face strict regulations regarding how they handle such data and must disclose the kinds of data they collect and their plans for it.

“Individuals ought to be able to control where that information — that personally identifiable and maybe even personally predictive information — goes,” Mr. Baisley said.

Experts say that the neurotechnology industry is poised to expand as major tech companies like Meta, Apple and Snapchat become involved.

“It’s moving quickly, but it’s about to grow exponentially,” said Nita Farahany, a professor of law and philosophy at Duke.

From 2019 to 2020, investments in neurotechnology companies rose about 60 percent globally, and in 2021 they amounted to about $30 billion, according to one market analysis. The industry drew attention in January, when Elon Musk announced on X that a brain-computer interface manufactured by Neuralink, one of his companies, had been implanted in a person for the first time. Mr. Musk has since said that the patient had made a full recovery and was now able to control a mouse solely with his thoughts and play online chess.

While eerily dystopian, some brain technologies have led to breakthrough treatments. In 2022, a completely paralyzed man was able to communicate using a computer simply by imagining his eyes moving. And last year, scientists were able to translate the brain activity of a paralyzed woman and convey her speech and facial expressions through an avatar on a computer screen.

“The things that people can do with this technology are great,” Ms. Kipp said. “But we just think that there should be some guardrails in place for people who aren’t intending to have their thoughts read and their biological data used.”

That is already happening, according to a 100-page report published on Wednesday by the Neurorights Foundation. The report analyzed 30 consumer neurotechnology companies to see how their privacy policies and user agreements squared with international privacy standards. It found that only one company restricted access to a person’s neural data in a meaningful way and that almost two-thirds could, under certain circumstances, share data with third parties. Two companies implied that they already sold such data.

“The need to protect neural data is not a tomorrow problem — it’s a today problem,” said Mr. Genser, who was among the authors of the report.

The new Colorado bill won resounding bipartisan support, but it faced fierce external opposition, Mr. Baisley said, especially from private universities.

Image
Representative Cathy Kipp, Democrat of Colorado, wearing a mask and a black blazer, talks to a woman while standing among the seats of the State Capitol in Denver.
Representative Cathy Kipp, Democrat of Colorado, center, introduced the new bill.Credit…David Zalubowski/Associated Press

Testifying before a Senate committee, John Seward, research compliance officer at the University of Denver, a private research university, noted that public universities were exempt from the Colorado Privacy Act of 2021. The new law puts private institutions at a disadvantage, Mr. Seward testified, because they will be limited in their ability to train students who are using “the tools of the trade in neural diagnostics and research” purely for research and teaching purposes.

“The playing field is not equal,” Mr. Seward testified.

The Colorado bill is the first of its kind to be signed into law in the United States, but Minnesota and California are pushing for similar legislation. On Tuesday, California’s Senate Judiciary Committee unanimously passed a bill that defines neural data as “sensitive personal information.” Several countries, including Chile, Brazil, Spain, Mexico and Uruguay, have either already enshrined protections on brain-related data in their state-level or national constitutions or taken steps toward doing so.

“In the long run,” Mr. Genser said, “we would like to see global standards developed,” for instance by extending existing international human rights treaties to protect neural data.

In the United States, proponents of the new Colorado law hope it will establish a precedent for other states and even create momentum for federal legislation. But the law has limitations, experts noted, and might apply only to consumer neurotechnology companies that are gathering neural data specifically to determine a person’s identity, as the new law specifies. Most of these companies collect neural data for other reasons, such as for inferring what a person might be thinking or feeling, Ms. Farahany said.

“You’re not going to worry about this Colorado bill if you’re any of those companies right now, because none of them are using them for identification purposes,” she added.

But Mr. Genser said that the Colorado Privacy Act law protects any data that qualifies as personal. Given that consumers must supply their names in order to purchase a product and agree to company privacy policies, this use falls under personal data, he said.

“Given that previously neural data from consumers wasn’t protected at all under the Colorado Privacy Act,” Mr. Genser wrote in an email, “to now have it labeled sensitive personal information with equivalent protections as biometric data is a major step forward.”

In a parallel Colorado bill, the American Civil Liberties Union and other human-rights organizations are pressing for more stringent policies surrounding collection, retention, storage and use of all biometric data, whether for identification purposes or not. If the bill passes, its legal implications would apply to neural data.

Big tech companies played a role in shaping the new law, arguing that it was overly broad and risked harming their ability to collect data not strictly related to brain activity.

TechNet, a policy network representing companies such as Apple, Meta and Open AI, successfully pushed to include language focusing the law on regulating brain data used to identify individuals. But the group failed to remove language governing data generated by “an individual’s body or bodily functions.”

“We felt like this could be very broad to a number of things that all of our members do,” said Ruthie Barko, executive director of TechNet for Colorado and the central United States.

Comments:

justgimmesometruth
New York2h ago

Nice that Colorado is worried about companies stealing your brain waves (which I doubt could be of much use to anyone). Would be good to live in a state like this.

New Yorkers, on the other hand, need to be worried about being shot or stabbed in the subway. Streets crowded with the homeless. Meth zombies.

How nice to live in Colorado where you can worry about people stealing your brain waves. New Yorkers worry about people stealing their cash at gunpoint.
Recommend
Flag
DS commented 2 hours ago
D
DS
LA2h ago

It was only a matter of time before we could trade in vibes.
1 Recommend
Flag
Rich Ermoian commented 2 hours ago
R
Rich Ermoian
Evanston Illinois2h ago

While I respect your view and several points well taken, your pure ignorance of tech capitalism, lies in the fact you say “in general, so far it seems, this will be rolled out very very slowly”. If you’re planning on working as a computer engineer for the next 10 years, you might look over your shoulder and notice the AI gaining on your nearing obsolescence. Your industry is founded on overreaching, morally bankrupt and disguised data gathering that is insatiable but required by you and your field. Monetizing my every move, action and choices is obscene. Does the industry want my data? Then pay me for it. 
Computers are, of course, necessary.
But you need to bone up on the humanities. And the realities of exploiting humans for profit.
Money and power corrupt. Have you had the chance to experience the dark web?
Throw that into your algorithms.
1 Recommend
Flag
just saying~ commented 2 hours ago
j
just saying~
CT2h ago

I wonder how much such info is worth in “dollars.” Could, and should (!?!) it be sold is obviously important too…
I wonder if someone wants to make an offer on the rights to mine my mind and what such an offer might be –
Are we going to have to wear tin foil helmets to prevent unwanted brain peeks?
It seems more and more like we are actually reaching the future~
Recommend
Flag
M commented 2 hours ago
M
M
Colorado2h ago

Colorado and Arizona, so close but so far apart in how they are governed.
1 Recommend
Flag
Nino commented 2 hours ago
N
Nino
Austin2h ago

This is like a bandaid trying to patch a river from flowing.
Recommend
Flag
Sarah commented 2 hours ago
S
Sarah
San Francisco2h ago

Thank you Colorado Democrats for trying to protect citizens from the corporate theft of their personal data.
Recommend
Flag
MikeinSonoma commented 2 hours ago
M
MikeinSonoma
California2h ago

Colorado, my home state, I am so proud of you, even all you bad drivers that move there in the late 70s and early 80s, I even forgive you! …But really, get out of the left lane. 😉
Recommend
Flag
akamai commented 2 hours ago
a
akamai
New York2h ago

Letting private industry eventually know your thoughts. What a “brilliant” idea.
Recommend
Flag
CALVIN AND HOBBES commented 2 hours ago
C
CALVIN AND HOBBES
MONTREAL2h ago

If brain wave data are sold then this opens the door to all our biometric data – most creepy
Recommend
Flag
-dz commented 2 hours ago
-dz
-dz
Wyiot Ancestral Territory2h ago

The logic of capitalism.
1 Recommend
Flag
Johnny commented 2 hours ago
J
Johnny
New York2h ago

I read that tried this on a group of maga folks and the computers flashed “NO DATA FOUND!”
Recommend
Flag
The Raven commented 2 hours ago
T
The Raven
Flourtown PA2h ago

Sadly it’s not Gene Roddenberry’s vision of the future we’re moving towards it’s Phillip K. Dick’s
Recommend
Flag
Glen commented 2 hours ago
G
Glen
Pleasantville2h ago

In 1984, “Nothing was your own except the few cubic centimetres inside your skull. ”

Winston turned to out to be mistaken, but at least in the novel, the regime had to work at it. They couldn’t just buy a list of his thought crimes from a data broker.
Recommend
Flag
JoJav commented 2 hours ago
J
JoJav
USA2h ago

I was a beta tester and was paid a small sum. The recipient was
was a fellow in Tibet. He had a nervous breakdown after receiving my brain data and now wants to sue me. Read the fine print, Bub- “What I think … is what you get.”
Recommend
Flag
Common sense labs commented 2 hours ago
C
Common sense labs
Germany2h ago

You can’t read thoughts from an eeg for now. But with AI accelerating everything, I can see this being reality in 10 years. And it will be used by the bad guys first
Recommend
Flag
Jason McDonald CD commented 2 hours ago
J
Jason McDonald CD
Fremont2h ago

Literally everything is for sale in our capitalist system.
Recommend
Flag
VJ commented 2 hours ago
V
VJ
Flyoverland2h ago

Talk about boundary issues!
Recommend
Flag
ms.u. commented 2 hours ago
m
ms.u.
usa2h ago

One way to jump into the lives of everyday people… a little toddler cap that transmits a beep to the mommies’, daddies’,and/or nannies’ cellphones and enters their chat or game session and lets them know that it’s potty-time.
Recommend
Flag
alec commented 2 hours ago
a
alec
miami2h ago

don’t wear a head thingy and no need to worry about your neuro data … it’s pretty basic to avoid this one
Recommend
Flag
Joe commented 2 hours ago
J
Joe
Seattle2h ago

In 2050: “Introducing SoulMate(tm), a new app that let’s you upload your soul and receive a personalized Soulnalysis(tm) that finds your match based on 25 unique SoulFactors(tm) such as what Karmic stage you’re in. Simply point your camera at your third eye to get started for free!”
Recommend
Flag
Moon Pie commented 2 hours ago
M
Moon Pie
Colorado2h ago

Looking forward to my $20 settlement check for the first inevitable data breach. ¯\_(ツ)_/¯
Recommend
Flag
Angus D commented 2 hours ago
A
Angus D
San Francisco2h ago

If big tech were a person, it would be a predator, dangling candy treats so that it can suck the blood of its hosts. Bravo to Colorado and Utah for reigning in big tech’s insatiable vampire appetite.

Where is California’s legislation? Federal legislation? Nowhere. Enabling their pet predator, we can only assume.

Go Colorado!
Recommend
Flag
LTR commented 2 hours ago
L
LTR
New York State2h ago

How long before thoughts become crimes?
Recommend
Flag
McX commented 2 hours ago
M
McX
Orbis Tertius2h ago

The opposition to this law by Open AI and Meta shows what the tech industry has in store for us.
Recommend
Flag
Amelia commented 2 hours ago
A
Amelia
Denver2h ago

Proud of my home state government and Governor Jared Polis for supporting this bill to protect our mental privacy. It’s very proactive and future oriented. I hope other states, our federal government, other countries follow suit. Thank you Colorado!!!!
1 Recommend
Flag
RainyDaySprite commented 2 hours ago
R
RainyDaySprite
Seattle2h ago

Hey man, paranoid as it sounds, how is this not already happening through other devices? And I’m not talking about predicting that which I might think about by way of existing technology. More than once over recent years, and just last week, a friend and I discussed how bizarre it was that online ads were too accurately tailored to thoughts that we’d never expressed out loud.

Around 2006-8? I told a computer savvy friend that Google was creepy because my gmail would have ads based on obscure things I’d mention in my gmail correspondence. Triggering this mention to my friend was very specific ad for something to do with the Beatles, after mentioning them in a gmail message.
My friend told me I was paranoid. We now know that I was not.
How does YouTube and Google know what I am saying when all my known devices’ microphones are turned off? And I mean, how do they know stuff that I never Google or write about? It’s not just predictive.
1 Recommend
Flag
The House dog commented 2 hours ago
The House dog
The House dog
Seattle2h ago

The Technology business is officially out of control. Our lives are not better, products are not better – everything is more complicated and costs too much money. Fix that with all your great ideas, why don’t you?
1 Recommend
Flag
lkl commented 2 hours ago
l
lkl
Yonkers, NY2h ago

This is an extremely sensitive frontier in privacy regulation. What could be more intrusive and unacceptable than the sake of data about your brain activity – your thoughts?
Recommend
Flag
Mike commented 2 hours ago
M
Mike
Seattle2h ago

Do they get that they are acting like villains?
Recommend
Flag
Sam commented 2 hours ago
S
Sam
I Am2h ago

If my wacky brain activity destroys these devices does my homeowners insurance cover me for any claims by the manufacturers?
Recommend
Flag
Stadia commented 2 hours ago
S
Stadia
Santa Fe, NM2h ago

Digital business enterprises are and always have been about theft.
1 Recommend
Flag
Wally Cox commented 2 hours ago
W
Wally Cox
Los Angeles2h ago

After reading several articles touting the need for more data to “train” AI it strikes me that NeuraLink is the perfect device to give AI models “all the inputs”. No synthetic AI sludge, just raw, sensory pure human data.
1 Recommend
Flag
Wally Cox commented 2 hours ago
W
Wally Cox
Los Angeles2h ago

After reading several articles touting the need for more data to “train” AI it strikes me that NeuraLink is the perfect device to give AI models “all the inputs”. No synthetic AI sludge, just raw, sensory pure human data.
1 Recommend
Flag
Average man commented 2 hours ago
A
Average man
NYC2h ago

How anyone could fall for the meditation headband that is data-mining you shows how naive and overly optimistic Americans generally are today.
Recommend
Flag
mattaret commented 2 hours ago
m
mattaret
New York2h ago

Who voted AGAINST this and WHY? This should be a literal no-brainer.
1 Recommend
Flag
CV Danes commented 2 hours ago
CV Danes
CV Danes
Upstate NY2h ago

This harvesting and selling of data needs to stop.
Recommend
Flag
Ace commented 2 hours ago
A
Ace
La2h ago

Modern civilization is so dreadful. There’s not a day that goes by I don’t wish I lived some time, any time, in the past. There is no life being lived today. A world turned into mindless screen slaves for profit and marketing. It is the least interesting, most disgusting time in civilizational history, and it will be our last act as a species. What an unfathomably ignoble way to go out. Even more unbelievable is that this could have all been stopped if we hadn’t been such zombies allowing the horrific technology pushed on us by Silicon Valley over the last twenty years. All it would have taken would have been a little integrity and the ability to say, “no thanks, get that nonsense out of my face.” But we just couldn’t do it. We couldn’t resist the new shiny things. So pathetic, so lame, such a waste.
Recommend
Flag
OldSchoolTechie commented 2 hours ago
O
OldSchoolTechie
Upstate NY2h ago

The military application, of patterning neural networks, with robotics, is significant here. Map in the vagaries of cultural adaptation, and it should work very week.
Recommend
Flag
Wally Cox commented 2 hours ago
W
Wally Cox
Los Angeles2h ago

After reading several articles touting the need for more data to “train” AI it strikes me that NeuraLink is the perfect device to give AI models “all the inputs”. No synthetic AI sludge, just raw, sensory pure human data.
Recommend
Flag
In-the-NC-mountains commented 2 hours ago
I
In-the-NC-mountains
NC2h ago

This is a ‘scare’ article, missing lots of information.

To collect EEG data, you need to wear a headset or cap of surface electrodes. How many people do that on a daily basis? To collect fMRI data then you need to have people lie in a long tube and be very quiet. How many people do you know who do that on a regular basis?

There is lots of data you should worry about being collected and sold. But brain activity is the least of your worries.
Recommend
Flag
Louis commented 6 hours ago
Louis
Louis
FL6h ago

Not nearly enough. Our rights in the US are “inalienable”…the writers of our founding documents knew that some people would sell/offer their rights away ( one cannot legally be a ‘slave’ in the US ) So in America this has been defined. Not all commercial transactions are allowed by law merely because somebody is ” a willing participant ” Furthermore- Obviously, the only reason Elon uses mice is that direct human control would rightly outrage society and ignite a PR firestorm…and legislation. But we shouldn’t harbor any delusions about what species is the ultimate target of control.
4 Recommend
Flag
1 Reply
More than that commented 2 hours ago
M
More than that
US2h ago

@Louis

It’s not just mice. Musk’s Neuralink has already used monkeys, a pig, and at least one human participant.

Citing his desire to be able to call his Tesla telepathically, Musk said his company was “targeting disable people” (sic – people with disabilities). It’s much worse and moving much faster than the legislature is.

If it’s any indication how people are reacting to control over/regulation of TikTok, we’re all in quite a lot of trouble. Shoshana Zuboff anticipated this (“You are now remotely controlled” and Surveillance Capitalism), and this takes it to a much more intense level.
Recommend
Flag
Blanche White commented 9 hours ago
B
Blanche White
South Carolina9h ago

This kind of stuff is so scary and intrusive.

We’re all Henrietta Lacks’.

There used to be, as technology developed, great concerns about privacy. As it evolved and kids got more and more comfortable with gadgets, games and convenience, that receded into the background

But this country needs to have a serious discussion on what our computer masters can take from us.

There should be no “opt in” default option and no company should be able to treat their customers like sitting ducks and biological gold mines!
8 Recommend
Flag
AJS commented 10 hours ago
A
AJS
Whereabouts Unknown10h ago

As it seems unlikely that we will stop the inexorable move further into Surveillance Capitalism, I would like to see laws put in place that stop these private companies from sharing their collected data with the government.

If the government gets ahold of all this information, the move to autocracy and a 1984 world will take nothing more than the flip of a switch.
6 Recommend
Flag
ABC commented 10 hours ago
A
ABC
bw10h ago

At what point did we start to believe this kind of data would be able to “read” a person’s mind? There is no such thing as a mind reader. All the data of this planet will not help to provide one. It only helps to create statistics and wild speculations based on those statistics. So, in order not to help provide this statistics we can simply try to meditate without a headband that measures brain waves. Or without a watch that measures heart beats. No one forces anyone to use these gadgets.
4 Recommend
Flag
2 Replies
Shane Lynch commented 8 hours ago
S
Shane Lynch
New Zealand8h ago

@ABC

Some are so naive.

We will be forced to by stealth.

As it becomes more prevalent in our lives, more and more apps etc will require us to participate in mind stuff.

How many of us now can’t do a job without a phone constantly attached to us? Or can’t do anything without a computer because we have no choice?

These crept up on us without our knowing.

This will too – the skeptical among us (I am one) will not be adopting this unless I have no other choice, but the younger generations will think it’s great.

They already don’t think about how much of themselves they put on the cloud, into app info and other things that require their privacy to be invaded without them knowing.

I would go so far as to say that this tech can be reverse engineered to control minds as much as it can be to “assist” us.

Combine this tech with AI, and the VR glasses, and where will it stop?

Will we even know where our lives start and stop anymore?
7 Recommend
Flag
More than that commented 2 hours ago
M
More than that
US2h ago

@Shane Lynch
Not “will be forced” – we’re already very much there.
Recommend
Flag
Kemal A. Delic commented April 17
K
Kemal A. Delic
Paris/GrenobleApril 17

Intelectual Property laws orginated in 15th century in Tuscany for the world being physical. As technology has changed the landscape – creating parallel ,virtual meta-worlds, laws should be fundamentaly changed. Content published on the Internet should be owned by authors and not exploited by big techs. Our trails/crumbles on Internet are harvested and sold. Also, sensitive area of privacy should clarify who owns health, financial, social data/information. This should be fundamental for the entire Digital Economy – I think.
7 Recommend
Flag
Shane Lynch commented April 17
S
Shane Lynch
New ZealandApril 17

Are we able to out out of this? Or will we be forced into it by stealth?

The whole opt out thing is indeed a joke – opt out for simple emails doesn’t work, they just come from another address. Or the option is ignored.

I don’t see this being any different especially as the stakes and monetary value is so high.

The danger will be when we don’t even know we are being scanned – now they need headsets and wires, how long before Apple and Android are scanning retinas wirelessly without us even knowing?

Apple watches already scan heart rates etc into a phone app.

Scanning retinas etc is just a small step from that.

Next logical step is to put a complete system of these scans and into a robot or synthetic body.

Don’t say it won’t happen – we all know it will. Combine this with AI and look at the potential.

Be afraid, be very afraid.
7 Recommend
Flag
Shane Lynch commented April 17
S
Shane Lynch
New ZealandApril 17

“Another reads and interprets brain signals while the user scrolls through dating apps, presumably to provide better matches.”

This is how they trap the unwary and make us Guinea Pigs.

Make it useful for innocuous things – online dating, how much more innocuous can you get? – and one day we wake up and find we need to be wired up to breath because we forget how to.

(I realise that assisted breathing isn’t a joke – I have been on assisted breathing more times than I care to remember).
Recommend
Flag
J F commented April 17
J
J F
TexasApril 17

Excellent step forward but technology changes fast. We must be proactive instead of reactive or we will always be playing in slow-motion against an Olympic level athlete.

We should have robust legal protections for our biometric data more broadly – if exceptions are needed, they can be carved out narrowly and specifically later.
13 Recommend
Flag
Kevin commented April 17
K
Kevin
Los AngelesApril 17

Opt-Out is, and has always been, a joke. It is certainly inappropriate here. People should have to explicitly Opt-IN to have this data collected. And violation should be punishable by incarceration.
55 Recommend
Flag
Ellen commented April 17
E
Ellen
Minnesota USAApril 17

I’m proud Minnesota is attempting to get ahead of the neural data sharing and storage game. To wait will be too late.

I have faith in the state’s attempts to do so because we have a great liberal governor and two-chamber legislature that have already protected women’s rights to privacy and reproductive freedoms, as well.
13 Recommend
Flag
Jinbo commented April 17
J
Jinbo
NYCApril 17

At some point can we also revisit the whole mantra “information wants to be free”?

Information doesn’t have “wants”.

Corporations on the other hand, do want information to be free- so they can maximize their profits from the sale of yours.

Information is personal property.

There’s no point in blathering on further until we address that foundational principle.

And corporations ought have no say in the matter.
21 Recommend
Flag
AC commented April 17
A
AC
Newman, GAApril 17

Re the comments about current technology of this sort not being ready for prime time:

It’s not possible to assess this legislation fully on the basis of this article, but wouldn’t it be better to get ahead of the Big Tech curve, for once?

And I don’t mean only getting ahead of the technological advancement curve, but also getting ahead of two other curves: (1) “There’s enough money behind this now to fund lobbying that will kill anything to stop it before it’s too late” curve and (2) “There are now so many users mesmerized by marketing that they can’t live without this” curve.

In the tech company world, history doesn’t merely rhyme; it actually does repeat itself.
3 Recommend
Flag
Happy/Sad NJ commented April 17
H
Happy/Sad NJ
New JerseyApril 17

I have little sympathy for universities and corporations that lobby for less restrictive rules in this area. We’ve already seen what happens with personal data as well as the inferences that companies make even from anonymous data.

Instead, I’d prefer completely restricted laws, basically forbidding collecting any information without specific consent, forbidding use and storage of that data in any way other than explicitly agreed to, and complete restrictions on sale of this data without consent AND compensation. Lastly, there should be massive, uncontested penalties for breaking any of these rules, enough to deter these mega-rich companies from trying it. A complete and enforceable opt-in required system.
13 Recommend
Flag
Hen commented April 17
H
Hen
SuttonApril 17

When a social media company employee has access to your private data, you are not always anonymous to that employee or the company. Those companies sell your information to advertisers and you are not always anonymous.

Without enforcing privacy rights, a person is open to a lot of invasive experiences. For example, those times you start getting ads about the random product you mentioned to your friend ten minutes ago.

Without neuro-rights…you might get an ad about the lingerie you were thinking of buying but hadn’t looked up (let’s say you had never bought lingerie or spoken about it before). Then later you see an ad about car accidents, thirty minutes before the alert about a traffic accident that just happened in your neighborhood. The ad didn’t help you but the AI/algorithm knew about the accident before you did. The speed of ad relevancy in all of the scenarios is connected to Ai and increased computing power. Combine that with your thoughts and things get next level futuristic in less than two years.

That’s how insidious the invasion of privacy will continue to be if we don’t own the data, the hardware and the knowledge of how these devices/services work. Let’s say you don’t care about the above lingerie scenario but you care if your thoughts about a pot brownie you want to buy becomes illegal suddenly. Try your best not to think about that pot brownie!

This is a pivotal moment and so many of us have very little awareness of the impact of neuro-rights.
14 Recommend
Flag
1 Reply
Louis commented 2 hours ago
Louis
Louis
FL2h ago

@Hen How about a government that pushes hard for fascism because they know that at large percentage of the citizenry have been traumatized by images of violence and war. AI knows what makes people easier to control and discard their resoning: fear.
Recommend
Flag
Sam JFM commented April 17
S
Sam JFM
New YorkApril 17

I use OpenBCI (open source brain computer interface) software and an EEG that are not consumer grade. I am developing an app that will guide users to a particular meditation brainwave state. While I do have to ask permissions in the app, even if I could collect their brainwave data, the data is currently not sophisticated and doesn’t warrant collection. Simply, the gadgets out there, including the Neurowave game I hacked to make a simple EEG for my cellphone, and my sophisticated EEG alike, translate macro neurological phenomena and lump them into five brainwave states categorized by electrical cycles, hertz: delta, theta, alpha, beta, gamma. It’s like having an alphabet to describe the brain with only five letters. EEGs are fairly simplistic and the data is too broad stroked to provide anyone with the keys to your brain. But glad my brainwaves will be asked permission first just the same!
6 Recommend
Flag
2 Replies
Idonthaveaname commented April 17
I
Idonthaveaname
Washington StateApril 17

@Sam JFM Is your device similar to that utilized game playing device used in the submersible that exploded under the ocean near the Titanic about 12 miles down? Or simply like Johnny Cash’s character in the song, this 41 – 51 – 61 – 71 Cadillac whose parts he picked up at the factory in his lunch bucket? By the way, I think your device is a great innovation. Now if I could only figure out how to get my iMac to record songs I like onto CD’s that will play on an old stereo–because my child who has autism will only listen to songs played on that old stereo.
1 Recommend
Flag
RR commented 2 hours ago
R
RR
Wisconsin2h ago

@Sam JFM re: “It’s like having an alphabet to describe the brain with only five letters. EEGs are fairly simplistic and the data is too broad stroked to provide anyone with the keys to your brain.”

Back in the early days of molecular biology, many researchers were convinced that DNA couldn’t be the genetic material because it has only four “letters” — how could anything as complex as a living organism possibly be defined, created, and maintained using a four-letter alphabet? I’m just sayin’.
Recommend
Flag
Eve commented April 17
E
Eve
DenverApril 17

Will this be legislated by individual states? If so, the U.S. will continue to be a mixed bag of progressive/proactive states vs regressive/reactive states, impacting the quality of life and liberty for the citizens within them.
12 Recommend
Flag
1 Reply
Jon commented 2 hours ago
J
Jon
Chicago2h ago

@Eve Exactly. Red states will mandate participation and free sharing. It’s Capitalism, don’t ya know.
Recommend
Flag
TW commented April 17
T
TW
OaklandApril 17

Reading the article it seems that this doesn’t ban the collection of the data or even the sale of the data collected – it’s just allows opt out at any time.

Past laws that allow opt out don’t generally get a lot of people that actually opt out.
2 Recommend
Flag
2 Replies
McX commented April 17
M
McX
Orbis TertiusApril 17

@TW Actually, how do you opt-out? This is a genuine question. I want to opt out of this before it even gets started.
4 Recommend
Flag
Mike Bozeman commented 2 hours ago
M
Mike Bozeman
San Antonio, Texas2h ago

@McX Don’t use a wearable
Recommend
Flag
Ramesh G commented April 17
R
Ramesh G
SF Bay AreaApril 17

Less than meets the eye here. Aside from reading mild electrical waves correlated to simple motor actions, it is simply not possible within the known laws of physics, to read a person’s thoughts or memories.
Reason is that the brain fluid is something like salty water which attenuates electrical signals within a few mm to less than microvolts.
Nearly all consciousness, mind activity is inches deep within the cerebral cortex – completely inaccessible except perhaps on a consenting patient lying still deep inside a 3 ton MRI machine.
6 Recommend
Flag
7 Replies
b fagan commented April 17
b
b fagan
chicagoApril 17

@Ramesh G – re-read article. It includes: “And last year, scientists were able to translate the brain activity of a paralyzed woman and convey her speech and facial expressions through an avatar on a computer screen.”

Look at papers being published in IEEE journals and elsewhere and there’s tons of research in determining the emotional state, reaction type and all sorts of other mental/emotional responses from people – lots of it from other sensors monitoring pulse, perspiration – or even IR cameras remotely viewing facial changes in blood flow and eye dilation or gaze direction.

Much of that being done even without skull sensors, and much of it built to combine with other data to draw more robust conclusions.

So yes, there is much that meets the eye here – without putting restrictions into more force than the timid US legal system, we’re allowing industry to collect, collate, pick apart and combine ever-increasing recorded data about people and their actions, locations and – now – working to understand more about mental state.

I’m in the tech industry and do NOT have any of these apps on my phone or use any other of the medical/health/whatever monitoring tools that put data into the hands of others. Partly because securing data costs money companies often don’t spend, and partly because data = revenue either while a company is in business, or as the valuable asset of a failed startup.

We need more regulations on this topic.
11 Recommend
Flag
MScott commented April 17
M
MScott
NortheastApril 17

@Ramesh G
Because it can’t be done today without large scale MRIs doesn’t mean the technology won’t advance tomorrow. Nor does it mean that intrepid marketers won’t be busy selling the information they can glean whenever they get a chance. If businesses aren’t required to respect a fundamental level of human privacy, they won’t do it.
9 Recommend
Flag
McX commented April 17
M
McX
Orbis TertiusApril 17

@b fagan I agree with you about the need for regulations, like yesterday, but these procedures helping paralyzed people require training. It remains true that there is no device that allows one to simply wave around some wand that picks up people’s thought. Brain waves: maybe one day. But brain waves are not thoughts.
2 Recommend
Flag
Edward commented April 17
E
Edward
UtahApril 17

In all of this personal information gathering in all its forms, why has no one asked or filed law suits to obtain payment to the original owners of this product? WHY am I not being paid by every one of these collectors for the information which ultimately belongs to ME?
11 Recommend
Flag
1 Reply
TW commented April 17
T
TW
OaklandApril 17

@Edward Read the terms of service agreements tgat you signed when you signed up to use these products.
2 Recommend
Flag
PRober commented April 17
P
PRober
Redwood Valley CAApril 17

Attempts to limit the gathering of personal information are almost always met with charges of holding back the advance of the human race. The question is who is defining that advancement. Usually it’s the techno-euphorics, for whom more, faster, more thorough, more comprehending is the only advancement. They won’t stop until all life is in their purview and under their control. Not because they want that world, but they want the vindication of their own talents and actions. Life completely structured and informed by technology Is that vindication, their “immortality totem.”
Historically such monomaniacal visions are counteracted by a host of other visions. But with higher ed. now so monetized and the humanities rendered obsolete, alternative visions have no constituency. The tyranny of technology is a self-fulfilling prophecy.
22 Recommend
Flag
Non scio commented April 17
N
Non scio
PhoenixApril 17

MONEY is what is coveted by technology companies, and technology is only their tool to shovel it in.
13 Recommend
Flag
Pamalam commented April 17
Pamalam
Pamalam
USAApril 17

This is wise. Actually it shouldn’t require laws. It should be banned. Abuse by third parties is increasing, and developers use this to blame and not take responsibility. Result: no one is held liable. Later, the data can be sold, moved, or dynamics of use change. All this should be stopped too. Humans should have the right to opt out of all of this. Sure we can stop using technology, but hasn’t that grown impossible now? Tech and business has created a necessity of use, and that means we do not have any options to op out.

We should feel safe in our homes – that should translate to the tech world too, but it glowingly unsafe.
13 Recommend
Flag
1 Reply
SNN commented 2 hours ago
S
SNN
KCKS2h ago

@Pamalam That’s fine to say, but you’re banning ebilluence and reading the riot act to the microbiomes that live in your guts and on your skin etc. Maybe just take a blue team with you when taking your morning pastorale (at an 8.)
Recommend
Flag
Caesius commented April 17
Caesius
Caesius
LINYApril 17

Why? Why would anyone buy these devices and use them? These are nothing but digital supplements. Unregulated. Mostly untested, woefully weak in any hard proof of their efficacy, but heavily over hyped…and sold at a premium (overpriced, built in the proverbial sweat shops) to be the mouse wheel for consumers to run on…for the huge profit of these Tech Companies and their Bro’ CEOs.

Why?
8 Recommend
Flag
1 Reply
SNN commented 2 hours ago
S
SNN
KCKS2h ago

@Caesius You have a mouse wheel you can run on? I don’t even have a trackball that tough, but this gig doesn’t get longer and you may as well get input devices and immersive consoles that serve you well. Single atomic layer devices are implementing neural networks well and the types are expected to get their native typechecking interfaces together soon. See you on the CO border creating puts on this data soon!
Recommend
Flag
irradiated me commented April 17
i
irradiated me
saint louis park, mnApril 17

Am I the only reader who thinks it improbable that millions of dollars would be spent developing this brain/data tech if it were only intended for a medicinal purpose. I could be wrong – it is possible that capitalism has a soft spot for the ADA that I had not previously noticed.
30 Recommend
Flag
1 Reply
Vocal Citizen commented April 17
V
Vocal Citizen
New EnglandApril 17

@irradiated me. Billions are being spent by big tech, and technology-based manufacturing, in developing smart medical equipment and applications to be applied/used with patients in non-medical locations including wearables and in-hime connected equipment. There must be stronger safeguards than those currently in place to ensure patients own their data and control whom that data is shared with. The abuses of data use are already mounting and wi only exponentially explode from here. We must restructure and safeguard personal information including biometric data. While I applaud the Colorodas bill, it does not go far enough.
8 Recommend
Flag
Will commented April 17
W
Will
SeattleApril 17

Hardly anyone will need this or benefit from it. Humanity got along in various constructive ways for thousands of years without the internet and even electricity. Just another example of commercialism’s ongoing efforts to artlessly disrupt; that if left unchecked can only mean…

As recently as the 1980s computers were widely understood to be little more than annoyances. Their nature hasn’t changed; nor will it ever.
9 Recommend
Flag
6 Replies
Concerned Reader commented April 17
C
Concerned Reader
MDApril 17

@Will

“As recently as the 1980s computers were widely understood to be little more than annoyances.”

You are writing this on a smartphone or computer with many more times computing power than a super computer of the 1980s, likely connected wirelessly at speeds that were unimaginable, and sent over the Internet that didn’t exist at the time.

In other words, things have changed drastically, and our message exchange is possible only because of that.
15 Recommend
Flag
Studentofhistory commented April 17
S
Studentofhistory
Boulder, ColoradoApril 17

@Will
So, should fully paralyzed individuals throw their chance away at using computers like the rest of us, getting jobs like the rest of us, etc? Should we disallow people with amputated limbs the chance to have artificial limbs in the future that are highly similar to their original limb, with touch sensation pumped right to the correct area of their brain?

I’ve worked on this technology in the past (I’m a computer engineer), and it’s a proper lifesaver for some potential users. For others that work heavily with the technology, it will be a massive productivity force-multiplier and break limits of our interaction with computers. In general it seems so far like this is going to be a very, very careful development. Even then, that doesn’t mean (and hasn’t in the past) that we should hold back the future of Humanity based on fear.

You have to realize how far humanity has come. You’re making your comment with a phone that has more than humanities SUM TOTAL of computing power in flop/s in the 1960’s and early 70’s.
9 Recommend
Flag
Amica Silentia commented April 17
A
Amica Silentia
CaliforniaApril 17

@Studentofhistory
There’s always, always, a poster child figure for new technologies. Indeed, if I was paralyzed and someone clapped a wired-up headset on my skull and this allowed me to live a more normal life, maybe I would be thrilled, but on the other hand I might be horrified to think of the potential abuses such as mind control under authoritarian regimes. There really are people in society who can think about the unintended consequences of a brave new world.
7 Recommend
Flag
Neuro commented April 17
N
Neuro
USApril 17

The free for all should stop. Humans harvested for profit, as fast as can be had before the sluggish and paid-off political wheels will turn to prevent it. The system is certainly rigged against the populace and otherwise designed to benefit a few. Vote for better, smarter political representatives committed to protect Americans.
35 Recommend
Flag
Rick Simmons commented April 17
R
Rick Simmons
Connecticut, USApril 17

If a person agrees to neural data usage, I only agree with its use for positive energy holistic reasons only. In no way do I agree with takings (ideas etc.). Nor do I agree with its use by judges and law who can convict based on a thought of doing something or not. Reading brains can be dangerous because sometimes the brain comes up with the wrong conclusions etc. I can imagine walking around with others saying: “you think I am this or you think I am that”. Sounds like a crazy world to me. But, as long as a person in need formally agrees, I think it would help in controlling a wheelchair or a computer etc. It may help with learning.
4 Recommend
Flag
Ginger commented April 17
G
Ginger
ColoradoApril 17

Do we really need the State to protect us from idea theft? How ’bout get yourself off of the Internet and take a walk, see a friend, go for a walk with a friend.
9 Recommend
Flag
5 Replies
Vocal Citizen commented April 17
V
Vocal Citizen
New EnglandApril 17

@Ginger

The problem is that increasingly these technologies are being moved from entertainment into healthcare, work environments, and other applications where individuals must utilize the systems to get care or attain/maintain employment. It is no longer enough to just stop using social media.
57 Recommend
Flag
Studentofhistory commented April 17
S
Studentofhistory
Boulder, ColoradoApril 17

@Ginger
It’s not idea theft, it’s the fact that in the future, we will use neural interfaces to… interface with technology. Whether external or internal to the head, neural interfaces will initially be used by disabled individuals and eventually be used by the rest of the population to interface with technology. I’m both a Coloradan and an Engineer (I did a research project into external neural interfaces years ago), and this is honestly a very good piece of legislation. This is GDPR/CO data rights laws extended to the next most-profitable advertising opportunity before they have the chance to lobby it away, and you should be all for it.

Imagine this:
There will soon be a market for people’s thought data. Instead of just seeing what you Google, what you look at on Amazon, etc, advertisers will likely be able to buy data about what you think and what is sensed by the neural interface.
12 Recommend
Flag
Jane Doe commented 2 hours ago
J
Jane Doe
Berlin2h ago

@Ginger
Pretending that walking through the park can mitigate the pitfalls of the digital age is naive at best. In the worst case, your suggestion is dangerous for the foundations of a democratic society.
Recommend
Flag
Stewart commented April 17
S
Stewart
Behind the Orange County curtain.April 17

“I think, therefore I am” has been rebooted:

“I think, therefore I am a commodity”.
93 Recommend
Flag
1 Reply
Idonthaveaname commented April 17
I
Idonthaveaname
Washington StateApril 17

@Stewart We already are, aren’t we. Are we citizens anymore? Not when the politicians only think of us as consumers. We’re being within something that both uses us (information) and sells us (Income) With me, perhaps, they’d get GIGO!
1 Recommend
Flag
Name commented April 17
N
Name
LocationApril 17

If we can’t get rational policy and safeguards in place as this very invasive technology becomes ubiquitous, I guess we’ll all be donning tinfoil hats!
17 Recommend
Flag
David Bible commented April 17
D
David Bible
HoustonApril 17

Wow. Did not know about these products or the tracking, but my years of SciFi reading takes this concept to so awful places.

Definitely need this bill passed, in Colorado and nationally.

Maybe the users of these products should go ahead and throw them in a river.
32 Recommend
Flag
2 Replies
Studentofhistory commented April 17
S
Studentofhistory
Boulder, ColoradoApril 17

@David Bible
So, should fully paralyzed individuals throw their chance away at using computers like the rest of us, getting jobs like the rest of us, etc? Should we disallow people with amputated limbs the chance to have artificial limbs in the future that are highly similar to their original limb, with touch sensation pumped right to the correct area of their brain?

I’ve worked on this technology in the past (I’m a computer engineer), and it’s a proper lifesaver for some potential users. For others that work heavily with the technology, it will be a massive productivity force-multiplier and break limits of our interaction with computers. I’ve read hard sci-fi my entire life and while this has some opportunity to be a bad thing, in general it seems so far like this is going to be a very, very careful development. Even then, that doesn’t mean (and hasn’t in the past) that we should hold back the future of Humanity based on fear.
2 Recommend
Flag
McX commented April 17
M
McX
Orbis TertiusApril 17

@Studentofhistory I am for the use of this technology to help people in a medical context, but I don’t buy into the idea that the future of “Humanity” equates with technological innovation. And I’m really suspicious of this standard canard of increased productivity. Neither of these factors necessarily constitute “progress,” and in fact I think that a good deal of fear is warranted.
3 Recommend
Flag
Ugly and Fat Git commented April 17
Ugly and Fat Git
Ugly and Fat Git
Boulder, COApril 17

My beautiful state Colorado is working towards the future while Arizona is still struggling with basic rights for women.
160 Recommend
Flag
DavidJ commented April 17
D
DavidJ
NJApril 17

I imagine the Supreme Court will someday rule corporations are entitled to your thoughts. Another United Citizens suit.
56 Recommend
Flag
Stewart commented April 17
S
Stewart
Behind the Orange County curtain.April 17

“I think, therefore I am” has had been rebooted:

“I think, therefore I am a commodity”.
18 Recommend
Flag
USMCR commented April 17
U
USMCR
DenverApril 17

This legislation does NOT pertain to Rep Boebert who we would happily export.
2 Recommend
Flag
1 Reply
mtnlion commented 2 hours ago
m
mtnlion
Colorado2h ago

@USMCR The *absence* of brain activity is not protected by this law.
Recommend
Flag
FM commented April 17
F
FM
MichiganApril 17

“Consumers have grown accustomed to the prospect that their personal data, such as email addresses, social contacts, browsing history and genetic ancestry, are being collected and often resold by the apps and the digital services they use.”

Not this consumer. I will not be growing accustomed to this. Ever.
130 Recommend
Flag
2 Replies
Sarah commented April 17
S
Sarah
San FranciscoApril 17

@FM I would never use any social media or Google product for this privacy reason alone. Also, be sure to never have your GPS turned on, always delete all browser cookies every user session, and never provide or allow access to any of these data to any company, site, app, or product.
9 Recommend
Flag
FM commented April 17
F
FM
MichiganApril 17

@Sarah agreed. Also would suggest LineageOS for MicroG as an alternate OS for android phones which entirely removes all the Google components.
5 Recommend
Flag
Michael Doran commented April 17
M
Michael Doran
Springfield MAApril 17

I take issue with the statement that the privacy of our clinical medical data is protected by federal law.

1: There is no such thing as “de-identification” of personal medical data. What is being bought and sold in an unregulated marketplace in this country is fully identifiable. (It just takes a little effort and the right software.) And what is being shared (with – or in most cases – without a patient’s full knowledge and approval) with high tech companies for research and development has the double downside of containing lots inaccurate information that can never be fully expunged from a person’s medical data once it has left the hospital/clinic/doctor’s office.

3: Everyone’s prescription drug history is collected by the 50 state PDMP system, sent “in real time” to the Department of Justice, and made accessible nationwide to law enforcement and public health authorities without a warrant in many jurisdictions. There is no transparency: patients are not notified when their data is accessed by people other than their medical providers, and the media in this country refuses to investigate this mislabeled intrusive Bush era domestic surveillance program.

The media and our state/federal representatives have not done nearly enough to address what I view as a widespread, intolerable violation of civil and human rights in this country.
77 Recommend
Flag
3 Replies
august 1 2023 commented April 17
a
august 1 2023
NYApril 17

@Michael Doran it may be “intolerable.” This is already happening. There is no containment for a/i
7 Recommend
Flag
Amica Silentia commented April 17
A
Amica Silentia
CaliforniaApril 17

@Michael Doran
A dismal result of the capture of whole institutions by the computer industry. Makes me nostalgic about the bulging hand-written medical charts that were only abandoned a few decades ago. To steal data you’d have to break into a hospital and deal with a lot of paper files in sometimes undecipherable doctors’ writings. Now the internet makes it so much easier – for insurance companies and collection agencies.
8 Recommend
Flag
Wolfgang Roberts commented 2 hours ago
W
Wolfgang Roberts
Oregon2h ago

@Michael Doran interesting comments. Will you please cite your sources for us?
Recommend
Flag
Gomez commented April 17
G
Gomez
CaliforniaApril 17

This is great for the people of Colorado and I hope California follows suit. Each person should have complete control of their data and especially their biological data.
27 Recommend
Flag
David J commented April 17
D
David J
San FranciscoApril 17

The only data anyone should be able to collect and reuse without a warrant are: browsing history, purchasing history. Access to data about who’s communicating with whom, and when, should be subject to judicial authorization (i.e., a search warrant). Data about insurance and medical records, dental records, mental health records, heart rates, brain activity, sleep patterns, race, living and business addresses, tax records, marriage status, citizenship, any history of breaking the law (from parking tickets to DUIs to felonies), financial records (except for investments), and social security should be subject to the most stringent protections, equally. Financial investments should probably be pretty transparent.
40 Recommend
Flag
2 Replies
Thiery la Jambe commented April 17
Thiery la Jambe
Thiery la Jambe
Rive GaucheApril 17

@David J

Then you’ll be very disappointed in the new surveillance act working its way through Congress, as should any good citizen be.

Let’s not just talk about it. Please do a few searches and inform yourselves. I’m not preaching, just sharing. Resistance is essential, our freedom is on the line.
3 Recommend
Flag
Elizabeth commented 2 hours ago
E
Elizabeth
Atlanta2h ago

@David J why should browsing history automatically be inhaled and sold? Or surveilled without warrant?
1 Recommend
Flag
Charles commented April 17
C
Charles
MassachusettsApril 17

Hopefully this begins a trend. Good grief though. Privacy has been RAVAGED by these companies (and also the government) in America. Let this be the beginning to the end of such invasive information-reaping.
183 Recommend
Flag
1 Reply
Thor commented 2 hours ago
T
Thor
Germany2h ago

@Charles EXACTLY what I was thinking. Is there nothing left that is not trying to be mined to sell?
Recommend