Meta said the contracting "did not meet (meta's) standards". I am sure that is true. meta's "standard" is not to reveal the illegal, immoral, unethical things meta does. No matter what the harm.
Maybe a company with those standards should not get our business. Oops, no wait, maybe they mean the Friedman Doctrine standards? In that case they are entitled to do any and every thing to make a profit. No matter what the harm.
I used to work for Meta. I quit largely because of intense frustrations with the company. Meta has made a lot of mistakes, overlooked a lot of harms, and made a lot of short-sighted, selfish choices. Many things about the world are worse than they could be because of choices Meta has made.
So that when I say that they really do have a zero tolerance policy for anyone using their internal systems to violate user privacy, it's not because I'm eager to defend them. It's just true (at least, it was when I was there). There are internal systems dedicated to making sure you have access to what you need to do your job, and absolutely nothing else. All content you interact with through internal tools is monitored and logged. If you get caught trying to use whatever access your job gives you for anything other than doing your job, security immediately escorts you out of the building. This is drilled into new hires early and often. For everything Meta gets wrong, they really do take this seriously.
Yea but no. Meta is a defense contractor that hires out to 3rd parties exactly to do this. so you guys don't get to do that, but a lot of other people are. I hope that helped you sleep at night while you were there. But yea, it all gets bought and sold at the end of the day.
The irony is meta wants to implement verification to protect kids. Meanwhile it's doing everything it can to exploit them most every single level for profit and for the love of the game.
> At the time of the publication, Meta admitted subcontracted workers might sometimes review content filmed on its smart glasses when people shared it with Meta AI.
They just got fired for "piercing the veil". They committed the sin of bringing attention to the invasion of privacy.
Unfortunately in today’s world where organizations are larger than many a country’s GDP, they really only have to face responsibility towards shareholders and maximizing profits is the thing they usually care about.
Is it illegal or immoral? Having Meta review this material has to be approved by users and has their consent.
There was an example in the article where a user’s glasses kept recording the user’s wife after he took them off. That’s bad but on the user, not Facebook.
Seems similar to a situation where someone takes nudes of someone without their consent and then sends them off to a lab to be printed. The lab isn’t doing anything illegal or unethical printing them when they ask the user “are these legal” and the user replies “yes.” Unless you want to stop photo printers from ever printing nudes, I think the responsibility is on the user, not the firm.
Meta cancels the contract with the outsourcing company they contracted to classify smart glasses content after employees at the company whistleblow about serious privacy issues with the content they were paid to classify.
How else do you want companies to remove and prevent CSAM? It seems like you must have some human involvement to train and monitor.
It’s a terrible job, I wouldn’t want to do it, but someone needs to. Perhaps one day, AI will be accurate enough to not need it, but even then you need someone to process complaints and waivers (like someone’s home photos being inaccurately flagged).
CSAM exists on social media because they are so large that it's not possible to moderate them effectively. To me this is a a no-go. If a business is so large that it cannot respect laws, it needs to be shut down.
The correct way to organize social media is in federated way. Each server only holds on average a few hundred or few thousand people. Server moderators should be legally responsible for content on their server. CSAM on social media will be 100x suppressed because banning people is way easier on small servers.
Not many moderators will have to look at CSAM because the structure of the system makes is unappealing to even try sharing CSAM, knowing you will be immediately blocked.
Isn't it more that tech companies are just more high profile and integral to political and social landscape than older companies; but reviewing the current political zeitgeist, they're in lockstep to what some, if not all, would just call fascism?
Sounds about right. If you know someone who uses these smart glasses, it's important not to tolerate them whatsoever. Don't speak with them, interact with them. I wouldn't even recommend being in their presence.
I know bunch of people who use smart glasses. And use RayBan meta glasses myself for two years (mostly as speakers/mic, but occasionally can use camera as well for some random shots – like cycling in a forest at a beautiful sunset). My default assumption for many years that if any photo/video goes to cloud, it potentially can be leaked/stolen/used. I keep this assumption both for smartphones and smartglasses, yet would be happy to switch to Apple glasses finally when they're out.
Calls to stop speaking or interacting with people who use smart glasses sounds like the dumbest thing I've read on HN ever.
You're aware of the privacy implications but think people talking about avoiding people who use them are proposing dumb arguments? I don't follow your logic.
There's also nothing stopping us from stigmatizing the use of smartphones in public. Even a slight discouragement of it would be progress. It doesn't have to be all or nothing.
Because person wearing glasses usually can move and video surveillance cameras usually can't?
If that's not it then spell it out for me, please.
Also, why would i be deceptive in this discussion? I feel like I missed some ideological conflict.
Most people don't run around holding out their smartphone directly in front of them. It has to be pointed at the subject, and tends to be obvious.
Smart glasses, however, are always aimed at whatever the wearer is looking at. They may or may not be recording (note the reports of people hiding the LED indicators), and at a fair distance could easily be mistaken for a normal pair.
The general populace is much more likely to notice the former recording rather than the latter.
I've seen people keep their phone in their shirt pocket. The only reason it tends to be obvious is that most people aren't trying to be covert. Those aren't the ones you should be worried about.
At everything on the opposite side of the screen, typically. There is a recording light for Meta glasses, but not one for iPhones, for example: the "recording" indicators are all user-side there.
When I'm on public transport, people generally face their phones in such a way that they'd only be filming your feet or the floor... They don't hold them up at head height in such a way that other people would be recorded. Maybe it's just a cultural thing
A mostly-solitary sporting event (or one where you know all the other participants and can get their consent to record beforehand) seems like a reasonable use of these sorts of glasses. I wouldn’t personally give consent just as a sort of privacy reflex, but it really depends on your social circle.
A Kenyan workers' organisation alleges Meta's decision was caused by the staff speaking out.
Meta says it's because Sama did not meet its standards, a criticism Sama rejects ...
Well, yeah. If I went straight to the press to trash the reputation of my client's product, rather than communicating internally first to help them proactively address the issues, I would expect to get fired.
Not that I am remotely interested in defending Meta, or optimistic that they would proactively address privacy issues. But I don't feel that sympathetic to the outsourcing company here either.
I don't know what happened behind the scenes. I'm just going off what is said and not said in the article. If I were whistleblowing about something like this, I would take pains to describe what measures I took internally before going public. I didn't see any of that here.
EDIT: Look, to be clear, I think it's bad that naive or uninformed people are buying video recorders from Meta and unintentionally having their private lives intruded on by a company that, based on its history, clearly can't be trusted to be a helpful, transparent partner to customers on privacy. I think it's good that the media is giving people a reminder of this. I think it's good that the sources said something, even though the consequences they suffered seem inevitable. But to me, there is nothing essentially new to be learned here, and I don't know what can or should be done to improve the situation. I think for now, the best thing for people to do is not buy Meta hardware if they have any desire for privacy. Maybe there are laws that could help, but what should be in the laws exactly? It's not obvious to me what would work. I suspect that some of the reason people buy these products is for data capture, and that will sometimes lead to sensitive stuff being recorded. What should the rules be around this and who should decide? Personally I don't know.
What makes you think the outsourcing firm didn't raise these concerns in email or meetings? You think these people wanted to lose jobs and income? That's irrational.
Why reflexively defend a massive tech corporation caught repeatedly violating the law?
There are transgressions severe enough that your duty to stop them is heavier than your responsibility to "the reputation of your client's product." Amazing this needs to be stated, frankly.
What specifically do you mean? It is by design that smart glasses see the things happening in front of their users? Yes, it is. That is why people buy them.
Huh. There you go again, thinking everyone else is an idiot. Capture of video data of users by Meta is never acceptable. It would not be acceptable for any phone, and it is not acceptable for any glass, ever.
Saving the data for any purpose other than allowing users to access it is bad enough; allowing Meta employees or contractors to view personal videos is on a whole new level.
I don't know why people buy smart glasses. Maybe they buy them for video capture. If so, the videos go to Meta's servers and Meta might do things with them. They might be criticized for not reviewing them in certain cases. That's one reason why I wouldn't buy Meta smart glasses.
The main issue here is Facebook employees viewing users' private video streams (including of user nudity) without the users' knowledge.
The secondary issue is that it's generally frowned upon to make your employees view nudity in the workplace. Are there extenuating circumstances here? No, we have no evidence there are any extenuating circumstances here.
"Meta ended its contract with Sama"
At this scale, this sound like some insider joke contract made up only to make some hustle on the side capitalizing with stock options on possibility of adhoc news trading bots glitching out on the keyword.
> > Meta said this was for the purpose of improving the customer experience, and was a common practice among other companies.
> Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
It's total fantasy. I've worked in big tech. Casually uploading and providing company/contractor access to non-redacted intimate photos or pictures of the insides of people's homes vaguely "for the purpose of improving the customer experience" would not pass even a surface-level privacy or data-protection review anywhere I've ever worked. Do Meta even read what they are saying?
Well you gotta give out black mail material to the scam centers somehow. Otherwise they don't actually have leverage! Oh right... We don't want that happening.
I once read the manual of one of those small floor cleaning robots (Ecovacs Deebot U2 pro), and it basically said that by using it you were giving them a right to take pictures and send them to a remote server (to analyze issues or something like that)
Are you conflating telemetry with literally live-streaming your life to Meta? Because that's what makes the statement weird.
edit 2: OK, I see what you mean. But I'm wondering if it should be possible to consent to this via T&C. Basically the same issue as with many online services, turned up to 11, sure. And it involves OTHER people, who have not consented.
Stuff like this used to be outrage fuel even when it was more of a social experiment, e.g. the documentary "We live in public" or the "Big brother" TV show. By now, I'm sure there have been millions of influencers doing similar things, but it's very much not considered normal?
Streaming to an unknown number of employees might be considered different from streaming to the public, sure.
But the core question here is whether there's informed consent, and, IMO also, if it should be possible to consent to this when the other party is a company like Meta and the pretext is not deliberately seeking attention (like influencers and streamers do).
So already, this person wearing these glasses are already agree with that Meta can monitor them. They also probably trust Meta when they say "When the glasses are off, nothing is recording", for better or worse. So with that perspective in mind, it's not far fetched to assume these same people will willingly be naked into front of these recording devices they believe to be off.
Of course, anyone who opened a newspaper in the last 10 years or so would know better, but I can definitively see some people not giving a fuck about it.
One of the bigger commercial niches for smart glasses is filming POV porn, so it is hardly surprising that sort of content ended up in the moderation queue. The project should have planned to account for that use case.
And I do appreciate how awkward it is for Meta to admit that use case exists. Even in the Oculus Go days there were a bunch of polite euphemisms internally to avoid mentioning "our device has to ship with a browser so people can watch porn on it"
I think Meta, like all companies, doesn’t want its subcontractors creating bad press for them.
So it doesn’t surprise me that Meta didnt renew/cancelled a contract that is a net negative for them. Arguing over the reason seems fruitless as no reason is needed per the terms of the contract (I assume since breach of contract wasn’t brought up by the sub).
I believe the tricky privacy and security issues around smart glasses (and other "personal" tech) can be navigated successfully enough by a thoughtful, diligent, responsive company.
Which is why I'd never touch a person tech device from Meta.
Their entire DNA is written to exploit their users for profit. In my judgement, they literally cannot and will never consider those issues as anything other than something to obscure to keep people unaware of the depth of the exploitation.
Yeah, I think it's more of a British English thing. It can also mean things like "in a fight". Like: "those two guys had a big row outside the pub the other night"
Facebook may have to rename itself into NaughtyBook or SpyBook
or Pr0nBook. They really want people to help them spy on other
people here - including their sex life. Expect new sexy videos
in 3 ... 2 ...
Why do they even need workers to classify naked content? They could filter some content prior to passing it to workers. They already have models to moderate explicit content.
If you want to read more about how unsavory aspects of AI-training are off-loaded onto poor workers in third-world countries, would recommend Karen Hao's "Empire of AI". These workers are paid pennies an hour for unstable jobs that expose them to some horrific material.
The owner of the private space generally has authority to deny this already, there's no need for an additional law.
In the US at least, any private homeowner/renter can deny entry to their property, barring legal warrants and exceptional circumstances. A business can have a policy, and is generally legally protected as long as the policy is 1) equally applied, and 2) does not violate ADA... A court would have to weigh in if glasses are allowed or not for ADA... but I suspect there's already a case where a movie theater banned such glasses and they would probably(?) win, since such individuals could be expected to have non-recording glasses.
Why? What's the difference between that and one of the many, many concealed camera options that you don't even notice? Just that it's noticeable? I don't think that's a good enough reason for yet-more-regulation. You're already being recorded everywhere you go in public by the authorities, and often by people standing right next to you unnoticed, so just act accordingly.
Because they will be popular and lots of people will buy them and use them all the time, leading to much more generalized surveillance than the concealed options that only a tiny tiny fraction of people would buy or use (and that we should also regulate)
> and was a common practice among other companies.
Meta isn’t lying, you should assume other companies are doing it too, Tesla did it with their cameras, and assume others like any company has access to your camera, I would even assume CCTV cameras too. It’s why for anything sensitive, try to use open source stacks, you might lose some of the features, but it’s a needed compromise.
So I've never had a smart speaker in my house (Alexa, Apple, Google). I've just never been comfortable with the idea of having an always-on cloud-connected microphone in my house. Not because I thought these companies would deliberately start listening and recording in my house but because they will likely be careless with that data and it'll open the door for law enforcement to request it. Consider the Google Wi-fi scraping case from STreetView.
Or they might start scanning for "problematic" behavior, a bit like the Apple CSAM fingerprinting initiative.
So not one part of me would ever buy Meta glasses (or the Snap glasses before that). You simply don't have sufficient control over the recordings and big tech companies can't be trusted, as we've witnessed from outsourced workers sharing explicit images. And I bet that's just the tip of the iceberg.
I honestly don't understand why anyone would get these and trust Meta to manage the risks.
That is to say nothing of the new technological use cases that could develop from the already existing technology. They just haven’t been thought of developed yet.
Things like audio scanning your living space using those Alexa smart speakers with ultrasonics to get an image of not only everything in your space, but where you are in that space as well.
That technological use case only came out within the last five or so years, maybe closer to eight. Either way I could see that coming before it became a thing just because ultrasound imaging of your unborn child is a thing ultrasound imaging of the sea floor is a thing so why wouldn’t ultrasound imaging of your living space be a thing by a company who wants to know what you buy.
I never ever ever had Alexa I only ever had a Google home because I got it for free with GPM but I almost never used it because I hated the idea of it always listening.
I already regret Wi-Fi because they figured out now how to look through walls with that.
Meta said the contracting "did not meet (meta's) standards". I am sure that is true. meta's "standard" is not to reveal the illegal, immoral, unethical things meta does. No matter what the harm.
Maybe a company with those standards should not get our business. Oops, no wait, maybe they mean the Friedman Doctrine standards? In that case they are entitled to do any and every thing to make a profit. No matter what the harm.
[edit: add last two sentences]
I used to work for Meta. I quit largely because of intense frustrations with the company. Meta has made a lot of mistakes, overlooked a lot of harms, and made a lot of short-sighted, selfish choices. Many things about the world are worse than they could be because of choices Meta has made.
So that when I say that they really do have a zero tolerance policy for anyone using their internal systems to violate user privacy, it's not because I'm eager to defend them. It's just true (at least, it was when I was there). There are internal systems dedicated to making sure you have access to what you need to do your job, and absolutely nothing else. All content you interact with through internal tools is monitored and logged. If you get caught trying to use whatever access your job gives you for anything other than doing your job, security immediately escorts you out of the building. This is drilled into new hires early and often. For everything Meta gets wrong, they really do take this seriously.
Yea but no. Meta is a defense contractor that hires out to 3rd parties exactly to do this. so you guys don't get to do that, but a lot of other people are. I hope that helped you sleep at night while you were there. But yea, it all gets bought and sold at the end of the day.
The irony is meta wants to implement verification to protect kids. Meanwhile it's doing everything it can to exploit them most every single level for profit and for the love of the game.
Yeah, why the hell is Meta wa5tching people's videos either? Why PAY a company to invade our privacy and watch our videos? It's flipping BIZARRE.
Isn’t that obvious from the article? They’re labeling content for training AIs, something which is happening all over the world constantly.
> At the time of the publication, Meta admitted subcontracted workers might sometimes review content filmed on its smart glasses when people shared it with Meta AI.
They just got fired for "piercing the veil". They committed the sin of bringing attention to the invasion of privacy.
Unfortunately in today’s world where organizations are larger than many a country’s GDP, they really only have to face responsibility towards shareholders and maximizing profits is the thing they usually care about.
Is it illegal or immoral? Having Meta review this material has to be approved by users and has their consent.
There was an example in the article where a user’s glasses kept recording the user’s wife after he took them off. That’s bad but on the user, not Facebook.
Seems similar to a situation where someone takes nudes of someone without their consent and then sends them off to a lab to be printed. The lab isn’t doing anything illegal or unethical printing them when they ask the user “are these legal” and the user replies “yes.” Unless you want to stop photo printers from ever printing nudes, I think the responsibility is on the user, not the firm.
Is there explicit approval? Or is it buried in the legal agreements?
Meta cancels the contract with the outsourcing company they contracted to classify smart glasses content after employees at the company whistleblow about serious privacy issues with the content they were paid to classify.
"Fun" bonus fact: This isn't the first time Sama (the outsourcing company) has had these problems.
OpenAI had them classify CSAM, so Sama fired them as a client back in 2022. https://time.com/6247678/openai-chatgpt-kenya-workers/
We're 4 years on, 3 years since that report broke. Not a single thing has improved about how tech companies operate.
How else do you want companies to remove and prevent CSAM? It seems like you must have some human involvement to train and monitor.
It’s a terrible job, I wouldn’t want to do it, but someone needs to. Perhaps one day, AI will be accurate enough to not need it, but even then you need someone to process complaints and waivers (like someone’s home photos being inaccurately flagged).
CSAM exists on social media because they are so large that it's not possible to moderate them effectively. To me this is a a no-go. If a business is so large that it cannot respect laws, it needs to be shut down.
The correct way to organize social media is in federated way. Each server only holds on average a few hundred or few thousand people. Server moderators should be legally responsible for content on their server. CSAM on social media will be 100x suppressed because banning people is way easier on small servers.
Not many moderators will have to look at CSAM because the structure of the system makes is unappealing to even try sharing CSAM, knowing you will be immediately blocked.
Isn't it more that tech companies are just more high profile and integral to political and social landscape than older companies; but reviewing the current political zeitgeist, they're in lockstep to what some, if not all, would just call fascism?
Sounds about right. If you know someone who uses these smart glasses, it's important not to tolerate them whatsoever. Don't speak with them, interact with them. I wouldn't even recommend being in their presence.
There's a name for these people, glassholes
> I wouldn't even recommend being in their presence.
Great! Now do people with smart TVs and people with smart phones
I know bunch of people who use smart glasses. And use RayBan meta glasses myself for two years (mostly as speakers/mic, but occasionally can use camera as well for some random shots – like cycling in a forest at a beautiful sunset). My default assumption for many years that if any photo/video goes to cloud, it potentially can be leaked/stolen/used. I keep this assumption both for smartphones and smartglasses, yet would be happy to switch to Apple glasses finally when they're out.
Calls to stop speaking or interacting with people who use smart glasses sounds like the dumbest thing I've read on HN ever.
You're aware of the privacy implications but think people talking about avoiding people who use them are proposing dumb arguments? I don't follow your logic.
Also make sure to avoid people with smartphones and places with video surveilence.
Don't let perfect be the enemy of good.
There's also nothing stopping us from stigmatizing the use of smartphones in public. Even a slight discouragement of it would be progress. It doesn't have to be all or nothing.
Is this an honest argument? Surely you can think of how glasses might be ... in a different league than the two items you mention?
Unless you are using these during sex I consider a microphone is 10x more privacy intruding than a camera.
Security cameras afaik usually don't record audio, but all phones can record it. And they don't even need to be pointed in any specific direction.
Because person wearing glasses usually can move and video surveillance cameras usually can't? If that's not it then spell it out for me, please. Also, why would i be deceptive in this discussion? I feel like I missed some ideological conflict.
Not meaningfully. Anyone holding a smartphone might be recording you. You’d better avoid them if you don’t want to be recorded.
Most people don't run around holding out their smartphone directly in front of them. It has to be pointed at the subject, and tends to be obvious.
Smart glasses, however, are always aimed at whatever the wearer is looking at. They may or may not be recording (note the reports of people hiding the LED indicators), and at a fair distance could easily be mistaken for a normal pair.
The general populace is much more likely to notice the former recording rather than the latter.
I've seen people keep their phone in their shirt pocket. The only reason it tends to be obvious is that most people aren't trying to be covert. Those aren't the ones you should be worried about.
This line makes a valid point. People record strangers all the time. In an obvious way or trying to be sneaky.
Just because you don’t notice it doesn’t mean it doesn’t happen.
However, this is still a different thing than smart glasses which can further be segmented into who designed the smart glasses.
Someone has to hold smartphone and point it at you.
If somebody was pointing a camera on me all the time? I would definitely avoid them.
People do that on my subway all the time.
It's the camera of their smartphone.
Not sure if it's ON though.
They point the camera of their smartphone directly at you?
At everything on the opposite side of the screen, typically. There is a recording light for Meta glasses, but not one for iPhones, for example: the "recording" indicators are all user-side there.
When I'm on public transport, people generally face their phones in such a way that they'd only be filming your feet or the floor... They don't hold them up at head height in such a way that other people would be recorded. Maybe it's just a cultural thing
Usually they are pointed at the ground when they're reading off them.
I want to get the Oakley Meta ones so I can record bike rides easier, should I not be tolerated?
A mostly-solitary sporting event (or one where you know all the other participants and can get their consent to record beforehand) seems like a reasonable use of these sorts of glasses. I wouldn’t personally give consent just as a sort of privacy reflex, but it really depends on your social circle.
Whistleblower protection is key for any working society. Only dictatorships and oligarchies protect criminals while shaming whistleblowers.
I do not care which country the outsourcing company is in. When criminals go global, protection whistleblowers should go global too.
Mark Zuckerberg and disrespect for user privacy.
Name a more iconic duo.
> the content they were paid to classify
Well, yeah. If I went straight to the press to trash the reputation of my client's product, rather than communicating internally first to help them proactively address the issues, I would expect to get fired.
Not that I am remotely interested in defending Meta, or optimistic that they would proactively address privacy issues. But I don't feel that sympathetic to the outsourcing company here either.
I don't know what happened behind the scenes. I'm just going off what is said and not said in the article. If I were whistleblowing about something like this, I would take pains to describe what measures I took internally before going public. I didn't see any of that here.
EDIT: Look, to be clear, I think it's bad that naive or uninformed people are buying video recorders from Meta and unintentionally having their private lives intruded on by a company that, based on its history, clearly can't be trusted to be a helpful, transparent partner to customers on privacy. I think it's good that the media is giving people a reminder of this. I think it's good that the sources said something, even though the consequences they suffered seem inevitable. But to me, there is nothing essentially new to be learned here, and I don't know what can or should be done to improve the situation. I think for now, the best thing for people to do is not buy Meta hardware if they have any desire for privacy. Maybe there are laws that could help, but what should be in the laws exactly? It's not obvious to me what would work. I suspect that some of the reason people buy these products is for data capture, and that will sometimes lead to sensitive stuff being recorded. What should the rules be around this and who should decide? Personally I don't know.
What makes you think the outsourcing firm didn't raise these concerns in email or meetings? You think these people wanted to lose jobs and income? That's irrational.
Why reflexively defend a massive tech corporation caught repeatedly violating the law?
You would help conceal a crime against the people just because it's good business??
Congratulations, you have a bright future in politics and/or tech CEOing.
There are transgressions severe enough that your duty to stop them is heavier than your responsibility to "the reputation of your client's product." Amazing this needs to be stated, frankly.
Beautifully and succinctly put.
Proactively address the issues? Are you kidding me? This is not an issue that just happened to slip by; it is 100% by design. You're fooling no one.
What specifically do you mean? It is by design that smart glasses see the things happening in front of their users? Yes, it is. That is why people buy them.
Huh. There you go again, thinking everyone else is an idiot. Capture of video data of users by Meta is never acceptable. It would not be acceptable for any phone, and it is not acceptable for any glass, ever.
Saving the data for any purpose other than allowing users to access it is bad enough; allowing Meta employees or contractors to view personal videos is on a whole new level.
I don't know why people buy smart glasses. Maybe they buy them for video capture. If so, the videos go to Meta's servers and Meta might do things with them. They might be criticized for not reviewing them in certain cases. That's one reason why I wouldn't buy Meta smart glasses.
If only we had the technology to record video without sending it to Meta's servers.
The main issue here is Facebook employees viewing users' private video streams (including of user nudity) without the users' knowledge.
The secondary issue is that it's generally frowned upon to make your employees view nudity in the workplace. Are there extenuating circumstances here? No, we have no evidence there are any extenuating circumstances here.
"Meta ended its contract with Sama" At this scale, this sound like some insider joke contract made up only to make some hustle on the side capitalizing with stock options on possibility of adhoc news trading bots glitching out on the keyword.
> "We see everything - from living rooms to naked bodies," one worker reportedly said.
> Meta said this was for the purpose of improving the customer experience, and was a common practice among other companies.
Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
> > Meta said this was for the purpose of improving the customer experience, and was a common practice among other companies.
> Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
It's total fantasy. I've worked in big tech. Casually uploading and providing company/contractor access to non-redacted intimate photos or pictures of the insides of people's homes vaguely "for the purpose of improving the customer experience" would not pass even a surface-level privacy or data-protection review anywhere I've ever worked. Do Meta even read what they are saying?
Well you gotta give out black mail material to the scam centers somehow. Otherwise they don't actually have leverage! Oh right... We don't want that happening.
With lawyers like these, …
I once read the manual of one of those small floor cleaning robots (Ecovacs Deebot U2 pro), and it basically said that by using it you were giving them a right to take pictures and send them to a remote server (to analyze issues or something like that)
Meta is a defense contractor. They see the world a little differently from everyone else.
How is this weird? People have been trading away their privacy for the smallest possible gains in convenience for a long time.
Are you conflating telemetry with literally live-streaming your life to Meta? Because that's what makes the statement weird.
edit 2: OK, I see what you mean. But I'm wondering if it should be possible to consent to this via T&C. Basically the same issue as with many online services, turned up to 11, sure. And it involves OTHER people, who have not consented.
Stuff like this used to be outrage fuel even when it was more of a social experiment, e.g. the documentary "We live in public" or the "Big brother" TV show. By now, I'm sure there have been millions of influencers doing similar things, but it's very much not considered normal?
Streaming to an unknown number of employees might be considered different from streaming to the public, sure.
But the core question here is whether there's informed consent, and, IMO also, if it should be possible to consent to this when the other party is a company like Meta and the pretext is not deliberately seeking attention (like influencers and streamers do).
edit, clarified social media comparison
Not sure which is worse here - that Meta are recording video from customers' smart glasses, or that they are firing people who talk about it.
The latter, as they can't even claim to have done so by accident, or "it was just bug".
Everything having to do with Meta, starting with its very name, has been evil from the start.
Can I squeeze in the just a teeny tiny bit of… why the hell are you wearing an internet camera on you while naked and/or having sex?
… although I really extend that to why are you wearing an internet connected camera that is obviously going to be monitored by Meta.
So already, this person wearing these glasses are already agree with that Meta can monitor them. They also probably trust Meta when they say "When the glasses are off, nothing is recording", for better or worse. So with that perspective in mind, it's not far fetched to assume these same people will willingly be naked into front of these recording devices they believe to be off.
Of course, anyone who opened a newspaper in the last 10 years or so would know better, but I can definitively see some people not giving a fuck about it.
The new sex tape is two minutes of a hairy navel going in and out of focus.
The Ray-Ban stays ON during sex!
One of the bigger commercial niches for smart glasses is filming POV porn, so it is hardly surprising that sort of content ended up in the moderation queue. The project should have planned to account for that use case.
And I do appreciate how awkward it is for Meta to admit that use case exists. Even in the Oculus Go days there were a bunch of polite euphemisms internally to avoid mentioning "our device has to ship with a browser so people can watch porn on it"
Why is there even a “ moderation queue”? Isn’t this people’s private recordings?
I think Meta, like all companies, doesn’t want its subcontractors creating bad press for them.
So it doesn’t surprise me that Meta didnt renew/cancelled a contract that is a net negative for them. Arguing over the reason seems fruitless as no reason is needed per the terms of the contract (I assume since breach of contract wasn’t brought up by the sub).
I believe the tricky privacy and security issues around smart glasses (and other "personal" tech) can be navigated successfully enough by a thoughtful, diligent, responsive company.
Which is why I'd never touch a person tech device from Meta.
Their entire DNA is written to exploit their users for profit. In my judgement, they literally cannot and will never consider those issues as anything other than something to obscure to keep people unaware of the depth of the exploitation.
I wonder under what circumstances footage from the glasses are uploaded for classification.
Probably this is people asking the glasses something about what they see and the glasses uploading video for classification to generate an answer.
People think it is "just AI" so are not very concerned about privacy.
Bigtech and the race to the bottom of the ethical pitt. We can still go lowerrrr!
https://archive.ph/ubWba
Absolutely no way I'd buy anything from Meta that has a camera built-in.
What does "in row" mean? For us non-English English speakers.
To add to the other replies, when it's an argument, it's pronounced like "how" not like "no".
“a noisy argument or fight”, from the Cambridge dictionary. I believe it’s primarily used in British English.
A row in this context is like a dispute or argument
It's also pronounced r-ow (ow, as in I hurt myself) in this context, instead of r-oh, in case that helps the OP
in an argument
"row" means "an argument"
Yeah, I think it's more of a British English thing. It can also mean things like "in a fight". Like: "those two guys had a big row outside the pub the other night"
Facebook may have to rename itself into NaughtyBook or SpyBook or Pr0nBook. They really want people to help them spy on other people here - including their sex life. Expect new sexy videos in 3 ... 2 ...
Why do they even need workers to classify naked content? They could filter some content prior to passing it to workers. They already have models to moderate explicit content.
A question for the HN folks who work for Meta - Is the pay so good that it makes it worth working for such a morally bankrupt organization?
Unfortunately this news will have no impact, neither on customer behavior, neither on policy, neither on Meta's behavior.
Good. Anyone who works for such a company is immoral in my opinion.
If you want to read more about how unsavory aspects of AI-training are off-loaded onto poor workers in third-world countries, would recommend Karen Hao's "Empire of AI". These workers are paid pennies an hour for unstable jobs that expose them to some horrific material.
Not a fan of regulation in general, but would love to see a ban of cameras on glasses used in public spaces.
If anything they should be banned in private spaces, like if someone wearing them enters someone's home etc.
There is no expectation of privacy in public.
The owner of the private space generally has authority to deny this already, there's no need for an additional law.
In the US at least, any private homeowner/renter can deny entry to their property, barring legal warrants and exceptional circumstances. A business can have a policy, and is generally legally protected as long as the policy is 1) equally applied, and 2) does not violate ADA... A court would have to weigh in if glasses are allowed or not for ADA... but I suspect there's already a case where a movie theater banned such glasses and they would probably(?) win, since such individuals could be expected to have non-recording glasses.
Why? What's the difference between that and one of the many, many concealed camera options that you don't even notice? Just that it's noticeable? I don't think that's a good enough reason for yet-more-regulation. You're already being recorded everywhere you go in public by the authorities, and often by people standing right next to you unnoticed, so just act accordingly.
There is no difference. Both are creepy as hell and should be called out and ridiculed
“You're already being recorded everywhere you go in public by the authorities”
You are the frog being boiled.
Because they will be popular and lots of people will buy them and use them all the time, leading to much more generalized surveillance than the concealed options that only a tiny tiny fraction of people would buy or use (and that we should also regulate)
The problem is if it becomes socially normalized. If you're using a concealed camera and someone notices, you're a creep/asshole.
Meta is so evil
Evil is the current meta
This is what happens when you buy a camera from the "they trust me, dumb fucks" guy and put it on your face.
i don't think smart glasses itself is a good idea
> and was a common practice among other companies.
Meta isn’t lying, you should assume other companies are doing it too, Tesla did it with their cameras, and assume others like any company has access to your camera, I would even assume CCTV cameras too. It’s why for anything sensitive, try to use open source stacks, you might lose some of the features, but it’s a needed compromise.
So I've never had a smart speaker in my house (Alexa, Apple, Google). I've just never been comfortable with the idea of having an always-on cloud-connected microphone in my house. Not because I thought these companies would deliberately start listening and recording in my house but because they will likely be careless with that data and it'll open the door for law enforcement to request it. Consider the Google Wi-fi scraping case from STreetView.
Or they might start scanning for "problematic" behavior, a bit like the Apple CSAM fingerprinting initiative.
So not one part of me would ever buy Meta glasses (or the Snap glasses before that). You simply don't have sufficient control over the recordings and big tech companies can't be trusted, as we've witnessed from outsourced workers sharing explicit images. And I bet that's just the tip of the iceberg.
I honestly don't understand why anyone would get these and trust Meta to manage the risks.
That is to say nothing of the new technological use cases that could develop from the already existing technology. They just haven’t been thought of developed yet.
Things like audio scanning your living space using those Alexa smart speakers with ultrasonics to get an image of not only everything in your space, but where you are in that space as well.
That technological use case only came out within the last five or so years, maybe closer to eight. Either way I could see that coming before it became a thing just because ultrasound imaging of your unborn child is a thing ultrasound imaging of the sea floor is a thing so why wouldn’t ultrasound imaging of your living space be a thing by a company who wants to know what you buy.
I never ever ever had Alexa I only ever had a Google home because I got it for free with GPM but I almost never used it because I hated the idea of it always listening.
I already regret Wi-Fi because they figured out now how to look through walls with that.