this post was submitted on 02 Nov 2024
69 points (100.0% liked)

Technology

37740 readers
50 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 17 comments
sorted by: hot top controversial new old
[–] Catoblepas@lemmy.blahaj.zone 61 points 3 weeks ago (1 children)

About a month ago, Israel-based Corsight AI began offering its global clients access to a new service aimed at rooting out what the retail industry calls “sweethearting,”—instances of store employees giving people they know discounts or free items.

Lol, I hope stores that use this lose millions on this stupid ass privacy invasion. Anyone stupid enough to believe the savings of catching a 10% employee discount used occasionally for friends or whatever is going to offset whatever the fuck this most recent torment nexus is going to cost frankly deserves to be swindled.

[–] iturnedintoanewt@lemm.ee 4 points 3 weeks ago (1 children)

That would be if it ends in the news if a store uses it. I definitely can't imagine people stop using their main grocery store off from a rumor like this.

[–] Catoblepas@lemmy.blahaj.zone 26 points 3 weeks ago* (last edited 3 weeks ago)

Oh, I don’t even mean from lost sales, I mean because this service is fundamentally going to cost more than the “theft” (lol) it’s allegedly stopping. If any one employee (or even a team) is doing this at scale and a business needs AI tracking customers to pick up on it, there is something drastically wrong.

This service is basically pure AI hype. It’s not doing anything a minimally engaged manager couldn’t already do with the salary you’re having to pay them anyway. Except the AI is also doing it worse and at a higher cost. Yay!

[–] scrubbles@poptalk.scrubbles.tech 32 points 3 weeks ago

We're already in the dystopia.

[–] prex@aussie.zone 30 points 3 weeks ago
[–] Peanutbjelly@sopuli.xyz 28 points 3 weeks ago* (last edited 3 weeks ago)

Big. Fan of ai stuff. Not a fan of this. This definitely won't have issues with minority populations and neurodivergents falling outside of distribution and causing false positives that enable more harassment of people who already get unfairly harassed.

Let this die with the mind reading tactics they spawned from.

[–] t3rmit3 24 points 2 weeks ago* (last edited 2 weeks ago)

Not friendly enough when talking to customers? Bad employee.

Too friendly when talking to customers? Bad employee.

This is just about 1) creating an algorithmic justification for the racial profiling that managers already do, and 2) keeping employees in fear of termination so they put up with bullshit.

Side story about how shitty retail management is:

When I was working retail years ago (big box electronics store), our management employed a system of getting every new employee to 3 write-ups as fast as they could (I'm talking, within a month of starting), using literally any excuse they could, so they could hold the "one more write-up and you're fired" over their head.

"AI" is definitely going to become a new tool for employee suppression.

[–] Alice 16 points 3 weeks ago

This is horrifying for a lot of reasons but it'd be nice if my boss had to see the amount of people who yell at me for masking or tell me I look like a man every day. It wouldn't help anything I just hate my boss and want her to feel the awkwardness

[–] loops 15 points 2 weeks ago

Israel-based

Well there's your problem right there.

[–] Midnitte 10 points 2 weeks ago (1 children)

Absolutely no way this doesn't explicitly target certain groups of people and end up in a lawsuit.

[–] Doxin@pawb.social 4 points 2 weeks ago

There's no chance this doesn't turn out to be, among other things, an autism detector.

[–] wesker@lemmy.sdf.org 10 points 3 weeks ago* (last edited 3 weeks ago)

If it can detect suspicious unfriendliness, then I'm really in trouble.

[–] thingsiplay 8 points 3 weeks ago (1 children)

I would just get sunglasses and try to look suspicious just to mess up their tracking.

[–] Nytixus@kbin.melroy.org 8 points 3 weeks ago (1 children)

And be theatric. High five the cashier, over-compliment them, get a bunch of friends to dance with them. Then by the end, be happy they offered you a discount.

[–] thingsiplay 4 points 3 weeks ago

I'm already trained with over-compliment. I think If I just act like always its already sus. xD

[–] Dirac@lemmy.today 6 points 3 weeks ago

This is trash. I can’t wait for these people to install this garbage for $100K or something, and then get too many alerts to actually investigate, thus wasting their money (which they deserve for participating in this kinda surveillance) and proving that Corsair are a bunch of charlatans.

[–] ShellMonkey@lemmy.socdojo.com 5 points 3 weeks ago* (last edited 3 weeks ago)

They claim $100 B in 'losses' to this kind of game. Unless they're actually running red on their books what they really mean is 'we think we should make at least $100 B more per year'.

I'm sure that the vast majority of that would go directly to the front of house employees they're pinning this on too, definitely not to the execs and share holders...