this post was submitted on 20 May 2024
13 points (100.0% liked)

Programming

13376 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 1 year ago
MODERATORS
 

Does anyone know, or can anyone guess, the business case for predictive text? On phone apps, it is often incredibly difficult to turn off. Why is that, do you think? (The examples I have recent experience with are Facebook and Outlook mobile apps.)

I would have thought that, for AI training purposes, they would want humans typing things and not just regurgitating canned responses. But apparently not?

top 14 comments
sorted by: hot top controversial new old
[–] PotatoesFall@discuss.tchncs.de 9 points 6 months ago

On Android you can just install another keyboard if your current one doesn't have a setting for this.

If I had to guess a business case, I'd say that predictive text as a feature gives you a "legitimate reason" to send your typing data to Google or whoever to train the prediction engine, and they want that data.

[–] Dave@lemmy.nz 8 points 6 months ago

How come the apps are controlling your keyboard? Shouldn't it use your phone's selected keyboard?

[–] pineapplelover@lemm.ee 7 points 6 months ago

I have disabled autocorrect in my (android) keyboard settings. That disables it system wide, works for me.

[–] Kissaki 4 points 6 months ago (1 children)

I assume you don't mean keyboard text predictions, which would be a different thing, but the platforms.

It's a new convenience feature. Something they as a platform can shine with, retain users, and set themselves apart from other platforms.

Having training data is not the primary potential gain. It's user investment, retention, and interaction. Users choosing the generated text is valid training data. Whether they chose similar words, or what was suggested, is still input on user choice.

It does lead to a convergence to a centralized standard speak. With a self-strengthening feedback loop.

[–] friendly_ghost 1 points 5 months ago

Thank you! This makes sense to me

[–] jarfil 3 points 6 months ago* (last edited 6 months ago)

On Android, most apps depend on the keyboard.

  • Gboard has a configurable suggestions bar where you can pick words, or not.
  • Microsoft SwiftKey works similarly, but it underlines the word you're typing.
  • AnySoftKeyboard works like Swiftkey.

Only exception I've seen, is Copilot, which shows the suggested word directly, to be selected with [tab], but you can still type a different one.

I've noticed no such behavior on Facebook. Have you checked your keyboard settings?

[–] Max_P@lemmy.max-p.me 2 points 6 months ago (1 children)

I have none of that on my phone, just plain old keyboard.

But the reason it's everywhere is it's the new hot thing and every company in the world feels like they have to get on board now or they'll be potentially left behind, can't let anyone have a headstart. It's incredibly dumb and shortsighted but since actually innovating in features is hard and AI is cheap to implement, that's what every company goes for.

[–] PonyOfWar@pawb.social 4 points 6 months ago (2 children)

It's not new, nor is it AI. Predictive text suggestions have been in Android for ages now.

[–] thingsiplay 2 points 6 months ago

I think predictive text predates even Android and smartphones (but not exactly), when we had to press a key 3 times until specific characters appeared; called T9 and just a dictionary. Having or not having a dictionary suggestion was the difference between life and death. Now the modern smartphone has way more compute power and resources, therefore they can analyze text in more depth. It's just the logical next step to the plain and simple dictionary.

[–] megopie 1 points 5 months ago

See, it isn’t new and it isn’t AI, but it’s the same line of development as modern LLMs. They’ve just rebranded existing projects and lines of development as “AI technology” to be marketable.

[–] Paragone 2 points 5 months ago

It's simple:

Beat the population into learned-helplessness,

& then all the AI molestingware that the device can run, can be running in it.

Desensitization/enforced-learned-helplessness.

It's just a conditioning-step, is all.

The profit is in having the population not have any privacy left, & living only within the neuromarketing-platforms that the mainstream operating-systems are becoming.

It's just a step in the suckerpunching humankind, is all.

_ /\ _

[–] TehPers 1 points 6 months ago

I've seen this in a few places on desktop, and I have no clue why it's even a feature. I'm not aware of anyone using it anywhere (although to be fair I haven't thought to ask).

As for why it's enabled by default, probably for visibility. The easiest way to get people to use a feature is to make them use it and make them explicitly disable it (if even an option). For AI training, they could theoretically just capture typing data and messages regardless of if the feature is enabled/disabled anyway.

[–] megopie 1 points 5 months ago* (last edited 5 months ago)

Might be that information about when you do and don’t use the output is helpful for training. Like, if you use the output, good sign the output is good.

[–] thingsiplay 1 points 6 months ago

Is that really the case? Would everything one type to the keyboard be send to the companies and used as training data for AI? Does that any keyboard at all on the smartphone?