this post was submitted on 18 Jul 2024
115 points (100.0% liked)

Technology

37719 readers
24 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 47 comments
sorted by: hot top controversial new old
[–] OsrsNeedsF2P@lemmy.ml 122 points 4 months ago (2 children)

The dumbest part is like, why? How much work is it really to keep goo.gl links around?

In 2018, Google wanted developers to move to Firebase Dynamic Links that detect the user’s platform and sends them to either the web or an app. Google ended up also shutting down that service for devs.

lmao

[–] jarfil 37 points 4 months ago* (last edited 3 months ago) (2 children)

How much work is it really to keep goo.gl links around?

A lot.

Goo.gl has a namespace for 10 billion entries, it used to keep tracking/analytics data for each link, with a user interface, and it would happily generate them for links to internal stuff.

Just keeping it running would take some containers of server racks, plus updating the security, accounting for changing web standards, and so on.

Keep in mind this isn't some self-hosted url shortener with less than a million entries and a peak of 10K users/second, that you can slap onto a random server and keep it going. It's a multiple orders of magnitude larger beast, requiring a multi-server architecture just to keep the database, plus more of the same for the analytics, admin interface... and users will expect it to return a result in a fraction of a second, worldwide.

[–] Kissaki 23 points 4 months ago (1 children)

They could drop all the tracking though and only serve the public redirects. A much simpler product that would retain web links.

[–] jarfil 2 points 3 months ago (1 children)

I think they've dropped the tracking already. Still, where's the money in that?

They also can't release the database, not without prior consent of the link creators, or risking exposing some login credentials some very smart people might've put in there.

[–] Kissaki 1 points 3 months ago (1 children)

Why does there have to be money in it when they're sunsetting the service?

[–] jarfil 1 points 3 months ago

Google/Alphabet is a for-profit corporation, it makes no sense for them to do anything without some sort of profit.

[–] Penguincoder 12 points 4 months ago

Good analysis, I agree and understand.

[–] jonne@infosec.pub 7 points 4 months ago

Yeah, shouldn't be too hard to at least keep the existing links working in a read only state.

[–] thingsiplay 75 points 4 months ago (1 children)
[–] Adanisi@lemmy.zip 2 points 4 months ago

Another one for the graveyard!

[–] corbin@infosec.pub 75 points 4 months ago (2 children)

That's a whole lot of link rot about to happen.

[–] ptz@dubvee.org 21 points 4 months ago (1 children)

I have probably saved hundreds of links from such a fate in my org. People there use them for everything even though the media they're using them in allows them to be clicked (e.g. they're not going out to print where someone has to type them in).

Thankfully, I'm in a position to un-shorten them before they get published. lol

[–] tal@lemmy.today 11 points 4 months ago* (last edited 4 months ago) (1 children)

It might be interesting to have a search engine or someone else who has built a massive list of links visible online generate unshortened forms now before Google shuts down the service.

[–] jarfil 10 points 4 months ago (1 children)
[–] tal@lemmy.today 3 points 4 months ago

Thanks. It says that there are already browser plugins that use their database, so looks like there's already a way on both the scraper and user ends to programmatically avoid link rot here.

[–] anarchrist@lemmy.dbzer0.com 9 points 4 months ago

The Jedis are going to feel this one

[–] henfredemars@infosec.pub 61 points 4 months ago* (last edited 4 months ago) (13 children)

Don’t build your online life around Google services.

[–] smeg@feddit.uk 5 points 4 months ago

Don't rely on any company keeping a service running unless you've got a contract with them

load more comments (12 replies)
[–] Penguincoder 18 points 4 months ago (2 children)
[–] ptz@dubvee.org 11 points 4 months ago

LOL¹⁰⁰

[–] thingsiplay 5 points 4 months ago

Googlel

Used instead of lol; when you're too dank for lol

- https://www.urbandictionary.com/define.php?term=Lel

[–] Moonrise2473@feddit.it 13 points 4 months ago (2 children)

How much money they can save for this?

Probably with the saved money can't even pay one single day of salary for the CEO

[–] jarfil 3 points 4 months ago* (last edited 4 months ago) (1 children)

URL shorteners in general, or just Google?

https://wiki.archiveteam.org/index.php/URLTeam

Goo.gl has a namespace for 10 billion entries, it used to keep tracking/analytics data for each link, with a user interface, and it would happily generate them for stuff like Google Maps links.

How much money would you say it takes to even maintain a system like that, plus update its security, not to mention account for changing web standards, at that scale?

[–] jonne@infosec.pub 5 points 4 months ago (1 children)

Probably like $50/month in cloud resources if you turned off all the extra stuff and only did redirects and kept it around in read only mode. You'd need to do some dev work up front and price that in as well, obviously.

[–] jarfil 4 points 4 months ago* (last edited 4 months ago)

$50/month would barely scratch the surface.

Let's take a conservative approach, and say there are:

  • only 1 billion links
  • each link only points to a URL of up to 100 characters in length on average (some will be 1000 or longer, but let's hope some are 50 or shorter)
  • less than 10 billion daily hits total (that's an average of 10/link)
  • the response time should be well under 50ms.

Now you're looking at 100GB of raw data to put into a database, that needs to return 100K answers/second, in less than 50ms each, worldwide, 24/7.

What is your estimated cloud cost for something like 256GB of RAM, 128 cores, 10Gbps connect, replicated across several zones, and 1TB/day outgoing transfer?

That's only for the redirect responses in read-only mode, nothing else. You will also need some maintenance to keep it 24/7, for when the server catches fire, or gets obsoleted, and when new exploits come up against your software stack.

[–] halm@leminal.space 2 points 4 months ago (1 children)

Oh, the CEO's pay is secure from day one of the fiscal year. They're trying to pay the cleaning staff with this.

[–] metaStatic@kbin.earth 5 points 4 months ago (1 children)

What cleaning staff? Cleaning is just another part of your job now, this is purely for the shareholders.

[–] halm@leminal.space 1 points 4 months ago

Accurate 😬

[–] Gork@lemm.ee 11 points 4 months ago

Does Google no longer want to pay for the Greenland .gl TLD?

[–] radivojevic@discuss.online 11 points 4 months ago (1 children)

Why anyone uses a single Google product, I’ll never know.

[–] SweetCitrusBuzz 13 points 4 months ago* (last edited 3 months ago) (1 children)

Disclaimer: I don't use any google services myself.

Because it is free, guaranteed to work as long as they keep it running and marketed well.

Plus since they were early into the game of tech online they have many services that all link together.

There aren't many that will offer most users so much value for 'free'.

Most alternatives will have some cost if you want as much space as google provides, either the same as google (user data) or monetary (which I semi agree with, hosting isn't free and I'd rather pay money than with data). However, not everyone is in a position to pay with money and so data is usually what they pay with.

[–] DJDarren@thelemmy.club 2 points 4 months ago (1 children)

We use Google Forms and Sheets at work, precisely because easy for a bunch of us to access, and our boss is tight as fuck, so it being free is a massive draw.

I keep looking to other ways to perform the few functions we use, but ultimately I lack the knowledge and resources to roll my own.

[–] SweetCitrusBuzz 1 points 4 months ago

Yeah, I lack the knowledge and reaources to roll my own too.

So I mostly rely on cryptpad for sharing/collaborating on different document types myself. I don't think it is necessarily free for businesses, but I am unsure.

[–] cupcakezealot@lemmy.blahaj.zone 8 points 4 months ago (2 children)

what were they used for? internal redirects like t.co? or something for customers? genuine question

[–] jarfil 14 points 4 months ago

Anyone could generate them, for free, and they came with analytics on the side. Google also generated them for sharing content from their services.

https://en.m.wikipedia.org/wiki/Google_URL_Shortener

[–] tal@lemmy.today 3 points 4 months ago

I'd assume that Google's value -- as with other link-shortening companies -- came from being able to add information tracking whenever someone clicked on that link.

If you mean customer value, might be formats where people had limited space to include links like traditional Twitter (which was originally 140 characters in a post, whereas URLs have no specification-mandated character limit).

[–] mp3@lemmy.ca 5 points 4 months ago* (last edited 4 months ago) (1 children)

Looks like I'll finally have to replace that link in my resume after all.

It was useful to know when a copy of my printed resume was accessed online through the link I added on the footer, at least while the console for it was online.

[–] Midnitte 1 points 4 months ago

Could do similar things like adding a +resume to your email link