jasonpeacock 12 hours ago

What amazes me is that this wasn't the original plan. What product manager thinks "the best thing for our customers is to delete their data!".

> We understand these links are embedded in countless documents, videos, posts and more, and we appreciate the input received.

How did they think the links were being used?

  • borg16 11 hours ago

    i read in an earlier thread for this on HN - "this is a classic example of data driven product decision" aka we can reduce costs by $x if we just stopped goo.gl links. Instead of actually wondering how this would impact the customers.

    Also helps that they are in a culture which does not mind killing services on a whim.

    • Aurornis 11 hours ago

      The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.

      I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.

      They also might have wanted to use the domain for something else.

      • cogman10 10 hours ago

        How much of a burden could this really be?

        The nature of something like this is that the cost to run it naturally goes down over time. Old links get clicked less so the hardware costs would be basically nothing.

        As for the actual software security, it's a URL shortener. They could rewrite the entire thing in almost no time with just a single dev. Especially since it's strictly hosting static links at this point.

        It probably took them more time and money to find inactive links than it'd take to keep the entire thing running for a couple of years.

        • simonw 9 hours ago

          "How much of a burden could this really be?"

          My understanding from conversations I've seen about Google Reader is that the problem with Google is that every few years they have a new wave of infrastructure, which necessitates upgrading a bunch of things about all of their products.

          I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.

          If a product has an active team maintaining it they can handle the upgrade. If a product has no team assigned there's nobody to do that work.

          • rohit89 21 minutes ago

            > If a product has no team assigned there's nobody to do that work.

            This seems like a good eval case for autonomous coding agents.

          • Polizeiposaune 6 hours ago

            My understanding is that (at least at one point) binaries older than about six months were not allowed to run in production. But APIs are "evolving" irregularly so the longer you go between builds the more likely something is going to break. You really need a continuous build going to stay on top of it.

            Best analogy I can think of is log-rolling (as in the lumberjack competition).

            • hackernudes 5 hours ago

              Google is famously a monorepo and is basically the gold standard of CI/CD.

              What does happen is APIs are constantly upgraded and rewritten and deprecated. Eventually projects using the deprecated APIs need to be upgraded or dropped. I don't really understand why developers LOVE to deprecate shit that has users but it's a fact of life.

              Second hand info about Google only so take it with a grain of salt.

              • zdragnar 5 hours ago

                Simple: you don't get promoted for maintaining legacy stuff. You do get promoted for providing something new that people adopt.

                As such, developing a new API gets more brownie points than rebuilding a service that does a better job of providing an existing API.

                To be more charitable, having learned lessons from an existing API, a new one might incorporate those lessons learned and be able to do a better job serving various needs. At some point, it stops making sense to support older versions of an API as multiple versions with multiple sets of documentation can be really confusing.

                I'm personally cynical enough to believe more in the less charitable version, but it's not impossible.

                • Hammershaft 5 hours ago

                  I agree this is an overriding incentive that hurts customers & companies. I don't think there's an easy fix. Designing & creating new products require more relevant capabilities from employees for promotions then maintaining legacy code.

          • chrisjj 7 hours ago

            > I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.

            Arrival of new does not neccessitate migration.

            Only departure of old does.

            • mbac32768 7 hours ago

              They deprecate internal infrastructure stuff zealously and tell teams they need to be off of such and such by this date.

              But it's worse than that because they'll bring up whole new datacenters without ever bringing the deprecated service up, and they also retire datacenters with some regularity. So if you run a service that depends on deprecated services you could quickly find yourself in a situation where you have to migrate to maintain N+2 redundancy but there's hardly any datacenter with capacity available in the deprecated service you depend on.

              Also, how many man years of engineering do you want to spend on keeping goo.gl running. If you were an engineer would you want to be assigned this project? What are you going to put in your perf packet? "Spent 6 months of my time and also bothered engineers in other teams to keep this service that makes us no money running"?

              • fragmede 5 hours ago

                > If you were an engineer would you want to be assigned this project?

                If you're high flying, trying to be the next Urs or Jeff Dean or Ian Goodfellow, you wouldn't, but I'm sure there's are many thousands of people who are able to do the job that would just love to work for Google and collect a paycheck on a $150k/yr job and do that for the rest of their lives.

              • ikiris 6 hours ago

                Because it costs money to run things, and no one wants to pay for something that they aren't getting career value for.

            • lokar 7 hours ago

              A lot of Google infra services are built around the understanding that clients will be re-built to pick up library changes pretty often, and that you can make breaking API changes from time to time (with lots of notice).

            • pavel_lishin 7 hours ago

              But if you don't downgrade the old, then you're endlessly supporting systems, forever. At some point, it does become cheaper to migrate everything to the new.

          • stouset 8 hours ago

            And you could assign somebody to do that work, but who wants to be employed as the maintainer of a dead product? It’s a career dead-end.

        • davidcbc 7 hours ago

          > How much of a burden could this really be?

          You know how Google deprecating stuff externally is a (deserved) meme? Things get deprecated internally even more frequently and someone has to migrate to the new thing. It's a huge pain in the ass to keep up with for teams that are fully funded. If something doesn't have a team dedicated to it eventually someone will decide it's no longer worth that burden and shut it down instead.

        • saagarjha 5 hours ago

          I assume the general problem is people using these links for bad purposes and having to deal with needing to moderate them.

        • londons_explore 10 hours ago

          I think the concern is someone might scan all the inactive links and find that some of them link to secret URL's, leak design details about how things are built, link to documents shared 'anyone with the link' permission, etc.

          • cogman10 9 hours ago

            > I think the concern is someone might scan all the inactive links

            How? Barring a database leak I don't see a way for someone to simply scan all the links. Putting something like Cloudflare in front of the shortener with a rate limit would prevent brute force scanning. I assume google semi-competently made the shortener (using a random number generator) which would make it pretty hard to find links in the first place.

            Removing inactive links also doesn't solve this problem. You can still have active links to secret docs.

            • bapak 3 hours ago

              > You can still have active links to secret docs.

              If they're have a (passwordless) URL they're not secret.

      • rdtsc 8 hours ago

        > I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.

        Yeah I can't imagine it being a huge cost saver? But guessing that the people who developed it long moved on, and it stopped being a cool project. And depending on the culture inside Google it just doesn't pay career-wise to maintain someone else's project.

      • mort96 8 hours ago

        Documents from 2018 haven't decayed or somehow become irrelevant.

      • rany_ 8 hours ago

        I really doubt it was about security/maintenance burdens. Under the hood, goo.gl just uses Firebase Dynamic Links which is still supported by Google.

        Edit: nevermind, I had no idea Dynamic Links is deprecated and will be shutting down.

        • quesera 7 hours ago

          Firebase Dynamic Links is shutting down at the end of August 2025.

          • rany_ 7 hours ago

            I had no idea. It's too late to delete my comment now.

            It's a really ridiculous decision though. There's not a lot that goes into a link redirection service.

      • dangus 3 hours ago

        I think the problem with URL shorteners like Google’s that includes the company name is that to the layperson there is possibly an implied level of safety.

        Here is a service that basically makes Google $0 and confuses a non-zero amount of non-technical users when it sends them to a scam website.

        Also, in the age of OCR on every device they make basically no sense. You can take a picture of a long URL on a piece of paper then just copy and paste the text instantly. The URL shortener no longer serves a discernible purpose.

        • cbarrick 3 hours ago

          Shorter URLs mean fewer characters to encode in a QR code.

      • EGreg 9 hours ago

        How much does it really cost google to answer some quick HTTP requests and redirect, vs all their youtube videos etc

      • resize2996 10 hours ago

        "security and maintenance burden" == "cost" == "cost-driven decision"

        • aspenmayer 10 hours ago

          Capital inputs are one part of the equation. The human cost of mental and contextual overhead cannot be reduced to dollars and cents.

          • mixdup 9 hours ago

            Sure it can. It takes X people Y hours a day/month/week to perform tasks, including planning and digging up the context behind, related to this service. Those X people make Z dollars per year. It's an extremely simple math equation

            • aspenmayer 5 hours ago

              Emotional labor doesn’t show up on a balance sheet.

    • jerlam 10 hours ago

      Goo.gl didn't have customers, it had users. Customers pay, either with money or their personal data, now or the future. Goo.gl did not make any money or have a plan to do so in the future.

      • CydeWeys 10 hours ago

        One wonders why they don't, instead of showing down, display a 15s interstitial unskippable YouTube-style ad prior to redirecting.

        That way they'll make money, and they can fund the service not having to shut down, and there isn't any linkrot.

        • gloxkiqcza 9 hours ago

          This is such an evil idea.

          • xp84 9 hours ago

            Why is it evil? If we assume that a free URL shortener is a good thing, and that shutting one down is a bad thing, and given that every link shortener will have costs (not just the servers -- constant moderation needs, as scammers and worse use them) and no revenue. The only possible outcome is for them all to eventually shut down, causing unrecoverable linkrot.

            Given those options, an ad seems like a trivial annoyance to anyone who very much needs a very old link to work. Anyone who still has the ability to update their pages can always update their links.

          • sincerely 8 hours ago

            This is how every URL shortener on the internet worked used to work

      • franga2000 9 hours ago

        The monetary value of the goodwill and mindshare generated by such a free service is hard to calculate, but definitely significant. I wouldn't be surprised if it was more than it costs to run.

        • x0x0 6 hours ago

          And also the ongoing demonstration of why you should never trust Google.

          "Here's a permanent (*) link".

          [*] Definitions of permanent may vary wildly.

      • somat 7 hours ago

        I always figured most of the real value of these url hashing services was as an marketing tracking metric. That is, sort of equivalent to the "share with" widgets provided that conveniently also dump tons of analytics to the services.

        I will be honest I was never in an environment that would benefit from link shortening, so I don't really know if any end users actually wanted them (my guess twitter mainly) and always viewed these hashed links with extreme suspicion.

    • thevillagechief 10 hours ago

      One of the complaints about Google is that it's difficult to launch products due to bureaucracy. I'm starting to thing that's not a bad thing. If they'd done a careful analysis of the cost of jumping into this url-shortener bandwagon, we wouldn't be here. Maybe it's not a bad thing they move slower now.

      • margalabargala 22 minutes ago

        I would bet that the salaries paid to the product managers behind shutting this down, during the time they worked on shutting it down, outweigh the annual cost of running the service by an order of magnitude.

    • observationist 7 hours ago

      At this point, anyone depending on Google for anything deserves to get burned. I don't know how much more clearly they could tell their users that Google has absolutely no respect for users without drone shipping boxes of excrement.

    • miohtama 11 minutes ago

      For all HN commenters: if you are not paying for it, you are not a customer and thus you should not complain.

    • Imustaskforhelp 11 hours ago

      If companies can spend billions on AI and not have anything in return and be okay with that in the ways of giving free stuff (okay, I'll admit not completely free since you are the product but still free)

      Then they should also be okay for keeping the goo.gl links honestly.

      Sounds kinda bad for some good will but this is literally google, the one thing google is notorious for is killing their products.

      • citizenpaul 11 hours ago

        This is basically modern SV business. This old data is costing us about a million a year to hold onto. KILL IT NOW WITH FIRE.

        Hey lets also dump 100 Billion dollars into this AI thing without any business plan or ideas to back it up this year. HOW FAST CAN YOU ACCEPT MY CHECK!

    • manquer 9 hours ago

      Hard to imagine costs were ever a factor.

      For company running GCP and giving things like Colab TPUs free the costs of running a URL service would be trivial rounding number at best

    • no_wizard 10 hours ago

      Arguably, this is them collecting the wrong types of data to inform decisions, if that isn't represented in the data.

    • j45 10 hours ago

      All while data and visibility is part of the business.

      Like other things spun down there must not be value in the links.

  • charcircuit an hour ago

    One that is operating in an environment where strict privacy laws exist. User data stuck in legacy systems is a liability.

    Not only are things evolving internally within Google, laws are evolving externally and must be followed.

  • troupo 9 hours ago

    > How did they think the links were being used?

    Can't dig this document up right now, but in their Chrome dev process they say something along these lines: "even if a ferie is used by 0.01% of users, at scale that's a lot of users . Don't remove until you've made solely due impost is negligible".

    At Google scale I'm surprised [1] this is not applied everywhere.

    [1] Well, not that surprised

    • cnst 9 hours ago

      Yup, 0.01% of users at scale is indeed a lot of users.

      This is exactly why many big companies like Amazon, Google and Mozilla still support TLSv1.0, for example, whereas all the fancy websites would return an error unless you're using TLSv1.3 as if their life depends on it.

      In fact, I just checked a few seconds ago with `lynx`, and Google Search even still works on plain old HTTP without the "S", too — no TLS required whatsoever to start with.

      Most people are very surprised by this revelation, and many don't even believe it, because it's difficult to reproduce this with a normal desktop browser, apart from lynx.

      But this also shows just out how out of touch Walmart's digital presence really is, because somehow they deem themselves to be important enough to mandate TLSv1.2 and the very latest browsers unlike all the major ecommerce heavyweights, and deny service to anyone who doesn't have the latest device with all the latest updates installed, breaking even the slightly outdated browsers even if they do support TLSv1.2.

  • ChrisArchitect 10 hours ago

    So bizarre. Embedded links, docs, social posts, stuff that could be years and years old, and they're expecting traffic to them recently? Why do they seem to think their link shortener is only being used for like someone's social profile linktree or something. Some marketing person's bizarre view of how the web is being used.

  • cellover 9 hours ago

    tail -f access.log maybe?

neilv 11 hours ago

"Actively used" criteria scrods that critical old document you found, in which someone trusted it was safe to use a Google link.

Not knowing all the details motivating this surprising decision, from the outside, I'd expect this to be an easy "Don't Be Evil" call:

"If we don't want to make new links, we can stop taking them (with advance warning, for any automation clients). But we mustn't throw away this information that was entrusted to us, and must keep it organized/accessible. We're Google. We can do it. Oddly, maybe even with less effort than shutting it down would take."

  • inetknght 11 hours ago

    > someone trusted it was safe to use a Google link.

    That someone made a poor decision to rely on anything made by Google.

    • progval 10 hours ago

      Hindsight is 20/20. Google was considered by geeks to be a very reliable company at some point.

      • wolrah 10 hours ago

        Using a link shortener for any kind of long-term link, no matter who hosts it, has never been a good idea. They're for ephemeral links shared over limited mediums like SMS or where a human would have to manually copy the link from the medium to the browsing device like a TV ad. If you put one in a document intended for digital consumption you've already screwed up.

        • CydeWeys 10 hours ago

          Link shorteners are old enough that likely more URLs that were targeted by link shorteners have rotted away than have link shorteners themselves.

          Go look at a decade+ old webpage. So many of the links to specific resources (as in, not just a link to a domain name with no path) simply don't work anymore.

          • babypuncher 9 hours ago

            I think it would be easy for these services to audit their link database and cull any that have had dead endpoints for more than 12 months.

            That would come off far less user hostile than this move while still achieving the goal of trimming truly unnecessary bloat from their database. It also doesn't require you to keep track of how often a link is followed, which incurs its own small cost.

            • xp84 9 hours ago

              > cull any that have had dead endpoints

              That actually seems just as bad to me, since the URL often has enough data to figure out what was being pointed to even if the exact URL format of a site has changed or even if a site has gone offline. It might be like:

              kmart dot com / product.aspx?SKU=12345678&search_term=Staplers or /products/swingline-red-stapler-1235467890

              Those URLs would now be dead and kmart itself will soon be fully dead but someone can still understand what was being linked to.

              Even if the URL is 404, it's still possibly useful information for someone looking at some old resource.

              • Wicher 3 hours ago

                Totally. Furthermore one can input that (now broken) URL into the Internet Archive to see if they might have snapshotted that red stapler page.

        • onli 6 hours ago

          We knew that. But it is very useful in documents that would be printed, especially if the original url is complicated. That is why one would not use a random url shortener, but Google's. After all, Google would never destroy those URLs, and the company will likely outlive us.

          I'm completely serious, and I have a PhD thesis with such links to back it up. Just in some foootnotes, but still.

          Yes, maybe this shows how naive we were/I was. But it definitely also shows how deep Google has fallen, that it had so much trust and completely betrayed it.

        • 0cf8612b2e1e 6 hours ago

          I am constantly annoyed at O’Reilly and similar book vendors which seem to have a policy that all links should go through a shortener.

      • neilv 9 hours ago

        Yeah, when Google was founded, people acted like they were normal smart and benevolent and forward-thinking Internet techies (it was a type), and they got a lot of support and good hires because of that.

        Then, even as that was eroding, they were still seen as reliable, IIRC.

        The killedbygoogle reputation was more recent. And still I think isn't common knowledge among non-techies.

        And even today, if you ask a techie which companies have certain reliability capabilities, Google would be at the top of some lists (e.g., keeping certain sites running under massive demand, and securing data against attackers).

  • forty 9 hours ago

    "Don't Be Evil" has been deprecated for a while

    • neilv 4 hours ago

      I'm bringing it back.

  • kmeisthax 6 hours ago

    > Oddly, maybe even with less effort than shutting it down would take.

    Google has a number of internal processes that effectively make it impossible to run legacy code without an engineering team just to integrate breaking upstream API changes, of which there are many. Imagine Google as an OS, and every few years you need to upgrade from, say, Google 8 to Google 9, and there's zero API or ABI stability so you have to rewrite every app built on Google. Everyone is on an upgrade treadmill. And you can't decide not to get on that treadmill either because everything built at Google is expected to launch at scale on Google's shitty[0]-ass infrastructure.

    [0] In the same sense that Intel's EDA tools were absolutely fantastic when they made them and are holding the company back now

    • hirsin an hour ago

      The worst case is when this mentality of "just update your code" leaks out to the rest of us. I'm still scarred from some of the samesite shenanigans, breaking useful (not ads) boxed software because they figured everyone on the internet could "just update" their websites within six months of them putting out a dev blog post.

      It's just not an accurate view of how the world works.

  • userbinator 10 hours ago

    "Those who control the past control the present. Those who control the present control the future."

    Look at what happened to their search results over the years and you'll understand.

    • userbinator 3 hours ago

      Ironically, the voting pattern on my comment demonstrates the point exactly.

modeless 12 hours ago

What purpose does "deactivating" any serve?

  • MajimasEyepatch 10 hours ago

    It may help prevent linkjacking. If an old URL no longer works, but the goo.gl link is still available, it's possible that someone could take over the URL and use it for malicious. Consider a scenario like this:

    1. Years ago, Acme Corp sets up an FAQ page and creates a goo.gl link to the FAQ.

    2. Acme goes out of business. They take the website down, but the goo.gl link is still accessible on some old third-party content, like social media posts.

    3. Eventually, the domain registration lapses, and a bad actor takes over the domain.

    4. Someone stumbles across a goo.gl link in a reddit thread from a decade ago and clicks it. Instead of going to Acme, they now go to a malicious site full of malware.

    With the new policy, if enough time has passed without anyone clicking on the link, then Google will deactivate it, and the user in step 4 would now get a 404 from Google instead.

    • dundarious 9 hours ago

      In this little story, what's the difference if the direct ACME URL was used? What does the goo.gl indirection have to do with anything?

      • xp84 9 hours ago

        Goo.gl was a terrible idea in the first place because it lends Google's apparent legitimacy (in the eyes of the average "noob") to unmoderated content that could be malicious. That's probably why they at least stopped allowing new ones to be made. By allowing old ones, they can't rule out the Google brand being used to scam and phish.

        e.g. Imagine SMS or email saying "We've received your request to delete your Google account effective (insert 1 hour's time). To cancel your request, just click here and log into your account: https://goo.gl/ASDFjkl

        This was a very popular strategy for phishing and it's still possible if you can find old links that go to hosts that are NXDOMAIN and unregistered, of which there are no doubt millions.

        • NewJazz 8 hours ago

          Yeah I'm pretty sure this is the main reason google is shutting the service down. They don't want their brand tainted by phishing attempts.

        • dundarious 5 hours ago

          This reputational risk argument makes sense. The post I was replying to seemed to be making a flawed argument about capabilities.

      • mattmaroon 2 hours ago

        Only insofar as Google might wish to prevent it since their brand was on the shortened url you clicked to get there. And people not having malware is surely good for Google indirectly.

        Presumably ACME used the link shortener because they wanted to put the shortened link somewhere, so someone’s going to click things like these. If Google can just delete a lot of it why not?

  • mystifyingpoi 11 hours ago

    It creates a good entry in the promo package for that Google manager. "Successfully conducted cost saving measure, cutting down the spend on the link shortener service by 70%". Of course, hoping that no one will check the actual numbers.

  • maven29 11 hours ago

    A warning shot to guard against an AT&T Bell-style forced divestiture?

    • imchillyb 11 hours ago

      I believe this is the simplest and most succinct answer given the current anti monopoly climate the courts and prosecutors have.

  • 42lux 12 hours ago

    Increasing database ops.

  • 18172828286177 12 hours ago

    [flagged]

    • zarzavat 11 hours ago

      Do PMs at Google have so much power that they can shut down a product used by billions of people?

      • afavour 11 hours ago

        They’re not shutting down a product, they’re removing old links.

        I’m not defending it, just that I can absolutely imagine Google PMs making a chart of “$ saved vs clicks” and everyone slapping each other on the back and saying good job well done.

      • deelowe 11 hours ago

        They can write the proposals to do so and if it gets picked up by a VP and approved, then they can cite that on their promo.

      • OutOfHere 11 hours ago

        The product was shut down a long time ago. They're now deleting inactive data of users.

  • Retr0id 12 hours ago

    Presumably, saving disk space on some google servers.

    • dietr1ch 12 hours ago

      More than disk space I think they care about having short links, higher cache hit rates and saving RAM on their fleet.

      • smaudet 11 hours ago

        I find even this incredibly stingy... Back of the envelope:

        1043*1000000000 / (1023^3)

        10 4 byte characters times 3 billion links, dividing by 1 GB of memory...

        Roughly 111 GB of RAM.

        Which is like nothing to a search giant.

        To put that into perspective, my Desktop Computer's max Mobo memory is 128 GB, so saying it has to do with RAM is like saying they needed to shut off a couple servers...and save like maybe a thousand dollars.

        This reeks of something else, if not just sheer ineptitude...

        • dietr1ch 11 hours ago

          > Roughly 111 GB of RAM. Which is like nothing to a search giant.

          You are forgetting job replication. A global service can easily have 100s of jobs on 10-20 datacenters. Saving 111TiB of RAM can probably pay your salary forever. I think I paid mine with fewer savings while there. During covid there was a RAM shortage too enough to have a call to prefer trading CPU to save RAM with changes to the rule of thumb resource costs.

          • nomel 10 hours ago

            > A global service can easily have 100s of jobs on 10-20 datacenters.

            There's obviously, something in between maintaining the latency with 20 datacenter, increasing the latency a bit reducing hosting to a couple $100 worth of servers, and setting the latency to infinity, which was the original plan.

            • dietr1ch 10 hours ago

              I'm guessing that they ran out of leeway with small tweaks and found that breaking inactive links was probably a better way out. We don't know the hit rates of what they call inactive nor the real cost it takes to keep them around.

              A service like this is probably on maintenance mode too, so simplifying it to use fewer resources probably makes sense, and I bet the PMs are happy about shorter links, since at some point you are better off not using a link shortener and instead just use a QR code in fear of inconvenience and typos.

      • Retr0id 11 hours ago

        If they really are only purging the inactive ones, this shouldn't impact cache hit rate much.

nsksl 12 hours ago

I don't understand. For you to see the message, you have to click on the link. Your clicking on the link must mean that the link is active, since it is getting clicks. So why is the link being deactivated for being inactive?

  • skybrian 11 hours ago

    > showed no activity in late 2024

    Apparently they measured it once by running a map-reduce or equivalent.

    I don’t see why they couldn’t measure it again. Maybe they don’t want it to be gamed, but why?

  • poyu 11 hours ago

    I interpreted "inactive" as the link that the shortener is linking to is not responding.

    • OutOfHere 11 hours ago

      No. Inactive means that the short URL hasn't been accessed in a while.

  • lathiat 11 hours ago

    If I had to guess it is possibly something to do with fighting crawlers/bots/etc triggering the detection? And running some kind of more advanced logic to try ensure it's really being used. Light captcha style.

    But just a guess.

xp84 9 hours ago

I am pretty sure the terrible idea of putting the Google brand on something that can so easily be used for phishing is the reason they deprecated it in the first place. They should have used something without obvious branding.

quink 11 hours ago

And for an encore, I guess they'll start tearing out random pages in the books I didn't happen to read last August?

Uehreka 3 hours ago

“COOL URLS DON’T CHANGE! COOL URLS DON’T CHANGE!!” I continue to insist as I slowly shorten and turn into a bit.ly

alpb 7 hours ago

This whole thing has 0 cost to Google to run. They could be nice citizens and continue to provide this service for free, but they chose to not to.

mixdup 9 hours ago

I'm sure there's some level of security implication, but maybe they could also archive the database of redirect with Archive.org or just release it

jjice 11 hours ago

I would've imagined that the good will (or more likely, the lack of bad will) from _not_ doing this would've been worth the cost, considering I can't imagine this has high costs to run.

yandie 11 hours ago

They probably saved the equivalent of an engineer's salary!!

ChrisArchitect 11 hours ago

Noticed recently on some google properties where there are Share buttons that it's generating https://share.google links now instead of goo.gl.

Is that the same shortening platform running it?

And also does this have something to do with the .gl TLD? Greenland? A redirect to share.google would be fine

  • ewoodrich 8 hours ago

    The key difference is share.google, as you mentioned, is for Google controlled properties whereas goo.gl allowed shortening any arbitrary user provided URL. Which opened up a giant can of worms with Google implicitly lending its brand credibility to any URL used by a scammer, phisher or attacker.

    • charlesabarnes 8 hours ago

      You can generate share.google links on chrome for any arbitrary url.

      • ewoodrich 7 hours ago

        How? I just tried each of the Share options for this thread in the desktop Share menu, and they all used the full URL. Including the QR code which I verified by saving as a PNG and scanning it outside of any Google app. I also haven't found any Share option in the iOS app either that doesn't use the full URL. But harder to test on mobile given the various permutations of sharing between random apps.

beanjuiceII 5 hours ago

i am not even sure what you can trust this company for anymore

alliao 8 hours ago

oh google, please get your mojo back this is correct

TZubiri 9 hours ago

Next step, deprecate those ridiculous forms.gle links that just train users to ignore domain names.

hk1337 9 hours ago

Yet another google product put to the chopping block. If products were people, they'd have a lot of blood on their hands.

lofaszvanitt 7 hours ago

Jesus, do not rely on Google for anything.

AlienRobot 12 hours ago

I'll never use a URL shortener again.

  • saurik 11 hours ago

    The same reason you did in the first place -- despite a ton people who saw the future saying you shouldn't -- is the reason why the next generation of people will do it despite you trying to warn them.

  • SoftTalker 11 hours ago

    Any form of URL is at best a point in time reference.

    Shortened or not, they change, disappear, get redirected, all the time. There was once an idea that a URL was (or should be) a permanent reference, but to the extent that was ever true it's long in the past.

    The closest thing we might have to that is an Internet Archive link.

    Otherwise, don't cite URLs. Cite authors, titles, keywords, and dates, and maybe a search engine will turn up the document, if it exists at all.

  • Jabrov 12 hours ago

    Has there ever been one that survived for a really long time?

    • reddalo 11 hours ago

      Three random examples that come to my mind:

      - Tinyurl.com, launched in 2002, currently 23 years old

      - Urly.it, launched in 2009, currently 16 years old

      - Bitly.com, also launched in 2009

      So yes, some services survived a long time.

    • Imustaskforhelp 11 hours ago

      Honestly, that's a great question

      I think I might be doing a self plug here, so pardon me but I am pretty sure that I can create something like a link shortener which can last essentially permanent, it has to do with crypto (I don't adore it as an investment, I must make it absolutely clear)

      But basically I have created nanotimestamps which can embed some data in nano blockchain and that data could theoretically be a link..

      Now the problem is that the link would atleast either be a transaction id which is big or some sort of seed passphrase...

      So no, its not as easy as some passphrase but I am pretty sure that nano isn't going to dissolve, last time I checked it has 60 nodes and anyone can host a node and did I mention all of this for completely free.. (I mean, there is no gas fees in nano, which is why I picked it)

      I am not associated with the nano team and it would actually be sort of put their system on strain if we do actually use it in this way but I mean their system allows for it .. so why not cheat the system

      Tldr: I am pretty sure that I can build one which can really survive a really long time, decentralized based link shortener but the trade off is that the shortened link might actually become larger than original link. I can still think of a way to actually shorten it though

      Like I just thought that nano has a way to catalogue transactions in time so its theoretically possible that we can catalogue some transactions from time, and so basically its just the nth number of transaction and that n could be something like 1000232

      and so it could be test.org/1000232 could lead to something like youtube rickroll. Could theoretically be possible, If literally anybody is interested, I can create a basic prototype since I am just so proud really that I created some decent "innovation" in some space that I am not even familiar with (I ain't no crypto wizard)

      • ameliaquining 11 hours ago

        You can't address the risk that whoever owns the domain will stop renewing it, or otherwise stop making the web gateway available. Best-case scenario is that it becomes possible to find out what URL a shortened link used to point to, for as long as the underlying blockchain lasts, but if a regular user clicks on a link after the web gateway shuts down then they'll get an error message or end up on a domain squatting site, neither of which will provide any information about how to get where they want to go.

        • OutOfHere 7 hours ago

          These days one can register a domain for ten years, and have it auto-renew with prefunded payments that are already sitting in the account. This is what I did for the URL shortener I am developing.

          The same would have to be done for the node running the service, and it too has been prefunded with a sitting balance.

          Granted, there still exist failure modes, and so the bus factor needs to be more than one, but the above setup can in all probability easily ride out a few decades with the original person forgetting about it. In principle, a prefunded LLM with access to appropriate tooling and a headless browser can even be put in charge to address common administrative concerns.

        • Imustaskforhelp 10 hours ago

          I mean yes the web gateway can shut, but honestly like atleast with goo.gl if things go down, then there is no way of recovering.

          With the system I am presenting, I think that it can be possible to have a website like redirect.com/<some-gibberish> and even if redirect.com goes down then yes that link would stop working but what redirect.com is doing under the hood can be done by anybody so that being said,

          it can be possible for someone to archive redirect.com main site which might give instructions which can give a recent list on github or some other place which can give a list of top updated working web gateways

          And so anybody can go to archive.org, see that's what they meant and try it or maybe we can have some sort of slug like redirect.com/block/<random-gibberish> and then maybe people can then have it be understood to block meaning this is just a gateway (a better more niche word would help)

          But still, at the end of the day there is some way of using that shortened link forever thus being permanent in some sense.

          Like Imagine that someone uses goo.gl link for some extremely important document and then somehow it becomes inaccessible for whatever use case and now... Its just gone?

          I think that a way to recover that could really help. But honestly, I am all in for feedback and since its 0 fees and as such I would most likely completely open source it and neither am I involved in this crypto project, I most likely will earn nothing like ever even if I do make this, but I just hope that I could help in making the internet a little less like a graveyard with dead links and help in that aspect.

      • tqi 8 hours ago

        1) i think this means every link is essentially public? probably not ideal.

        2) you don't actually want things to be permanent - users will inevitably shorten stuff strings didn't mean to / want to, so there needs to be a way to scrub them.

      • OutOfHere 11 hours ago

        It's not useful if the resulting URL is too long. It defeats the purpose of a URL shortener. The source URL can just be used then.

        • Imustaskforhelp 10 hours ago

          Yes I did address that part but honestly I can use the time of when it was sent into blockchain / transaction id which is generally really short as I said in the comment. I will hack a prototype tomorrow.

          • OutOfHere 7 hours ago

            It is the long URL that also needs to be stored, not just the short URL.

            If you want to use blockchain for this, I advise properly using a dedicated new blockchain, not spamming the Nano network.

      • wizzwizz4 11 hours ago

        > which can last essentially permanent

        Data stored in a blockchain isn't any more permanent than data stored in a well-seeded SQLite torrent: it's got the same failure modes (including "yes, technically there are a thousand copies… somewhere; but we're unlikely to get hold of one any time in the next 3 years").

        But yes, you have correctly used the primitives to construct a system. (It's hardly your fault people undersell the leakiness of the abstraction.)

        • Imustaskforhelp 10 hours ago

          Honestly, I agree with your point so wholeheartedly. I was really into p2p technologies like iroh etc. and at a real fundamental level you are still trusting that someone won't just suddenly leave things so things can still very well go down... even in crypto

          But I think compared to sqlite torrent, the part about crypto might be the fact that since there's people's real money involved (for the worse or for the better) it then becomes of absolute permanence that data stored in blockchain becomes permanent.. and like I said, I can use that 60 nodes for absolutely free due to absolutely 0 gas fees compared to Sqlite torrent.

  • HPsquared 11 hours ago

    Finally a use for blockchain?

    • Imustaskforhelp 11 hours ago

      Oh boy... I think I found the man that I can yap about the idea that I got scrolling thorugh HN: link shortener in blockchain with 0 gas fees Here is the comment since I don't want to spam the same comment twice, Have a nice day

      https://news.ycombinator.com/reply?id=44760545

      • notpushkin 3 hours ago

        JFYI: single line breaks don’t work on HN. You can use double line breaks between paragraphs, or full stops between sentences

  • purplecats 12 hours ago

    could host your own

    • rs186 8 hours ago

      If you make it read-only, maybe. If anyone can generate a link, wait for your hosting provider to shout at you and ask why there is so much spam/illegal content with your domain. The you realize you can't actually manage a service like this.

    • Symbiote 11 hours ago

      I set one up at work using https://shlink.io/

      As we already have a PostgreSQL database server, thecost of running this is extremely low, and we aren't concerned about GDPR (etc) issues with using a third-party site.

  • OutOfHere 11 hours ago

    The more correct generalization would be to never trust a Google product again with your data.

    Fwiw, I wrote and hosted my own URL shortener, also embeddable in applications.

  • rsync 10 hours ago

    "I'll never use a URL shortener again."

    I don't know if anyone should use a URL shortener or not ... but if you do ...

    "Oh By"[1] will be around in thirty years.

    Links will not be "purged". Users won't be tracked. Ads won't be served.

    [1] https://0x.co

    • echoangle 10 hours ago

      > "Oh By"[1] will be around in thirty years.

      How can you (or I) know that?

    • hoten 3 hours ago

      I'm not sure how to ask this without being rude, so I'll just shoot. Is this example satire? https://0x.co/examples.html

      What normal person would find this glove and result in it being returned to its owner? Even if "0x.co" was written too, I think most people wouldn't understand it to be a URL.

    • PKop 8 hours ago

      Says who? These assertions mean nothing and guarantee nothing.