• mkwt@lemmy.world
      link
      fedilink
      arrow-up
      67
      ·
      8 months ago

      Usual business mindset on something like this is, “Sure, this is not economical for us, but it’s only for six to nine months while the software guys code up the real software. In the mean time, we’ll collect and maintain market share, and we’ll just swap in the real software when it’s ready.”

    • psmgx@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      8 months ago

      Offshoring continues to reap dividends. Only thing that’s going to stop that is AI, and that means no one, here nor there, will have jobs

  • usualsuspect191@lemmy.ca
    link
    fedilink
    arrow-up
    26
    ·
    8 months ago

    I swear I remember hearing about a science fiction story where the “self-driving” cars actually just had people hidden in them.

      • Holyhandgrenade@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        8 months ago

        I watched this sketch comedy show in Iceland as a kid, where every week they had a section called “the men behind the curtains”. It was just people hidden away inside ATMs, vending machines etc. pretending it was a machine doing the work.

    • Revonult@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      8 months ago

      Slow march to 40k servitors not looking far fetched. Why develop an AI/algorithm when you have no moral conscience and can just make a human do it.

  • EnderMB@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    8 months ago

    If it were Amazon, you know that at least 150 out of those 1000 workers had already been threatened with PIP before being put on a plan. Look up Focus and Pivot, Amazon’s policy that puts around ~5-15% of corporate workers below director level a year on forced attrition.

    • dexa_scantron@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      8 months ago

      Nah, this was definitely outsourced to a company in India; you can abuse contract/vendor employees with way less effort than it takes to abuse full-time employees.

      • EnderMB@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        8 months ago

        Perhaps, although many labelling teams at Amazon for other orgs are in-house or are part-time hours. Amazon likes keeping things in-house because they don’t particularly give a fuck about abusing staff.

        • Krauerking@lemy.lol
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          No, I can assure you even if they work directly in the building there is no way there isn’t a middle man contract holder that allows the immediate firing of employees without effort. I work in this kind of tech. The goal is to have the least amount of actual employees as possible for these companies because it gives the least liability and the fastest route to letting them go.

          Amazon does this with their delivery drivers what makes you think they aren’t making sure there is a middle contract holder here too.

          • Coniferous@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            Previously worked for Amazon data labeling. The priority was largely having everything done in-house due to privacy concerns. It’s a lot easier to act on privacy leaks coming from within. That said, the long term strategy is using crowdsourced labeling for anything not having to do with customers or customer data. So looks like you’re both right :)

          • EnderMB@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            Mostly because I work for Amazon, and can both see the org structure and know how it works in other orgs here.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      8 months ago

      Focus and Pivot

      This is also known as “stack ranking” and “rank and yank”.

      It’s a super-gross way to run a business. I can see how you might want to “cut the fat” when starting out or growing. But keeping a policy like that for the long haul means selecting for employees that are good at that surviving. And that may not require one to even be all that productive, just good at working the system.

      Anecdotes: https://news.ycombinator.com/item?id=4195136

      It’s also a recipe for a toxic work environment:

      https://www.cultureamp.com/blog/what-is-stack-ranking

  • jg1i@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    8 months ago

    Waymo and Cruise definitely have humans waiting to take over if the autonomous mode gets stuck… so that’s already true!

    • melpomenesclevage@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      8 months ago

      Its not about automation; its about making labor fungible, kinda like the garmet industry-local workers rights happen, call center instantly relocates.

      This is the globalization you get while jacking off with the monkey paw.

  • Portable4775@lemmy.zip
    link
    fedilink
    arrow-up
    13
    ·
    8 months ago

    How long is it gonna take for them to use data created by those indians to train their AI model and replace them?

    • morphballganon@mtgzone.com
      link
      fedilink
      English
      arrow-up
      17
      ·
      8 months ago

      They’ve surely started working on it already. Current “AI” (LLMs) aren’t perfect. They require constant human adjustments.

      I’m an auditor for a “machine learning” algorithm’s work, and it develops new incorrect processes faster than it corrects them. This is because corrections require intervention, which involves a whole chain of humans, whereas learning new mistakes can happen seemingly spontaneously. The premise of machine learning is that it changes over time, but it has no idea which changes were good until it gets feedback.

      So, to answer your question, I’m sure they’re throwing a ton of money at that. But when will it be viable, if ever?

  • RedSeries@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    8 months ago

    I remember trying MTurk out back in the day to try and make money on the side. It’s such a mind numbing activity. Doesn’t surprise me that this is still the model for smart “automated” systems like this.

    • IMongoose@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      8 months ago

      I used one of these stores like 5-6 years ago, maybe more. It was basically a pop up shop so it was pretty new. It took like 15-20 minutes for me to be charged. I’m positive now that it was some poor Indian watching me because it took so long.

  • Queen HawlSera@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    8 months ago

    The self-driving cars secretly being driven by somebody is literally the plot to Captain Laserhawk

    • Ashe@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      11
      ·
      8 months ago

      It’s also already happening. Waymo admitted to needing wayyyy more manual interventions than they let on

  • pc36@lemmy.ca
    link
    fedilink
    arrow-up
    11
    ·
    8 months ago

    I mean ask Apple or Google how many people listen to their voice systems to manually improve them for accuracy…you have to train the AI somehow

    • WarlordSdocy@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      8 months ago

      Yeah except with there being only 27 stores in the US that use this tech if you have 1000 people reviewing the purchases is it really a machine learning system or are you just outsourcing the process to people in another country.

      • pc36@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        You still have to keep training the Model. These stores were in large busy markets, and having people watch and critique the AI is how they continually train the model. It took Apple over 8 years to ‘announce’ they’re doing on device voice recognition(they probably aren’t), and that was just voice recognition and LLM training vs image recognition which is hard on its own. Let alone tracking a person THROUGH a store, recognizing that someone picked something up and took it vs put it back or left it on another row.

        The real reason this probably happened is because those 1000 people training the model reported metrics of failures on top of the stores showing losses due to error. The margin of error was probably greater than they wanted. Or add in the biometric data they had integrated into it adding more layers of cost and privacy protection…it probably just doesn’t return the money they wanted and they’ll try again in a few years probably utilizing more RFID on top of the image recognition and people tracking.

  • JakenVeina@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    ·
    8 months ago

    You can’t just steer people into bridge abutments

    So thaaaaaat’s what happened with the Francis Scott Key Bridge.