• Perroboc@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    1
    ·
    1 year ago

    That’s because you’re not engraving the suspects name in wooden balls based on the dreams of 3 people sleeping in some weird hot tubs.

  • flamingo_pinyata@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    34
    ·
    1 year ago

    Less than 1%? Did they forget to flip a boolean condition?
    Like that’s worse than random, it’s worse than if you intentionally wanted to be wrong.

    • deranger@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      1 year ago

      How do you figure that’s worse than random? Randomly attempting to predict crimes would likely be 0% accurate. I’m not supporting predictive policing at all, just curious what brought you to that conclusion.

      There are near infinite failure conditions and few successful conditions.

      • three@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        5
        ·
        edit-2
        1 year ago

        would likely be 0%

        shut the fuck up lmfao all you gotta do is say the black dude getting out of prison headed to the halfway house is going to rob the cornerstore and you’re at 97-98%

        this dude just asserted 0% like he has a doctorate in predictive policing j*sus chr*st

  • _haha_oh_wow_@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    1 year ago

    Police are notorious for using bullshit tech to try and justify their “investigations”. Remember Voice Stress Analysis? Total bullshit, but thousands of departments bought into it. There are probably still innocent people in prison because of it.

    • slaacaa@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Same with bite analysis, polygraph, and (if I remember correctly) blood splatter analysis

      • _haha_oh_wow_@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Sort of, blood spatter is kinda legit: It’s derived from old tracking techniques so it’s not totally bullshit (but it’s also not a super power or anything). You can tell if someone was running and blood was dripping or if it came from them getting repeatedly hit with something, etc. That’s part of forensics, some of which is legit science (though it’s not perfect and there are people who are full of shit that hire themselves out as “experts” sometimes).

  • hperrin@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    1 year ago

    I know how they could make it thousands of times more accurate. Just rewrite it to always point at Wall Street.

  • ShaggySnacks@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    An algorithm needs good data, I would wager a bet that the Police are very good at keeping data that is racist and terrible.

  • profdc9@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    3
    ·
    1 year ago

    The police need crimes and criminals to justify their existence. If the criminals are selected by a computer program, that is sufficient for their purposes.

      • DontMakeMoreBabies@kbin.social
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I mean that’s just not a realistic thing to believe. People aren’t actually unique or special.

        So at some point computational power will meet the right algorithm and suddenly we can model morons.

        Short sprint to predictive policing. And before anyone gets all bent out of shape, go ahead and ask a criminal defense attorney how many of their clients are ‘criminally stupid.’ Based on conversations I’ve had, I imagine the answer is ‘a fuck ton of them.’

        Of course that makes people feel weird so we probably won’t do it even when we can, but that’s not the same thing as ‘it’ll never be possible.’

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          There is no algorithm that will ever predict that someone will do something100% because there are too many factors, including those that come up during the opportunity to commit a crime, to account for. That doesn’t even cover the fact that the algorithm can only predict based on the information it is given and calculate based on our assumptions about people based on other data.

          At best it will be the technical equivalent of stop and frisk, with racist outcomes based on racist assumptions. Like most forensic stuff, it will just be technology used to justify what people already assume.

          Not to mention that stupid people doing stupid things makes them very unpredictable at the individual level.

          • DontMakeMoreBabies@kbin.social
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            There is lot wrong with your comment… And what you’re saying makes me suspect you have zero actual experience with the criminal justice system.

            Just one point before I duck out - we put folks in jail for life and even kill them on less than 100% proof because the standard is ‘reasonable doubt.’

        • ඞmir@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          There is far too much randomness in life to be able to predict everything, unless you can know everyone’s actions at all points in time. Which we seem to not be too far off from…

      • GeekyNerdyNerd@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Highly unlikely that’ll be the case forever. We can already do population level behavioral prediction for advertising purposes. It’s just a matter of time, quality data generation, and finding the right algorithm before we will be able to accurately predict where and when police resources should be deployed to efficiently deter crime. Especially since we already have a decent idea as to the factors that generally lead to spikes in crime-rates things like: poverty, widespread social isolation and low social cohesion, alcohol and drug use, perceived opportunity, and the presence of easily victimized populations such as racial minorities, religious minorities, the disabled, and the LGBT+ community.

        Tbh, we don’t even need such an algorithm because we already know that the best ways to reduce crime are to increase protections for those minorities, alleviate poverty, reduce the presence of alcohol selling establishments, provide addiction/mental illness care, promote social cohesion, and have community events where law enforcement builds trust and bonds with their local communities, promoting co-operation and mutual respect between law enforcement and the people they are supposed to protect. In other words, the best ways to combat crime are the exact opposite of what everyone in the USA has generally been doing, especially conservative areas. Predictive policing is only even desirable because we don’t want to do the hard work of actually improving people’s lives and building communities where crime isn’t something people have/want to consider.

        • snooggums@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          At best we can do what we already do and make estimates about groups, kind of like how we can have a fairly accurate predictions for climate but weather for a specific person’s house is extremely unreliable in any detail a couple days ahead except for massive weather systems like hurricanes. As you noted we already know the causes, but trends do not predict which individuals will commit crimes.

          There will be no point in time that an algorithm will be able to predict that an individual will commit a crime at a specific point in time.

          • GeekyNerdyNerd@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            s you noted we already know the causes, but trends do not predict which individuals will commit crimes. There will be no point in time that an algorithm will be able to predict that an individual will commit a crime at a specific point in time.

            I think we might’ve had a bit of miscommunication here. I wasn’t talking about predictive policing at an individual level, that’s highly unlikely to be possible, at least with traditional computing technologies (not to mention that individual predictive policing isn’t even desirable for a multitude of reasons explored by many dystopian fiction authors throughout history) but rather at an area level. Being able to predict where and when crimes are likely to occur and with regularity, predicting that a specific drug store will probably be robbed within a narrow window of time for example. Even if such an algorithm was only accurate within a couple of hours it would fundamentally change how law enforcement functions, as well as the purpose it serves. Instead of merely enforcing the law after a crime is committed they could prevent crime/catch the criminal mid act without the need for informants, and without even knowing who they are gonna be arresting prior to catching them.

  • waterbogan@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 year ago

    How did they manage to do so spectacularly badly? I think part of the problem is that they were trying to predict times and locations, rather than focusing on individual offenders. Past record is highly predicitive of future behaviour, i.e. if an offender has committed assault half a dozen times, it is highly probable that they will commit another assault or similar violent offence again, we just dont know when or where. Poor quality data may also be part of it - garbage in, garbage out