• PmMeFrogMemes@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    ·
    21 days ago

    This is what machine learning is supposed to be. Specialized models that solve a specific problem. Refreshing to read about some real AI research

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      20 days ago

      Yeah, this is a typical place for AI to actually shine and we hear almost nothing about it because making fake porn videos of your daughter in law is somehow more important

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 days ago

        I was about to post a comment: Finally a use for AI that feels justified to spend energy on.

        • SugarCatDestroyer@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          20 days ago

          Well, it would be logical to say that anonymity is a threat. Plus, it makes it easier to block thought-criminals if they become a threat… :3

          What anonymity, don’t joke with me here.

    • brendansimms@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      20 days ago

      After reading the actual published science paper referenced in the article, I would downvote the article because the title is clickbaity and does not reflect the conclusions of the paper. The title suggests that AI could replace pathologists, or that pathologists are inept. This is not the case. Better title would be “Pathologists use AI to determine if biopsied tissue samples contain markers for cancerous tissue that is outside the biopsied region.”

  • kalkulat@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    2
    ·
    21 days ago

    From the article: " All 232 men in the study were assessed as healthy when their biopsies were examined by pathologists. After less than two-and-a-half years, half of the men in the study had developed aggressive prostate cancer…"

    HALF? I’d suggest staying away from that study … either they don’t know what they’re doing, or some AI made up that article…

    • brendansimms@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      20 days ago

      From the peer-reviewed paper: “This study examined if artificial intelligence (AI) could detect these morphological clues in benign biopsies from men with elevated prostate-specific antigen (PSA) levels to predict subsequent diagnosis of clinically significant PCa within 30 months”… so yes, these were men who all had high cancer risk.

    • Hawk@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      21 days ago

      Maybe they specifically picked men with increased risk?

      Half sounds pretty nuts otherwise.

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        20 days ago

        Yes, this is one of the kinds of AI I love

        The plagiarism machines are the kind most of us can’t stand

      • Octavio@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        19 days ago

        Yep. My thesis is that a larger share of AI investment and energy should be directed toward more promising areas.

  • GreenKnight23@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    20 days ago

    this is bullshit.

    the study was performed by Navinci Diagnostics, which has a vested interest in the use of technological diagnostic tools like AI.

    the only way to truly identify cancer is through physical examination and tests. instead of wasting resources on AI we should improve early detection through improved efficiency of tests, allowing patients to regularly test more often and cheaper.

      • GreenKnight23@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        20 days ago

        it won’t because it’s an illusion of a test with unverifiable results.

        Imagine this, you want to know if you have cancer. you can get results from a biopsy, blood tests, and an MRI. all results are validated by specialist review. it will take 3 months to collect and validate the results. OR, you can run all your tests above and have results in 24 hours but they aren’t validated by a specialist.

        so the question is, why does it take 3 months and how can we make it shorter without decreasing quality, validity, or consistency?

        AI is not the answer.

  • GraniteM@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    21 days ago

    I thought the article was telling an unmarried woman that AI can find the cancer pathologists she’s been looking for. Not sure why they would be hiding.