Jump to content
  • Fake news and ridiculing the dead — what’s wrong with Microsoft’s AI news


    Karlston

    • 1 comment
    • 708 views
    • 2 minutes
     Share


    • 1 comment
    • 708 views
    • 2 minutes

    A CNN report illustrates the news algorithm's lowlights.

    A new CNN report about the MSN AI model’s news aggregation kicks off with examples of questionable editorial calls, like highlighting a story claiming President Joe Biden dozed off during a moment of silence for Maui wildfire victims (he didn’t), or an obituary that inexplicably referred to an NBA player as “useless.” An editorial staff of humans probably would've spotted the problems. But Microsoft’s system, which continues to feel more like a social experiment than a helpful tool after ditching human efforts in favor of algorithms a few years ago, did not.

     

    That these stories were picked by MSN’s AI is no better than a travel guide Microsoft said was created by its algorithm and reviewed by a human that suggested Ottawa tourists grab a meal at the local food bank, or an AI-created poll that asked readers to vote on why a young woman died.

     

     

    It’s not just Microsoft, of course. AI is creeping into journalism just as it is everywhere else. The BBC is undertaking AI experiments, sites like Macworld use chatbots to query their archive, and The Associated Press has used AI for its “Automated Insights” for over eight years.

     

    Egregious examples in the last year of error-riddled Star Wars stories and bad financial advice doled out by chatbots show why AI chatbots shouldn’t be journalists, but at least those stories are generally just SEO plays.

     

    Microsoft Start and MSN are presented as resources for finding actual news. But its automated system keeps featuring or generating content with needlessly upsetting language and outright falsehoods, and there’s little indication anyone involved in the process cares. There are no careless journalists to blame, no editors with names and faces to take (or even shirk) responsibility. It’s all just software doing what it’s made to do and spokespeople shrugging when it goes wrong and saying they’ll try to make sure it doesn’t happen in the future.

     

    Source


    User Feedback

    Recommended Comments

    Human solutions for human problems.

    Technical solutions for technical problems.

    What part of humans need to edit or proofread reading material for other humans has Nadella not figured out yet?

     

    Or is he playing with his Co-Pilot a little too much?

    Link to comment
    Share on other sites




    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...