Jump to content
  • Microsoft will halt sale of emotion-reading tech and limit access to face recognition tools


    Karlston

    • 328 views
    • 2 minutes
     Share


    • 328 views
    • 2 minutes

    Microsoft has confirmed that it will pull back software that judges a person’s emotional state by processing their image. Additionally, the company will also restrict access to its facial recognition technology.

     

    Following Google’s footsteps, Microsoft is halting the sale of emotion-reading technologies. The company will also limit “unrestricted” access to facial recognition technology. Existing customers will have just one year before losing access to Azure Face, a set of Artificial Intelligence (AI) tools that attempt to infer emotion, gender, age, smile, facial hair, hair, and makeup. Speaking about the development, Sarah Bird, principal group product manager at Microsoft's Azure AI unit said:

     

    These efforts raised important questions about privacy, the lack of consensus on a definition of 'emotions,' and the inability to generalize the linkage between facial expression and emotional state across use cases, regions, and demographics.

     

    Microsoft has reportedly been reviewing whether emotion recognition systems are rooted in science. It is not immediately clear what Microsoft meant. However, it is possible that the company hasn’t been able to perfect the algorithms that guess a person’s emotional state based on an image. Moreover, the company could be bolstering its case against new rules and regulations about the use of such tools.

     

    1655837999_microsoft_responsible_ai_2_st

     

    Apart from halting the sale of emotion-reading tech, Microsoft is also stopping unrestricted access to its facial technologies. The company has indicated that customers using its facial recognition technologies must obtain prior approval. It is obvious that Microsoft’s customers must have contractual obligations. However, it is not clear if Microsoft is placing additional restrictions or merely asking companies to sign a disclaimer absolving Microsoft of any legal penalties arising from any misuse.

     

    For the time being, Microsoft has merely asked its clients “to avoid situations that infringe on privacy or in which the technology might struggle”. An obvious legally questionable purpose would be identifying minors. Incidentally, Microsoft isn’t specifically banning such uses.

     

    Microsoft is also putting some restrictions on its Custom Neural Voice feature, which lets customers create AI voices based on recordings of real people.

     

     

    Microsoft will halt sale of emotion-reading tech and limit access to face recognition tools


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...