Close Menu
  • Homepage
  • Local News
  • India
  • World
  • Politics
  • Sports
  • Finance
  • Entertainment
  • Business
  • Technology
  • Health
  • Lifestyle
Facebook X (Twitter) Instagram
  • Contact
  • Privacy Policy
  • Terms & Conditions
  • DMCA
Facebook X (Twitter) Instagram Pinterest
JHB NewsJHB News
  • Local
  • India
  • World
  • Politics
  • Sports
  • Finance
  • Entertainment
Let’s Fight Corruption
JHB NewsJHB News
Home»World»Microsoft Engineer Says Company’s AI Tool Generates Sexual And Violent Images
World

Microsoft Engineer Says Company’s AI Tool Generates Sexual And Violent Images

March 7, 2024No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email

Microsoft Engineer Says Company's AI Tool Generates Sexual And Violent Images

Mr Jones claims he beforehand warned Microsoft administration however noticed no motion

A Microsoft AI engineer, Shane Jones, raised issues in a letter on Wednesday. He alleges the corporate’s AI picture generator, Copilot Designer, lacks safeguards in opposition to producing inappropriate content material, like violent or sexual imagery. Mr Jones claims he beforehand warned Microsoft administration however noticed no motion, prompting him to ship the letter to the Federal Commerce Fee and Microsoft’s board.

“Internally the corporate is effectively conscious of systemic points the place the product is creating dangerous photographs that may very well be offensive and inappropriate for customers,” Mr Jones states within the letter, which he revealed on LinkedIn. He lists his title as “principal software program engineering supervisor”.

In response to the allegations, a Microsoft spokesperson denied neglecting security issues, The Guardian reported. They emphasised the existence of “sturdy inside reporting channels” for addressing points associated to generative AI instruments. As of now, there was no response from Shane Jones relating to the spokesperson’s assertion.

The central concern raised within the letter is about Microsoft’s Copilot Designer, a picture technology software powered by OpenAI’s DALL-E 3 system. It features by creating photographs based mostly on textual prompts.

This incident is a part of a broader pattern within the generative AI discipline, which has seen a surge in exercise over the previous 12 months. Alongside this speedy improvement, issues have arisen relating to the potential misuse of AI for spreading disinformation and producing dangerous content material that promotes misogyny, racism, and violence.

“Utilizing simply the immediate ‘automotive accident’, Copilot Designer generated a picture of a lady kneeling in entrance of the automotive carrying solely underwear,” Jones states within the letter, which included examples of picture generations. “It additionally generated a number of photographs of girls in lingerie sitting on the hood of a automotive or strolling in entrance of the automotive.”

Microsoft countered the accusations by stating they’ve devoted groups particularly tasked with evaluating potential security issues inside their AI instruments. Moreover, they declare to have facilitated conferences between Jones and their Workplace of Accountable AI, suggesting a willingness to handle his issues by means of inside channels.

“We’re dedicated to addressing any issues workers have by our firm insurance policies and recognize the worker’s effort in finding out and testing our newest expertise to additional improve its security,” a spokesperson for Microsoft stated in an announcement to the Guardian.

Final 12 months, Microsoft unveiled Copilot, its “AI companion,” and has extensively promoted it as a groundbreaking technique for integrating synthetic intelligence instruments into each enterprise and inventive ventures. Positioned as a user-friendly product for most people, the corporate showcased Copilot in a Tremendous Bowl commercial final month, emphasizing its accessibility with the slogan “Anybody. Anyplace. Any machine.” Jones contends that portraying Copilot Designer as universally secure to make use of is reckless and that Microsoft is neglecting to reveal well known dangers linked to the software.

 

Source link

companys engineer Generates Images Microsoft Sexual tool Violent
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Two dead and hundreds arrested in France after PSG victory

June 1, 2025

June 1, priceless recordings destroyed in Universal Studios fire

June 1, 2025

DOGE Takeover Left Peace Institute With Rats And Roaches

June 1, 2025

At least three dead after Russian bridge collapses onto train

June 1, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Best high-yield savings interest rates today, June 1, 2025 (best accounts offering 4.3% APY)

June 1, 2025

‘Can’t afford to dwell on them now’: What Shashi Tharoor said on speaking to colleagues as Congress critique mounts | India News

June 1, 2025

Are the viral candle moisturisers hazardous for your skin? Dermatologist explains | Life-style News

June 1, 2025

‘Swaad nu barkraar rakhiyo’ & ‘mela lut ke liona ae’ – Super Sher fan Gurdas Mann uses wrestling analogy of acing the mela dangal to cheer for Punjab Kings | Ipl News

June 1, 2025
Popular Post

Anthropic lining up a new slate of investors, ruled out Saudi Arabia

‘We wrongly expected adults in the room’

Siddaramaiah calls Centre ‘anti-poor’, says it is conspiring to fail Anna Bhagya scheme

Subscribe to Updates

Get the latest news from JHB News about Bangalore, Worlds, Entertainment and more.

JHB News
Facebook X (Twitter) Instagram Pinterest
  • Contact
  • Privacy Policy
  • Terms & Conditions
  • DMCA
© 2025 Jhb.news - All rights reserved.

Type above and press Enter to search. Press Esc to cancel.