The quality of Bing Image Creator has noticeably declined. According to a recent examination by Windows Latest involving various queries, the tool has stopped producing high-quality images as it used to and now imposes restrictions on certain terms like “short black hair” or any language that might be associated with race or color.
It seems we can’t enjoy nice things for long. Microsoft claims that enhancements to Copilot and Bing Image Creator (which utilizes DALL-E 3 technology) are forthcoming, courtesy of new models from their collaborator, OpenAI. But is that actually happening?
Although Copilot and Bing Image Creator might appear to have improved on the surface, reports from users indicate that the output quality has become quite poor, exacerbated further by stringent content moderation.
For those not familiar, Bing Image Creator, accessible via https://www.bing.com/images/create, has traditionally been a standout tool, hailed as one of the leading platforms for AI-generated imagery.
Many users have previously considered it superior to Gemini, and even believed it surpassed OpenAI’s integrated features. In the past, Bing Image Creator was more responsive to user input.
Sadly, this responsiveness seems to have diminished. The ongoing censorship of Bing Image Creator is frustrating many creators, leading some to contemplate alternatives, as the tool has become nearly ineffective for those well-versed in AI image generation.
Following recent updates, images generated by Bing Image Creator often appear less detailed and lack vibrancy. In the comparison below, one can observe the stark difference between images generated before and after updates; the latest Bing image comes across as lifeless and comparable to “taxidermized mannequins.”
“They just couldn’t resist tweaking it to create the illusion of an upgrade,” voiced one user.
A Reddit user highlighted, “The differences are glaring; after comparing thousands of portraits, they now seem devoid of life.” Another user remarked, “All images exhibit poorer lighting, diminished detail, and are overall inferior.”
The gallery above illustrates the differences between images created before and after the updates to Bing’s DALL-E 3 model; the older image looks rich and authentic, whereas the newer one appears awkward and cartoonish.
“It feels almost unusable now,” mentioned another user.
The violence-related censorship has also intensified. If you attempt to generate anything that might suggest violence, the AI promptly alters the prompt to include terms associated with kindness and positivity.
Additionally, attempting to create anime-style images with phrases like “short black hair” results in content warnings, compounding the frustration.
There’s a strange occurrence where Bing Image Creator or Copilot generates dog images even when unrelated to the initial prompt.
“It feels as though the AI is overly cautious and punishes us with bland results,” commented one user.
“The images look reminiscent of childish drawings. I’d prefer it block the generation altogether than produce mockeries like this,” another user contended.
“It seems like they are actively undermining their own AI,” lamented yet another user.
“The joy is being stripped away,” another user stated. “Why can’t we simply have good things?”
There’s a growing outcry for Microsoft to rethink their strategy, with some urging direct communication with OpenAI.