Announcements
We ıntegrate ınformatıon ın lıfe

  • DOLAR
    %0,04
  • EURO
    %-0,09
  • ALTIN
    %-1,07
  • BIST
    %-0,69
Palestine Crisis on Canva! They Implemented Secret Censorship

Palestine Crisis on Canva! They Implemented Secret Censorship

Canva made a statement and apologized after the Magic Layers tool changed the word ‘Palestine’ in the designs and announced that it had resolved the problem.

Popular graphic design platform Canva came to the fore with an unexpected text change problem in one of its artificial intelligence-supported tools. Users noticed that the platform’s new Magic Layers feature automatically replaced the word ‘Palestine’ in designs.

Images shared on the X platform revealed that a design written ‘Cats for Palestine’ was changed to ‘Cats for Ukraine’ by artificial intelligence. This started a wide-scale discussion among users.

Technical error in the Magic Layers feature

Canva spokesperson stated that they were aware of the issue and that they quickly investigated and resolved this issue with the Magic Layers feature. The company expressed its regret for the situation and apologized to the users.

The spokesperson argued that the problem was isolated and did not affect designs overall. However, some users reported that they were able to repeat the error in their own projects.

Canva announced that they have established additional control mechanisms to prevent similar situations from occurring in the future. The company announced that it had initiated a comprehensive internal audit process to understand the origins of the error.

The Magic Layers tool, launched last month, was designed to separate static images into editable layers. It is not yet clear why the tool changed the texts or what training data led to this result.

Similar discussions about artificial intelligence tools

This incident is not the first example of biases exhibited by artificial intelligence models on certain issues. Meta’s AI tools on WhatsApp have previously been criticized for producing controversial results when asked for an image of a Palestinian.

Similarly, in 2023, ChatGPT’s answers to some questions about Palestinians were examined by actions.

The training processes and impartiality of artificial intelligence systems continue to be a topic that remains current in the technology world.

Canva emphasized that it is reviewing its internal testing processes to detect unexpected future outcomes. The company stated that it will take the necessary steps to regain users’ trust.

What do you think about artificial intelligence tools making such text interventions?

Social Media Share:

TOGETHER FOR A LOOK

Can you share with us your comment?