Over the past week, the internet has witnessed a mass exodus of painters, photographers, and other artists from Instagram. The mass exodus was a protest of how content creators are upset that Instagram is using their art as a tool for AI training.
In a world where we are so entrenched in AI, with the proliferation of AI machines such as ChatGPT, Grammarly, and DeepBlue, can machine learning be used for something bad? And what does that say about the world we are living in today?
Mass exodus from Instagram
Many artists and musicians have stated that the reason for their exodus surrounds how Meta is using their posts to be used to train AI. Tensions have been rising between the two parties over this debacle, with content creators fearing they might lose their jobs over AI at the end of the day.
According to current laws and regulations, no clause prohibits Meta from using the work of these artists as almost everything posted publicly on the internet is considered fair game for AI training.
However, artists feel powerless given their predicament. While they need to rely on Meta apps to market their work, they are unable to stop Meta from using work as fodder for AI.
How entrenched in the world of AI
But machine learning hasn’t always been bad, hasn’t it? Some of the more common AI tools like ChatGPT are based on machine learning as well to learn from the prompts that were given to it and programmed itself to provide better and more comprehensive answers.
Another example is the writing assistant Grammarly, which is based on machine learning too to learn the different grammar and writing styles of countless authors to perfect its craft as an expert proofreader.
If you are a WhatsApp user, you will know by now the new Meta AI system the company has introduced into WhatsApp. The Meta AI allows users to generate pictures as well, which requires machine learning, which is done at the expense of these artists.
Hero or Zero?
So who is it to judge who is right and who is wrong? If you are an avid user of image-generative software such as DALL.E or Midjouney, then you are no less guilty than Meta is. Every one of us has an equal part to play in this. It is hypocritical to stand on the moral high horse and call Hail Mary while reaping the benefits of generative AI behind closed doors.
Real question
Maybe the crux of the issue doesn't lie in what is right or wrong; but rather in respecting the work of these artists. Maybe what these tech companies should be doing is asking artists for permission before they use their work for any AI learning so that they can have the power and autonomy to have the last say in this decision.
Needless to say, there should be more laws and regulations to protect the rights of these artists from the pangs of copyright and plagiarism.