May 1, 2024

There are two things I want to share in response to the trending AI (Artificial Intelligence) Photolab and similar AI-powered image enhancing tools.
First, it is important to know what you’re signing up for before using free AI tools like Photolab. When you upload your photos, you’re not just getting cool and awesome pictures of you – you’re also helping improve the AI product by providing your valuable personal data such name, email address, user IDs, etc. There’s nothing inherently wrong with this exchange as long as you’re aware that those fabulous images come at the cost of giving away personal data found in your pictures for free. I know it is fun and cool, but just make sure you understand what’s being collected and decide if you’re okay with the trade-off.
As always, the problem arises when users don’t know or haven’t agreed to their data being used for improving the algorithm that companies use for product development. It’s not automatically wrong to use user data for making the product better, but it becomes a problem if users feel tricked. Just be sure you know what you’re getting into.
Additionally, everyone is capable of due diligence, and doing so helps us develop a critical mindset. For instance, before using an app like Photolab, I find it helpful in my decision-making process to at least know who is behind the app. And lo and behold, you might also find these facts interesting.
Photolab is an app developed by Linerock Investments LTD, a registered company in the British Virgin Islands with development and customer support offices in Russia, Ukraine, and Belarus. The founder, Victor Sazhin, is of Russian origin. While there are rumors about collected data being sent to Russia, Photolab spokesperson Kate Polezhaeva clarified to The Independent that user photos undergo hosting and processing on Amazon AWS and Microsoft Azure servers, both located outside the Russian Federation.
She said the Moscow River address is merely the location of the legal registration by the company’s lawyers. Armed with such information, you are better equipped to decide whether you feel safe using Photolab or not.
Second, let us ask ourselves: What values and norms and we promoting? While Photolab and similar AI tools are fun, their use seems to contribute to a distorted perception of beauty. When you think about it, the more we use these AI-powered image-enhancing tools to portray a person we are not, the more they seem to foster unrealistic standards. In the long run, the use of these tools kind of promotes a culture where individuals may feel pressured to conform to an idealized image. So, while it’s fun to experiment with enhancing images, it is my utmost hope that we approach these AI tools with a critical mindset.
Someone asked me, “Does this also apply to the Remini AI app? My short answer is yes.
The long version is as follows: Similar to Photolab, Remini AI is also a generative AI that relies on big data. To collect the needed data, these apps (like many other apps) are offered for “free” (but the tradeoff is your personal data). Like I said, there’s nothing inherently wrong with companies collecting data to make their AI tools smarter and improve their (photo enhancing) services. However, it’s crucial that they explicitly inform users about what they are doing, and users should be aware.
Most importantly, our decision to use photo enhancers like Photolab, Remini AI, etc. has a much bigger impact. No doubt about that. It prompts us to question what values and norms do we want to promote and keep for the next generation to follow. By that, I mean we often express disapproval to vanity or discrimination and advocate for values like authenticity and acceptance.
But in the realm of photo enhancement apps, our decision and actions speak louder. To end, I hereby advocate for a conscious choice for active participation in shaping a culture that values genuine self-expression and acceptance. Just always remember that our true beauty shines through when we embrace who we are, unfiltered.