Fit & Fabulous

British Fashion Model Agents Association Petitions UK Government to Protect Models from AI Misuse


Written by Intern Queeny George M.H, Team Allycaral

The British Fashion Model Agents Association (BFMA) and London-based agency The Milk Collective have released a petition urging the UK government to take legislative action to protect models from having their likeness exploited by artificial intelligence. The petition, which has already gathered more than 2,000 signatures, comes at a time when concerns are rising about AI-generated imagery threatening the future of real-world modelling careers.

In an era where brands can generate hyper-realistic images from scratch, many models worry that their unique appearances may soon be replicated digitally โ€” without fair compensation, consent, or control. The BFMA, which represents over 5,000 models, stated in their petition that โ€œclients should not be using AI to obtain, manipulate, distribute and potentially own modelsโ€™ data without their consent and without following industry-agreed principles & practices.โ€

Historically, models have operated within clear contractual frameworks that define the terms of image usage, duration, and compensation. But AI disrupts this system, enabling content to be created and distributed without those protections. Brands like H&M have attempted a middle ground โ€” partnering with select models to create digital twins, which they use in AI-generated campaigns. However, the larger industry lacks any universal safeguards.

Vogue recently faced criticism for running an AI-generated ad for Guess, highlighting the growing tension between creative innovation and ethical concerns. As the adoption of AI accelerates, the fear of replacement continues to grow, prompting many in the fashion industry to call for clearer guidelines and stronger legal frameworks.

The petition marks a significant moment in the dialogue around technology, image rights, and creative labour. It remains to be seen how regulators will respond, but the message from the modelling community is clear: AI must not erase the rights โ€” or the livelihoods โ€” of real people.

TechPulse

Google Ordered to Pay $12,500 for Street View Privacy Breach Involving Naked Man in Yard


July 30, 2025 โ€” In a ruling that has reignited global debates on privacy and digital surveillance, Google has been ordered to pay $12,500 in damages after an image on Street View showed a man naked in his own yardโ€”without his knowledge or consent.

The man, whose identity has been withheld for privacy reasons, filed a complaint after discovering the image of himself online. The photo, which was briefly accessible on Googleโ€™s Street View platform, showed him in a private moment on his residential property. Despite Google’s attempts to blur identifying features, the image remained recognizable to some viewers.

A Question of Consent

While Googleโ€™s Street View service is known for capturing public roads and urban spaces, this case challenges how the platform defines private versus public spaces, especially when high-resolution imagery captures individuals in vulnerable or unintended moments.

Legal experts said the ruling sets a precedent for accountability in digital mapping and highlights the need for stricter privacy protocols in the age of automated image collection.

Googleโ€™s Response

A spokesperson for Google expressed regret over the incident, stating:

โ€œWe take privacy seriously and work to prevent these situations through advanced blurring technologies. We acknowledge the lapse in this case and will ensure the image is permanently removed.โ€

The company also confirmed a review of its Street View processes and escalation mechanisms for privacy-related flags.

Bigger Picture: Tech and Ethics

This is not the first time Googleโ€™s mapping platforms have come under scrutiny. Past incidents have raised concerns about facial recognition, license plate visibility, and location-based profiling. While the service remains a powerful tool for navigation and exploration, it is now equally central to discussions around privacy, consent, and digital ethics.

What This Means for You

  • If you appear in a Street View image, you have the right to request it be blurred or taken down.
  • Private property visibilityโ€”even from a public roadโ€”can be contested under certain privacy laws.
  • This case may prompt greater user control and regulatory oversight for map-based services.

๐Ÿ“– For more on privacy rights, digital ethics, and tech accountability, visit allycaral.com