6.5 C
New York
Friday, November 7, 2025

Safety Chunk: Beware sketchy ChatGPT-clones slipping again into App Retailer charts


9to5Mac Safety Chunk is completely delivered to you by Mosyle, the one Apple Unified Platform. Making Apple gadgets work-ready and enterprise-safe is all we do. Our distinctive built-in strategy to administration and safety combines state-of-the-art Apple-specific safety options for absolutely automated Hardening & Compliance, Subsequent Era EDR, AI-powered Zero Belief, and unique Privilege Administration with probably the most highly effective and fashionable Apple MDM available on the market. The result’s a completely automated Apple Unified Platform at the moment trusted by over 45,000 organizations to make hundreds of thousands of Apple gadgets work-ready with no effort and at an reasonably priced price. Request your EXTENDED TRIAL in the present day and perceive why Mosyle is every thing you might want to work with Apple.


Round this time two years in the past, OpenAI’s extremely standard GPT-4 API was spreading like wildfire all around the App Retailer. It wasn’t lengthy earlier than AI-powered productiveness apps, chatbot companions, dietary trackers, and principally anything you would consider dominated the charts, garnering hundreds of thousands of downloads. Quick ahead to in the present day, a lot of these vibe-coded, opportunistic apps have disappeared, partly because of cooling hype but additionally Apple’s harder stance towards knockoffs and deceptive apps.

Nevertheless, this week, safety researcher Alex Kleber seen that one deceptive AI chatbot, impersonating OpenAI’s branding, managed to realize high marks within the Enterprise class. Albeit on the much less standard Mac App Retailer, that is nonetheless vital and warrants a PSA to be cautious sharing private data with these apps.

The primary Enterprise “AI ChatBot” app on macOS seems to impersonate OpenAI’s branding from its brand and identify to its design and logic. Investigation exhibits it’s made by the identical developer as one other practically an identical app. Each share matching names, an identical interfaces and screenshots, and even the identical help web site that results in a free Google web page. In addition they seem beneath the identical developer account and firm handle situated in Pakistan.

Regardless of Apple’s elimination of most OpenAI copycat apps, these two slipped by evaluation and now sit among the many high downloads on the U.S. Mac App Retailer.

It goes with out saying that an app’s opinions, rating, and even approval to the shop don’t essentially assure security in regard to knowledge privateness.

Sketchy GPT clone on the U.S. Mac App Retailer – 9to5Mac

A current report revealed by Personal Web Entry (PIA) discovered troubling examples of poor transparency in lots of of those private productiveness apps. One standard AI assistant that used the ChatGPT API quietly collected much more person knowledge than its App Retailer description claimed. The itemizing mentioned it solely gathered messages and gadget IDs to enhance performance and handle accounts. Its privateness coverage confirmed it additionally collected names, emails, utilization stats, and gadget data, which frequently finally ends up being bought to the likes of information brokers or used for nefarious functions.

Any GPT clone app that collects person inputs tied to actual names is a recipe for catastrophe. Think about a large pool of conversations the place each message is linked to the one who mentioned it, sitting in a sketchy database run by a shell firm with an AI-generated privateness coverage that holds no water within the nation the place they reside. That’s occurring someplace proper now.

One would possibly assume because of this the App Retailer has privateness labels. Whereas Apple launched them to assist customers perceive what knowledge an app collects and the way it makes use of it, these labels are self-reported by builders. Apple depends on their honesty. Builders can stretch the reality, and Apple has no system to confirm it.

I believe it’s vital to proceed spreading the phrase that these apps are nonetheless on the market, amassing who is aware of what data and knowledge from unsuspecting customers. These undoubtedly pose big privateness dangers. Unfold the phrase!

FTC: We use earnings incomes auto affiliate hyperlinks. Extra.

Related Articles

Latest Articles