2.6 C
New York
Friday, January 31, 2025
pCloud Premium

China’s DeepSeek AI is watching what you type



250127 DeepSeek aa 530 7abc09

China’s DeepSeek, the free artificial intelligence chatbot that’s undercutting American counterparts, has prompted worries about whether it’s safe to use.

While cybersecurity researchers say the app does not immediately appear to be uniquely dangerous, it still carries substantial privacy risks both as an app that follows China’s laws and as an artificial intelligence product that may collect and rearrange everything people tell it.

All large language models, or LLMs — the type of AI-driven advanced chatbot made famous by OpenAI’s ChatGPT — are built by first amassing massive amounts of data, and work in part by collecting what people type into them. DeepSeek, though more efficient than ChatGPT, is no different. 

Under Chinese law, all companies must cooperate with and assist with Chinese intelligence efforts, potentially exposing data held by Chinese companies to Chinese government surveillance. That system differs from the U.S., where, in most cases, American agencies usually need a court order or warrant to access information held by American tech companies.

But it’s possible to use DeepSeek and minimize how much data you send to China. Using the app or the chatbot through deepseek.com requires users to register an account, either with an email address or through a Chinese phone number, which most people outside China don’t have.

Lukasz Olejnik, an independent consultant and a researcher at King’s College London Institute for AI, told NBC News that means people should be wary of sharing any sensitive or personal data with DeepSeek. 

“Be careful about inputting sensitive personal data, financial details, trade secrets, or information about healthcare. Anything you type could be stored, analyzed, or requested by authorities under China’s data laws,” Olejnik said.

Ron Deibert, the director of the University of Toronto’s Citizen Lab, said that means DeepSeek users should be particularly cautious if they have reason to fear Chinese authorities. 

“Users who are high-risk in relation to mainland China, including human rights activists, members of targeted diaspora populations, and journalists should be particularly sensitive to these risks and avoid inputting anything into the system,” Deibert said.

One way to reduce what you send to China is to register DeepSeek with a new email account, not one you already use for other important services. That could keep the app, or potentially Chinese intelligence services, from being able to easily match what you tell DeepSeek with who you are on other parts of the internet.

For the more technologically savvy, it’s possible to download the DeepSeek AI model and ask it questions directly, without having to go through the Chinese company processing those requests. That not only prevents China from seeing whatever information you give the model, but it also means little or no censorship about topics that are blocked in Beijing, Olejnik said.

DeepSeek has also prompted worries because its privacy policy declares that it collects a large amount of sensitive information from users, including what kind of device they’re using and “keystroke pattern or rhythms.” While some people may find that invasive, it is limited to what a person types into the app and not what they type into other apps, and it is not unheard of: TikTok and Facebook, for example, have had ways of tracking users’ keystrokes and mouse movements.

Deibert cautioned that while there are risks to giving information to a Chinese LLM, American ones carry risks as well.

“The same risks apply to all AI platforms, including those based in the United States,” Deibert said.

Deibert noted that many U.S. tech companies collect similar sensitive information, and that recently, they’ve worked to court President Donald Trump. “Anyone who is remotely critical of the administration, is a watchdog of the administration, or is part of a vulnerable or at-risk community, should exercise serious caution before using or inputting any data into what are largely ‘black boxes.’ Remember, as with virtually all social media platforms, users’ data is part of the raw material used to train those systems,” he said.



Source link

Odisha Expo
Odisha Expohttps://www.odishaexpo.com
Odisha Expo is one of the Largest News Aggregator of Odisha, Stay Updated about the latest news with Odisha Expo from around the world. Stay hooked for more updates.

Related Articles

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe
Best Lifetime Deals on SaaSspot_img

Latest Articles

Under-20s Six Nations Rugby

0
Coverage of Ireland v England in the opening match of the 2025 U20s Six Nations. Source link

‘Fear’ and ‘chaos’ grip federal workers as Trump rapidly remakes the government

0
Fear, anger and confusion have swept across federal agencies as workers grapple with a deluge of orders from President Donald Trump that they...

Wiland Launches FanSignals to Help Sports Teams and Leagues Maximize Fan Engagement and Revenue...

0
Wiland Launches FanSignals to Help Sports Teams and Leagues Maximize Fan Engagement and Revenue Growth  Business Wire Source link

What we know about some of the D.C. midair collision victims

0
They were just minutes away from landing at Ronald Reagan National Airport near Washington, D.C., when their plane collided with an Army helicopter over...

Barcelona Win Transfer Battle Against Real Madrid for Rising Star

0
Barcelona takes the upper hand against Real Madrid for Alan Godoy from Eldense, as the team signs him for their club Barça Atlètic...