0.7 C
New York
Sunday, March 9, 2025
pCloud Premium

China’s DeepSeek AI is watching what you type



250127 DeepSeek aa 530 7abc09

China’s DeepSeek, the free artificial intelligence chatbot that’s undercutting American counterparts, has prompted worries about whether it’s safe to use.

While cybersecurity researchers say the app does not immediately appear to be uniquely dangerous, it still carries substantial privacy risks both as an app that follows China’s laws and as an artificial intelligence product that may collect and rearrange everything people tell it.

All large language models, or LLMs — the type of AI-driven advanced chatbot made famous by OpenAI’s ChatGPT — are built by first amassing massive amounts of data, and work in part by collecting what people type into them. DeepSeek, though more efficient than ChatGPT, is no different. 

Under Chinese law, all companies must cooperate with and assist with Chinese intelligence efforts, potentially exposing data held by Chinese companies to Chinese government surveillance. That system differs from the U.S., where, in most cases, American agencies usually need a court order or warrant to access information held by American tech companies.

But it’s possible to use DeepSeek and minimize how much data you send to China. Using the app or the chatbot through deepseek.com requires users to register an account, either with an email address or through a Chinese phone number, which most people outside China don’t have.

Lukasz Olejnik, an independent consultant and a researcher at King’s College London Institute for AI, told NBC News that means people should be wary of sharing any sensitive or personal data with DeepSeek. 

“Be careful about inputting sensitive personal data, financial details, trade secrets, or information about healthcare. Anything you type could be stored, analyzed, or requested by authorities under China’s data laws,” Olejnik said.

Ron Deibert, the director of the University of Toronto’s Citizen Lab, said that means DeepSeek users should be particularly cautious if they have reason to fear Chinese authorities. 

“Users who are high-risk in relation to mainland China, including human rights activists, members of targeted diaspora populations, and journalists should be particularly sensitive to these risks and avoid inputting anything into the system,” Deibert said.

One way to reduce what you send to China is to register DeepSeek with a new email account, not one you already use for other important services. That could keep the app, or potentially Chinese intelligence services, from being able to easily match what you tell DeepSeek with who you are on other parts of the internet.

For the more technologically savvy, it’s possible to download the DeepSeek AI model and ask it questions directly, without having to go through the Chinese company processing those requests. That not only prevents China from seeing whatever information you give the model, but it also means little or no censorship about topics that are blocked in Beijing, Olejnik said.

DeepSeek has also prompted worries because its privacy policy declares that it collects a large amount of sensitive information from users, including what kind of device they’re using and “keystroke pattern or rhythms.” While some people may find that invasive, it is limited to what a person types into the app and not what they type into other apps, and it is not unheard of: TikTok and Facebook, for example, have had ways of tracking users’ keystrokes and mouse movements.

Deibert cautioned that while there are risks to giving information to a Chinese LLM, American ones carry risks as well.

“The same risks apply to all AI platforms, including those based in the United States,” Deibert said.

Deibert noted that many U.S. tech companies collect similar sensitive information, and that recently, they’ve worked to court President Donald Trump. “Anyone who is remotely critical of the administration, is a watchdog of the administration, or is part of a vulnerable or at-risk community, should exercise serious caution before using or inputting any data into what are largely ‘black boxes.’ Remember, as with virtually all social media platforms, users’ data is part of the raw material used to train those systems,” he said.



Source link

Odisha Expo
Odisha Expohttps://www.odishaexpo.com
Odisha Expo is one of the Largest News Aggregator of Odisha, Stay Updated about the latest news with Odisha Expo from around the world. Stay hooked for more updates.

Related Articles

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe
Best Lifetime Deals on SaaSspot_img

Latest Articles

'The game has changed dramatically' – Kohli out for one

0
Michael Bracewell's traps Virat Kohli lbw for one to leave India on 106-2 in their Champions Trophy Final against New Zealand in Dubai. Source...

Andrew Tate, social media influencer who faces trafficking charges, sits cageside for UFC 313

0
LAS VEGAS — Former kickboxer and controversial internet personality Andrew Tate was cageside for UFC 313 inside T-Mobile Arena on Saturday night, one...

Champions Trophy: India to chase 252 to win after NZ batters fail to attack...

0
New Zealand won the toss and chose to bat first as they face unbeaten India in the Champions Trophy final in Dubai today...

Hundreds of civilians have been killed in Syria. Here’s what we know.

0
Hundreds of civilians have been killed in Syria in the last 48 hours, according to a war monitoring group, in the deadliest eruption...

Couch: MSU isn't quite all square with Michigan in all sports, but the Spartans...

0
Couch: MSU isn't quite all square with Michigan in all sports, but the Spartans have made gains in recent years  Lansing State Journal Source link...