Be careful with the data given by Deepseek … and all other Amnesty International


Dibsic She shook the world of technology and financial markets when it hit the application stores a few weeks ago, and promised to provide the same types of high -performance artificial intelligence models such as well -known players such as Openai and Google in a small part of the cost.

AI AI ATLAS Technical Association

But some of them in the government and security government are concerned that the AI ​​Open Summary Source Assistant links with China can put American data at risk, and compare it to the social media platform Tiktok, which members of Congress I voted with an overwhelming majority to ban last year.

These concerns are not limited to Deepseek. They are something that every person who downloads AI Chatbot apps on their phones, even regardless of the national security that occurs in the legislative halls. We will determine some useful tips below.

A pair of members of the House of Representatives recently Declared plans to introduce legislation This would prohibit the application on all government agencies, indicating the ability of the Chinese Communist Party to access the data collected by Deepseek and other applications owned by China, as well as the ability to use Deepseek to spread Chinese information.

“This is five national security fire fires, national security is at risk,” US MP Josh Jotimir, a democratic in New Jersey, said in a statement.

“We have seen the playing book in China before with Tiktok, and we cannot allow this to happen again.”

Australia banned the application On government agencies. Some American countries did the same, with Texas Being one of the first. New York Governor also It issued a state -level ban on Deepseek on the state government agencies and regulations.

South Korea officials took a step forward, Request to remove Deepseek from the country’s application stores Until improvements are made to ensure that they are compatible with data protection laws in South Korea.

Deepseek relations with China, as well as its wild popularity in the United States and the surrounding news, are an easy comparison with Tiktok, but security experts say the social media platform.

Although Deepseek may be the new hot AI auxiliary now, there are a large number of new artificial intelligence models and publications on the horizon, making it important to take care of when using any kind of artificial intelligence programs.

Meanwhile, it will be difficult to sell the average person to avoid downloading and using Deepseek, said Dimitri Serotta, CEO of Bigid, a cyberspace security company specializing in AI security compliance.

“I think it is tempting, especially for something that was in the news a lot,” he said. “I think somewhat, people need to make sure they work in a certain group of parameters.”

Why are people worried about Debsik?

Like Tiktok, Deepseek has relationships with China and user data is sent to cloud servers in that country. As with Tiktok, which is based in China -based BYTEDANCE, Deepseek is required under Chinese law to transfer user data to the government if the government requests that.

With Tiktok, legislators on both sides of the corridor are concerned about using American user data by the Chinese Communist Party for Intelligence Bowers, or that the same application can be modified to immerse American users with Chinese propaganda. These concerns were eventually pushed To pass the law last year, it would ban Tijk Unless it is sold to Jupiter, US officials are considered appropriate.

But getting dealing with Deepseek, or any other Amnesty International, is not as simple as the prohibition of application. Unlike Tiktok, which companies, governments and individuals can choose to avoid, Deepseek is something that might end up facing information and deliver information to it, without knowing that.

Serotta said that the ordinary consumer may not even know the artificial intelligence model they interact with. Many companies already run more than one type of artificial intelligence model, and “brain”, or the specified artificial intelligence model that is run in this avatar, can be “replaced” with another group in the company group while the consumer interacts with him, depending on the tasks It must be done.

Meanwhile, the surrounding tanner does not give up AI in general any time soon. There are more models from other companies, including some will be open source like Deepseek, and they will attract the interest of companies and consumers in the future.

As a result, the focus on Deepseek only removes some data security risks, said Kelcey Morgan, the first manager of RAPID7 products management.

Instead of focusing on the model currently in the spotlight, companies and consumers need to know how much the risks they want to tolerate regarding all kinds of artificial intelligence, and putting practices designed to protect data.

“This is regardless of anything hot coming out next week,” Morgan said.

Can the Chinese Communist Party use Deepseek data for intelligence purposes?

Cyber ​​Security experts say China has enough people and prepares the ability to extract the huge quantities of data collected by Deepseek, and combine them with information from other sources and perhaps building profiles for American users.

“I think we have entered a new era in which the account is no longer registered,” said Serotta, referring to the capabilities of companies such as Palantir Technologies, which make the software that allows us to disturb huge amounts of data for intelligence purposes, and in addition to that China has the same types of capabilities.

Although people who play with Deepseek may be young and relatively important now, as with TIKTOK users, China is happy to play a long game and wait to see if none of them grow up to be an influential person and deserve to be targeted. .

This is something in Washington, regardless of political tendencies, have become increasingly knowledgeable in recent years, said Andrew Burin, CEO of Flashpoint, the largest special provider of threat and intelligence data in the world.

“We know that policy makers are familiar with, we know that the technology community is aware,” he said. “My personal evaluation is not sure that the American consumer is necessarily aware of what these risks are, or where this data is going and why it might be a source of concern.”

Burin stressed that anyone working in the government should exercise “the highest levels of caution” if they choose to use Deepseek, but he also said that all users should keep in mind that their data may end in the hands of Chinese officials.

“This is an important factor that must be observed,” he said. “You did not need to read the privacy policy to find out.”

Gettyimages-217821478

Keep your own information.

Gety pictures

How to stay safe while using Deepseek or other artificial intelligence models

Given that it may be difficult to know the artificial intelligence model you already use, experts say it is better to take care of any of them.

Here are some tips to do this.

Be smart with artificial intelligence like everything else. The best usual technology practices also apply. Long, complex and unique placement PasswordsAlways empower Dual factors authentication When you can, keep up update all your devices and programs.

Maintaining personal information. Think before introducing personal details about yourself in Chatbot, Amnesty International. Yes, this covers the absence of social security numbers and clear banking information, but also types of details that may not automatically stop the warning names, such as your address, the place of employment, friends “or co -workers.

Be skeptical. Just as you are careful of information requests that come in the form of email messages, texts or social media posts, you should be concerned about artificial intelligence inquiries as well. Think about it like the first date, Seruta said. If the model asks personal questions strangely the first time you use, stay away.

Do not hurry to be the early adoption. Just because artificial intelligence or application does not mean that you must have you immediately. Decide yourself how much the risks you want to take when it comes to the new program on the market.

Read the conditions and terms. Yes, this is a lot to be offered, but with any application or program, you must really read these phrases before starting to deliver data, to get. Borin said these phrases can also provide an insight into whether AI or the application collects and shares data from other parts of your device. If this is the case, turn off these permissions.

Be familiar with America’s discount. Borin said that any application of its headquarters in China should be treated with suspicion, but it should be from other aggressive or unresolved countries such as Russia, Iran or North Korea. Privacy rights that you may enjoy in places such as the United States or the European Union do not apply to these applications, regardless of what conditions and conditions say.



Leave a Reply

Your email address will not be published. Required fields are marked *