Main Highlights:
- According to a news articles, Oracle has begun reviewing TikTok’s algorithms and content control mechanisms.
- These assessments began last week, after TikTok’s June revelation that it had switched its US traffic to Oracle servers in response to reports that TikTok colleagues in China had accessed its US user data.
- The new agreement gives Oracle the opportunity to monitor TikTok’s infrastructure in order to assist the firm in assuring US legislators that its app is not being controlled by Chinese government agencies.
TikTok may be the most popular short-video platform in the world, but it is still under examination from authorities throughout the world because to its links to the Chinese government.
TikTok’s algorithms and content moderation methods are being reviewed by Oracle to guarantee they are not being exploited by Chinese authorities. The assessments started last week, when all of TikTok’s US traffic was routed to Oracle’s infrastructure.
TikTok, which is owned by the Chinese business ByteDance, stated in June that it is collaborating with Oracle “to further defend” its app, infrastructure, and US customer data security.
It comes as US legislators continued their scrutiny of the company’s possible ties to Chinese officials.
What exactly is this audit, and why is TikTok being scrutinised?
Why are TikTok’s algorithms being monitored?
Oracle’s audit, according to the report, is intended to look into TikTok’s ties to the Chinese government. There have been claims in the past that TikTok attempted to restrict or censor videos that included “Tiananmen Square” or “Tibetan independence.” According to the report, TikTok would also reduce videos featuring political speeches. It was also previously alleged that it attempted to hide content from people deemed unattractive, as well as those who were poor or handicapped.
While TikTok has already disputed most of these accusations, it has come under increased investigation in the United States, with the Senate Intelligence Committee investigating possible connections to the Chinese government.
The attention grew when a Buzzfeed investigation in July this year revealed that internal records verified that China accessed US data on many occasions. According to the study, engineers in China had access to the data of US consumers. TikTok denied the Buzzfeed claims in a formal letter to US Senators.
Former US President Donald Trump vowed to ban TikTok during his presidency because of its “connection” with the Chinese government.
How will this audit benefit TikTok?
TikTok, which is owned by Chinese conglomerate ByteDance, must demonstrate that its software and algorithms are neither controlled or accessible by the Chinese government. TikTok said in June that it will move all US user data to Oracle’s servers.
TikTok is launching these measures as part of Project Texas, which it revealed in a letter to US Senators on June 30, 2022.
According to the letter, Project Texas is intended to “help develop confidence with users and key stakeholders,” as well as make progress toward US regulatory compliance.
According to TikTok, Project Texas is intended to guarantee that the corporation “completely” protects “user data and US national security concerns.”
TikTok also claimed in the letter that it saves “100% US user data by default in the Oracle cloud environment,” and that it is collaborating with “Oracle on new, improved data security safeguards that we intend to finalise in the near future.”
TikTok expects that by having Oracle analyse its algorithms on a regular basis, there would be some confidence that its algorithm is not being controlled by Chinese authorities. It will hope that by doing so, the app will not be blocked or, worse, that Bytedance will be compelled to sell its most popular product to a foreign corporation.
But what exactly are the problems with TikTok’s algorithms?
While TikTok aims to be free of all ties to the Chinese government, multiple investigations have highlighted that the algorithm and how it operates are problematic. According to reports, TikTok’s algorithm will deliver addicting and harmful material. According to a Wall Street Journal investigation, the algorithm is supposed to learn more about the users’ viewing patterns, but it frequently serves up upsetting content.
The Wall Street Journal developed 100 bots for the purpose, and some of them were shown films that “encouraged eating disorders, sexualized youngsters, and talked suicide.”
The analysis indicated that the majority of the content consumed is driven by TikTok’s recommendation engine, which advises what video to watch next. The parents of two girls who died chose to sue the firm in July of this year for its algorithm providing harmful content to youngsters. According to The New York Times, the complaint filed in Los Angeles County claims TikTok was aware of its product’s addictive nature yet continued to “steer minors to hazardous information.” Last year, the girls died after taking part in a blackout challenge, in which users are urged to hold their breath until they pass out.
TikTok has attempted to address several claims. It had issued its own report on transparency. According to a blog post, it will provide academics with “accurate techniques to discover and analyse content and trends or conduct platform testing.” TikTok intends to provide academics with “anonymized data regarding content and behaviour” on the site before the end of the year. It also intends to make more of its content control tools available to academics.