Who can remove the malignant tumor of cyber violence after being humiliated by "prostitutes" and "sluts"

Who can remove the malignant tumor of cyber violence after being humiliated by "prostitutes" and "sluts"
20:04, October 8, 2022 Caijing New Media

By Fan Shuo

Editor/Guo Liqin

In 2014, Lewinsky, a former White House intern who had disappeared for many years, returned to public view. Lewinsky was involved in the sex scandal with then US President Bill Clinton in 1998, which eventually led to the impeachment of the other party by Congress.

Through his speech and writing signed articles, Lewinsky said that he was probably the first person to suffer from global cyber bullying. During the scandal, she was labeled as "prostitute", "slut", "stupid woman", etc. Lewinsky believed that in the viral transmission brought by the rise of social media after the event, he "suffered more devastating humiliation".  

Today, more innocent people than Lewinsky are suffering similar pain.

Internationally, UNICEF defines bullying by using digital technology as "Cyberbullying", that is, repeated behavior through social media, instant messaging platforms, game platforms, etc., with the purpose of intimidating, irritating or humiliating others.

In China, people call such cyber bullying "cyber violence" (hereinafter referred to as "cyber violence"). Although laws and regulations have not yet clearly defined the concept and extension of online violence, it is generally believed that online violence is an extension of social violence in the online world, with the characteristics of defamation, slander, infringement of reputation, damage to rights and interests, and incitement.

Zhu Wei, deputy director of the Communication Law Center of China University of Political Science and Law, told that compared with the early Internet, At this stage, online violence is often spread through online and offline organizations with "copy, matrix and homogeneous content". In individual cases, There may even be "stigmatization of a person or enterprise for many years".

In recent years, with the "rumour making case of women taking express delivery in Hangzhou", "Liu Xuezhou cyber violence suicide case" and other events that have aroused great public concern, cyber violence has become an Internet ecological problem that cannot be ignored, and the importance of platform governance has become increasingly prominent.

Finance and Economics Law E tries to analyze the real reason why online violence is hard to stop by visiting victims of online violence, lawyers, social platform auditors, and even scholars involved in the establishment of rules and regulations. Based on this, we will explore the solutions for the future.

In the investigation, the Finance and Economics Law E found that the dilemma of rights protection faced by cyber criminals is a comprehensive problem involving platform technology upgrading, platform profit model regulation, and breakthroughs in laws and regulations. It requires joint efforts of all parties to really break the situation.

   Who are the cyber perpetrators?

On August 17, 2022, a lawsuit related to cyber violence will be heard in the Beijing Internet Court, and Beijing WeChat Vision Technology Co., Ltd., the operator of dithering, will become the defendant.

The plaintiff of the case, Zhang Ting, is a dithering blogger with tens of millions of fans. After an unexpected success, she experienced a year long online storm. At the end of 2020, Zhang Ting began to share her story of going out of the countryside, completing her studies through work study programs and student loans, and finally obtaining a doctor's degree. In the following month, the number of followers of her account gradually increased from 2 million to 10 million.

At the same time, some We Media accounts on the Tiao Yin platform began to question Zhang Ting's academic fraud and capital operation behind her popularity. Among them, an account used live broadcast and short video to question Zhang Ting for one and a half years. At the beginning of 2021, after learning that Zhang Ting had joined a university in Xi'an, the operator of the above account reported to Shaanxi Provincial Department of Education, Education and Work Commission and other departments with "abusive netizens", "epidemic fraud" and other reasons. To this end, Zhang Ting submitted nearly 30 written statements to the petition department.

In August 2021, Zhang Ting was diagnosed with anxiety, insomnia and depression by Peking University International Hospital. On the 20th of the same month, Zhang Ting reported the case to the police in Tongzhou District, Beijing, the residence of the above-mentioned account, on the grounds of libel. The police believed that the case was in line with the criteria for filing, and issued a notice of filing on October 17th of the same year.

Zhang Ting is just one of many victims of cyber violence. The difference is that she found that the platform plays an important role in it.

After that, Zhang Ting initiated a lawsuit to the platform party, requesting that Douyin disclose the true identity information of the above-mentioned account, and immediately delete the infringement information released by the account and its associated accounts.

   "Diaoyin has fully disclosed the identity information of the infringer." Zhang Ting told the Finance and Economics Law E, and she later withdrew the lawsuit against Diaoyin, while Infringement account The court has filed a lawsuit.

Zhou Zhaocheng, Zhang Ting's attorney, In online violence litigation, because it is extremely difficult for individuals to obtain the identity information of others in the online world, the prosecution platform is the most legal, direct and fastest way to obtain the basic identity information of infringers.

Zhou Zhaocheng believes that network service providers are legally responsible for notifying infringing network users, taking necessary measures such as deleting, blocking, and disconnecting links, and to a certain extent, they need to share the tort liability with infringing network users. As an online platform for big data collection and collection, it has the obligation to timely disclose the real identity information of infringing network users to the obligee.

On August 23, 2022, Sheng Ronghua, Deputy Director of the Central Cyberspace Office and Deputy Director of the National Cyberspace Office, reported the results of the "Qinglang · Cyber Violence Special Action" at the State Council press conference. In this special rectification action, more than 65.41 million pieces of information related to attacks, slanders, rumors and slanders were intercepted by key website platforms, and 78000 illegal accounts were disposed of.

This special project started in April this year Action, focus on 18 online platforms (including Sina Weibo, Diaoyin, Baidu Post Bar, Zhihu, etc.) that are prone to frequent online violence and have great social influence, and require these website platforms to carry out full chain governance by establishing and improving monitoring and identification, real-time protection, intervention and disposal, traceability and accountability, publicity and exposure, and other measures.

   Victim's dilemma of adducing evidence

Obtaining the identity information of the infringer is only the first step in litigation. Due to the large scale of online violence and the high cost of the victim's proof, the parties and lawyers often face multiple difficulties. Few victims take the initiative to safeguard their own rights and interests through legal means.

After searching and reading 53 relevant judgments with the keyword of "cyber violence" on China's judicial documents website, the Financial and Economic E Law found that from 2013 to 2022, only 17 judgments were cases directly facing cyber violence.

At present, most online speeches are anonymous, and online violence often presents many to one characteristics. Zhu Zheng, director of Beijing Jingshi (Hefei) Law Firm, told Finance and Economics Law E, In the face of cyber violence, it is difficult for victims to collect and fix evidence completely by their own strength. In addition, The cost of proof for victims is high, It takes a lot of time and money to collect evidence, but it is difficult to obtain full compensation in the final judgment of the case, "This is a real dilemma," Zhu Zheng said.

A typical case is the rumor making case of Hangzhou women taking express delivery.

In July 2020, Ms. Hangzhougu was secretly photographed by the owner of a nearby convenience store, Mr. Lang, when she went to the express point of the community to pick up the express. Later, Lang and He fabricated the dialogue content of "rich woman cheating express brother" and sent it to WeChat group. Later, the above secretly shot videos and fabricated screenshots of WeChat chat records were combined and forwarded by others, and spread to multiple WeChat groups, WeChat public accounts and other network platforms. This matter has been fermenting on the Internet, so Ms. Gu was abused, discouraged by the company, and diagnosed as depressed. She said frankly that she had experienced "social death".

On October 26, 2020, Ms. Gu filed a criminal private prosecution to the People's Court of Yuhang District, Hangzhou City. On the ground that Mr. Lang and Mr. He fabricated the facts and slandered the private prosecutor Mr. Gu through the Internet, and the circumstances were serious, she demanded that the two people be prosecuted for their criminal responsibility for the crime of libel.

As China's laws have not clearly defined cyber violence, it generally belongs to the category of civil cases, and the most serious libel and insult related to cyber violence can be regarded as criminal private prosecution cases.

   Zheng Jingjing, the lawyer acting for the case, said frankly that the threshold for filing such cases was high, and it was impossible to make a judgment in the absence of the defendant. "If the defendant's' whereabouts are unknown 'when filing a case, the court may persuade the private prosecutor to withdraw the prosecution. If the private prosecutor does not withdraw, the court can rule not to accept the case," Zheng Jingjing said.

Fortunately, this case finally With the promotion of the procuratorial organ, from "private criminal prosecution" to "public prosecution", In February 2022, it was selected into the 34th batch of guiding cases of the Supreme People's Procuratorate.

Zheng Jingjing told Finance and Economics Law E, It can be transferred to public prosecution because of the particularity of the case itself. Ms. Gu located the suspects through chat records and photo shooting. When Ms. Gu confronted them, the other side also admitted the truth of rumor making. After that, Ms. Gu immediately reported to the police, and the police in Yuhang District of Hangzhou City imposed administrative punishment on the two suspects Identity information and infringement are fixed through administrative punishment, which is very helpful for litigation.

On April 30, 2021, the People's Court of Yuhang District, Hangzhou City made a judgment: both defendants were found guilty of libel and were sentenced to one year's imprisonment with a two-year suspension of execution.

   This case also highlights the difficulties in obtaining evidence and fixing evidence in this kind of litigation. Jia Yu, the chief procurator of the People's Procuratorate of Zhejiang Province, also said in an interview with the media that it is very difficult to rely on personal strength to obtain evidence and safeguard rights because a large amount of evidence exists in the network. Jia Yu revealed that, In the case of "Hangzhou women's taking express delivery was rumored", the procuratorial organ spent nearly one month collecting evidence, and finally formed 18 cases and 76 CDs.

Zhang Ting told Finance and Economics Law E that in order to prepare for the lawsuit, she only collected evidence from the defendant and his two followers. Five proofs were finally formed, and the notarization cost was nearly 100000 yuan.

Zheng Jingjing said that "the rumor of Hangzhou women's taking express delivery was spread and fermented in the WeChat group at the beginning. In this case, once they were kicked out of the group by the group leader or were not spread in the WeChat group, it would be difficult for the victims to obtain evidence. In addition, according to the Interpretation of the Supreme People's Court on Several Issues Concerning the Application of Law in Handling Criminal Cases such as Defamation by Using Information Networks, only when the same defamation information is actually clicked and browsed more than 5000 times, or forwarded more than 500 times, can it be recognized as "serious circumstances" of fabricating facts to defame others as stipulated in the first paragraph of Article 246 of the Criminal Law And meet the criminal filing standards. In WeChat group, it is difficult to determine the forwarding times and browsing times of video and text messages.

At that time, Zheng Jingjing used the number of readers and clicks of articles reprinted on the rumor public account as evidence. Later, Zheng Jingjing received some cases that only spread rumors in WeChat groups. "I can only investigate the other party's tort liability through civil tort litigation, because it is difficult to find evidence that meets the criminal filing standard." Zheng Jingjing said.

On the eve of the Spring Festival in 2022, Liu Xuezhou, a minor boy in Hebei who failed to find his family, committed suicide in Sanya, Hainan, after being subjected to cyber violence.

Zhou Zhaocheng is also a lawyer acting in the "Liu Xuezhou Internet Violence Suicide Case". As he recalled, On the day of Liu Xuezhou's death, many of Liu Xuezhou's We Media accounts that had been hacked online deleted relevant videos in an attempt to destroy evidence, which caused considerable obstacles for lawyers to obtain evidence. Therefore, "we still hope to communicate with the police and other judicial authorities, It is hoped that the judiciary can intervene forcefully. ”Zhou Zhaocheng said.

   Zhu Zheng believes that if we want to fundamentally solve the problem of difficulty in providing evidence in cyber violence cases, on the one hand, we need to put the system first, and we must further develop and improve the relevant judicial interpretations and provisions on providing evidence against cyber violence, Provide clear guidance for victims to collect evidence, and avoid blind evidence collection or invalid evidence collection by victims. It should also be made clear from the level of law or judicial interpretation that the cost of obtaining evidence for the victim should be borne by the perpetrator. On the other hand, we should make full use of the platform's "one click forensics", "blockchain certificate storage" and other technologies to reduce the cost of forensics and certificate storage, Protect the rights of victims.

The "one click forensics" function of the platform undoubtedly provides convenience for the parties to obtain evidence, but the technical implementation effect of "one click forensics" and its legal effect in practice still need to be verified.

Compared with the mature electronic evidence notarization and blockchain evidence storage technology, the "one click forensics" function is still a novelty Zhu Zheng said.

Station B once launched the "one click forensics" function in July 2022. However, Caijing E Law did not see the relevant entrance at the client of station B. The customer service of station B said that if there was a need for "one click forensics", the user could leave a message directly to the customer service. However, as for the technology used in "one click forensics", the customer service of Station B did not give a clear explanation, only indicating that all information in the station was recorded, and the deletion of infringement information did not affect the staff's review and verification.

Shen He, the risk control security operator of a social media platform, told Finance and Economics Law E that his platform is still developing the "one click forensics" function, hoping to solve the problem of netizens' forensics and the problem of possible re injury during forensics, but this function has not been launched yet.

According to Zheng Jingjing, the online violence evidence collection is not just a screenshot. The evidence needs to meet the requirements of authenticity, legality and relevance. Once the content publisher deletes the original text or modifies the content, the evidence validity of the previous screenshots will be questionable.

Therefore, Zheng Jingjing, based on her own experience in handling cases, suggested that victims of cyber violence should preserve their evidence through a notary office or a third-party storage platform when collecting evidence. In addition to traditional notarization, most Internet certificate storage platforms can also help victims to fix evidence. "For example, The electronic evidence platform of Hangzhou Internet Court adopts blockchain technology for evidence storage, and the effectiveness of this evidence storage technology has been recognized by almost all Internet courts in practice. ”Zheng Jingjing said: "Other courts may also refer to similar practices."

   What efforts has the platform made?

Since 2022, several network platforms have successively launched cyber storm control measures, but what are the technologies and operational processes behind these measures? To what extent can we solve the problem of cyber violence?

Wu Danjun, a partner of Beijing Guantao Zhongmao (Shanghai) Law Firm and a legal expert of Shenzhen Big Data Research and Application Association, introduced the Finance and Economics E Law, and the law put forward requirements for the management of illegal information such as online violence, and granted certain governance rights to the platform. For example, the Management Regulations on Mobile Internet Application Information Service stipulates that the platform can take measures such as warning, limiting functions, suspending updates, and closing accounts to deal with users' behaviors of publishing illegal information through the platform.

According to the interview with the parties and platform risk control personnel who have experienced cyber violence in the Financial and Economic Law E, The platform is really improving relevant governance measures.

Zhang Ting recalled that in the debate about her education at the end of 2020, she received a lot of slander and abusive information in comments and private letters. At that time, Tiao Yin's prompt for closing comments, private messages and other functions was not perfect, and she did not think of closing comments, private messages and so on. But in July 2021, when Zhang Ting's "pro Japanese" was questioned by the aforementioned account, Zhang Ting's background of dithering received a large number of insulting and slanderous remarks again. At this time, a notification has automatically popped up to remind Zhang Ting that she can close private messages and comments.

According to the financial and economic E law, since March 2022, microblog, twitter, and station B have successively launched new functions such as "one key protection", "one key riot prevention", and "one key forensics"; The platforms such as Fasthand, Baidu Post Bar, Zhihu, and Little Red Book can independently set private messages and comment permissions. When necessary, users can close private messages and comments in a timely manner to avoid the harm of online violence.

Zhu Wei believes that it is not a good policy to rely only on relief after the event. The platform should make early warning in the early stage of online violence fermentation and quickly cut off the online violence transmission link in the middle of the event.

in fact, Some platforms are already exploring mechanisms to detect network security risks in advance.

According to Shen He, his platform will collect the original corpus in the video through technical means, and use automated models to classify the corpus and extract data models. Based on the extracted corpus features, the security team filters and reports the corpus and enters it into the risk database. The main body of these data is obtained by automated recall, but some of the data that the team focuses on comes from manual recall: including manual reporting, user feedback, security team self discovery and other ways.

After the risk is found, the special team will study and judge the data according to the internal audit rules, and divide the priority, type, etc. according to different risks. At this time, all online storm risks will not be blocked, single article or social relations will not be blocked.

 Platform network security risk discovery process Platform network security risk discovery process

Shen He said, for example, if an event is marked as high-risk, and one of the speakers is a big V, his content may be subject to online violence, and 20% of his fans are users with online violence tendencies, the platform will judge him as a high-risk online violence object. The platform will do some appropriate social relationship shielding for the big V itself, such as closing comments, etc., and segment 20% of the users who are prone to online violence, not showing them relevant content. "This strategy can be issued by the platform, or it can be handed over to the victims of cyber attacks to actively trigger our 'one button anti riot' function," Shen He said.

   Platform governance still has limitations

However, the existing platform governance also has some limitations.

In view of the pre-warning of online violence, Shen He told Finance and Economics Law E that the platform would automatically recall data and discover risks for the content that users have already published (including the content created by the author and the comments published on ordinary social relations), and then manually proofread the risks found in the data retrieval. Because it may affect the user experience, "we can't directly implement policies such as blocking the recall risk of machines, so finding out in advance is not a means that we can really solve online violence in a closed loop." He said.

In the middle of a cyber storm event, the platform will set active interception means such as "one click anti riot", and will also increase the audit (senior editors and reviewers) on the model side as an intervention means in the event. "The only disadvantage is that this intervention method will directly block the social relations of the account that may be subject to online violence, that is, the public opinion field of online violence, and cannot prevent some online violence conducted by a third party (platform) outside the platform," Shen He said.

 Governance measures of the platform before and during online violence Governance measures of the platform before and during online violence

The platform's audit method for online storm information is machine prediction and preliminary screening, which is coordinated with manual review, but it is still difficult to avoid errors and omissions.

For example, Shen He said that if there were 100 articles or comments, the platform would first go through them with an automated model, screen out 50 articles suspected of cyber storm or cyber storm risk, and then conduct manual review. "The current human audit review is mainly to label the information."

According to the manual review record displayed by Shen He, the financial E law was labeled by the auditor as "potential list of cyber violence", "negative comments are aimed at the content, not related to the author" and other labels, and then gave "suggestions to deduct points to ban", "prohibition" and other disposal follow-up.

Xiao Le, the former auditor of Tiaoyin, told the Finance and Economics Law E that the audit department he worked for had a corresponding training and audit mechanism for online violence prevention and suicide. Xiao Le said that he encountered more verbal violence in his work. The auditor will label and report according to the relevant sensitive thesaurus, such as "national abuse", abusing the anchor or audience and "causing personal attacks", "dangerous speeches", "excessive words", etc.

But, The criteria for judging verbal violence or insult are ambiguous, Xiaole often makes difficulties when dealing with it. "Because some people are used to taking some words as a catchphrase and have no intention of subjectively insulting them," Xiao Le said.

According to previous media reports, Byte Beat has recruited a large number of content auditors since 2017. By 2021, there will be more than 20000 people, most of whom are regular employees.

   Xiao Le said that how to balance accuracy and efficiency in manual audit is a big challenge. The auditor needs to check whether there are any violations in the video in the audit system, and mark and submit this. Xiao Le recalled that he needed to view multiple videos within a few minutes, and the auditors of his team had to review at least 300 videos of varying duration from 1 minute to 5 minutes every day. Other teams are tasked with processing 1000 videos with a duration of more than 30 minutes a day. If there are holidays and public opinion outbreaks, the content increase will be handled by overtime in different periods. At the same time, the background is also checking the efficiency of auditors. If it is too low, it will be warned.

According to Xiao Le, auditors generally work 9 hours a day, and their on duty time is arranged by their superiors one month in advance to ensure that they are available 24 hours a day. Almost every auditor will work all night, and overtime is the norm. According to Xiao Le's observation, many of her familiar auditors have left or plan to leave. "The turnover rate is high, and the platform is constantly recruiting".

Wu Danjun believes that, In the case of insufficient human audit capability and incomplete machine audit algorithm model, some online spam comments may be missed or some normal comments may be injured by mistake. Therefore, the platform needs to regularly train auditors to improve their business capabilities, and also needs to regularly review, evaluate, and verify the mechanism, model, data, and application results of the algorithm mechanism to improve the machine audit algorithm model.

   Explore more effective governance model

How to amplify the commercial benefits brought by traffic while avoiding the legal risks brought by cyber violence is the challenge of platform governance.

Zhu Wei said that online violence takes various forms, such as fabricating false facts, transplanting flowers, and taking out of context to cause public opinion, guide public opinion, and then trigger online violence. In topic discussions, netizens often form opposing factions. "If there is opposition, there will be traffic, topics, and even hot searches, which will bring benefits to the platform." Zhu Wei said.

Wang Sixin, professor of Communication University of China and vice president of the Institute of a Community with a Shared Future for Humanity, also believes that, Cyberstorm events have a direct impact on the traffic of the platform. According to the narrow sense of the market return rate, the greater the traffic, the greater the revenue of the platform. However, the widespread occurrence of cyber violence will also bring legal and regulatory risks. "So the platform needs to balance the two."

Shen He told Law E of Finance and Economics that the regulatory authorities are very concerned about the trends of his platform. Once the platform is found to have violations, in the most serious case, the app will be taken off the shelf, and the cost of violations will be high. Therefore, "we generally do not relax the governance of online violence for the sake of popularity." He revealed that some content platforms with entertainment attributes may choose to take some risks.

For those problems that the platform failed to deal with in a timely manner, Zhu Wei mentioned that the 12377 free tip off telephone, tip off website and tip off mailbox set up by the Center for Reporting Illegal and Bad Information of the Central Cyberspace Office (the State Internet Information Office) is also an effective remedy. "But at present, due to the excessive number of reports, 12377 is also overloaded, unable to meet all the reporting needs." Zhu Wei revealed.

Zhang Ting told Finance and Economics Law E, At first, when she reported to Douyin, the platform only banned the reported account for one month. The reported account was permanently blocked until she reported the violation through 12377 email.

What is a more effective regulatory model for online violence? In the future, how can the self-restraint of the platform and the direct supervision of the relevant departments work together?

Wu Shenkuo, the assistant dean of the Internet Development Research Institute of Beijing Normal University, told the Finance and Economics Law E that we can refer to the ideas of the United States and the European Union on cyber violence governance, The governance idea of the United States is characterized by increasing the penalties for infringement, emphasizing the penalties for infringers, as well as the security protection obligations of the platform. The EU focuses on the constraints on specific information content and speech types, The most representative is the code of conduct against false information proposed around 2018, as well as relevant laws and regulations on hate speech. Among them, The code of conduct against false information emphasizes "self-restraint of the platform".

Wu Shenkuo believes that, stay Globally, the scope of governance for specific online speech content is expanding, and the requirements for platform responsibility and punishment for relevant subjects are also increasing. At present, China's cyber violence governance takes the Cyber Security Law as the top-level design, and constructs rule content in three levels of technical elements, organizational management and online content, emphasizing a new comprehensive governance idea. In this process, we have also absorbed international practice and experience, for example, technology blocks the protection obligations of the platform and multi-level rights protection channels.

Liu Xiaochun, Executive Director of the Internet Rule of Law Research Center of the University of the Chinese Academy of Social Sciences, suggested that, The future governance ideas can be transformed from the prevention of content abusers to the provision of self-help relief for victims. Guide netizens through some reputation mechanisms and information disclosure mechanisms, such as IP address disclosure and other measures. Through the education of Internet users and the function setting from the technical perspective, it enables Internet users to provide timely self-protection.

Many interviewees mentioned that China's current laws and regulations do not clearly regulate online violence, so should we introduce special regulations for online violence? There are different views on this.

"Platform governance alone is not enough." Zhu Wei believes that everyone has the right to express, but in the anonymous environment of cyberspace, group behavior is often irrational, lacking moral and legal constraints. The root of harnessing cyber violence is that the law should hold the public accountable, and the participants should be held accountable according to the situation. At the legislative level, special regulations should be formulated to deal with cyber violence.

Wang Sixin believes that legislation on cyber violence is unrealistic. In his view, online violence is a trivial problem, and its social effects are difficult to quantify. Zheng Jingjing also mentioned that online violence is a comprehensive concept, including a variety of different behaviors. "Cyber violence is more like a collection of various behaviors, which is difficult to define legally."

For example, in civil tort, online violence usually involves "reputation dispute", "privacy dispute" and "network tort liability dispute", and more serious cases involve the crime of libel, insult, provocation and trouble making in criminal crimes; For acts involving administrative punishment, the Law on Administrative Penalties for Public Security and the Law on Network Security also have relevant provisions.

From a practical perspective, Zheng Jingjing believes that, The practical problem to be solved urgently in the governance of cyber violence is to more clearly define the connection mechanism and operating rules from "criminal private prosecution" to "public prosecution" in legal relief.

"The state should further clarify the provisions on the transition from private prosecution to public prosecution through judicial interpretation and give clear guidance to legal practice." Zheng Jingjing suggested.

(At the request of the interviewee, Shen He and Xiao Le are pseudonyms in the article)

Netstorm law
 Sina Technology Official Account
Sina Technology Official Account

"Palm" technology news (WeChat search techsina or scan the QR code on the left to follow)

Record of creation

Scientific exploration

Science Masters

Apple Exchange

Mass testing

special

Official microblog

 Sina Technology  Sina Digital  Sina mobile phone  Scientific exploration  Apple Exchange  Sina public survey

Public account

Sina Technology

Sina Technology Brings You the Fresh Technology Information

Apple Exchange

Apple Exchange brings you the latest Apple product news

Sina public survey

Try new cool products for free at the first time

Sina Exploration

Provide the latest scientist news and wonderful shocking pictures