With the help of Sam Sutton
Meta has a significant part of human rights issues From the Rohingya massacre to Cambridge Analytica in the history of the company.
Therefore, it is not surprising that the human rights community is skeptical of Metaverse’s promise to revolutionize the way the Internet itself is used through the 3D overlay on the world. Whether they can prove that those skeptics are wrong may come down to the trade-offs that the company and its fellow virtual world builders are willing to make.
For now, they have the card near their chest, as expected to some extent. Human Rights Director of Meta last night Miranda Sisons A panel discussion with a disturbing title shed light on this subject.Human Rights and the Metaverse: Are We Already Outdated?“
Sissons first advertised the potential of AR / VR technology to improve quality of life. Genuine Through use in fields such as global, automotive safety and medical diagnostics. But it’s not a “metaverse”. And when it comes to the new virtual space rules that Meta is building, well … they are accidental.
“Many of the significant risks are related to our behavior as humans,” Sisons said. “And many of these behaviors can be mitigated or prevented by guardrails, standards, design principles, and design priorities.”
But what exactly are those principles? The human rights community has provided a number of formal tools to assess the impact of specific technologies and prevent harm, such as the Rohingya and Cambridge Analytica cases, and Sissons states that companies are human rights compliance presented by the group. He insisted that he had to follow the framework of. Like the United Nations and the World Economic Forum.
Electronic Frontier Foundation Katitza RodriguezDemands businesses to impose strict restrictions on the types of data that devices may collect and store, including potential “emotion detection,” and virtually attended yesterday’s session. She said in Sissons’ vision, Meta may need to make unpleasant trade-offs.
“We need to educate and train engineers, marketing teams and others about the importance of human rights and the impact their products have on society,” Rodriguez said. “It’s difficult, but it’s important … how to mitigate human rights risk? Avoid including facial recognition in your products. These are difficult choices.”
And there is no shortage of examples of what would happen if these choices were not made early in the life of the technician.
“What we have learned from other immersive worlds like video games is that early-set norms actually define the culture of space.” Chloe PointonPanel moderator and consulting firm advisor Article 1Tell me later.
Daniel RoyferAccess Now’s senior policy analyst enthusiastically argued at the panel about the frequent refraining regulators from being unable to keep up with the development of new technologies, saying, “Data protection, transparency, access to information, etc.” The very basic thing is to do a lot of work. “
Brussels, home of Royfer, has taken this notion clearly in recent years with numerous regulations on data privacy and AI.Meta’s promises may be vague at this point, but in reality there are signs that state regulators may be catching up — this week’s surprising bipartisan. Draft privacy bill We are beginning to clarify who has the authority to set and enforce privacy laws.
Today, the National Endowment for Democracy Release report Regarding the “Global Struggle for AI Surveillance,” we are working on “both the impact of new technologies on democracy and the vector for civil society to be involved in design, deployment and operation.”
The author emphasizes the threat that AI-powered surveillance poses to civil liberties and privacy, and has the potential to enhance the dictatorship’s ability to suppress various social groups and the freedom of expression itself. Focuses on.
This report offers many potential remedies, including establishing more specific rules and regulations for the development and use of AI ( Europe is doing) And create a body of global surveillance.
This report pays special attention to China, a terrifying high-tech surveillance nation. Already a model For oppressive governments around the world.
“Beijing is moving fast to write the rules for AI systems,” the author said. “These efforts will have a significant impact on Beijing when it comes to forming global rules for AI surveillance technology, which could reduce the role of human rights norms in these frameworks.” .. free (-r) world.
First in Digital Future Daily: Bitcoin has a lawyer who defended Obamacare in front of the US Supreme Court.
Grayscale Investments, former Secretary of State Don・ I used Beriri.
The SEC has rejected similar bids on Bitcoin-based exchange-traded funds. A high-profile initiative led by former Trump adviser Anthony ScaramucciBecause the risk is too high for individual investors.
Grayscale invites Beriri, the Obama administration’s groundbreaking healthcare and same-sex marriage case, to squeeze into market regulators and discuss them in court.
The company has laid the groundwork for legal opposition for months, Claim their refusal to bid is unfair Given the SEC approval of ETFs associated with Bitcoin futures contracts, it is a more indirect financial instrument regulated by the Commodity Futures Trading Commission. They have also sought to build a grassroots movement around their efforts through active public relations and advocacy campaigns that cover Washington’s Union Station with advertisements and flood the SEC with letters of support.
In an interview, Manger Tolls and Olson’s partner, Beriri, said, “Hopefully, I’m doing everything I can to convince the Commission that approval is the right answer, so I need to go to court. There will be no. ” He added that the SEC “will be very difficult to distinguish between spots and futures ETFs.” [market] ETF. — — Sam sutton
Human rights in the Metaverse-POLITICO
Source link Human rights in the Metaverse-POLITICO