Social media platforms must abandon algorithmic secrecy | 社交媒体平台必须放弃算法保密 - manbetx20客户端下载
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持manbetx3.0 大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
FT英语电台

Social media platforms must abandon algorithmic secrecy
社交媒体平台必须放弃算法保密

More transparency is required about the algorithms that wield enormous power over billions of people
对数十亿人拥有巨大权力的算法需要更高的透明度。
00:00

Algorithmic advances have spurred quantum leaps in myriad human endeavours — including medical devices, models for climate change, financial trading systems, GPS mapping, even online dating.

But while these algorithmic forces impact our daily lives in beneficial ways, they are often inaccessible and mysterious to the average citizen.

In driving our social media platforms, feeding us daily news, family updates and friend suggestions, algorithms have held us rapt for some time. Yet most of us have neither the training nor the faculties to understand how these systems impact us, and the protocols that govern them.

undefined

We must take the word of others about how these arcane systems work and of what they are composed.

The testimony of technology executives called before the UK parliament or US congress to explain algorithmic data processes, or account for data breaches, tells us little about how their algorithms really operate.

And the algorithms animating our social media news feeds are often protected as trade secrets, and not found on a publicly accessible registry such as the US or UK Patent Office.

“Patents work on the basis of sufficient disclosure of an inventor’s scientific innovation to the benefit of society,” says Tanya Aplin, our colleague at King’s College. “Trade secrets, on the other hand, keep the knowhow of formulas and technical developments confidential.”

This balance between full disclosure and secrecy sits at the heart of the debate around the use of algorithmic forces.

Critically, these algorithmic systems have no form of community review. Also, an epistemological conundrum compounds the problem: we do not know who knows how these algorithms work.

Welton Chang, chief technology officer at Human Rights First, an advocacy group, says: “Within the labyrinthine structures of social media companies, it is doubtful that there is a department or team with full visibility of a platform’s secretive black box of algorithms.”

Algorithmic, robotic content has, in large part, assisted and powered election interference, fomented domestic rebellion and facilitated extremism online.

Absolute and unchecked, platforms enabled by algorithms wield enormous power over billions of citizens worldwide.

Frank Pasquale, writing about secretive algorithms in his book The Black Box Society, says: “However savvy absolute secrecy may be as a business strategy, it is doubtful that public policy should be encouraging it.”

Protecting algorithmic “secret sauces” via trade secret law has become de rigueur over the past few years. Ironically, the antithesis of this approach — the open-source algorithm — may not only be these algorithms’ saving grace, but one possible antidote against secrecy.

By disclosing their formulas for the benefit of society, open-source algorithms allow a cross section of professionals to examine the fundamental principles at play.

undefined

Security researchers can determine whether our personal data were put at risk during algorithmic processing. Human rights organisations can help to avoid infringement of our fundamental freedoms. Academics can dig into these systems for bias.

But, until we have some basic understanding of how social media algorithms use our personal data, platforms will always be able to resist accountability and efforts at regulation will be too imprecise to make an impact.

“Users have the right to know what inputs are being made both into the algorithms that choose their content and those used to moderate their content,” says Jillian York, author of Silicon Values.

While the underlying algorithms at play in apps remain opaque and inaccessible, a new step in this direction is Apple’s App Tracking Transparency program. This feature returns some measure of control over personal data to users who can prevent tracking across third-party apps and websites.

Full disclosure and transparency, as opposed to secrecy, form the foundations of liberal democracies. With platforms inextricably linked to our political and democratic processes, it is time to abandon secrecy and mystery in favour of transparency.

Social media platform users must be able to come to their own conclusions about the place of the digital algorithm in their lives.

Open, transparent, fair and accountable algorithmic decision-making processes should form the linchpin of operating principles set for and by platforms and policymakers.

Frederick Mostert is a professor of practice in intellectual property law at King’s College, London and a member of the Digital Scholarship Institute.

Alex Urbelis is a partner at the Blackstone Law Group LLP and a member of Human Rights First’s Technology Advisory Board

版权声明:本文版权归manbetx20客户端下载 所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

Lex专栏:机器人的崛起将极大推动英伟达发展

对于创始人黄仁勋来说,物理人工智能是人工智能的下一个前沿领域。

特朗普将难以推动油价下降

特朗普不可能同时实现低能源价格和创纪录的国内油气产量。美国能源产量将增长,但增产部分更多将来自天然气。

Meta对顶级广告客户免除标准内容审核流程

社交媒体巨头的“护栏”旨在保护高支出广告客户,因为担心其自动化审核系统错误地惩罚顶级品牌。

FT社评:马斯克对欧洲民主的威胁必须得到遏制

科技监管不能像扎克伯格本周指控的那样扼杀创新,但对欧洲内容审核的指责只是特朗普、马斯克和扎克伯格政治和个人目的的烟幕弹。

反对派领袖:叙利亚盟友倒台后,委内瑞拉军方可能抛弃马杜罗

委内瑞拉反对派领袖玛莉亚•科里纳•马查多认为,军方首领担心会遭遇与阿萨德军方同样的命运。

欧洲科技企业家:尽管美国占据主导地位,但欧洲仍可在AI领域获胜

欧洲最成功的科技企业家之一赞斯特罗姆表示,不是每家公司都必须研发出大型语言模型,欧洲企业可以基于美国的AI平台开发应用。
设置字号×
最小
较小
默认
较大
最大
分享×