How the internet can rebuild trust - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

How the internet can rebuild trust

Algorithms and generative AI models that decide what billions of users see should be transparent
00:00

{"text":[[{"start":null,"text":"

As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense
"}],[{"start":5.8,"text":"The writer is co-founder of Wikipedia and author of ‘The Seven Rules of Trust’"}],[{"start":11.29,"text":"When I founded Wikipedia in 2001, pioneers of the internet were excited by its promise to give the world access to truth and connection."}],[{"start":22.18,"text":"Two decades later, that optimism has curdled into cynicism. We scroll through feeds serving up news we no longer believe, interact with bots we cannot identify and brace for the next synthetic scandal created by fake images from artificial intelligence."}],[{"start":42.06,"text":"Before the web can move forward, it must remember how it earned trust in the first place."}],[{"start":48.49,"text":"The defining difference between web 1.0 and the platforms that dominate today is not technological sophistication but moral architecture. Early online communities were transparent about process and purpose. They exposed how information was created, corrected and shared. That visibility generated accountability. People could see how the system worked and participate in fixing its mistakes. Trust emerged not from perfection (there was still plenty of online trolling, flame wars and toxicity), but from openness."}],[{"start":84.49000000000001,"text":"Today’s digital landscape reverses that logic. Recommendation algorithms and generative AI models decide what billions of users see, yet their workings remain opaque. When platforms insist their systems are too complex to explain, users are asked to substitute faith for understanding."}],[{"start":105.78,"text":"AI intensifies the problem. Large language models can produce fluent paragraphs and convincing deepfakes. The tools that promised to democratise knowledge now threaten to make knowledge unrecognisable. If everything can be fabricated, the distinction between truth and illusion becomes a matter of persuasion."}],[{"start":127.36,"text":"Re-establishing trust in this environment requires more than fact-checking or content moderation. It requires structural transparency. Every platform that mediates information should make provenance visible: where data originated, how it was processed, and what uncertainty surrounds it. Think of it as nutritional labelling for information. Without it, citizens cannot make informed judgments and democracies cannot function."}],[{"start":156.64,"text":"Equally important is independence. As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense. Guardrails must ensure the entities curating public knowledge are accountable to the public, not just investors."}],[{"start":177.42999999999998,"text":"And we must revive civility too. Some of the best early online spaces relied on norms that valued reasoned argument over insult. They were imperfect but self-correcting because participants felt a duty to the collective project. Today’s social platforms monetise outrage. Restoring trust means designing systems that reward good-faith discourse — through visibility algorithms, community-based moderation, or friction that forces reflection before reposting."}],[{"start":212.55999999999997,"text":"Governments have a role to play but regulation alone cannot rebuild trust. It has to be observed in practice. Platforms should disclose not only how their algorithms work but also when they fail. AI developers should publish dataset sources and error rates."}],[{"start":232.7,"text":"The challenge of our time is not that information is scarce but that authenticity is. Important aspects of the early internet succeeded because people could trace what they read to another human being, even if the other human being was operating behind a pseudonym. The new internet must restore that chain of custody."}],[{"start":255.83999999999997,"text":"We are entering an era when machines can mimic any voice and invent any image. If we want truth to survive that onslaught, we must embed transparency, independence and empathy into the digital architecture itself. The early days of the web showed it could be done. The question is whether we still have the will to do it again."}],[{"start":284.46999999999997,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1764835851_6780.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

前阿波罗风控主管警告新兴寿险公司风险

查克•拉古纳坦指出,一些较新的寿险公司高度依赖私募信贷以及易遭提款冲击的新型储蓄产品。

医疗保健正在推动美国经济

福鲁哈尔:医疗保健是一个庞大的行业——但这未必是好事。

公司董事会应该任命一名AI董事吗?

新的AI工具可帮助董事长和董事们进行准备与调研,但不太可能被赋予投票权。

盟友请求美国财政部支援?但互换额度存在上限

美国曾向阿根廷提供200亿美元的互换额度,但对海湾和亚洲盟友提供类似安排将面临诸多制约。

伊朗战争致汽油价格持续高企,美国车主减少加油

飙升的驾车成本在11月中期选举前给特朗普带来了政治难题,他在2024年竞选期间曾承诺将汽油价格降至每加仑2美元。

加拿大实业界:繁琐监管之害甚于特朗普关税

加拿大企业表示,过度监管正在扼杀卡尼总理提振经济增长的努力。
设置字号×
最小
较小
默认
较大
最大
分享×