Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the updraftplus domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /var/www/html/admission/wp-includes/functions.php on line 6114
Ethics in AI Live Event: Machines Judging Humans – 國立臺灣師範大學招生資訊與形象網
Reading

Ethics in AI Live Event: Machines Judging Humans

event-09

About The Event

Part of the Colloquium on AI Ethics series presented by the Institute for Ethics in AI.

This event is also part of the Humanities Cultural Programme, one of the founding stones for the future Stephen A. Schwarzman Centre for the Humanities.

Over the past decade, algorithmic accountability has become an important concern for social scientists, computer scientists, journalists, and attorneys. Exposés have sparked vibrant debates about algorithmic sentencing. Researchers have exposed tech giants showing women ads for lower-paying jobs, discriminating against the aged, deploying deceptive dark patterns to trick consumers into buying things, and manipulating users toward rabbit holes of extremist content. Public-spirited regulators have begun to address algorithmic transparency and online fairness, building on the work of legal scholars who have called for technological due process, platform neutrality, and nondiscrimination principles.

This policy work is just beginning, as experts translate academic research and activist demands into statutes and regulations. Lawmakers are proposing bills requiring basic standards of algorithmic transparency and auditing. We are starting down on a long road toward ensuring that AI-based hiring practices and financial underwriting are not used if they have a disparate impact on historically marginalized communities. And just as this “first wave” of algorithmic accountability research and activism has targeted individual encounters with existing systems, an emerging “second wave” of algorithmic accountability has begun to address more structural concerns. Both waves will be essential to ensure a fairer, and more genuinely emancipatory, political economy of technology. Second wave work is particularly important when it comes to illuminating the promise & perils of formalizing evaluative criteria.

Location

Valencia St San Francisco, CA 900 United States

The Palace of Fine Arts 3601 Lyon St San Francisco, CA 94123 United States
Phone Number 123 0039 68886
Email admin@unicamp.com

Our Speakers

user-avatar-03
Bellezza
Content Writer
Suspe ndisse suscipit sagittis leo sit met condime ntum esti laiolainx bulum iscipit sagittis leo sit met con ndisse suscipit sagittis leo sit met cone suscipit sa
user-avatar-06
Charlie
Office Manager
Suspe ndisse suscipit sagittis leo sit met condime ntum esti laiolainx bulum iscipit sagittis leo sit met con ndisse suscipit sagittis leo sit met cone suscipit sa
user-avatar-05
Emerson
Developer
Suspe ndisse suscipit sagittis leo sit met condime ntum esti laiolainx bulum iscipit sagittis leo sit met con ndisse suscipit sagittis leo sit met cone suscipit sa
user-avatar-09
Lucinda
Manager
Suspe ndisse suscipit sagittis leo sit met condime ntum esti laiolainx bulum iscipit sagittis leo sit met con ndisse suscipit sagittis leo sit met cone suscipit sa
user-avatar-02
Orabelle
Art Director
Suspe ndisse suscipit sagittis leo sit met condime ntum esti laiolainx bulum iscipit sagittis leo sit met con ndisse suscipit sagittis leo sit met cone suscipit sa
user-avatar-01
Savanna Walker
Founder & CEO
Suspe ndisse suscipit sagittis leo sit met condime ntum esti laiolainx bulum iscipit sagittis leo sit met con ndisse suscipit sagittis leo sit met cone suscipit sa

Leave your thought here

  • Cost Free
  • Event date
    2021 年 6 月 14 日
  • Event time 上午 10:30 - 上午 11:30
  • Location United States
  • Organiser Savanna Walker
  • Total Slot 20
  • Booked Slot 0

This event has expired