Please use this identifier to cite or link to this item:
https://ir.iimcal.ac.in:8443/jspui/handle/123456789/4095
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Regana, Sowrya | - |
dc.date.accessioned | 2022-11-09T05:03:09Z | - |
dc.date.available | 2022-11-09T05:03:09Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | https://ir.iimcal.ac.in:8443/jspui/handle/123456789/4095 | - |
dc.description.abstract | The fast paced and multi directional development of AI applications is supporting humans in areas ranging from buying a simple toothbrush to sending reusable spacecrafts into the space. As AI is being increasingly adopted in crucial and sensitive applications, it raises the need to study about the biases that these systems bring to the table. Recent times have seen significant increase in the examples of AI systems reflecting or exacerbating machine biases, from racist facial recognition to sexist natural language processing. Recently, American Civil Liberty Union filed a case against Detroit Police for falsely arresting an African American due to mis-identification attributed to its facial recognition software. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | Students of PGDBA Post Graduate Diploma in Business Analytics, IIM Calcutta | en_US |
dc.relation.ispartofseries | Vol.1; | - |
dc.subject | AI | en_US |
dc.subject | American Civil Liberty Union | en_US |
dc.subject | IBM Research | en_US |
dc.subject | Policy Makers | en_US |
dc.subject | Demographic Parity | en_US |
dc.subject | Equalized Odds | en_US |
dc.subject | Equal Opportunity | en_US |
dc.title | The world is not fAlr | en_US |
dc.type | Article | en_US |
Appears in Collections: | AINA 1.0 - Volume 1 Edition 2019-20 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
The world is not fair.pdf | The world is not fair | 1.27 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.