blog.geomblog.org
The Geomblog: 12/01/2014 - 01/01/2015
http://blog.geomblog.org/2014_12_01_archive.html
Ruminations on computational geometry, algorithms, theoretical computer science and life. The Job Search: A series of posts. Clustering: A conceptual approach. Thursday, December 11, 2014. Accountability in data mining. For a while now, I've been thinking about the endgame in data mining. Or more precisely, about a world where everything is being mined for all kinds of patterns, and the contours of our life are shaped by the patterns learnt about us. How can I trust the patterns being discovered? And I a...
bigdatadialog.com
Bias in Human Decision-Making — Big Data Dialog
http://www.bigdatadialog.com/fairness/dw94l8a677lv8kmpwq15jtc8m67vol
Data Science and Ethical Decision Making. Bias in Human Decision-Making. October 16, 2015. The ideas in this article, in the context of our expectations of Amazon's same day delivery service. Chris Devers photo of Banksy's No Loitrin piece at Cambridge. There has been considerable recent press. On algorithmic bias. There is even a workshop series. With machines making decisions, biases become much easier to quantify. We not only know the fractions along each dimension, but also how the decisions were...
jeremykun.com
What does it mean for an algorithm to be fair? | Math ∩ Programming
https://jeremykun.com/2015/07/13/what-does-it-mean-for-an-algorithm-to-be-fair
Methods of Proof Diagonalization. The Čech Complex and the Vietoris-Rips Complex →. What does it mean for an algorithm to be fair? July 13, 2015. In 2014 the White House commissioned a 90-day study that culminated in a report. Pdf) on the state of “big data” and related technologies. The authors give many recommendations, including this central warning. Warning: algorithms can facilitate illegal discrimination! 8221; Even if this is the most common question being asked on Google, and. But mostly ignored ...
enriquedans.com
Las máquinas aprenden. Sí, pero… ¿de qué? » Enrique Dans
https://www.enriquedans.com/2015/08/las-maquinas-aprenden-si-pero-de-que.html
Las máquinas aprenden. Sí, pero… de qué? Un interesante artículo en TechCrunch,. 8220; Machine learning and human bias: an uneasy pair. Me pone sobre la pista de algunas iniciativas de la policía de varias ciudades norteamericanas y su uso de herramientas de. 8220; Minority Report. Y su Departamento de Pre-crimen, pero que de manera efectiva ya lleva cierto tiempo siendo una realidad. Un artículo de 2013 en Chicago Tribune. En el estado de Alabama, la ciudad de Oxford. Utiliza una aplicación de. Mientras...
fairness.haverford.edu
On algorithmic fairness, discrimination and disparate impact.
http://fairness.haverford.edu/jekyll
On algorithmic fairness, discrimination, and disparate impact. What is algorithmic fairness and why is it important? This site serves to collect articles and research that will help to answer these questions. Our own take on the research questions behind these issues can be found in this paper. More research is collected at fatml.org. Could your data discriminate? Our entry in the Knight News Challenge. Stay tuned!
bigdatadialog.com
Fairness — Big Data Dialog
http://www.bigdatadialog.com/fairness
Data Science and Ethical Decision Making. Why We Are Hard On Amazon, And Should Be. August 19, 2016. It is more important to be fair when dealing with algorithms, and also more easy to detect and correct unfairness. August 19, 2016. The Promise and Perils of Predictive Policing Based on Big Data. November 16, 2015. Originally published in The Conversation. Also appeared in Gizmodo. Tariqabjotu Police Line Up CC-BY-2.0. In recent years, and the value it has provided. Was an early adopter. It depends on so...
benjamin.dekosnik.com
writing – BENJAMIN DE KOSNIK
https://benjamin.dekosnik.com/tag/writing
Boyd, danah. 2006. Friends, Friendsters, and MySpace Top 8: Writing Community Into Being on Social Network Sites. First Monday 11:12, December. http:/ www.firstmonday.org/issues/issue11 12/boyd/index.html. Hu, Yuheng, Lydia Manikonda, and Subbarao Kambhampati. “What We Instagram: A First Analysis of Instagram Photo Content and User Types.” ICWSM. 2014. Keefe, Patrick Radden. “ The Detectives Who Never Forget A Face. 8221; The New Yorker, August 22, 2016. Lee, Pamela. “ Identity Theft. Rank everything fro...