- A Human Rights Approach to U.S. Cybersecurity Strategy
"Cybersecurity policies need to take seriously the human rights of individual users by putting people's empowerment and well-being at their center," writes Stonehill College's Anwar Mhajne. As the Biden adminstration looks to defend human rights through its foreign policy, it's important to find a multilateral approach that ecnompasses the cyber realm and "puts individual rights at its center."
- The Doorstep: TikTok & the Normalization of Protests Around the World, with Dr. Tia C. M. Tyree
Howard University's Professor Tia Tyree joins "Doorstep" hosts Tatiana Serafin and Nick Gvosdev to discuss social media and youth activism in 2021 and beyond. The digital native generation is taking its online activism offline more swiftly and easily than ever with TikTok as the platform of choice. What responsibilities do tech giants and governments have to support this mobilization? How will global societies be reshaped as Gen Z takes power?
- AI & Equality Initiative: Algorithmic Bias & the Ethical Implications
In this AI & Equality Initiative podcast, Senior Fellow Anja Kaspersen speaks with three researchers working with the University of Melbourne's Centre for AI and Digital Ethics about bias in data and algorithms. How can these types of biases have adverse effects on health and employment? What are some legal and ethical tools that can be used to confront these challenges?
- Facing a Pandemic in the Dark: An Update on Cox's Bazar & COVID-19, with Razia Sultana
Three weeks ago, Razia Sultana, a Rohingya lawyer and activist, wrote an article for the Carnegie Council website about how over 1 million Rohingya refugees living in unsanitary conditions and with no Internet access in makeshift camps in Cox's Bazar, Bangladesh are dealing with the COVID-19 pandemic. In this Q&A, she gives an update on this situation.
- The Risks and Rewards of Big Data, Algorithms, and Machine Learning, with danah boyd
How do we analyze vast swaths of data and who decides what to collect? For example, big data may help us cure cancer, but the choice of data collected for police work or hiring may have built-in biases, explains danah boyd. "All the technology is trying to do is say, 'What can we find of good qualities in the past and try to amplify them in the future?' It's always trying to amplify the past. So when the past is flawed, it will amplify that."