Powered by RND
PodcastsTechnologyTerms of Service with Clare Duffy
Listen to Terms of Service with Clare Duffy in the App
Listen to Terms of Service with Clare Duffy in the App
(36,319)(250,152)
Save favorites
Alarm
Sleep timer

Terms of Service with Clare Duffy

Podcast Terms of Service with Clare Duffy
CNN Audio
New technologies like artificial intelligence, facial recognition and social media algorithms are changing our world so fast that it can be hard to keep up. Thi...

Available Episodes

5 of 7
  • Should You Say Yes to a TSA Face Scan?
    If you’ve been to the airport recently, you might have noticed something different while going through security: a TSA agent taking your photo. More than 200 US airports are now using facial recognition technology to identify passengers. Face scans can speed up the security process, but according to Dr. Joy Buolamwini, this technology comes with big picture risks. She explains what’s going on with these face scans — and how to opt out. Read more about Dr. Joy’s nonprofit, the Algorithmic Justice League, here. What questions do you have about the technology in your life? Email us at [email protected]. Learn more about your ad choices. Visit podcastchoices.com/adchoices
    --------  
    27:11
  • Remembering All Your Passwords is Hard. Let's Make It Easier
    CNN Audio’s senior producer Haley has a confession: she uses the same password for everything. Research shows she’s not alone. Why is this a problem? And what’s the safest way to create and manage your passwords? Rachel Tobac, an ethical hacker and CEO of SocialProof Security, has some answers that won’t make your head spin. The password managers Rachel mentions include: Bitwarden, 1Password, Dashlane, and KeePass What questions do you have about the technology in your life? Email us at [email protected]. Learn more about your ad choices. Visit podcastchoices.com/adchoices
    --------  
    24:14
  • Debunking Conspiracies This Holiday Season? Let AI Help.
    Conspiracy theories can take many forms, from misgivings about the first moon landing to false claims that the 2020 election was stolen. These kinds of beliefs are nothing new, but social media has helped make many of them more mainstream. As anyone who’s tried to reason with a conspiracy theorist knows, it’s hard to debunk such deeply held beliefs – and arguing with a loved one about them can be emotionally taxing. What if an AI chatbot could help? A recent study, published in Science, asked that very question — and the results were surprising. Thomas Costello, an assistant professor of psychology at American University and co-author of the study, breaks down the findings. Try chatting with the bot yourself at debunkbot.com. What questions do you have about the technology in your life? Email us at [email protected]. Learn more about your ad choices. Visit podcastchoices.com/adchoices
    --------  
    27:57
  • Why It Feels Like Your Phone Is Listening to You
    It’s a familiar feeling if you spend enough time on the internet: you talk about something in the real world, and then you see that same thing advertised to you online. This uncanny experience has led many people to wonder: are our devices listening to us? And is that how online advertisers are able to serve such specific ads? David Choffnes, associate professor at Northeastern University, says it’s not that simple. You can read the full study David discussed in this episode here. What questions do you have about the technology in your life? Email us at [email protected]. Learn more about your ad choices. Visit podcastchoices.com/adchoices
    --------  
    28:56
  • Deepfake Revenge Porn Is Rising: What Can You Do?
    A new kind of deepfake revenge porn is sweeping the internet. Using artificial intelligence, bad actors can do things like superimpose your face on a nude body, creating convincing and harmful images. Tech companies and lawmakers are trying to play catch up, but the truth is these tools are still easy to access. So how can you and your loved ones stay safe from this dangerous technology? Carrie Goldberg, a lawyer specializing in digital harassment and sex crimes, has some answers. If you've had someone create deepfakes of you, or if you're a parent and this happened to your child, we'd appreciate hearing from you about how you handled it. Email us at [email protected]. If you find an explicit image of yourself or a loved one on social media, here are some resources for getting it taken down: Take It Down StopNCII.org Meta’s takedown form Google’s takedown form Learn more about your ad choices. Visit podcastchoices.com/adchoices
    --------  
    27:38

More Technology podcasts

About Terms of Service with Clare Duffy

New technologies like artificial intelligence, facial recognition and social media algorithms are changing our world so fast that it can be hard to keep up. This cutting-edge tech often inspires overblown hype — and fear. That’s where we come in. Each week, CNN Tech Writer Clare Duffy will break down how these technologies work and what they’ll mean for your life in terms that don’t require an engineering degree to understand. And we’ll empower you to start experimenting with these tools, without getting played by them.
Podcast website

Listen to Terms of Service with Clare Duffy, Lex Fridman Podcast and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features

Terms of Service with Clare Duffy: Podcasts in Family

Social
v7.0.0 | © 2007-2024 radio.de GmbH
Generated: 12/14/2024 - 1:01:29 AM