Sora2 Cameos: What could go Wrong, Probably Will…

Deepfakes, rebranded as "Cameos" by OpenAI's Sora tool, allow users to personalize videos but raise significant concerns. Users' likenesses can be exploited without their consent, leading to potential scams and misinformation. While providing creative fun, the lack of safeguards poses risks, emphasizing the need for accountability and regulation in AI-generated content.

Data Duped: Your Superpower against Misinformation

Data Duped book helps people make better decisions and avoid misinformation

With the launch of Data Duped: How to Avoid Being Hoodwinked by Misinformation, that I co-wrote with Jeffrey D. Camm, we hope we are bringing curious readers closer to understanding data.  Not just the math and statistics, but also provide some insights into how data is used in many of our everyday interactions. For example, that not-so-random advertising you … Continue reading Data Duped: Your Superpower against Misinformation

Why Amazon’s Just Walk Out Retail Experience Feels Creepy

I have a secret. Recently was in a convenience store and picked up a few items and… I just walked out. I didn’t check out, stop at a counter or talk to anyone. I just left. It is not what you think. I wasn’t breaking the rules, but the secret is I still feel creeped out by the … Continue reading Why Amazon’s Just Walk Out Retail Experience Feels Creepy