Is AI the Scammer’s New Best Friend?

AI is empowering the masses and levelling the playing field, but not always for the better. While carrying out sophisticated cyber attacks used to require a team of skilled hackers and expensive infrastructure, it looks like AI has drastically lowered the technical barrier to entry for individuals and groups looking to engage in fraud and cybercrime.
Microsoft’s latest Cyber Signals report on AI-assisted scams notes that AI software is increasingly used in fraud attempts in a number of ways:
- Misusing legitimate apps for malicious purposes.
- Using fraud-specific tools from the cybercrime underground.
- Scanning and scraping the web to build detailed target profiles.
- Creating fake AI-enhanced product reviews and storefronts.
- Crafting entire fake brands with business histories and testimonials.
- Deploying deepfakes, voice cloning, phishing emails, and fake websites to appear credible.
AI is Levelling Up Fraud in Record Time
In just the past year, Microsoft reportedly blocked $4 billion worth of fraud attempts and about 1.6 million bot signups per hour. And among the AI-powered attacks they flagged using a combination of the methods highlighted above, two in particular are especially worth a closer look.
Firstly, e-commerce fraud. AI has drastically shortened the time needed to create fraudulent websites, so scammers can now launch convincing storefronts in minutes instead of days or weeks. These sites mimic real brands with AI-generated product descriptions, fake customer reviews, and even realistic-looking photos. Some go a step further by using AI chatbots to field customer complaints and delay refunds to make the scams seem even more legitimate.
Job and employment scams are on the rise too, and AI is making them harder to spot. Scammers are using generative AI to churn out fake job listings, recruiter profiles, and personalised phishing emails faster than ever, at scale and with startling realism. Once a job seeker is engaged, they’re often asked to hand over personal info, sometimes even banking details, and as Microsoft warns, unsolicited offers promising high pay with minimal qualifications should always raise suspicion.
Keep Calm and Question Everything
In order to stay one step ahead of these scams, Microsoft shared a few important tips worth keeping in mind.
To protect yourself as an online shopper, here are three things you can do:
- Don’t let pressure tactics trick you– Don’t be fooled by “limited-time” deals and countdown timers.
- Only click on verified ads – Many scam sites spread through AI-optimised social media ads. Cross-check domain names and reviews before purchasing.
- Be sceptical of social proof – Scammers can use AI-generated reviews, influencer endorsements, and testimonials to exploit your trust.
And when interviewing for a job, remember that employers would never ask you to:
- Provide personal or financial information– Unsolicited SMS and email messages offering high-paying jobs with minimal qualifications are typically fraudulent. Avoid sharing personal or financial information. You should never provide a Social Security number, banking details, or passwords to an unverified employer.
- Pay for a job opportunity – Employment offers that include requests for payment, offers that seem too good to be true, and a lack of formal communication platforms can all be indicators of fraud.
- Communicate via unofficial communication channels– If recruiters and hiring managers only communicate via SMS, WhatsApp, or non-business email accounts, it’s a red flag. Legitimate employers use official company platforms for hiring communications. Always handle personal and sensitive information over secure platforms.
AI is evolving faster than we can say “deepfake,” and while that’s exciting for tech, it’s a headache for anyone trying to avoid scams. As these tools get smarter, expect the next wave of scams to be even harder to spot. So, from an individual perspective, keep your guard up, trust your instincts, and know that when something looks too good to be true — it probably is.