
It seems artificial intelligence (AI) is everywhere. It's easy to access, which means scammers use it, too.
AI-related scams make use of the following:
-
Voice cloning, in which scammers use AI tools to clone voices and pose as someone you know.
-
Deepfakes, in which scammers use AI-manipulated images and videos to impersonate people.
-
Phishing scams, in which generative AI tools can produce highly convincing, official-looking emails to trick you into sharing sensitive information or clicking malicious links.
The Nebraska Department of Banking and Finance (NDBF) is urging Nebraskans to be aware of the increased risk of investment fraud involving AI. Below are some key warning signs.
Unregistered/Unlicensed Investment Platforms
Remember that federal and state securities laws generally require securities firms, professionals, exchanges, and other investment platforms to be registered. A promoter’s lack of registration should be taken as a prompt to do additional investigation before you invest money.
When trading securities or seeking investment advice, make sure you are working with a registered investment professional and on a registered exchange or other type of registered investment platform. Use no-cost tools to confirm registration status of investment advisers or broker dealers – and check for any disciplinary history. Contact the NDBF if you need assistance.
Watch the U.S. Securities and Exchange Commission’s (SEC) educational videos for tips on spotting fake investment platforms.
Promises of Quick or Guaranteed Profits
It might seem exciting to invest in AI-focused companies that claim they are leaders in developing or using this emerging technology. But bad actors often use the hype around new developments, like AI or crypto assets, to lure investors into schemes. These bad actors might use catchy AI-related buzzwords and make claims that their companies or business strategies guarantee huge gains. Red flags of these types of scams include:
-
High-pressure sales tactics;
-
Promises of quick profits; or
-
Claims of guaranteed returns with little or no risk.
False claims about a public company’s products and services relating to AI also might be part of a “pump-and-dump” scheme in which fraudsters profit at the expense of unsuspecting investors. In a pump-and-dump scheme, promoters try to “pump” up, or increase, the stock price of a company by spreading positive – but false – information, usually online through ads, fake news coverage, or social media promotions. These rumors can cause investors to buy the stock, driving up the price. Then the promoters or others working with them “dump” their own shares before the hype ends. Typically, after the promoters profit from their sales, the stock price drops, and the remaining investors lose most of their money.
Microcap stocks, some of which are penny stocks or nano-cap stocks, may be particularly vulnerable to investment schemes, including scams involving AI-related claims.
Celebrity Endorsements
A growing number of investors are using social media to research opportunities and connect with others. Influencers have taken notice, and social media has become saturated with financial content, leading to the rise of the financial influencer or “finfluencer.”
While celebrities and other well-known personalities might be great at what they do in their respective professions, a celebrity endorsement does not mean that an investment is legitimate or is appropriate for all investors.
Keep in mind that according to the North American Securities Administrators Association (NASAA) and its 2025 survey of securities regulators, investors are often targeted through social media platforms, like Facebook and X (31.7%), as well as text- and voice-based communication platforms, like Telegram and WhatsApp (31.3%).
Scammers are also using YouTube and Vimeo (14.1%) and short-form video content like TikTok and Instagram Reels (19%) to reach victims.
Suspicious Content
AI is being used to generate professional graphics, videos, and other content that creates a sense of legitimacy – and deepfake images, videos, and voices of celebrities and persons known to the intended victims.
For example, some scam artists are using AI-generated audio to try to lure potential victims into thinking a grandchild is in financial distress and in need of money.
Scammers might use deepfake videos to imitate the CEO of a company announcing false news to manipulate the price of a stock, or they could use AI technology to produce realistic looking websites or marketing materials to promote fake investments or fraudulent schemes. In some cases, bad actors even impersonate government officials.
Steps to Protect Yourself
Here are some simple steps you can take right now to guard against AI-related scams:
-
Limit the information you share online.
-
Hang up and call someone at their known number if they call you from a random number.
-
Research financial claims from videos, and don’t trust a single video alone.
-
Enable multi-factor authentication and follow good password practices.
-
Register your phone number for one-time password log-in.
-
Update software and apps to install the latest security patches.
-
Regularly check your account security and privacy settings.
When it comes to AI-assisted investment scams, keep these points in mind:
-
Investors should carefully review the disclosures that companies are making and assess their promotional campaigns. If the company appears focused more on attracting investors through promotions than on developing its business, you might want to compare it to other companies working on similar AI products or services.
-
Use the SEC’s EDGAR database to access disclosures for public companies.
-
Verify that a communication from a federal, state, or provincial agency or other regulatory body is genuine before responding by contacting their office directly using the contact information on their website. Be sure to independently search for contact information rather than clicking on links or calling numbers in any communications you receive.
-
Because fraudsters might impersonate legitimate investment professionals – and even use phony personal websites to bolster their credibility – verify that you are communicating with the actual investment professional and not an imposter. Compare the phone number or website for the firm as disclosed in the firm’s Client Relationship Summary (Form CRS).
-
Be on the lookout for scammers using AI technology to impersonate family or friends. AI-generated tools might be able to access personal information about you online or on social media, so be wary of any unsolicited communication asking you to invest your money — even if it sounds like it was written just for you. Fraudsters pretending to be friends or family members in distress seek to stir up your emotions. Consider creating a password or phrase for family members to verify their identity in case of an emergency.
-
Be cautious about using AI-generated information to make investment decisions. AI-generated information might rely on data that is inaccurate, incomplete, or misleading. Even when based on accurate input, information resulting from AI can be faulty or completely fabricated.
-
Be aware that AI can generate and spread false or inaccurate information. Confirm the authenticity of underlying sources and review multiple sources of information before making investment decisions.
If you’re a victim of financial fraud, or you’ve identified something that appears to be a scam, contact the NDBF at (402) 471-2171 or file a complaint online as soon as possible.