nsKnox https://nsknox.net/ Protect every payment by automatically detecting and preventing fraud attempts in real time Sun, 19 Jan 2025 18:55:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://nsknox.net/wp-content/uploads/2019/03/fav.pngnsKnoxhttps://nsknox.net/ 32 32 Inside the $25M Hong Kong Deepfake Scam: A Comprehensive Analysishttps://nsknox.net/blog/inside-the-25m-hong-kong-deepfake-scam-a-comprehensive-analysis/ https://nsknox.net/blog/inside-the-25m-hong-kong-deepfake-scam-a-comprehensive-analysis/#respond Sun, 19 Jan 2025 18:36:10 +0000 https://nsknox.net/?p=21123Leveraging nsKnox’s expertise, this analysis unpacks how the deepfake scam in Hong Kong likely transpired and what businesses can learn to protect themselves

The post Inside the $25M Hong Kong Deepfake Scam: A Comprehensive Analysis appeared first on nsKnox.

]]>

In February 2024, a Hong Kong-based multinational firm fell victim to a sophisticated deepfake scam, losing $25 million USD.

The attackers employed AI technology to create convincing video deepfakes of the company’s CFO and other executives, which they used during a video conference to deceive an employee into authorizing large fund transfers.

Drawing on nsKnox’s understanding of how deepfake payment fraud attacks are typically executed, here is our perspective on how the $25 million Hong Kong deepfake scam likely transpired. This is based on the steps outlined in an interview with nsKnox’s CISO, Yaron Libman (also featured in this newsletter).

1. Targeting and Reconnaissance:

  • Information Gathering: The attackers likely conducted extensive research on the target organization to identify the key decision-makers, particularly those involved in financial approvals, such as the CFO.
  • Data Collection: Harvesting audio and video footage of the company’s CFO and other executives. This could have been sourced from public interviews, webinars, or social media
    platforms. Such material was critical for training AI models to convincingly replicate their speech patterns and facial expressions.
  • Identifying Vulnerabilities: Through phishing attacks, insider knowledge, or social engineering tactics, the attackers gained information on internal workflows, payment protocols, and approval authority chains.

This survey gave the attackers the necessary data to build convincing deepfake models and craft a legitimate-appearing scenario

2. Deepfake Creation:

With sufficiently collected raw data, the attackers, most likely, employed advanced deepfake AI technology to:
  • Develop Hyper-Realistic Models: Using state-of-the-art algorithms widely available today to most users, they replicated the CFO’s facial features, voice, and micro- expressions, ensuring the deepfake could perform convincingly in real-time video interactions.
  • Enhance Real-Time Adaptability: The deepfake system was probably fine-tuned to respond fluidly during live communication, seamlessly simulating the CFO’s responses to eliminate suspicion.
  • Authenticate the Setup: The attackers likely mirrored legitimate internal video conferencing systems or spoofed official meeting links, further reducing suspicion.

3. The Attack Execution:

The attackers initiated a video call, leveraging the deepfake CFO to instruct the targeted employee with financial authority.

  • Manipulating Trust: During the call, the deepfake executives instructed the employee to transfer funds to specified bank accounts, framing it as urgent and confidential.
  • Creating Urgency: The deepfake ‘CFO’ likely framed the request as an urgent, high-stakes transaction, leveraging their perceived authority to discourage hesitation or secondary verification.
  • Exploiting Trust: The employee, trusting the visual and verbal cues of the supposed CFO, authorized the transfer of $25 million to the fraudulent accounts provided during the call.

This stage relied on exploiting technological and psychological vulnerabilities, such as employees’ inherent trust in senior leadership and reluctance to challenge high level directives.

4. Completion and Concealment

Once the funds were transferred, the attackers likely employed a series of steps to obscure their tracks:
  • Delayed Realization: The fraud was discovered only after the funds had been transferred, and subsequent communications raised suspicions, leaving little room for recourse as, by then, the funds had already been ‘laundered’ and become irretrievable.
  • Layered ‘Money Laundering’: The funds were likely dispersed through multiple mule accounts and shell companies across numerous jurisdictions, making tracing or recovering the funds challenging@
  • Operational Anonymity: The attackers likely operated through anonymized networks, leaving minimal digital footprints that could tie them to the crime.
  • Investigation: Upon investigation, it was revealed that the video conference had been manipulated using deepfake technology, leading to the unauthorized transfer.

How Can Companies Prevent Such Attacks?

In today’s era of synthetic reality, the increasing prevalence of deepfake technology has rendered voice and video verification methods unreliable, making traditional approaches inadequate. Corporations must require a technology-driven solution that can securely and accurately validate bank account details without the need for phone or video callbacks. By adopting a deterministic approach to verifying payee bank account information, businesses can eliminate the need for fraud-prone phone calls and video conferencing.

PaymentKnox™ for Corporates by nsKnox is a comprehensive payment validation platform designed to address the complexities of modern financial fraud. It provides deterministic account validation by cross-referencing transaction details against verified databases and can validate any account anywhere in the world using bank KYC data.

This ensures that payments are routed only to legitimate and pre approved recipients, significantly reducing the risk of fraud.

While video conferencing has become a regular part of our
daily workplace routines, it’s important to be aware of
potential risks.

Below are steps to help your teams protect themselves from external threats during calls and video conferences. However, it’s crucial to stress that these precautions alone are not sufficient when it comes to transferring funds:

  • Employee Training: Regularly educate employees on the risks of deepfake threats and provide them with the tools and training to recognize red flags for signs of fraudulent communications.
    • Questioning unusual requests and verifying them through alternative channels, even when they
 appear to come from senior executives.
    • Double-check any instructions or requests received that bypass standard protocols, such as changes in payment processes, high-pressure demands, or deviations from typical communication channels.
    • Encourage employees to confirm sensitive instructions through established independent channels,
 even if the request appears urgent.

  • Voice and Video Verification Systems: Leverage AI driven tools designed to detect deepfake anomalies. Platforms like Sensity and Microsoft Video Authenticator can identify issues such as unnatural lip-syncing, irregular speech patterns, audio-visual inconsistencies, discrepancies in facial light reflections, or other subtle details indicative of deepfake contet.

Conclusion

The $25 million deepfake scam in Hong Kong highlights how advanced AI-driven fraud can manipulate trust and authority within financial operations.

As deepfake technology continues to evolve, voice and video verification can no longer be relied upon, rendering traditional methods insufficient. To address this, companies must adopt technology-based solutions that verify bank account details without depending on phone or video callbacks.

Additionally, measures such as employee training and the implementation of voice and video authentication systems can help mitigate vulnerabilities associated with deepfake AI.

The post Inside the $25M Hong Kong Deepfake Scam: A Comprehensive Analysis appeared first on nsKnox.

]]>
https://nsknox.net/blog/inside-the-25m-hong-kong-deepfake-scam-a-comprehensive-analysis/feed/ 0
Q&A: The Growing Threat of Deepfake AI in B2B Paymentshttps://nsknox.net/blog/qa-the-growing-threat-of-deepfake-ai-in-b2b-payments/ https://nsknox.net/blog/qa-the-growing-threat-of-deepfake-ai-in-b2b-payments/#respond Sun, 19 Jan 2025 18:00:25 +0000 https://nsknox.net/?p=21108In today's rapidly evolving digital landscape, the rise of deepfake AI technology has introduced a new and significant threat to B2B payment security

The post Q&A: The Growing Threat of Deepfake AI in B2B Payments appeared first on nsKnox.

]]>

In today’s rapidly evolving digital landscape, the rise of deepfake AI technology has introduced a new and significant threat to B2B payment security.

With cybercriminals using artificial intelligence to create highly convincing fake videos, audio, and images, businesses are at risk of falling victim to sophisticated fraud attempt.

These attacks can easily bypass traditional security measures, leading to unauthorized transactions, stolen data, financial losses, and reputational damage. In this interview with nsKnox’s CISO, Yaron Libman, we explore the dangers posed by deepfake AI, its impact on B2B payments, and how companies can safeguard themselves against this emerging

Q. What are Deepfakes, and how do they work?

A. Deepfakes are highly realistic but fabricated digital content created using artificial intelligence (AI) and machine learning (ML). By training algorithms on massive amounts of real data, such as voice recordings, videos, and images, deepfake technology can generate media that closely mimics actual individuals. This means a person’s voice, facial expressions, and videos can be convincingly faked. These deepfakes can impersonate company executives, stakeholders, or clients to manipulate financial transactions or obtain sensitive data.

Q. What is the future of deepfakes in the context of B2B payment security?

A. Deepfake AI attacks are effective because humans have a natural tendency to trust the data they receive. However, companies must verify and validate this data, regardless of its source. As deepfake technology advances and becomes more accessible, these attacks will likely increase in frequency and become more challenging to detect. The future of payment security will likely rely on more sophisticated, technology-driven tools capable of identifying deepfakes in real time. Additionally, closer collaboration between private payment protection companies and security firms and the implementation of
regular audits and employee training will be critical in developing systems and frameworks that can stay ahead of this evolving threat.

Q. Why are Deepfake scams such a significant threat to B2B payment security, and how are they used in fraud?

A. Deepfake AI poses a serious risk to B2B payment security by enabling cybercriminals to impersonate trusted individuals or entities with alarming accuracy. Criminals can create deepfake videos or audio messages of executives—such as CEOs, CFOs, or Treasurers—directing employees to authorize fraudulent transactions or redirect payments. Without proper verification protocols, these impersonations can lead to substantial financial losses, data breaches, and reputational damage.

Deepfakes can be deployed in multiple ways to manipulate payment systems.
For example:

  • Executive Fraud: Scammers create deepfake content mimicking a senior executive to pressure employees into transferring funds to fraudulent accounts.
  • Invoice Manipulation: Attackers use deepfake-generated voices or signatures to alter
    payment details or submit fake invoices, tricking finance teams into processing
    unauthorized payments.
  • Social Engineering Attacks: Cybercriminals craft realistic deepfake communications
    from clients or vendors requesting sensitive information or urgent payments.

These attacks are especially dangerous because they are designed to bypass traditional security measures like email verification or invoice checks. Even well-trained staff can be deceived, highlighting the critical need for multi-factor authentication, independent and
technology-based account validation technology, and robust employee training to
mitigate these risks.

Q. Is there a specific case that serves as a wake-up call for businesses about deepfake fraud?

A. Consider this scenario: you’re on a video call with your company’s CFO. They appear exactly as you’ve always known them—the same voice, mannerisms, and familiar expressions you’ve observed over years of meetings. They urgently request a multi- million-dollar transfer to a vendor’s account for a time-sensitive deal. Would you question it?

This is exactly how the $25 million deepfake scam in Hong Kong unfolded. Attackers used advanced AI to create a highly realistic video of the CFO and conducted a live video call with an unsuspecting employee, completely mimicking the CFO’s appearance and behavior. By leveraging trust, authority, and a sense of urgency, the scammers persuaded the employee to bypass standard verification procedures and approve the transfer. By the time the fraud was discovered, the funds were long gone.

What makes this case so alarming is how authentic the deception was—these are not basic scams but sophisticated and carefully planned schemes. The distinction between real and fake is increasingly blurred, leaving companies at risk. Without advanced, technology-based protections—such as deterministic account validation and deepfake fraud detection tools—any business could fall victim. In today’s landscape, even seeing and hearing can no longer be trusted.

Q. How can companies stay ahead of deepfake-related risks?

A. Staying ahead of deepfake attacks involves continuous investment in technological,
automated security solutions, regular employee training, and raising awareness and alertness to unexpected or unusual payment requests.

Businesses should also conduct regular security audits of their Master Vendor Files (MVF) and ERP systems, stay informed about emerging AI threats, and implement proactive measures such as automated fraud detection technologies. By adopting strong security protocols and keeping abreast of technological advancements, companies can stay one step ahead of cybercriminals and protect their financial systems from fraud, including deepfake attacks.

Q. How does nsKnox’s solution protect companies from deepfake attacks?

A. nsKnox PaymentKnox™ solution protects companies from deepfake attacks by offering a multi-layered approach to continuously verify account ownership and protect payments throughout the transaction journey. It focuses on deterministically verifying account data based on details from the banking system instead of trusting or trying to confirm the source.

nsKnox ensures that every transaction adheres to a verified payment process, making it extremely difficult for fraudsters to carry out deepfake impersonations or manipulations. This robust, multi-layered verification system offers companies a powerful defense against evolving threats like deepfakes, safeguarding payment security and maintaining trust across the entire B2B transaction process.

Deepfake AI presents a severe and growing threat to B2B payment security. Businesses must adopt robust safeguards and remain vigilant to mitigate the risks posed by this emerging technology. While the threat landscape evolves, so must our defenses against it.

The post Q&A: The Growing Threat of Deepfake AI in B2B Payments appeared first on nsKnox.

]]>
https://nsknox.net/blog/qa-the-growing-threat-of-deepfake-ai-in-b2b-payments/feed/ 0
BabcockPower Sets New Standards with Advanced B2BPayment Securityhttps://nsknox.net/resources/babcockpower-sets-new-standards/ https://nsknox.net/resources/babcockpower-sets-new-standards/#respond Thu, 16 Jan 2025 06:01:58 +0000 https://nsknox.net/?p=21048Implements nsKnox PaymentKnox™ Platform for Global Treasury & AP to Prevent B2B Payment Fraud

The post BabcockPower Sets New Standards with Advanced B2BPayment Security appeared first on nsKnox.

]]>
 

 

The post BabcockPower Sets New Standards with Advanced B2BPayment Security appeared first on nsKnox.

]]>
https://nsknox.net/resources/babcockpower-sets-new-standards/feed/ 0
Brunswick’s advanced B2B Payment Protection method for Global Treasury & APhttps://nsknox.net/resources/introducing-brunswicks-advanced-b2b-payment-protection-method-for-global-treasury-ap/ https://nsknox.net/resources/introducing-brunswicks-advanced-b2b-payment-protection-method-for-global-treasury-ap/#respond Mon, 02 Sep 2024 14:24:56 +0000 https://nsknox.net/?p=20959The nsKnox PaymentKnox™ platform offers Brunswick worldwide validation of vendor bank accounts, eliminating fraud throughout the entire payment process

The post Brunswick’s advanced B2B Payment Protection method for Global Treasury & AP appeared first on nsKnox.

]]>
 

 

The post Brunswick’s advanced B2B Payment Protection method for Global Treasury & AP appeared first on nsKnox.

]]>
https://nsknox.net/resources/introducing-brunswicks-advanced-b2b-payment-protection-method-for-global-treasury-ap/feed/ 0
Treasury Dragons Webinarhttps://nsknox.net/resources/treasury-dragons-webinar/ https://nsknox.net/resources/treasury-dragons-webinar/#respond Mon, 02 Sep 2024 11:43:14 +0000 https://nsknox.net/?p=20916B2B Payments fraud is a rapidly evolving challenge for corporations of every size - and treasury teams have a crucial role in preventing it.
That's why nsKnox and Treasury Dragons have joined forces for this captivating masterclass to examine the scale of the problem and look at ways of dealing with it.

The post Treasury Dragons Webinar appeared first on nsKnox.

]]>

B2B Payments fraud is a rapidly evolving challenge for corporations of every size – and treasury teams have a crucial role in preventing it.
That’s why nsKnox and Treasury Dragons have joined forces for this captivating masterclass to examine the scale of the problem and look at ways of dealing with it.

In this 45-minute webinar, we will:

  • Reveal the results of exclusive research by Treasury Dragons and launch the Payments Fraud Index
  • Explain current approaches to managing payment fraud
  • Examine the awareness of various payment fraud types and the treasury’s role in preventing them
  • Zoom in on the emerging threat from AI and Deepfakes
  • Show how new technologies can eliminate payment fraud

This event is for treasury, shared services, AP, and finance teams who want to increase their understanding of the threat and develop a strategy to harden their organization against attacks from fraudsters.

We’ll hear from:

  • Mike Hewitt, CEO of Treasury Dragons, will talk us through the data from our exclusive research project.
  • Nithai Barzam, President & COO of fintech-security company nsKnox, will offer advice on securing incoming and outgoing B2B payments against the growing menace of fraud.

If this content interests you – click HERE to read our full Payment Fraud Report

The post Treasury Dragons Webinar appeared first on nsKnox.

]]>
https://nsknox.net/resources/treasury-dragons-webinar/feed/ 0
Is Artificial Intelligence Changing the Fraud Landscape?https://nsknox.net/resources/is-artificial-intelligence-changing-the-fraud-landscape/ https://nsknox.net/resources/is-artificial-intelligence-changing-the-fraud-landscape/#respond Mon, 02 Sep 2024 11:42:57 +0000 https://nsknox.net/?p=20924We proudly present the 2024 Payment Fraud Report, in partnership with Treasury Dragons, featuring the Payment Fraud Index.
This report unveils the profound impact of artificial intelligence (AI) and Deepfake technologies on the payment fraud landscape.

The post Is Artificial Intelligence Changing the Fraud Landscape? appeared first on nsKnox.

]]>

We proudly present the 2024 Payment Fraud Report, in partnership with Treasury Dragons, featuring the Payment Fraud Index.
This report unveils the profound impact of artificial intelligence (AI) and Deepfake technologies on the payment fraud landscape.

The survey reveals that over 80% of corporations are increasingly concerned about the threat of fraud, with 67% experiencing payment fraud attacks.

Furthermore, 55% of respondents acknowledge that the rise of AI is significantly heightening the risk of payment fraud in their organizations.

There is also encouraging news: this year’s survey shows that Treasury departments are stepping up to their leading role in combating payment fraud, with an increase in the adoption of automated systems to counter these threats.

Fill out the form to download this invaluable asset and stay informed on the latest trends in payment fraud.

If this content interests you – click HERE to watch our Payment Fraud Masterclass

The post Is Artificial Intelligence Changing the Fraud Landscape? appeared first on nsKnox.

]]>
https://nsknox.net/resources/is-artificial-intelligence-changing-the-fraud-landscape/feed/ 0
Triple Defense: Safeguard Client Payments with Three Layers of Anti-Fraud Technologyhttps://nsknox.net/resources/triple-defense-safeguard-client-payments-with-three-layers-of-anti-fraud-technology/ Thu, 08 Aug 2024 15:51:02 +0000 https://nsknox.net/?p=20904Sophisticated crime rings are increasingly targeting corporate AP/AR departments, causing a surge in payment fraud that concerns financial institutions. In 2022, 84% of large companies were impacted by payment fraud, with some incidents resulting in losses exceeding $100 million. These losses expose banks to significant monetary and reputational risks. Despite heavy investments in cybersecurity, banks […]

The post Triple Defense: Safeguard Client Payments with Three Layers of Anti-Fraud Technology appeared first on nsKnox.

]]>
Sophisticated crime rings are increasingly targeting corporate AP/AR departments, causing a surge in payment fraud that concerns financial institutions. In 2022, 84% of large companies were impacted by payment fraud, with some incidents resulting in losses exceeding $100 million. These losses expose banks to significant monetary and reputational risks.

Despite heavy investments in cybersecurity, banks find their corporate clients remain vulnerable. Traditional protocols, such as bank letters, are outdated in the face of advanced fraud tactics like generative AI, call spoofing, and Deepfake attacks.

How can banks and credit unions protect both their corporate customers and themselves? The answer lies in adopting multiple layers of anti-fraud technology.

Download the white paper, “Triple Defense: Safeguard Client Payments with Three Layers of Anti-Fraud Technology,” to learn about the latest approaches to combating payment fraud and enhancing client security.

Click HERE to watch our video Interview

The post Triple Defense: Safeguard Client Payments with Three Layers of Anti-Fraud Technology appeared first on nsKnox.

]]>
Banks Can Do More to Protect Corporate Payments Against Rising Fraudhttps://nsknox.net/resources/banks-can-do-more-to-protect-corporate-payments-against-rising-fraud/ https://nsknox.net/resources/banks-can-do-more-to-protect-corporate-payments-against-rising-fraud/#respond Wed, 07 Aug 2024 06:27:05 +0000 https://nsknox.net/?p=20948Watch this interview with Nithai Barzam of nsKnox and American Banker to learn how banks can better protect their corporate clients from payment fraud and enhance their security measures.

The post Banks Can Do More to Protect Corporate Payments Against Rising Fraud appeared first on nsKnox.

]]>
As payment fraud rises, 86% of large companies with revenue over $1 billion have been impacted by payment fraud; and banks are under pressure to protect their corporate clients.
Despite heavy investment in fraud mitigation, financial institutions find their corporate clients remain vulnerable due to inadequate defenses.
Traditional bank letters, easily forged in the age of Deepfake AI, are no longer sufficient for verifying approved beneficiaries. Modern, technology-based anti-fraud solutions are essential for banks and credit unions to fortify their defenses.
In this video, you’ll gain insights on:

  • The weakest links in existing payment protocols
  • The inadequacies of AML and KYC regulations against global crime rings
  • Risks to banks from corporate clients’ weak anti-fraud protections
  • Enhancing security beyond signatory rights
  • Technology-based solutions that also offer new revenue opportunities

Watch this interview with Nithai Barzam of nsKnox and American Banker to learn how banks can better protect their corporate clients from payment fraud and enhance their security measures.

Watch Now

The post Banks Can Do More to Protect Corporate Payments Against Rising Fraud appeared first on nsKnox.

]]>
https://nsknox.net/resources/banks-can-do-more-to-protect-corporate-payments-against-rising-fraud/feed/ 0
Building and Maintaining Trust in Supply Chainshttps://nsknox.net/resources/building-and-maintaining-trust-in-supply-chains-webinar/ https://nsknox.net/resources/building-and-maintaining-trust-in-supply-chains-webinar/#respond Thu, 27 Jun 2024 13:50:47 +0000 https://nsknox.net/?p=20865Trust is at the heart of effective buyer and supplier ecosystems.
But as supply chains become more complex, electronic payment more prevalent and fraud more sophisticated, how do companies protect and maintain that trust?

The post Building and Maintaining Trust in Supply Chains appeared first on nsKnox.

]]>

Trust is at the heart of effective buyer and supplier ecosystems. 

But as supply chains become more complex, electronic payment more prevalent and fraud more sophisticated, how do companies protect and maintain that trust? 

Join VP of Finance, Eli Soffer from global software and services company Amdocs, alongside Vincent Beerman from working capital management platform provider Taulia, and Nithai Barzam on behalf of nsKnox – to discuss how Trust can be maintained in the face of these challenges.

The post Building and Maintaining Trust in Supply Chains appeared first on nsKnox.

]]>
https://nsknox.net/resources/building-and-maintaining-trust-in-supply-chains-webinar/feed/ 0
Secure Your Finances: Watch Our Webinar on Combating Deepfake AI Fraudhttps://nsknox.net/resources/secure-your-finances-join-our-webinar-on-combatting-deepfake-fraud/ https://nsknox.net/resources/secure-your-finances-join-our-webinar-on-combatting-deepfake-fraud/#respond Thu, 16 May 2024 09:27:12 +0000 https://nsknox.net/?p=20824In an era where artificial intelligence can mirror reality, are you equipped to tackle the emerging threat of deepfakes in the B2B landscape?

The post Secure Your Finances: Watch Our Webinar on Combating Deepfake AI Fraud appeared first on nsKnox.

]]>

In an era where artificial intelligence can mirror reality, are you equipped to tackle the emerging threat of deepfakes in the B2B landscape?

With deepfake AI technologies becoming an accessible tool to fraudsters, protecting your organization has never been more critical.

Click here to watch the recording of our insightful webinar featuring Debra R. Richardson and our President & COO, Nithai Barzam.
This session will empower you with expert insights and strategies for safeguarding your organization against AI-generated deceptions using proactive solutions.

What You’ll Discover:

  • Recognize the Threat: Learn how cybercriminals exploit deepfake AI technology to manipulate business communications and redirect payments
  • Spot the Signs: Identify the telltale signs of deepfake AI images, audio, and video
  • 5-Step Defense Strategy: Equip yourself with a robust plan to shield your business from email compromise
  • The Role of Automation: Understand why leveraging automated tools is essential in securing B2B payments

The post Secure Your Finances: Watch Our Webinar on Combating Deepfake AI Fraud appeared first on nsKnox.

]]>
https://nsknox.net/resources/secure-your-finances-join-our-webinar-on-combatting-deepfake-fraud/feed/ 0