Bad bots are skewing analytics via AI mimicry and advanced human-like behavior, beating traditional defenses. A better approach is needed.
Across the Asia Pacific region (APAC), businesses are investing significant resources into hyper-personalization and seamless digital experiences, yet these investments are undermined by automated bot traffic that corrupts data accuracy.
Independent analyses from content delivery networks and security reports estimate bots at 40–50% of web traffic in 2024–2025, with good bots (~15%) for indexing and malicious bots (~30-37%) scraping or disrupting sites.
Consequences include skewed analytics and degraded user experiences. This is no longer just an IT security challenge. In markets across APAC, where digital adoption rates are among the highest in the world, bot interference is eroding the very customer trust that brands spend millions to build.
Concert scams and fan loyalty erosion
High-profile concert ticket debacles have made headlines across the region. When bots crowd out real people, the anger lands on the brand.
Automated threats by bots distort the digital marketplace by scraping pricing data, hoarding inventory, and manipulating user interactions. Bots mimic human behavior such as scrolling, clicking and hesitating, poisoning the analytics that teams rely on to optimize campaigns.
Legacy security systems may even flag genuine fans as suspicious, leading to account cancellations and customer rants — further alienating the very customers that brands are trying to serve.
When bots crowd out real people, the anger does not land on the bot operator; it lands on the brand.
Why traditional defenses are failing
What has changed today? Modern bots no longer simply follow scripts — they learn autonomously:
- Attackers are leveraging AI to create bots that can analyze defenses, adjust their behavior in real time, and convincingly mimic human actions.
- Traditional countermeasures, built around static rules and signature-based detection, are not just insufficient; they are also inadvertently training these bots to become more sophisticated.
- Each interaction helps malicious operators identify and circumvent security parameters more effectively.
- This creates a vicious cycle. As defenses grow more complex, they impose greater friction on legitimate users while simultaneously providing a richer learning environment for attackers.
The result is a lose-lose scenario: genuine customers face a degraded experience while bots become harder to stop.
A strategic shift: the economic deterrent
To address this, a shift in approach should target attackers’ low cost and high volume model. Breaking that economic equation can neutralize the threat.
This involves a digital Proof of Work requirement that introduces a computational puzzle into interactions. For a legitimate human user’s device, the puzzle is invisible and effortless. However, for a bot attempting high-volume interactions, each puzzle adds computing power and time.
The scalability of this approach multiplies that computational burden across thousands of bots. When the cost of computing power required to solve these puzzles exceeds the potential value of the scalped ticket or scraped data, the attack becomes unsustainable.
Over time, this can make large-scale automation less attractive to bot operators.
Reclaiming digital integrity
The goal is simple. Making the cost of launching an attack outweigh the potential rewards will render it economically irrational for the operator to continue.
By implementing measures that force bots to burn their own resources, organizations can push attackers to move on to softer targets.
Critically, this approach preserves the customer experience. Whether the inventory in question is a concert ticket, a limited-edition fashion drop, or a flash sale, it remains accessible to genuine human consumers without the friction of CAPTCHAs, waiting rooms, or other mechanisms that punish the wrong people.
In the digital economy, trust is the foundation. Protecting it means ensuring that attackers, not customers, bear the cost of malicious activity.


