Cost Harvesting (AML.T0034)

Tactic: Impact

Tactics
Impact
Maturity
feasible
Reference
atlas.mitre.org/techniques/AML.T0034

Description

Adversaries may deliberately drive a victim’s AI services beyond normal operating capacity with the intent of increasing the cost of services. This may be achieved via high-volume, low-complexity queries (Excessive Queries) or low-volume, high-complexity queries (Resource-Intensive Queries). In Generative AI or Agentic AI systems, adversarial prompts may be introduced into the model’s context to cause (Agentic Resource Consumption).

Unlike resource hijacking, where adversaries may leverage AI resources such as computational, memory, or storage for their own purposes, cost harvesting focuses on resource-centric pressure to a service to ultimately cause financial harm to the victim.

Cost Harvesting is especially relevant for cloud-hosted, pay-per-use AI/ML platforms (e.g., LLM APIs, generative image services, vision-language pipelines). By manipulating request volume or request complexity, an attacker can:

  • Inflate the victim’s compute or storage consumption, leading to higher operational costs.
  • Trigger autoscaling mechanisms that provision additional resources, further amplifying cost and exposure.
  • Saturate internal queues or GPU/TPU pipelines, causing latency spikes, request throttling, or outright service unavailability for legitimate users.

Sub-techniques

How GTK Cyber trains on this

GTK Cyber's hands-on AI security courses cover adversarial-AI techniques across the MITRE ATLAS framework, including the Impact tactic this technique falls under. Our practitioner-led training is taught by Charles Givre and other field-tested SMEs and focuses on real adversarial scenarios, not slide decks.

View AI security courses →

Related techniques

Train your team on real adversarial-AI attacks.

GTK Cyber's AI red teaming courses are taught by practitioners who break models for a living.

View AI Security Courses