AugeLab Linux API
Deploy your .pmod projects directly on Linux environments. Our specialized API provides hardware-accelerated inference, Python bindings, and containerized deployment options for seamless industrial integration.







Edge Devices & Embedded Deployments
Ship computer vision models to ARM and Linux edge devices with a lightweight runtime, predictable performance, and automation-friendly IO.
Native Performance on Any Hardware
Python 3.12 Core
Modern Python support with ultra-fast 'uv' package management for clean and reliable environments.
CUDA v12.8 Accelerated
Optimized OpenCV-GPU modules and CUDA kernels for maximum throughput on NVIDIA Jetson and GPU servers.
Docker Ready
Fully containerized deployment workflows with pre-built Ubuntu-based CUDA images.
Seamless Implementation
Use our Scenario API to integrate vision tasks into your custom applications.
fromstudioimportStudioScenario
# Initialize scenario with your key
scenario = StudioScenario(verification_code="YOUR_KEY")
scenario.load_scenario("inspection.pmod")whileTrue:
results = scenario.run(())
ifresults[0]:
break
# Clean up resources
scenario.cleanup()
Enterprise Deployment
On-Premise
Full control over your data and infrastructure with local deployments.
Kubernetes
Scale across your infrastructure with orchestrated containers.
Edge API
Lightweight binary execution for dedicated edge sensors.