Limited time offer! Share your link and get $5 of free AI credits!
LINUX RUNTIME

AugeLab Linux API

Deploy your .pmod projects directly on Linux environments. Our specialized API provides hardware-accelerated inference, Python bindings, and containerized deployment options for seamless industrial integration.

$ uv pip install studio --extra-index-url https://pyrepo.augelab.com --extra-index-url https://download.pytorch.org/whl/cpu --index-strategy unsafe-best-match
$ python my_project.py
# Starting AugeLab Runtime...
[INFO] Using CUDA 12.8
[INFO] Model loaded successfully
[PROCESS] Running inference at 12ms

Edge Devices & Embedded Deployments

Ship computer vision models to ARM and Linux edge devices with a lightweight runtime, predictable performance, and automation-friendly IO.

Raspberry Pi (ARM64)
Headless deployments for lineside monitoring, kiosks, and compact inspection stations.
NVIDIA Jetson (CUDA)
High-throughput inference on embedded GPUs for real-time industrial automation.
Industrial PCs & Gateways (x86_64)
Run models near the camera with on-prem reliability and easy integration to PLC/SCADA.
Containers & Orchestration
Docker/Kubernetes-friendly runtime to standardize installs across fleets.

Native Performance on Any Hardware

Python 3.12 Core

Modern Python support with ultra-fast 'uv' package management for clean and reliable environments.

CUDA v12.8 Accelerated

Optimized OpenCV-GPU modules and CUDA kernels for maximum throughput on NVIDIA Jetson and GPU servers.

Docker Ready

Fully containerized deployment workflows with pre-built Ubuntu-based CUDA images.

Seamless Implementation

Use our Scenario API to integrate vision tasks into your custom applications.

from studio import StudioScenario

# Initialize scenario with your key
scenario = StudioScenario(verification_code="YOUR_KEY")
scenario.load_scenario("inspection.pmod")

while True:
    results = scenario.run(())
    if results[0]:
        break

# Clean up resources
scenario.cleanup()

Enterprise Deployment

On-Premise

Full control over your data and infrastructure with local deployments.

Kubernetes

Scale across your infrastructure with orchestrated containers.

Edge API

Lightweight binary execution for dedicated edge sensors.