PyRun Features

PyRun's core features are designed for simplicity, scalability, and performance on your cloud infrastructure.

Effortless Execution

Run standard Python code seamlessly in the cloud. PyRun handles server management, scaling, and optimization automatically on your AWS account.

Integrated & Automated

Enjoy a VS Code-like web IDE, automated runtime builds (YML/Dockerfile), and integrated configuration for tools like Lithops, simplifying setup.

Scalable & Versatile

Leverage first-class support for Lithops (FaaS) and Dask. Easily scale from simple scripts to massively parallel computations on AWS. (Ray, Cube coming soon!)

Real-Time Monitoring

Gain instant insights with detailed metrics for CPU, memory, disk, network usage, and task timelines. Understand and optimize workloads effectively.

Cost-Effective

Pay only for the AWS resources you use. PyRun's automation reduces infrastructure overhead and optimizes resource utilization, saving you time and money.

Data Cockpit

Easily select, partition, and manage data from S3 or public registries using the integrated DataPlug library for streamlined parallel processing with Lithops.

Seamless Setup

Your Cloud IDE: Launch, Code, Run 🚀

PyRun provides a VS Code-like web interface with a file browser, editor, and terminal. Get started instantly with pre-configured Dask or Lithops templates.

Customize your runtime effortlessly by editing `environment.yml` or adding a `Dockerfile`. PyRun automatically detects changes and rebuilds the environment, letting you focus solely on your code.

Explore Workspaces
PyRun Integrated Workspace showing code editor and one-click run
PyRun Workflow: Code, Run, Monitor
Simplified Workflow

Code, Run, Monitor, Repeat 🔄

PyRun streamlines your cloud development cycle. Focus on your Python code within the integrated workspace. Define dependencies easily with `environment.yml` or a Dockerfile.

Click 'Run' to execute on your AWS account, leveraging Lithops or Dask. Monitor progress in real-time, analyze results, and iterate quickly. It's cloud computing made simple.

See the Basic Workflow
Data Handling Made Easy

Introducing Data Cockpit ✨

PyRun's Data Cockpit simplifies preparing data for distributed cloud jobs. Select datasets from your S3, public registries, or upload directly.

Intelligently partition your data using the integrated DataPlug library, optimize batch sizes with benchmarking, and generate metadata ready for parallel processing frameworks like Lithops. Spend less time on data prep, more on analysis.

Learn About Data Cockpit
PyRun Data Cockpit showing data selection and partitioning

Ready to Accelerate Your Python Workloads?

Experience PyRun's features firsthand. Sign up and connect your AWS account in minutes to run scalable Python jobs effortlessly.

Get Started for Free