Run TimeGPT distributedly on top of Ray

Ray is an open source unified compute framework to scale Python workloads. In this guide, we will explain how to use TimeGPT on top of Ray.

Outline:

  1. Installation

  2. Load Your Data

  3. Initialize Ray

  4. Use TimeGPT on Ray

  5. Shutdown Ray

1. Installation

Install Ray through Fugue. Fugue provides an easy-to-use interface for distributed computing that lets users execute Python code on top of several distributed computing frameworks, including Ray.

Note

You can install fugue with pip:

pip install fugue[ray]

If executing on a distributed Ray cluster, ensure that the nixtla library is installed across all the workers.

2. Load Data

You can load your data as a pandas DataFrame. In this tutorial, we will use a dataset that contains hourly electricity prices from different markets.

import pandas as pd 
df = pd.read_csv(
    'https://raw.githubusercontent.com/Nixtla/transfer-learning-time-series/main/datasets/electricity-short.csv',
    parse_dates=['ds'],
) 
df.head()
unique_iddsy
0BE2016-10-22 00:00:0070.00
1BE2016-10-22 01:00:0037.10
2BE2016-10-22 02:00:0037.10
3BE2016-10-22 03:00:0044.75
4BE2016-10-22 04:00:0037.10

3. Initialize Ray

Initialize Ray and convert the pandas DataFrame to a Ray DataFrame.

import ray
from ray.cluster_utils import Cluster
ray_cluster = Cluster(
    initialize_head=True,
    head_node_args={"num_cpus": 2}
)
ray.init(address=ray_cluster.address, ignore_reinit_error=True)
2024-05-10 11:09:17,240 WARNING cluster_utils.py:157 -- Ray cluster mode is currently experimental and untested on Windows. If you are using it and running into issues please file a report at https://github.com/ray-project/ray/issues.
2024-05-10 11:09:19,076 INFO worker.py:1564 -- Connecting to existing Ray cluster at address: 127.0.0.1:63694...
2024-05-10 11:09:19,092 INFO worker.py:1740 -- Connected to Ray cluster. View the dashboard at 127.0.0.1:8265