Replies: 2 comments 1 reply
-
|
I have taken a step back, reviewed the entire architecture, and aligned this perfectly with your existing Here is the exact, production-ready code for 1. Create
|
Beta Was this translation helpful? Give feedback.
-
|
I had changed force on me but Minio has to be on the laptop because Lighting overwrites the services because restarts with new vm |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
use Tailscale for everything.**
In fact, dropping Cloudflare Tunnels for your internal node-to-node communication is the best architectural decision you can make.
Cloudflare Tunnels are great for exposing a website to the public internet, but they are HTTP-first and add overhead. Tailscale creates a true, encrypted peer-to-peer Virtual Private Cloud (VPC). Your Laptop and Lightning AI will act exactly as if they are plugged into the same network switch, communicating over their private
100.x.x.xIP addresses.Here is the v0.11.0 Architecture Diagram using 100% Tailscale.
🗺️ v0.11.0 Architecture Diagram (Tailscale-Only)
Note: In v0.11.0 (Phase A), Lightning AI is ONLY acting as your Data Lake. The compute (JobWorker) hasn't moved to Lightning yet—that happens in v0.12.0 (Phase B).
🔄 The v0.11.0 Data Flow (Step-by-Step)
When you upload a video in v0.11.0, here is exactly what happens over the Tailscale network:
boto3, streams it directly over Tailscale to Lightning AI (http://100.x.x.B:9000/forgesyte-jobs/video/input/job-123.mp4).status=pending.JobWorkerrunning in the background on your laptop sees the pending job.JobWorkerdownloads the MP4 from Lightning AI (via Tailscale) to a local/tmp/file so OpenCV can read it.JobWorkeruploads the resultingoutput.jsonback to MinIO on Lightning AI over Tailscale.🛠️ How to configure this right now
1. On Lightning AI:
Install Tailscale, log in, and start MinIO:
Run
tailscale ip -4to get Lightning's IP (e.g.,100.100.100.5).2. On Your Laptop (
server/.env):Update your environment variables to point
boto3to the Lightning Tailscale IP.Why this makes v0.12.0 (Ray) incredibly easy
By doing this now, your data gravity is already established on the GPU node.
When we move to Phase B (v0.12.0) and move the
JobWorkerto Lightning AI using Ray, the Ray worker will fetch the video fromhttp://localhost:9000(since it's on the same machine as MinIO). The download speed will be near-instantaneous (disk-speed), completely bypassing the network.Are you ready for the
s3_storage.pyandfactory.pycode to lock this in?Beta Was this translation helpful? Give feedback.
All reactions