Introduction to Private Serverless Models
As mentioned earlier, each fal app runs in an isolated environment that gets voided right after each request (unless keep_alive
is set). But for certain use cases, it may be important to persist certain results after the run is over. In such scenarios, you can use the /data
volume, which is mounted on each machine and is shared across all your app is running at any point in time linked to your fal account.
import falfrom pathlib import Path
DATA_DIR = Path("/data/mnist")
```py {2,11,21,26}class FalModel( fal.App, requirements=["torch>=2.0.0", "torchvision"],
): machine_type = "GPU"
@fal.endpoint("/") def text_to_image(self, input: Input) -> Output: import torch from torchvision import datasets
already_present = DATA_DIR.exists() if already_present: print("Test data is already downloaded, skipping download!")
test_data = datasets.FashionMNIST( root=DATA_DIR, train=False, download=not already_present, ) ...
When you invoke this app for the first time, you will notice that Torch downloads the test dataset. However, subsequent invocations - even those not covered by the invocation’s keep_alive
- will skip the download and proceed directly to your logic.