Getting Your Model into Production: A Technical Guide to Slate

Getting Your Model into Production: A Technical Guide to Slate

You've built a model. It works. Now what?

 

This isn't about building models or picking algorithms. This is about the mechanics of getting a working model into Epic's Nebula platform so it can actually drive outcomes. Specifically, it's about Slate, the Docker-based environment that gets you from a pickle file to production.

What is Slate?

Slate is a Docker image built on Ubuntu with Python pre-installed. It contains Epic's tools for deploying models to Nebula and mirrors the exact runtime environment your model will run in. Think of it as: if it works in Slate, it'll work in production.

The Technical Setup

Slate uses a specific directory structure:

/home/eccp/workspace/

├── YourModelProject/

│   ├── source/

│   │   └── modelcode.py          # Your prediction logic

│   ├── resources/

│   │   ├── train_data.tsv        # Sample data for testing

│   │   ├── ondemand.json         # Formatted request payload

│   │   └── your_model.pkl        # Serialized model

│   └── pip_packages/             # Project-specific dependencies

The dsutils toolkit handles everything:

  • dsutils new-project - Creates the project structure
  • dsutils install <package> - Adds Python packages to pip_packages/
  • dsutils make-ondemand-payload - Converts test data to Nebula's JSON format
  • dsutils ondemand - Runs your model locally with test data
  • dsutils archive - Packages everything into a .zip for Hyperspace upload

The Workflow

1. Set up the container

docker run -it -v /path/to/your/project:/home/eccp/workspace slate-image

Volume mounting connects your local files to the container. You'll need this.

2. Write your model code

Your model_code.py needs a predict() function:

from epic import parcel

import pickle

 

def predict(data):

    # Define input features (must match PAF configuration)

    ordered_columns = ['age', 'lab_value', 'medication_dose']

   

    # Create parcel and extract DataFrame from Epic Chronicles data

    df = parcel.from_ondemand(ordered_columns)

   

    # Preprocess input - feature engineering happens here

    df['mean_arterial_pressure'] = (df['systolic_bp'] + 2 * df['diastolic_bp']) / 3

    df['calculated_feature'] = df['lab_value'] * df['medication_dose']

   

    # Load trained model

    with open('resources/your_model.pkl', 'rb') as f:

        model = pickle.load(f)

   

    # Predict using trained model

    predictions = model.predict(df)

   

    # Prepare output for Nebula

    return parcel.pack_ondemand_response(predictions)

Feature engineering happens here, not in SQL. Keep transformations with your model code.

3. Test locally

Put sample data in resources/train_data.tsv, then:

dsutils make-ondemand-payload

PYTHONPATH=pip_packages dsutils ondemand

That PYTHONPATH variable is critical if you've installed custom package versions—it ensures your packages load first, not the base runtime versions.

4. Package and deploy

dsutils archive

This creates archive/YourOrg.ModelName.1.0.0.zip. Hand that to IT for Hyperspace upload.

 

Things That Will Trip You Up

Package versions: Slate versions (DSRT 4, DSRT 5) come with pre-installed packages. If you need different versions, use dsutils install tensorflow==2.16.1 and remember the PYTHONPATH=pip_packages prefix when testing.

Permissions: Docker volume mounting can cause permission errors. Quick fix for testing: chmod 777 the problem files inside the container. For production, work with IT on proper permissions.

PAF alignment: Your parcel.ordered_columns must exactly match the Epic PAF records IT configures. Column mismatches will fail silently until deployment.

The Point

There's no model to tune here. But for any model to drive actual outcomes, it needs to embed scores into the workflow. Slate is how you get there with Epic.

Your deliverable is a working .zip file that passes local testing. Everything after that (Hyperspace upload, Nebula deployment) happens on the IT side.


#MachineLearning #Healthcare #Epic #MLOps #DataScience

This is incredible. Thanks for sharing!

To view or add a comment, sign in

More articles by Riyaz M.

Explore content categories