β Watch it directly on YouTube
β
Logistic Regression churn prediction
β
FastAPI REST endpoint
β
OpenAI summarisation for human-readable explanations
- Install requirements:
pip install -r requirements.txt- Run locally:
uvicorn app:app --reloadapp:apploads the FastAPI app fromapp.py.--reloadenables auto-reload for development (useful for code changes, not for production).
- Test with:
curl -X POST http://localhost:8000/predict -H "Content-Type: application/json" -d '{
"age": 45.0,
"tenure": 24.0,
"monthly_charges": 79.85,
"total_charges": 1800.0,
"contract_type": "Month-to-month",
"payment_method": "Electronic check"
}'Set your OpenAI API key:
export OPENAI_API_KEY="your_actual_openai_api_key"Python_GML_ML_Pipeline/
βββ app.py # FastAPI application
βββ requirements.txt # Python dependencies
βββ logistic_model.pkl # Trained ML model (placeholder)
βββ scaler.pkl # Feature scaler (placeholder)
βββ Dockerfile # Container configuration
βββ .gitignore # Git ignore rules
βββ README.md # This file
- Build the Docker image:
docker build -t python-gml-ml-pipeline .- Run the Docker container:
docker run -p 8000:8000 -e OPENAI_API_KEY="your_actual_openai_api_key" python-gml-ml-pipeline- Access the FastAPI app:
- API: http://localhost:8000
- Health Check: http://localhost:8000/health
- Interactive Docs: http://localhost:8000/docs
- OpenAPI Schema: http://localhost:8000/openapi.json
Execution Flow:
- Load trained model and scaler (using
joblib). - API endpoint receives JSON data (new user or input data).
- Dataframe creation & scaling for consistency with training.
- Model predicts churn probability (or other target).
- Returns JSON response with prediction for integration into apps or dashboards.
β Run locally:
uvicorn app:app --reloadβ Run in Docker:
docker build -t python-gml-ml-pipeline .
docker run -p 8000:8000 -e OPENAI_API_KEY="your_actual_openai_api_key" python-gml-ml-pipelineKey reasons to use FastAPI:
- Modern async Python framework
- Automatic OpenAPI schema & Swagger docs
- Production-grade performance
- Can be integrated into microservices/SaaS
Made with β€οΈ by Pierre-Henry Soria. A super passionate & enthusiastic Problem-Solver / Senior Software Engineer. Also a true cheese π§, ristretto βοΈ, and dark chocolate lover! π
logistic_model.pklandscaler.pklare placeholders. Train and export your own models usingjoblib.dump.- This project is a modern, production-ready ML pipeline, showcasing deployment and explainability best practices for 2025 and beyond.
βAI models become valuable when theyβre deployable, explainable, and integrated into real products that create business value.β
