We're in alpha · Starting with US & Canada · Shipping weekly — your feedback shapes RiseMe
Stage 4 Solutions logo
Stage 4 Solutions Verified
Consulting, Marketing, Technology Services

Senior Financial Data Analyst - Jupyter/Python & Cloud Models (Remote)

00, United StatesRemoteContractPosted 1 day agoVisa sponsorship available

We are looking for a Senior Financial Data Analyst for a large, global B2B high-tech company. In this role, you will lead the end-to-end conversion and automation of our Public Cloud Cost & Usage Long Range Planning (LRP) model. You will migrate the model into a maintainable, production-grade Jupyter Notebooks environment — complete with automated source-system pipelines, versioned model snapshots, and a Monte Carlo simulation engine for probabilistic forecasting.
This is a 5.5-month contract (extensions likely), 40 hours per week, remote role in the US.
This is a W2 employee of Stage 4 Solutions. Health benefits and 401K are offered.
Responsibilities
Excel LRP Model Analysis & Migration

  • Conduct a full structural audit of the existing Excel LRP model: document every driver, input assumption, formula chain, interdependency, and output metric.
  • Map the driver hierarchy — identifying primary cost drivers (e.g., workload growth, instance mix, Unit Price per Region, SKUs per Region per each hyperscaler) and their upstream input sources.
  • Re-engineer the entire model in Python within a modular Jupyter Notebooks architecture, preserving exact calculation fidelity while eliminating manual steps.
  • Validate migrated outputs against Excel results line-by-line before decommissioning the spreadsheet workflow.

Source System Pipeline Engineering

  • Design and build automated data ingestion pipelines for all model input feeds, including:
  • Cost & Usage APIs — to the homegrown FinOps Public Cloud Control Tower, which hosts actual Cost & Usage of AWS, Azure, and GCP
  • Migration demand— Sales input of Renewal activities, which trigger migration to public cloud, and the relevant workload forecasts
  • Rate card feeds —Net Effective Cost for each Hyperscaler Region per SKU

Driver-Based Calculation Engine

  • Build a parameterised calculation engine that accepts driver assumptions as structured inputs (YAML / JSON config or a dedicated assumptions notebook) and propagates them through the model deterministically.
  • Produce standard LRP output tables: monthly/quarterly/annual spend by each hyperscaler region and by Purpose.

Model Versioning & Variance Analysis Framework

  • Implement a versioning system that saves a complete, immutable snapshot of every model run — capturing inputs, assumptions, driver values, and full output tables — with a timestamp and named version tag.
  • Build a variance analysis module that can compare any two saved versions and decompose the difference into named drivers, including:

Monte Carlo Simulation Engine

  • Design and implement a Monte Carlo simulation layer that treats key model drivers as probability distributions rather than point estimates.
  • Calibrate distributions (normal, log-normal, triangular, uniform, or empirical) against historical actuals and forward-looking business intelligence.
  • Run simulations (target: 10,000+ iterations) to produce probabilistic output distributions for total cloud spend and major cost categories.
  • Generate P10 / P50 / P90 confidence interval outputs for executive scenario planning and risk-adjusted budget setting.
  • Build sensitivity/tornado analysis to rank drivers by their contribution to forecast variance.

Reporting, Visualisation & Stakeholder Enablement

  • Build interactive output dashboards within notebooks covering: waterfall variance bridges, scenario fan charts, driver sensitivity charts, and spend-by-category treemaps.
  • Automate recurring model refresh workflows so Finance team members can re-run the full model with a single command.

Requirements

  • 5+ years in financial modelling, FP&A, or cloud economics roles.
  • 3+ years of hands-on Python / Jupyter financial modelling (not data science alone).
  • Demonstrable Git proficiency; experience with code review in a team modelling context is preferred.
  • Experience with Jupyter Notebooks Financial Modelling — Mandatory
  • Experience with Excel LRP / Driver-Based Model Experience — Mandatory
  • Experience with Source System Pipeline Engineering
  • Experience with Model Versioning & Variance Analysis
  • Bachelor's degree in Finance, Economics, Mathematics, Computer Science, or a quantitative discipline. CFA, CIMA, CPA, or AFP/FP&A certification is a plus.

Please submit your resume to our network at https://www.stage4solutions.com/careers/ (apply to the
Senior Financial Data Analyst - Jupyter/Python & Cloud Models (Remote)
)
role
.
Please feel free to forward this job post to others you think may be interested.
Stage 4 Solutions is an equal-opportunity employer.
We celebrate diversity and are committed to providing employees with an inclusive environment that is free of discrimination and harassment. All employment decisions are based on the job requirements and candidates’ qualifications, without regard to race, color, religion/belief, national origin, gender identity, age, disability, marital status, genetic information or other applicable legally protected characteristics.
Compensation: $103/hr – $107.14/hr on W2

Ready to apply?
You'll be redirected to Stage 4 Solutions's application page.