We're in beta · Starting with US & Canada · Shipping weekly — your feedback shapes RiseMe
Chief Detective logo
Chief Detective Verified
Advertising Services

Analytics Platform Engineer

Oregon, Ohio, United StatesHybridFull Time$90,000–$140,000 /yrPosted 1 month ago

Compensation estimateAI

See base, equity, bonus, and total comp estimates for this role — free, no credit card.

Sign up to see compensation estimate

Analytics Platform Engineer 🚀 at Chief Detective LLC

Location:
Oregon based Remote-Hybrid (preferred; occasional in-person work), Other considered states are California, Colorado, and Washington.

Salary Range:
$90,000–$140,000 (DOE)

Other Requirements:
Applicants must be currently authorized to work in the United States on a full-time basis, this position is not eligible for visa sponsorship (now or in the future).

OVERVIEW

Chief Detective is a growth-focused agency and deep brand partner working with D2C brands (notably in the apparel and beauty spaces).

We are hiring an Analytics Platform Engineer to own the systems that power accurate, fast, and trusted performance reporting across our internal team and client work. This role sits at the intersection of data engineering (pipelines and warehouses), analytics engineering (dbt modeling and data quality), and light DevOps/platform ownership (automation, reliability, and security).

Our core tech stack relies on
BigQuery
,
GCP
,
dbt Cloud
, and
Looker Studio
, with
ClickUp
used for project management.

THE ROLE

You will own our "data product" layer end-to-end, from ingestion and modeling to reporting and reliability.

Key Responsibilities:

- Pipeline Management:
Build and maintain reliable batch and scheduled data pipelines into BigQuery.
- dbt Cloud Ownership:
Manage production workflows, including scheduled/cron-based runs and CI-style validation for safe changes (tests, controlled deploys).
- Data Modeling:
Design and maintain dbt models (from staging to marts), ensuring thorough documentation and testing.
- Reporting & Optimization:
Build and support reporting in Looker Studio. Performance-tune and utilize cost-aware modeling (recognizing that Looker Studio queries drive BigQuery usage).
- Reliability & Monitoring:
Implement monitoring and alerting for data freshness and pipeline failures.
- GCP Automation:
Build lightweight automation and integrations in GCP (e.g., Cloud Functions) when necessary.
- Security & Access:
Manage access patterns and operational safety in GCP, including IAM, service accounts, and secret management.
- Stakeholder Collaboration:
Work closely with our teams and leadership to define and maintain trusted KPI definitions and analytics deliverables for clients.
- Project Management:
Operate within ClickUp, maintaining clear scopes, milestones, and written updates.

Requirements:

- Strong SQL proficiency, with hands-on
BigQuery
experience strongly preferred.
- AI Tool Fluency
: Comfortable leveraging AI-assisted coding and agentic tools (such as Cursor, Codex or Claude) to accelerate development, automate boilerplate, and debug efficiently.
- Proven experience running
dbt
in production (building models, writing tests, and managing job scheduling in dbt Cloud).
- Working knowledge of
Google Cloud (GCP)
fundamentals, including projects, permissions, and service accounts.
- Comfort shipping small automation services (often in Python) and interacting with various APIs.
- Exceptional debugging skills: the ability to trace a "dashboard is wrong" ticket all the way back through models and pipelines.
- Familiarity with Git and standard Pull Request (PR) workflows.
- Strong communication and documentation habits (maintaining runbooks, metric definitions, and system notes).

Preferred Qualifications:

  • Degree in Computer Science, Information Systems, Mathematics, a related quantitative field, or equivalent practical experience.
  • Relevant industry certifications (e.g., Google Cloud Professional Data Engineer).
  • Experience with marketing or ecommerce analytics data (ad platforms, Shopify, and GA-style event data).
  • Data observability patterns (setting freshness SLAs, anomaly detection, and alert tuning).
  • Infrastructure-as-code experience (e.g., Terraform) or CI/CD pipeline ownership.
  • Familiarity with GCP workflow tools (Cloud Scheduler/Workflows, Pub/Sub).

BENEFITS

- Competitive Salary:
$90k–$140k, commensurate with experience.
- Comprehensive Benefits:
Group medical, dental, and vision coverage; short and long-term disability; life insurance; 401(k) eligibility after one year of service with matching contributions; paid time off; sick leave; an Employee Assistance Program, and more.
- Professional Growth:
Opportunity to shape our analytics future, access ongoing professional development, and scale your career quickly.
- Innovative Culture:
Work with a passionate, entrepreneurial team dedicated to building game-changing analytics solutions in a flexible, supportive environment.
- If you’re ready to leverage data and innovation to fuel explosive growth and love the idea of working closely with smart, driven colleagues, we can’t wait to meet you.
- Job Location:
While this is a remote position (outside of occasional requests to work in-person), you ideally will
*live*

and
*work*
in the state of Oregon, but strong resumes from California, Colorado, and Washington states will also be considered.

If you’re ready to leverage data and innovation to fuel explosive growth and love the idea of working closely with smart, driven colleagues, we can’t wait to meet you.

Learn more about Chief Detective at www.chiefdetective.com!

Ready to apply?
You'll be redirected to Chief Detective's application page.

Similar roles