Senior Data Engineer
Presidential Innovation Fellows
Data Science
United States
USD 130k-130k / year
Posted on Aug 7, 2025
Job Summary
We are looking for a highly versatile and hands-on Senior Data Engineer to lead the ingestion, transformation, infrastructure management, and system resiliency efforts for our growing data platform. This role sits at the intersection of data engineering, infrastructure reliability, and systems integration. Ideal for someone who thrives in full-stack data environments and wants to own data flow from source to warehouse.
You’ll be responsible for designing robust data pipelines, optimizing Snowflake and AWS performance, and implementing best practices for database reliability, disaster recovery, and automated data quality.
Duties/Responsibilities
Bamboo is committed to the principles of equal employment. We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations.
We are looking for a highly versatile and hands-on Senior Data Engineer to lead the ingestion, transformation, infrastructure management, and system resiliency efforts for our growing data platform. This role sits at the intersection of data engineering, infrastructure reliability, and systems integration. Ideal for someone who thrives in full-stack data environments and wants to own data flow from source to warehouse.
You’ll be responsible for designing robust data pipelines, optimizing Snowflake and AWS performance, and implementing best practices for database reliability, disaster recovery, and automated data quality.
Duties/Responsibilities
- ETL & Integration
- Design and implement scalable ETL/ELT pipelines to bring external data (APIs, files, logs) into Snowflake
- Normalize semi-structured data formats (JSON, CSV, CDC files) into well-structured Snowflake schemas
- Integrate RESTful APIs, manage auth, pagination, throttling, and payload transformation
- Snowflake & Data Warehousing
- Own Snowflake schema design, warehouse tuning, cost control, and query optimization
- Design and implement data partitioning strategies to improve performance and reduce cost
- Manage backup, snapshot, and restore strategies across data warehouse and staging environments
- Infrastructure & Reliability
- Manage and optimize AWS infrastructure components (S3, RDS, EC2, IAM, CloudWatch)
- Lead disaster recovery planning, including automated backups, restore testing, and failover strategies
- Build and maintain archival and historical data restore processes
- Monitoring & Quality
- Implement monitoring, alerting, and observability across ETL and warehousing layers
- Build and maintain automated data quality checks (null checks, referential integrity, drift detection, etc.)
- Maintain documentation of orchestration processes, dependencies, and runbooks
- Collaboration & Governance
- Work with cross-functional teams (data analysts, engineers, DevOps, and business) to ensure data trust and system reliability
- Support security, access control, compliance, and auditability in data platforms
- Manage and support data copy processes between environments (e.g., UAT → PROD, historical → active)
- Strong SQL and Python skills for data transformation, scripting, and automation
- Expert-level experience with Snowflake, including schema design, warehouse tuning, and performance optimization
- Deep understanding of API integration patterns and experience with tools such as Postman or Swagger
- Familiarity with ingestion frameworks such as dbt, Fivetran, or custom ETL tools
- Experience with orchestration tools like Airflow or Kestra
- Solid knowledge of AWS services including S3, EC2, RDS, CloudWatch, and IAM
- Ability to interpret and troubleshoot database performance reports such as AWR or query execution plans
- Understanding of database internals, indexing, partitioning strategies, and configuration tuning
- Experience implementing disaster recovery, database backup/restore, and data archival procedures
- Ability to design and maintain automated data quality checks
- Strong documentation and collaboration skills
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field
- 5+ years of experience in data engineering, data infrastructure, or similar roles
- Experience delivering and maintaining data pipelines in cloud-based production environments
- Track record of managing Snowflake performance and AWS-hosted systems at scale
- Demonstrated ability to work across data ingestion, transformation, and infrastructure reliability
- Master’s degree in Computer Science, Information Systems, Engineering, or a related field
- Experience in at least one domain-specific data model such as Insurance, CRM, or Call Center systems
- Familiarity with real-time or event-driven data architectures
- Exposure to data governance, compliance, and access control frameworks
- Experience with infrastructure-as-code tools like Terraform or CloudFormation
- Hands-on experience with CI/CD practices for data workflows
- Relevant certifications such as SnowPro Core, AWS Certified Data Analytics – Specialty, or Solutions Architect – Associate
- Prolonged periods of sitting or standing at a desk and working on a computer.
Bamboo is committed to the principles of equal employment. We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations.