Senior Data Engineer
What We Offer:
- Canteen Subsidy
- Night Shift allowance as per process
- Health Insurance
- Tuition Reimbursement
- Work-Life Balance Initiatives
- Rewards & Recognition
Role Overview: We are seeking an experienced Data Engineer with strong expertise in Google BigQuery, Dataform, and modern data pipelines. The ideal candidate will design, develop, and maintain scalable data models, ETL/ELT processes, and analytics-ready datasets to support our reporting, analytics, and business intelligence use cases.
Key Responsibilities
- Data Engineering & Pipeline Development
- Design and build scalable, high-performance ELT pipelines using Dataform on BigQuery.
- Develop and optimize SQLX models, incremental pipelines, and data transformations.
- Implement data quality checks, assertions, and validation frameworks within Dataform.
- Manage pipeline orchestration, scheduling, CI/CD, and environment configurations.
- Build and maintain dimensional models (Star/Snowflake schema) for analytics and reporting.
BigQuery Development & Optimization
- Create, optimize, and maintain BigQuery datasets, tables, views, partitions, clusters, and materialized views.
- Optimize query performance, reduce cost, and manage slot utilization.
- Implement best practices for data governance, security, and access control.
- Work with large-scale datasets and ensure efficient processing.
Monitoring & Maintenance
- Monitor pipeline performance, identify bottlenecks, and resolve issues proactively.
- Ensure high data availability, reliability, and accuracy across systems.
Required Skills & Qualifications
Technical Skills
- 5+ years of experience as a Data Engineer or similar role.
- Strong hands-on expertise with Google BigQuery (mandatory).
- Deep knowledge of Dataform (or DBT experience, with Dataform adoption).
- Proficiency in SQL (advanced-level query optimization, analytics functions, modeling).
- Experience with ELT/ETL architecture, data modeling (Star, Snowflake).
- Hands-on with GCP services like Cloud Storage, Cloud Composer, IAM, Cloud Functions (good to have).
- Experience building incremental data pipelines and managing dependencies.
- Familiarity with Git, CI/CD workflows, and version-controlled development.
- Good understanding of data warehousing concepts and best practices.
Soft Skills
- Strong analytical & problem solving skills.
- Ability to manage multiple tasks in an agile environment.
- Excellent communication and documentation skills.
- Self-driven and able to work with minimal supervision.
To apply for this job email your details to priya.mittal@etechtexas.com
No FAQ data found for this job.