Lead Data Engineer
Database Developer is needed in Santa Monica, United States.
Client: Disney Entertainment and ESPN Product & Technology
Location: Santa Monica, CA
Contract: undefined
Job Description
The Lead Data Engineer partners with business, analytics, and infrastructure teams to design and build data pipelines for Content performance metrics. Responsibilities include sourcing internal and external data, designing table structures, defining ETL strategies, and implementing automated data quality checks. Additionally, this role mentors junior data engineers in pipeline development.
Responsibilities
- Lead the successful design and implementation of complex technical problems.
- Lead and contribute to the design and growth of our Data Products and Data Warehouses around Content movements and metrics.
- Use sophisticated analytical thought to exercise judgement and identify innovative solutions.
- Partner with technical and non-technical colleagues to understand data and reporting requirements.
- Collaborate with Data Product Managers, Data Architects, and other Data Engineers to design, implement, and deliver successful data solutions.
- Design table structures and define ETL pipelines to build performant Data solutions that are reliable and scalable.
- Develop Data Quality checks.
- Develop and maintain ETL routines using ETL and orchestration tools such as Airflow.
- Serve as an advanced resource to other Data Engineers on the team and mentor junior members.
Basic Qualifications
- 7+ years of data engineering experience developing large data pipelines.
- Strong understanding of data modeling principles including Dimensional modeling and data normalization principles.
- Good understanding of SQL Engines and able to conduct advanced performance tuning.
- Ability to think strategically, analyze, and interpret market and consumer information.
- Strong communication skills – written and verbal presentations.
- Excellent conceptual and analytical reasoning competencies.
- Comfortable working in a fast-paced and highly collaborative environment.
- Familiarity with Agile Scrum principles and ceremonies.
Preferred Qualifications
- 4+ years of work experience implementing and reporting on business key performance indicators in data warehousing environments.
- 5+ years of experience using analytic SQL, working with traditional relational databases and/or distributed systems (Snowflake or Redshift).
- 3+ years of experience with programming languages (e.g. Python, Pyspark).
- 3+ years of experience with data orchestration/ETL tools (Airflow, Nifi).
- Experience with Snowflake, Databricks/EMR/Spark & Airflow is a plus.
Required Education
- Bachelor’s Degree in Computer Science, Information Systems, or related field, or equivalent work experience.
- Master’s Degree is a plus.
Cómo funciona
🔍Consigue clientes potenciales de calidad
Review job leads for free, filter by local or global clients, and get real time notifications for new opportunities.
🎉Apply with ease
Pick the best leads, unlock contact details, and apply effortlessly with Twine's AI application tools.
📈Desarrolla tu carrera profesional
Showcase your work, pitch to the best leads, land new clients and use Twine’s tools to find more opportunities.