Omm IT Solutions - Lead Snowflake Data Engineer (New York, Contractor)
Data Scientist is needed in New York, United States.
Client: Omm IT Solutions
Location: New York, NY, us
Contract: Contract
Job Description
We are seeking a Lead Snowflake Data Engineer to design, own, and deliver end-to-end data engineering solutions in modern cloud environments. This role focuses on building scalable, high-performance data pipelines using Snowflake and Cortex AI, with full lifecycle ownership—from ingestion and transformation to modelling, optimization, and consumption.
Key Responsibilities
- Lead the design and development of end-to-end ELT pipelines using Snowflake.
- Architect scalable data models optimized for performance, cost, and analytics consumption.
- Build and maintain backend data services using Python and PySpark.
- Leverage Snowflake Cortex AI to enable advanced analytics and intelligent data products.
- Drive performance tuning across pipelines, including query optimization, clustering, and warehouse scaling.
- Enforce best practices in data governance, security, and compliance.
- Collaborate across business, analytics, and engineering teams to deliver high-quality solutions.
- Provide technical leadership and mentorship to engineering teams.
- Communicate architecture decisions and trade-offs effectively in client-facing environments.
Requirements
Required Qualifications
- 10+ years of experience, or equivalent ownership of production-grade data platforms.
- Deep expertise in:
- Snowflake (data modeling, performance tuning, optimization).
- Python and PySpark.
- Advanced SQL.
- Proven ability to design and deliver end-to-end data pipelines (ingestion, transformation, modeling, consumption) in cloud environments (AWS preferred).
- Required: Ownership of at least one production-grade Snowflake pipeline end-to-end.
- Strong foundation in modern data warehousing:
- Dimensional modeling (star/snowflake schemas).
- ELT/ETL design patterns.
- Data marts and optimization strategies.
- Experience with distributed data processing and large-scale datasets.
- Hands-on experience with Snowflake Cortex AI integration.
- Working knowledge of React.js or similar frameworks.
- Strong understanding of data governance, security, and compliance.
- Ability to:
- Clearly explain and defend architectural decisions.
- Design systems that perform reliably at scale.
- Balance performance, cost, and maintainability.
Technical Depth
MUST HAVE:
- Candidates should be able to clearly explain and apply the following in real-world scenarios:
- Snowflake Performance & Scaling:
- Warehouse scaling modes (auto-scale, multi-cluster) and when to use them.
- Clustering keys and performance trade-offs.
- Cost vs performance optimization strategies.
- Snowflake Storage & Optimization:
- Micro-partitioning and its impact on pruning and query performance.
- Practical optimization techniques for large datasets.
- End-to-End Pipeline Design:
- Designing a complete ELT pipeline using Snowflake.
- Deciding where transformations should occur (Snowflake vs external processing).
- Ensuring scalability, maintainability, and performance across the pipeline engagement.
Additional Information
It is a 100% Onsite position in New York, NY.
How It Works
🔍Get quality leads
Review job leads for free, filter by local or global clients, and get real time notifications for new opportunities.
🎉Apply with ease
Pick the best leads, unlock contact details, and apply effortlessly with Twine's AI application tools.
📈Grow your career
Showcase your work, pitch to the best leads, land new clients and use Twine’s tools to find more opportunities.