External

Data Engineer

🏢 Lakshya Software Technologies Pvt. Ltd.  •  📍 India

Sign up to view full application details.

Job Description

Snowflake DBT Engineer total years of experience- 7+ years Location- Pune/Mumbai/Chennai/Bangalore Mandatory Skills(Please focus on this part): Snowflake- 6 years DBT-6 years Python-3 years Airflow:2 years Azure cloud exp is good to have Job Description: 5+ years of working experience in Snowflake, Databricks, DBT and Python on Azure/GCP/AWS in production environments and hands-on experience in developing moderate to complex ETL/ELT data pipelines Strong in Apache Spark and Delta Lake, strong SQL and Python abilities, experience with data engineering and ETL pipelines Advanced SQL in Snowflake (CTEs, Views, UDFs, Materialized Views) & DBT Partitioning, clustering, and performance tuning in Snowflake Streams, Tasks, and Snowpipe for real-time and incremental data pipelines Cost and query optimization (e.g., using result cache, pruning, compute credits control) DBT-Data Modeling, SQL Mastery,Jinja Templating, Workflow Optimization & Configuration, Data Loading and Cleaning Hands on Python 3+ years of hands-on experience with Cloud Composer (Airflow) developing DAGs 3+ years of hands-on experience with Databricks and the ability to resolve complex SQL query performance issues 4+ years of ETL Python development experience; experience parallelizing pipelines a plus Demonstrated ability to troubleshoot complex query, pipeline, and data quality issues Develop, test, and deploy robust ETL pipelines on Snowflake on MS Azure/AWS/GCP Platform using, but not limited to, Snowflake/dbt/Databricks, Cloud Composer, and Cloud Run to ingest structured/unstructured data from on-prem SQL Server instances into Snowflake Models/Database/Schema Implement data validation checks, error handling, and logging to ensure pipeline reliability Automate deployment workflows using CI/CD pipelines (GitHub Actions, etc.) and infrastructure-as-code (IaC) tools Monitor pipelines via Cloud Logging and Cloud Monitoring, implementing alerting for data latency or quality issues
View Full Description & Requirements →