Senior Data Engineer

Job Description:


Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.

Collaborate with data scientists and analysts to understand data requirements and implement solutions.

Optimize data workflows for performance, scalability, and reliability.

Troubleshoot and resolve data-related issues in a timely manner.

Stay updated on the latest technologies and best practices in data engineering.

Must Have Skills:

  1. Snowflake, Snowpark: The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics.
  • AWS Services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines.
  • AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role.
  • AWS Services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration.
  • Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks.
  • DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus.




Job Category: Day 1 Onsite Hybrid
Job Type: 12 Months Contract
Job Location: Fort Mill SC

Apply for this position

Allowed Type(s): .pdf, .doc, .docx