Sr. Data Engineer (AWS, SQL and Snowflake)

Job Description:

Duties and Responsibilities:

Maintain and build on our data warehouse and analytics environment utilizing Python, Snowflake, and AWS.

Collaborate with the data services team and data architect to develop a strategy for long-term data platform architecture.

Assist application teams with the collection of transactional and master data from source systems.

Design and implement data movement and transformation pipelines (e.g., AWS Glue, Apache AirFlow, DBT, Snowflake).

Designs and develops systems for the maintenance of the business’s data warehouse, ETL processes, and business intelligence.

Manage and elevate other engineers (both full-time, contractor, and/or third-party resources) while remaining hands-on.

Design processes and algorithms to enhance data quality and reliability.

Provide availability and access to consume analytical data through a wide range of Business Intelligence and Reporting toolsets (e.g., Microsoft Power BI, Google Looker).

Implement proactive monitoring and alerting to ensure operational stability and supportability.

Collaborate with data analysts, data scientists, security engineers, and architects to achieve the best possible technical solutions that address our business needs.

Performs analysis and critical thinking required to troubleshoot data-related issues and assist in the resolution.

Assist with production issues in Data Warehouses like reloading data and transformations.

Qualifications:

11+ years in Data Warehousing, Big Data, or ETL development (preferably using AWS S3, Glue, Apache AirFlow).

Minimum of 1 year of experience implementing a full-scale data warehouse solution using Snowflake Cloud.

Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting.

Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

5+ years of experience with a programming language (preferably Python).

Experience designing, building, and maintaining data processing and transformation pipelines (AWS Glue, Apache AirFlow, dbt, Snowflake).

Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Build processes supporting data transformation, data structures, metadata, dependency, and workload management.

Technical expertise in data models, data mining, and data segmentation techniques.

Knowledge of CI/CD deployment practices.

Strong skills with Python and SQL with the ability to write efficient queries

Desired Skills:

Strong experience with AWS data services (e.g., S3, DynamoDB, Glue, Athena, Redshift, Lambda, SQS, SNS, and API Gateway). Experience with AWS DMS a-plus.

AWS Solutions Architect Associate or AWS Developer Certification (desired).

Strong experience with the Snowflake cloud data platform.

SnowPro certification (preferred).

Experience working in an Agile / Scrum environment.

Experience with dbt data transformation framework preferred.

Experience utilizing Apache AirFlow to manage data pipelines and workflows.

Experience with Microsoft Azure DevOps for work item management and CI/CD deployment pipelines.

Thanks,

Team – TECHLOGIC SOLUTIONS LLC

www.techlogicusa.com

“LET’S WORK TOGETHER”

Job Category: Day 1 OnSite
Job Type: Long Term Contract
Job Location: Charlotte NC

Apply for this position

Allowed Type(s): .pdf, .doc, .docx