We are seeking a highly skilled and motivated Data Engineer to join our team. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to enable efficient data processing, analysis, and reporting. Expertise in DBT, Snowflake, SQL, and Python will be essential in building and optimizing data pipelines to ensure the availability, scalability, and reliability of our data ecosystem. Matillion /Any Data Ingestion tool experience and Control M is a plus. Team leading experience is mandatory.
Responsibilities:
- Collaborate with cross-functional teams to understand data requirements and translate them into effective data solutions.
- Design and implement DBT python Models
- Design and implement DBT test cases as part of the validation framework.
- Experience in creating DBT Singular, Generic test cases.
- Experience working with Matillion Tools is a plus.
- Experience converting stored procedures into DBT Models.
- Design and implement robust and scalable data pipelines using Snowflake, SQL, and Python to extract, transform, and load (ETL) data from various sources.
- Optimize data workflows for performance, reliability, and maintainability.
- Develop and maintain data models, ensuring data accuracy, consistency, and integrity.
- Work with large datasets, both structured and unstructured, to derive meaningful insights.
- Identify and address data quality issues through data profiling, validation, and cleansing techniques.
- Monitor and troubleshoot data pipelines to ensure data availability and accuracy.
- Continuously improve data processes and automation to streamline data operations.
- Collaborate with data scientists and analysts to support their data needs and facilitate data-driven decision-making.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field (Master's preferred).
- Strong experience with DBT and Snowflake for data warehousing and analytics.
- Proficiency in SQL for querying and manipulating data.
- Extensive programming experience in Python for building data pipelines and automating data processes.
- Solid understanding of data modeling concepts and database design principles.
- Familiarity with data integration, transformation, and ETL techniques.
- Experience with version control systems (e.g., Azure, Git) and CI/CD pipelines is a plus.
- Excellent problem-solving skills and ability to work in a dynamic, collaborative environment.
- Strong communication skills to collaborate effectively with technical and non-technical stakeholders.
Your email won't be used for commercial purposes. Read our Privacy Policy.