Company
RiskSpan logo

RiskSpan

riskspan.com
Location

Remote, but you must be in the following location

  • 🇮🇳 India
Apply

Data Engineer with Strong Snowflake, SQL, and Python Experience

Description

About RiskSpan

RiskSpan is a leading source of analytics, modeling, data and risk management solutions for the Consumer and Institutional Finance industries. We solve business problems for clients such as banks, mortgage-backed and asset-backed securities issuers, equity and fixed-income portfolio managers, servicers, and regulators that require our expertise in the market risk, credit risk, operational risk, and information technology domains.

This position is based in India to work remotely for a US client. You must live in India to be considered for this position.

We are seeking a highly skilled and motivated Data Engineer to join our team. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to enable efficient data processing, analysis, and reporting. Your expertise in Snowflake, SQL, and Python will be essential in building and optimizing data pipelines to ensure the availability, scalability, and reliability of our data ecosystem.

Responsibilities:

  • Collaborate with cross-functional teams to understand data requirements and translate them into effective data solutions.

  • Design and implement robust and scalable data pipelines using Snowflake, SQL, and Python to extract, transform, and load (ETL) data from various sources.

  • Optimize data workflows for performance, reliability, and maintainability.

  • Develop and maintain data models, ensuring data accuracy, consistency, and integrity.

  • Work with large datasets, both structured and unstructured, to derive meaningful insights.

  • Identify and address data quality issues through data profiling, validation, and cleansing techniques.

  • Monitor and troubleshoot data pipelines to ensure data availability and accuracy.

  • Continuously improve data processes and automation to streamline data operations.

  • Collaborate with data scientists and analysts to support their data needs and facilitate data-driven decision-making. ****

Qualifications:

  • Bachelor's degree in Computer Science, Information Technology, or a related field (Master's preferred).

  • Strong domain knowledge of mortgage data is a must

  • Strong experience with Snowflake for data warehousing and analytics.

  • Proficiency in SQL for querying and manipulating data.

  • Extensive programming experience in Python for building data pipelines and automating data processes.

  • Solid understanding of data modeling concepts and database design principles.

  • Familiarity with data integration, transformation, and ETL techniques.

  • Experience with version control systems (e.g., Git) and CI/CD pipelines is a plus.

  • Excellent problem-solving skills and ability to work in a dynamic, collaborative environment.

  • Strong communication skills to collaborate effectively with technical and non-technical stakeholders.

It would be good but not mandatory if the candidate has basic domain experience to understand the different terms used and come with up testing plan specific to the assets we are dealing with.