Remote, but you must be in the following location
CAPCO POLAND
*We are looking for Poland based candidate.
Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.
Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.
We also are experts in focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.
We're seeking a skilled Senior Big Data Engineer to join our Team. The ideal candidate will be responsible for designing, implementing and maintaining scalable data pipelines and solutions on on-prem / migration / cloud projects for large scale data processing and analytics.
BIG DATA ENGINEER @ CAPCO - WHAT TO EXPECT
You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.
You’ll be part of digital engineering team that develop new and enhance existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users.
You’ll be involved in digital and data transformation processes through a continuous delivery model.
You will work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions and enhancing security standards both on cloud and on-premise deployments.
You’ll be able to work across different data, cloud and messaging technology stacks.
You’ll have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set.
THINGS YOU WILL DO
Work alongside clients to interpret requirements and define industry-leading solutions
Design and develop robust, well tested data pipelines
Demonstrate and help clients adhere to best practices in engineering and SDLC
Excellent knowledge of building event-driven, loosely coupled distributed applications
Experience in developing both on-premise and cloud-based solutions
Build and improve strong relationships with peers, senior stakeholders and client
Leading and mentoring the team of junior and mid-level engineers
Contribute to security designs and have advanced knowledge of key security technologies e.g. TLS, OAuth, Encryption
Support internal Capco capabilities by sharing insight, experience and credentials
TECH STACK: Python, OOP, Spark, SQL, Hadoop
Nice to have: GCP, Pub/Sub, Big Query, Kafka, Juniper, Apache NiFi, Hive, Impala, Cloudera, CI/CD
SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE
Strong cloud provider’s experience on GCP
Hands on experience using Python. Scala and Java are also nice to have but not fully required.
Experience in most or all of data and cloud technologies such as Hadoop, HIVE, Spark, Flume, PySpark, DataProc, Cloudera, Airflow, Oozie, S3, Terraform etc.
Hands on experience with schema design using semi-structured and structured data structures
Experience using messaging technologies – Kafka, Kafka Connect, Spark Streaming, Amazon Kinesis
Strong experience in SQL
Good understanding of the differences and tradeoff between SQL and NoSQL, ETL and ELT
Understanding of containerisation (Docker, Kubernetes) and orchestration techniques
Experience with data lake formation and data warehousing principles and technologies – BigQuery, Redshift, Snowflake
Experience using version control tool such as Git.
Familiar with development good practices and optimisation techniques.
Experience in design, build and maintain CI/CD Pipelines on Jenkins, CircleCI
Enthusiasm and ability to pick up new technologies as needed to solve your problems
WHY JOIN CAPCO?
Employment contract and/or Business to Business - whichever you prefer
Possibility to work remotely
Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
Access to 3.000+ Business Courses Platform (Udemy)
Access to required IT equipment
Paid Referral Program
Participation in charity events e.g. Szlachetna Paczka
Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
Being part of the core squad focused on the growth of the Polish business unit
A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
A work culture focused on innovation and creating lasting value for our clients and employees
ONLINE RECRUITMENT PROCESS STEPS*
Screening call with the Recruiter
Technical interview: first stage
Client Interview
Feedback/Offer
We have been informed of several recruitment scams targeting the public. We strongly advise you to verify identities before engaging in recruitment related communication. All official Capco communication will be conducted via a Capco recruiter.
Your email won't be used for commercial purposes. Read our Privacy Policy.