Job Title: Intermediate Data Engineer
Contract Length: till May 30th (extension + possibility for conversion to FTE based on funding & performance)
Address: WFH (Toronto, ON)
The role of the Senior Data Engineer operates within the Data Engineering & Delivery team. This valuable member of the team will participate in creating re-usable data assets, pipelines and services for the broad enterprise use while working on a couple of ongoing data projects across the Bank.
We are looking for someone who will work with other data engineers, DevOps engineers, and other development/engineering teams in the bank to develop the technical solutions for enterprise data pipelines and services.
Candidate Value Proposition:
The successful candidate will have the opportunity to get exposure to various departments across the Bank as well as emerging technologies such as Kubernetes & Spark. In addition to this the contractor will have a chance to be extended or even converted to FTE.
Typical Day in Role:
• Design and implement data pipelines, services and components to enable enterprise-wide use of data
• Automate and re-factor data pipelines and services code.
• Work closely with data engineers and DevOps engineers to build well-managed re-usable data assets that drive real business outcomes
• Participate in planning and retrospective sessions, attend stand-ups, etc.
Must Have Skills/Requirements:
1) Experience in software engineering best practices such as code reviews, testing frameworks, maintainability and readability – 5+ years of hands on experience
2) Experience in building data / big data software – 2+ years of hands on experience
3) Experience working with big data technologies – Spark or Talend (both are preferred) – 2+ years of hands on experience
4) Experience with object-oriented programming languages – Scala or Java (Scala preferred) -5+ years of hands on experience
5) Experience working with relational databases (e.g. MySQL, PostgreSQL, Oracle) – 5+ years of hands on experience
6) Understanding of CI/CD (e.g. Jenkins, Git, Bitbucket, other)* -3+ years of hands on experience
Nice to Have Skills:
– Experience with ETL (e.g. Talend)*
– Previous experience working in a financial institution
– Knowledge and understanding of container and micro-services technologies (e.g. Docker, Kubernetes)
– Understanding of data management disciplines such as data quality, data profiling, etc.
– Experience in software engineering best practices such as code reviews, testing frameworks, maintainability and readability
– Confluence or Jira
– Certifications related to Hadoop, Scala or Spark
– GitHub or Online Blog that they can share ***
Soft Skills:
• Strong communication skills, both written and spoken
• Team player, self-starter
• Attention to details, high standards for quality
• Writing and maintaining clear documentation (Confluence)
Education:
– Computer Science or related field is preferred