Location: WFH, Toronto
Contract Duration: 4 months
Creating enterprise reusable data across business lines and projects. Creating analytics ready data that models and ready to use. Getting ready for conception for the analytics team is the primary goal. Data engineer, ETL Architecture to create enterprise data assets.
Story Behind the Need
Project Summary: A high-impact project is being developed on our Kubernetes-based analytics platform. The project has a clearly defined mission and we are searching for a candidate to design and develop the data pipelines and complete end-to-end process. It’s an exciting opportunity to build a project that helps realize the value of a high-performance environment, strong engineering, quality data, and customer analytics.
Candidate value Proposition:
The organization is focused on building high-value solutions across the bank. The organization consists of accomplished professionals who blend expertise in computer science, data, machine learning, and banking to deliver innovative projects that drive the business. A key part of success for the group is having scalable, fast, and continuously evolving data processes, and this is the responsibility of the Data Engineering team. As an integral part of the team, the Data Engineer will be responsible for all technical aspects of the data pipelines that power advanced and descriptive analytics work.
• Work with analytics teams in Canadian Banking to understand their project requirements
• Work with engineering colleagues on solution design
• Build, test, scale data pipelines
• Build automated data management into the project
Qualifications/Must Have skills:
1. At least 7 years of hands on programming experience with tools such as Python, Scala, and /or Java.
2. Comfortable working in databases and distributed data environments. (PostgreSQL, DB2, HDFS/Hive, Min.IO)
3. Familiarity with agile tooling to efficiently build as a team: git, Jira, Confluence, etc.
4. High aptitude for diving in and picking up new technologies as required
5. At least 5 years of hands on experience building ETL pipelines
Nice to have:
1. Experience in finance is an asset
2. Experience with Scala is preferred
3. Experience with Spark is preferred
4. Experience with Min.IO, Kubernetes, Airflow is an asset
5. Familiarity with data management capabilities (data registration, data quality, etc.) and ability to learn advanced data management toolsets
Bachelor’s degree in Computer Science or Engineering, or related field or relevant experience