Contract Length: 4 months (potential for extension)
Location: WFH, Toronto
Story Behind the Need:
Business Group: Team is responsible for delivering data integration solutions for a variety of business lines. The team is responsible for the CBT data aggregation/ETL and data delivery applications. They support and/or interface with a variety of platforms and technologies: Talend, Spark, Information Builders (Data Migrator and Data Quality Centre), Python, Java, Scala, DB2, COBOL, Hadoop, Hive, etc.
Client is looking for 3 energetic and results-oriented Data Engineers. Reporting to the Senior Manager, the Data Engineer is responsible for ensuring alignment with the Bank's Technology Blueprint and responsible for working with the various cross-functional teams during the design phase of large complex data integration projects and is expected to be strong technical specialist, be self-motivated and able to contribute to the Bank's overall technology goals with minimal supervision.
The 3 Data Engineers will focus on the TSYS project. This will require them to develop ETL and provide input on the architecture that is being built to process new statements using Talend. This will also involve complexity with data formats. The project is currently in the initial stage of reviewing requirements and assessing solutions
Typical Day in the Role:
• Develop and Design ETL job using Talend,
• Provide technical guidance to diverse projects that differ in size and scope from their inception to delivery overseeing the technical solution, identifying reducing and mitigating risks.
• Guide and assist in the definition of non-functional requirements and the delivery of a highly scalable, secure and flexible solution.
• Collaborate with multiple technical teams to understand interdependencies, commonality and variability of the solutions under development and then help the teams deliver a platform made of reusable and configurable capabilities.
• Provide input to the continuous improvement of processes and adopting latest technologies and methodologies.
• Ensure Agile and DevOps practices are applied to software development and architecture design.
Must Have Skills/Requirements:
1) 4+ years of IT development experience
2) 3+ years: Strong knowledge and hands-on experience running Talend Data Fabric.
3) 3+ years: Strong knowledge and experience with Java and Python, SQL/HQL, Linux OS Scripting/Commands and REST API.
4) 4+ years: Working knowledge and experience with Spark and Hadoop (Hive)
5) 4+ years: Knowledge and experience with different IDEs including Eclipse, CI/CD tools (Experience with code repository, version control and code promotion tools such as bitbucket and Jenkins) Confluence, defect tracking such as Jira
Nice to Haves:
• Apache open-source projects such as Sqoop or Zookeeper
• Hands-on experience on scaled Agile methodologies
• Talend certification
• ? Passionate about technology, maintain superior knowledge of emerging industry trends across multiple disciplines.
• ? Thinks outside the box and is able to lead the teams to innovate.
• ? Possess strong problem-solving skills to rapidly assess problem situations and recommend alternate solutions to effectively resolve high-level, complex problems.
• Strong communication skills – verbal & written (strong presentation/demo skills is a plus)
• Bachelor Degree in Computer Science or related field.