Contract Duration: 5 months
Location Address: WFH, Toronto
• Business group:
Data provisioning team for risk management
Candidate Value Proposition:
1. Advanced technologies with Big data, Hadoop, Data Stage, variety of technical tools to learn,
2. A lot of challenging projects and CRs in the backlog. Opportunity to add value
3. Intense busy team – a lot to do and learn
Typical Day in the role:
– Expectation from this role is 60/40 split:
– 60 % of development and responding to CR Backlog
– 30% some support and production support – may require 2-3 shift work per month
– 10 % administrative duties and documentation
Project Summary: The main function of the Data Engineer is to develop, evaluate, test and maintain architectures and data solutions within our organization.
Seeking an experienced Data Engineer who specializes in several development stages of SDLC. Reporting to the Senior Manager, Development, the incumbent is responsible for acting as the liaison between Development team and Operations team in Risk Management department.
The successful candidate would be responsible for several phases of the Systems Development Life Cycle (SDLC), including feasibility studies, research, analysis, development, and testing for new systems or enhancements to existing medium to large initiatives.
The incumbent must work under tight deadlines, conflicting priorities and changing project requirements. The incumbent must be able to react quickly to resolve problems and work overtime on occasion to meet deadlines and to implement systems with minimal impact to the end users
To join the team, the individual must be proactive and dynamic, demonstrate initiative, have an eagerness to learn, be adaptable to a high-paced environment, and thrive on challenge
Qualifications/ Must have:
1. Must have 5-7 years of hands on technical working experience programming using ETL Tools – Datastage, Talend/Python
2. Experience developing in both Agile and Waterfall environments
3. At least 2 years Continuous Integration tools like Jenkins, Github, Puppet, Chef, etc
4. 2+ years of experience with big data and data warehousing concepts / principles like Big data/Hadoop technologies such Hive/Spark
5. 2 + years of SAS or R- Language
6. 2+ years of experience /working knowledge of most common software design patterns.
7. 5+ years of experience working with relational databases (Sybase, SQL Server, Oracle).
8. 5-7 years of experience writing and maintaining related documentation.
Soft Skills: Important to have strong/excellent communication skills
1. Attention to details, high standards for quality.
2. Desire to learn, grow yourself and your team.
3. Passion for driving teams towards high performance and a deep pride in quality craftsmanship that delights users.
Nice to Have skills:
1. Experience designing and implementing cloud-based applications, awareness of main public cloud offerings (Microsoft Azure, AWS, GCE).
2. Experience working with scripting languages (Bash, PowerShell).
3. Financial and Credit Risk / Loss experience
4. Prior FI or Banking experience
5. IFRS 9 Exposure is desired
Bachelor (equivalent or higher) degree in Computer Science.