Location Address: WFH, Toronto
Contract Duration: 7 months + good chance of extension
Story Behind the Need
• Business group: The group ensures ongoing compliance and optimization of trade platforms on a global scale.
• Project: The group is working on a Trade Surveillance Project that will leverage the banks Data Lake for vendor applications to create new data feeds utilizing Spark technology.
Candidate Value Proposition
• The successful candidate will have the opportunity to use the newest technology to optimize trade platforms on a global scale. The candidates will also have an opportunity to join a flexible team that utilize Agile best practices to ensure projects deliverables continue to be met.
Typical Day in Role
• Analyze highly complex business requirements; generate technical specifications to design or redesign complex software components and applications
• Design, implement, automate and maintain large scale enterprise data ETL processes.
• Build high-performance algorithms, prototypes, predictive models and proof of concepts.
• Leverage industry best practices to design, test, implement and support a solution
• Assure quality, security and compliance requirements are met for supported area
• Be flexible and thrive in an evolving environment
• Adapt to change quickly and adjust work accordingly in a positive manner
Candidate Requirements/Must Have Skills:
1) 2 + years’ experience with Spark and Spark SQL programming language
2) 2 -3 years’ experience with Elastic Search
3) 2 + years’ using Agile Methodology throughout the SDLC Cycle
4) Recent lead experience (minimum 2 years') of upwards of 6 people
4) Strong communication skills required to clearly articulate technical requirements with sprint teams in virtual settings
– Apache Nifi
– Python programming is a plus
– Capital Markets experience in Regulatory or Derivative technology is a plus
Degrees or certifications:
• Bachelor's degree in a technical field such as computer science, computer engineering or related field required