Location Address: WFH, Toronto
Contract Duration: 12/06/2021 to 10/31/2022
Candidate Value Proposition:
– The successful candidate will have the opportunity to work with Minio and Trino, and the group also has plans to move to Google Cloud Platform.
Typical Day in Role:
• Building end to end solution for data pipeline using technologies like python, Hadoop and spark.
o Identify and resolve the problem if current data pipeline fails
o Continuous improvement of a new product
o Building new code from an older project
• Communication with data scientists, strategists, product team, etc. and finding the data gap if exists.
• Will need to be flexible to assist on other projects where their expertise is required
• Will need to help Data Scientist in developing, validating and operationalizing the Models built by them
Candidate Requirements/Must Have Skills:
• 4+ years’ experience with: Python, Spark, Hadoop, Shell scripting, Strong SQL knowledge,
• Foundational knowledge of Mongo Database and Scoop
• Proficiency in tidal scheduler
• Knowledge of Minio, Trino, Airflow, and Kubernetes
• Banking industry experience
• Communication skills
Degrees or certifications:
• Bachelor's degree in a technical field such as computer science, computer engineering or related field required.