• Skip to primary navigation
  • Skip to main content
  • Skip to footer
  • Email
  • Facebook
  • Instagram
  • LinkedIn
  • Twitter

1.844.822.0541

info@nexusgroup.ca

  • English
    • Français
  • Consultant Login
  • Find Your Opportunity
Nexus Systems Group

Nexus Systems Group

  • Home
  • What We Do
    • Staffing Solutions Made Easy
    • Contingent Workforce Management and Payroll Solutions
    • Technology Consulting and Delivery
  • Who We Serve
  • Who We Are
    • Our Team
    • News and Awards
    • Associations and Community
  • Contact Us
  • News

Data Engineer

December 9, 2024 by

Data Enginer  – Senior
Ministry – Government Client 
Toronto  3 days onsite
CRJMC
14 month contract 

Must Have Skills
·        7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL 
·        2+ Delta Lake, Databricks and Azure Databricks pipelines
o   Strong knowledge of Delta Lake for data management and optimization.
o   Familiarity with Databricks Workflows for scheduling and orchestrating tasks.
·        2+ years Python and PySpark 
·        Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments. 
·        Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data.
·        SQL Server, Oracle
?
Experience:
·        Experience of 7+ years of working with SQL Server, T-SQL, Oracle, PL/SQL development or similar relational databases
·        Experience of 2+ years of working with Azure Data Factory, Databricks and Python development
·        Experience building data ingestion and change data capture using Oracle Golden Gate 
·        Experience in designing, developing, and implementing ETL pipelines using Databricks and related tools to ingest, transform, and store large-scale datasets
·        Experience in leveraging Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.
·        Experience working with building databases, data warehouses and working with delta and full loads
·        Experience on Data modeling, and tools – e.g. SAP Power Designer, Visio, or similar
·        Experience working with SQL Server SSIS or other ETL tools, solid knowledge and experience with SQL scripting
·        Experience developing in an Agile environment
·        Understanding data warehouse architecture with a delta lake
·        Ability to analyze, design, develop, test and document ETL pipelines from detailed and high-level specifications, and assist in troubleshooting.
·        Ability to utilize SQL to perform DDL tasks and complex queries
·        Good knowledge of database performance optimization techniques
·        Ability to assist in the requirements analysis and subsequent developments
·        Ability to conduct unit testing and assist in test preparations to ensure data integrity
·        Work closely with Designers, Business Analysts and other Developers
·        Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants
·        Design and implement technical enhancements of Data Warehouse as required.
 
 
Technical Skills (70 points)
 
·        Experience in developing and managing ETL pipelines, jobs, and workflows in Databricks.
·        Deep understanding of Delta Lake for building data lakes and managing ACID transactions, schema evolution, and data versioning.
·        Experience automating ETL pipelines using Delta Live Tables, including handling Change Data Capture (CDC) for incremental data loads.
·        Proficient in structuring data pipelines with the Medallion Architecture to scale data pipelines and ensure data quality.
·        Hands-on experience developing streaming tables in Databricks using Structured Streaming and readStream to handle real-time data.
·        Expertise in integrating CDC tools like GoldenGate or Debezium for processing incremental updates and managing real-time data ingestion.
·        Experience using Unity Catalog to manage data governance, access control, and ensure compliance.
·        Skilled in managing clusters, jobs, autoscaling, monitoring, and performance optimization in Databricks environments.
·        Knowledge of using Databricks Autoloader for efficient batch and real-time data ingestion.
·        Experience with data governance best practices, including implementing security policies, access control, and auditing with Unity Catalog.
·        Proficient in creating and managing Databricks Workflows to orchestrate job dependencies and schedule tasks.
·        Strong knowledge of Python, PySpark, and SQL for data manipulation and transformation.
·        Experience integrating Databricks with cloud storage solutions such as Azure Blob Storage, AWS S3, or Google Cloud Storage.
·        Familiarity with external orchestration tools like Azure Data Factory
·        Implementing logical and physical data models
·        Knowledge of FHIR is an asset
Design Documentation and Analysis Skills (20 points)
·        Demonstrated experience in creating design documentation such as:
o   Schema definitions
o   Error handling and logging
o   ETL Process Documentation
o   Job Scheduling and Dependency Management
o   Data Quality and Validation Checks
o   Performance Optimization and Scalability Plans
o   Troubleshooting Guides
o   Data Lineage
o   Security and Access Control Policies applied within ETL
·        Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.
·        Participate in defect fixing, testing support and development activities for ETL
·        Analyze and document solution complexity and interdependencies including providing support for data validation.
·        Strong analytical skills for troubleshooting, problem-solving, and ensuring data quality.
 
 
Communication and Leadership Skills (10 points)
 
·        Ability to collaborate effectively with cross-functional teams and communicate complex technical concepts to non-technical stakeholders.
·        Strong problem-solving skills and experience working in an Agile or Scrum environment.
·        Ability to provide technical guidance and support to other team members on Databricks best practices.
·        Must have previous work experience in conducting Knowledge Transfer sessions, ensuring the resources will receive the required knowledge to support the system.
·        Must develop documentation and materials as part of a review and knowledge transfer to other members.

 

  • Apply Now
  • See All Jobs

Footer

ABOUT NEXUS SYSTEMS GROUP

Nexus is one of North America’s leaders in the provision of technology staff augmentation and strategic resource consulting. With a team of talented professionals using best of breed methodologies, Nexus consistently over delivers with quick, quality and trusted results to its clients and consultants.

LATEST OPPORTUNITIES

  • QA Analyst August 27, 2025
  • Senior Android developer August 27, 2025
  • Business Analyst/Delivery Lead August 27, 2025
  • Business Process Analyst August 27, 2025

SOCIAL

  • Email
  • Facebook
  • Instagram
  • LinkedIn
  • Twitter
Report on Business Canada's Top Growing Companies - Nexus Group
Nexus Group Growth 500 2019
Nexus Systems Group Growth 500 2018
NEXUS GROUP SYSTEMS GROUP INC. NEXUSGROUP.CA
  • Contact Us
  • Terms & Conditions
  • Privacy Policy