About the Role
Our public sector client is seeking a Senior Data Engineer for a 12-month, full time contract, with the possibility of up to a 24-month extension. The Data Engineer will be required on a full-time basis (7.25-hour workday, 36.25 hours/week), to word on service innovation, program review, and digital transformation projects across the Client’s enterprise environment. Data Engineers will work as part of cross-functional program review or product delivery teams. These teams, led by Client product owners, will work collaboratively and collectively participate in a full range of activities including: field research; backlog definition and refinement; and sprint planning and execution. Digital transformation projects review the current state of services, identify future opportunities, and then deliver new services that are efficient, effective and affordable.
The Client is seeking a talented and versatile Data Engineer to join the dynamic team. The ideal candidate will have a strong foundation in data engineering practices, combined with the analytical skills necessary to derive actionable insights from data. This role involves designing, implementing, and maintaining robust data pipelines and architectures, as well as performing detailed data analysis to support business decisions.
Responsibilities will include:
Services and project deliverables should evolve as the work progresses, in response to emerging user and business needs, as well as design and technical opportunities. However, the following must be delivered (iteratively) over the course of the project:
Data Engineering:
• Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
• Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
• Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high-quality data.
• Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
• Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
• Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
• Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
• Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self-service analytics.
Data Analytics:
• Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
• Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
• Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
• Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
• Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self-service capabilities.
• Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.
Working Hours
Standard Hours of work are 08:15 – 16:30 Alberta time, Monday through Friday excluding holidays observed by the client. Work must be done from within Canada, due to network and data security issues.
This resource will primarily work remotely; however, in the event of an onsite meeting, the Client/Handis Consulting Ltd. does not pay for travel to attend on-site meetings, nor any expenses related to relocation, commuting, housing/accommodation, food/drink (Edmonton AB).
Equipment
The candidate shall be responsible for providing all their own equipment. The computer's operating system must be a modern version of Windows or MacOS that is compatible with Azure Virtual Desktop (AVD) and related software for remote access. Windows is preferred due to better compatibility. AVD/remote-related software will be installed on the resource's computer.
Requirements
Education
Bachelor’s degree in computer science, IT or related field of study
Work Experience
3 years experience designing efficient dimensional models (star and snowflake schemas) for warehousing and analytics
3 years experience ensuring data quality, security, and governance
5 years experience as a Data Analyst, Data Engineer or in a similar role
2 years experience using Git, collaborative workflows, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, ARM, CloudFormation) to deploy and migrate data solutions
5 years experience with manipulating and extracting data from diverse on-premises and cloud-based sources
3 years experience with SSIS, Azure Data Factory (ADF), and using APIs for extracting and integrating data across multiple platforms and applications
2 years performing migrations across on-premises, cloud, and cross-database environments
2 years experience in application development, with working knowledge of modern technologies including Next.js, Node.js, D3.js, GitHub Actions, and Build Master automation
2 years experience with databases and data integration, including PostgreSQL, MongoDB, Azure Cosmos DB, Azure Synapse, and Talend
1 year exposure to AI/ML tools and workflows relevant to data engineering, such as integrating AI-driven analytics or automation within cloud platforms like Databricks and Azure
Standard Security Clearance and Potential Enhanced Security Clearance
The contractor shall, prior to commencement of the Services, provide the client, on its request and at no cost to the client/Handis Consulting Ltd., with a current criminal record check.
Should the candidate be assigned to a team working with required Enhanced Security Clearance, the Data Engineer must provide the Client with an Enhanced Security Clearance. If over the course of the SOW, the candidate may be required to complete higher-level security clearances, such as the Royal Canadian Mounted Police Top Secret Clearance. Please ensure applicants are eligible to apply if required by the Client.
Resource References
Three references, for whom similar work has been performed, must be provided. The most recent reference should be listed first. Reference checks may or may not be completed to assist with scoring of the proposed resource by the client.
SUBMISSION MUST INCLUDE:
· RESUME
· ALL REQUIRED EXPERIENCE MUST BE DESCRIBED IN RESUME UNDER THE JOB/PROJECT WHERE EXPERIENCE WAS ATTAINED
· RESOURCE REFERENCES
If this opportunity is of interest to you, please provide your resume detailing all your relevant experience and certifications, as well as 3 recent references. Candidate must have or immediately obtain Incorporated Business status.
Handis Consulting prides itself in being an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law.
About the Company