Application closing date: Sunday, 21 August 2022 • 11:59pm, Canberra time (in Canberra)

Estimated start date: Thursday, 01 September 2022

Location of work: ACT

Length of contract: 6 months

Contract extensions: 2x 12 months

Security clearance: Must have Baseline

Rates: $120 - $150 per hour (inc. super)

The Department of Agriculture, Fisheries and Forestry (DAFF) has a requirement for an experienced Data Engineer with experience developing Extract, Transform, Load (ELT) processes across several data and analytics platforms.

The role will be responsible for design, development and unit testing activities across several data movement and transformation processes within DAFF. These data movement and transformation processes are in the process of being uplifted to Microsoft Azure and the successful candidate will require experience using the following technology:

1. MS Azure Stack
    a. Data Integration
        i. Data Factory
        ii. SQL Server Integration Services
    b. Data Store
        i. SQL Server
        ii. Synapse Analytics
        iii. Blob/File Storage
        iv. Cosmos DB
    c. Analytics
        i. Azure Databricks
        ii. Stream Analytics

2. Development Tools – DevOps

3. Data technology solutions – sourcing (Oracle, Ingres, Azure, Wintel Server Network file shares), collecting, ingesting and storing

4. Strong data visualization skills would also be an advantage

Positions are based in our Canberra offices. Whilst the department’s current Working Away from the Office (WAFO) policy encourages staff to work in the office a minimum of 40% of the time (such as 2 days per week or 4 days per fortnight), the team requirements are that people work in the office 60% of the time or 3 days a week. Variations can be discussed with the director however, any requests for any less than 40% in the office needs to be supported by the director and approved by the CIO. Due to the nature of some projects, there are some positions where working from home will not be available.

Essential Criteria

  1. Experience developing ETL/ELT processes for data movement and transformation.
  2. Experience developing data storage layers optimized for query performance.
  3. Azure Cloud technology experience working with Engineering, Storage and Analytics services.
  4. Working with multi-disciplinary teams using an Agile methodology

Desirable Criteria

  1. Experience working with Azure Data Factory and Databricks.
  2. Experience work with Data Lake.
Apply To Position