Contract Data Modeller (178027) Sydney CBD, Sydney, Australia

Salary: AUD80 - AUD100 per hour

We’re currently seeking an experienced data modeller to join a highly skilled data and analytics team working on cutting-edge enterprise solutions. This is an exciting opportunity to deliver advanced data models, scalable architectures, and high-performance pipelines across cloud and on-prem environments.

In this role, you’ll collaborate with cross-functional teams, lead technical delivery, and help shape the future of data platforms, governance, and architecture for major clients.

About the Role

This is a hands-on role focused on designing, implementing, and maintaining advanced data models and end-to-end pipelines. You’ll help define data structures and architecture across various platforms, enabling high-quality analytics, machine learning, and business intelligence solutions.

The successful candidate will bring a deep understanding of data architecture, big data ecosystems, and modern cloud platforms, combined with strong communication and leadership skills.

Key Responsibilities

  • Design, develop and maintain scalable data models across cloud and on-prem platforms

  • Build, test and support batch and near real-time data flows and pipelines

  • Translate business requirements into technical solutions using established data modelling techniques

  • Apply modelling approaches including 3NF, Dimensional Modelling, ER, and Data Vault

  • Deliver Proof of Concepts and working demos for client engagements

  • Collaborate on delivery engagements and workshops with internal and external stakeholders

  • Lead or mentor junior team members and contribute to capability development

  • Work across data architecture, governance, and metadata strategies in large environments

  • Ensure technical standards, performance SLAs, and data quality frameworks are maintained

Skills and Experience – Mandatory

  • Experience designing and implementing data models for Big Data / DWH platforms (Star and Snowflake schemas)

  • Familiarity with data model products such as IFW, BDW, or FSDM (IBM, Teradata)

  • Strong proficiency in tools like Erwin and Sparx Systems Enterprise Architect

  • Expertise in Python, Scala, SQL, and shell scripting

  • Strong background in cloud data platforms (Azure, AWS, GCP, IBM) including tools like Azure Synapse, Cosmos DB

  • Experience building scalable, fault-tolerant services with high availability

  • Strong grasp of ETL, data warehousing and BI best practices

  • Practical experience with Agile, Kanban, or Waterfall project methodologies

  • Confident presenting to stakeholders and senior leadership

  • Previous team leadership or mentoring experience

  • Degree in Computer Science, IT, or related field

Desirable Skills

  • One or more industry-recognised certifications (e.g. Azure Data Engineer, AWS Big Data, GCP Professional Data Engineer)

  • Knowledge of Data Mesh architecture and domain-driven data products

  • Understanding of modern infrastructure, containerisation, and virtualisation

If the above looks of interest, please make an application and Experis will be in contact.


;