We've been proudly bringing joy to tables and smiles to faces with the delicious food we've been crafting for over 100 years. A large part of our strong growth and success is because we're always looking for new ways to do things and thinking about how we can create a better, more sustainable future world together. Our success as an industry leader in providing deliciously good food means we can continue to invest in our future as we work towards becoming Australia and New Zealand's first choice for poultry.
We’ve been proudly bringing joy to tables and smiles to faces with the delicious food we’ve been crafting for over 100 years. A large part of our strong growth and success is because we're always looking for new ways to do things and thinking about how we can create a better, more sustainable future world together. Our success as an industry leader in providing deliciously good food means we can continue to invest in our future as we work towards becoming Australia and New Zealand’s first choice for poultry.
Your opportunity to join our Technology team in North Ryde!
Inghams is seeking a talented Senior Data Engineer to join our AI, Data & Digital Innovation team and play a foundational role in building enterprise-scale data capabilities that will power our business for the next decade. This is your opportunity to architect and deliver production-grade data infrastructure at Australia and New Zealand's largest integrated poultry producer.
We're in the midst of an exciting digital transformation journey, rolling out our cloud data platform powered by AWS. As our Senior Data Engineer, you'll be instrumental in delivering our data transformation initiative—integrating legacy ERP, manufacturing execution, and operational systems with modern cloud architecture, establishing the data foundation that enables AI/ML innovation, consumer insights, and intelligent automation across our operations.
Key responsibilities:
- Deliver enterprise-scale data infrastructure at pace using AI-powered development tools to accelerate pipeline development, testing, and deployment cycles, and build production-ready data solutions rapidly through AI-assisted coding, automated testing frameworks, and modern DevOps practices that compress traditional delivery timelines.
- Experiment with emerging AWS services, pioneering new architectural patterns, and implementing cutting-edge data engineering techniques.
- Build scalable, AI-enabled data pipelines that serve as the foundation for ML/AI innovation, advanced analytics, and intelligent automation. Design data infrastructure that enables rapid experimentation, supports diverse AI/ML workloads, and unlocks new business capabilities across the enterprise.
- Rapidly prototype and productionise data solutions that directly impact business outcomes, iterating quickly based on stakeholder feedback and operational metrics.
- Champion a culture of rapid innovation and continuous experimentation, leveraging AI-assisted development to compress build cycles, improve code quality, and capture knowledge automatically.
- Develop and optimise ETL/ELT processes to integrate legacy ERP, manufacturing execution, and operational systems with modern cloud-based data architecture
- Implement data modelling and schema design for AWS services including S3 (data lake), Redshift (data warehouse), and supporting services, applying dimensional modelling and data vault techniques to ensure scalability and performance.
- Establish data governance, quality frameworks and data contracts across enterprise data sources
- Collaborate with AI, Analytics and BI teams to ensure data availability, discoverability, and fitness for purpose across use cases from operational reporting to advanced ML model training and real-time inference.
- Support consumer analytics and customer intelligence capabilities with reliable data infrastructure, APIs, and integration patterns
- Work with cross-functional teams to define data requirements, establish data contracts, and deliver against business-driven use cases
- Maintain data security, compliance and access controls in line with enterprise governance frameworks and AWS best practices
- Monitor, troubleshoot and improve data pipeline performance, reliability and cost-efficiency across the AWS environment, implementing observability, alerting, and automated remediation where appropriate.
- Document data architecture, data flows and technical specifications to support knowledge sharing, platform scalability, and onboarding of future team members
Key requirements:
- Strong AWS data engineering experience with hands-on expertise in S3, Redshift, Glue, Lambda, Step Functions, Athena, and related services
- Production-grade Python and SQL skills with extensive experience building ETL/ELT pipelines, writing complex queries, optimising database performance, and implementing data quality frameworks.
- Data modelling and architecture expertise with practical knowledge of dimensional modelling, data vault, schema design patterns, and understanding of how to structure data for both analytical and operational use cases.
- Infrastructure as Code (IaC) proficiency using Terraform, CloudFormation, or CDK to provision and manage cloud data infrastructure in a repeatable, version-controlled manner.
- CI/CD and DevOps practices with experience implementing automated testing, deployment pipelines, and monitoring for data infrastructure using tools like GitHub Actions, Jenkins, or AWS CodePipeline.
- Data governance and quality mindset, understanding the importance of data contracts, lineage, validation, and metadata management
- Business translation and communication ability to confidently explain technical data concepts to non-technical stakeholders, gather requirements from business users, and translate ambiguous needs into concrete technical solutions.
- Problem-solving and systems thinking with a genuine interest in understanding business challenges, evaluating trade-offs, and delivering practical, maintainable solutions
- 6–8 years of hands-on data engineering experience in a commercial environment, with at least 3 years focused on cloud data platforms and production pipeline development.
- Experience in FMCG, manufacturing, or supply chain environments is advantageous
We’re looking for people who are curious, caring, courageous and committed to join us; people who want to contribute their best work every single day and continue delivering deliciously good food that’s…Always Good!