Senior Data Engineer
Company Description
Nine is Australia’s largest locally owned media company – the home of Australia’s most trusted and loved brands spanning News, Sport, Lifestyle, and Entertainment. We pride ourselves on creating the best content, accessed by consumers when and how they want – across Publishing, Broadcasting and Digital.
We bring people together by celebrating the big occasions and connecting the everyday moments. Australia belongs here. We bring our purpose to life via three shared values: We walk the talk, turn over every stone and keep it human.
Job Description
Data Engineering builds and manages the pipelines, applications and infrastructure for collecting, processing, and storing Nine’s data. We are responsible for collecting hundreds of millions of events from client devices each day and streaming them directly into our data warehouse. We extract data from the majority of Nine’s enterprise systems, centralise it in our data warehouse, and create rich data models for analysis and to inform business operations and strategy.
The team is also responsible for building solutions to surface data to users including dashboards, Slack integrations and automated static reports, as well as providing data feeds to downstream systems for personalising audience experiences and commercialisation.
The Agile squad you’ll join functions at the cross-section of Nine’s Streaming services and data products. 9Now is Australia’s leading broadcast video on demand platform with almost half of all BVOD minutes watched on one of our apps. Data plays a critical role in the streaming value chain - in particular through the collection of audience metrics which inform broadcast ratings and underpin commercialisation efforts on 9Now.
Role responsibilities
- Act as the VoZ and Total TV SME for the Streaming data squad
- Design and implement data models that align with business requirements and support efficient querying and analysis.
- Design, develop and maintain data pipelines that extract, load and transform data from various sources into the data warehouses.
- Ensure the pipelines are reliable, scalable, and efficient, considering factors including data volume, velocity, and variety.
- Implement data quality checks and validation processes to maintain accurate and reliable data.
- Set up monitoring and alerting to proactively identify and resolve issues.
- Perform root cause analysis for incidents and implement corrective actions.
- Contribute to the technical design and vision for the data platform.
Employee responsibilities
- Collaborate with cross-functional teams, including product managers, data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet business needs.
- Communicate technical concepts and solutions effectively to non-technical stakeholders.
- Stay updated with emerging data engineering technologies and trends, and evaluate their potential benefits for the organisation's data ecosystem.
Qualifications
Essential qualifications, experience & skills
- Fluent in at least one programming language (e.g. Python, C#, Java).
- Proficient in SQL
- Understanding of OLTP concepts including normalised schema design.
- Understanding of OLAP concepts and data model design.
- Experience leading the designing and building data pipelines, ETL processes, and managing data infrastructure.
- Experience leading workflows automation, and managing task dependencies using a tool like Apache Airflow.
Problem-Solving & Leadership:
- A strong analytical mindset, a passion for tackling complex technical challenges, and the ability to find creative solutions issues.
- Excellent communication and interpersonal skills, with the ability to lead technical discussions, build consensus, and work effectively with cross-functional teams.
- Deep understanding of cloud migration strategies, best practices, and potential pitfalls.
Desirable qualifications, experience & skills
- A bachelor's or master's degree in computer science, software engineering, data engineering, data science, information technology or a related field.
- Experience with Google Cloud Platform, AWS or Azure including services for data storage, processing, and analytics.
- Experience with cloud deployment and test automation using a CI/CD solution such as Cloud Build or Concourse
- Experience in deploying cloud infrastructure as code (IaC) using Terraform or similar
Additional Information
Our Commitment to Diversity and Inclusion:
At Nine, we are committed to fostering a workforce that embraces all aspects of diversity and inclusion and where practices are equitable to ensure our people experience a sense of belonging. From day one, you'll be encouraged to bring your whole self to work and will be supported to perform at your best. Should you require any adjustments to the recruitment process in order to equitably participate, we encourage you to advise us at the time of application.
We encourage applications from Aboriginal and Torres Strait Islander people, people with disabilities, and of all ages, nationalities, backgrounds and cultures.
Disclaimer: We do not accept unsolicited agency resumes and are not responsible for any fees related to unsolicited resumes.
Work rights: Please note to apply for this role you must already have the right to lawfully work and live in Australia.