Logo of Huzzle

Data Engineer

image

Optum

2mo ago

  • Job
    Full-time
    Mid & Senior Level
  • Data
  • Dublin

AI generated summary

  • You must have experience in data pipelines, SQL, DevOps, Python, Snowflake or Databricks, AWS/Azure/GCP, Airflow, agile methodologies, and both real-time/batch data processing.
  • You will develop and maintain ETL pipelines, integrate data from multiple sources, perform data transformation, ensure data quality, implement governance, and monitor systems for reliability.

Requirements

  • Hands-on experience developing data pipelines that demonstrate a strong understanding of software engineering principles
  • Ability to debug complex data issues while working on very large data sets with billions of records
  • Fluent in SQL, with experience using Window functions and more advanced features
  • Understanding of DevOps tools, Git workflow and building CI/CD pipelines
  • Ability to work with business and technical audiences on business requirement meetings, technical white boarding exercises, and SQL coding/debugging sessions
  • Well versed in Python, in fulfilling multiple general-purpose use-cases, and not limited to developing data APIs and pipelines
  • Experience with Snowflake or Databricks
  • Experience with relational data models and OLAP processing
  • Experience building data pipelines on either AWS, Azure or GCP, following best practices in Cloud deployments
  • Familiar with Airflow or similar orchestration tool
  • Experience working in projects with agile/scrum methodologies
  • Experience with shell scripting languages
  • Experience with JavaScript/ReactJS/NodeJS
  • Experience working with both real-time and batch data, knowing the strengths and weaknesses of both and when to apply one over another

Responsibilities

  • Data Pipeline Development: Develop and maintain data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake. Ensure the smooth flow of data from source systems to destination systems while adhering to data quality and integrity standards.
  • Data Integration: Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
  • Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
  • Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines.
  • Implement data governance in line with company standards.
  • Monitoring and Support: Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability.

FAQs

What is the job title for this position?

The job title for this position is Data Engineer.

Where is the job located?

The job is located in Ireland and follows a hybrid work model.

What is the primary focus of the Data Engineer role?

The primary focus of the Data Engineer role is to streamline the flow of information and deliver insights to manage Stars Analytics, helping to make healthcare decisions easier for members and saving on healthcare costs.

What types of data will the Data Engineer be working with?

The Data Engineer will work with Medicare & Retirement data migration to the cloud and integrate new data sources to improve predictive model accuracy and drive organizational efficiencies.

What are the primary responsibilities of the Data Engineer?

The primary responsibilities include data pipeline development, data integration, data transformation and processing, contributing to code development best practices, data governance implementation, and monitoring support for data systems.

What qualifications are required for this Data Engineer position?

Required qualifications include hands-on experience developing data pipelines, debugging complex data issues, fluency in SQL, understanding DevOps tools and CI/CD pipelines, and proficiency in Python.

What are some preferred qualifications for this position?

Preferred qualifications include experience with Snowflake or Databricks, building data pipelines on cloud platforms like AWS, Azure, or GCP, familiarity with Airflow, and experience with agile/scrum methodologies.

Is experience with large data sets important for this role?

Yes, the ability to debug complex data issues while working on very large data sets with billions of records is essential.

Will the Data Engineer be working independently or with a team?

The Data Engineer will be collaborating with both business and technical audiences, implying teamwork is an important aspect of the role.

Does Optum have a commitment to diversity and inclusion?

Yes, Optum is committed to diversity and inclusion, as reflected in their mission and employment policy. They provide equal opportunity to all applicants without regard to various protected characteristics.

Science & Healthcare
Industry
10,001+
Employees

Mission & Purpose

Optum is a health services and technology company that provides a wide range of solutions to improve healthcare delivery and outcomes. They offer services in healthcare management, data analytics, pharmacy benefit management, and technology solutions to healthcare providers, payers, employers, and government agencies. Optum's ultimate mission is to improve the health system's efficiency and effectiveness, creating a healthier world for everyone. Their purpose lies in collaborating with healthcare partners to address complex challenges, such as improving care coordination, reducing healthcare costs, and enhancing patient experiences. By leveraging data-driven insights and innovative technology, Optum aims to empower healthcare professionals and organisations to deliver high-quality, patient-centered care and drive positive healthcare outcomes for individuals and communities worldwide.

Culture & Values

  • Integrity

    Honour commitments. Never compromise ethics.

  • Compassion

    Walk in the shoes of people we serve and those with whom we work.

  • Relationships

    Build trust through collaboration.

  • Innovation

    Invent the future, learn from the past.

  • Performance

    Demonstrate excellence in everything we do.