Shire Jobs

Mobile Shire Logo

Job Information

UnitedHealth Group Senior Data Engineer - Azure - Multiple Locations in Noida, India

Combine two of the fastest-growing fields on the planet with a culture of performance, collaboration and opportunity and this is what you get. Leading edge technology in an industry that's improving the lives of millions. Here, innovation isn't about another gadget, it's about making health care data available wherever and whenever people need it, safely and reliably. There's no room for error. Join us and start doing your life's best work.(sm)

The project under Data Solutions in Optum is a cloud migration initiative to build a unified data platform to cater unique data needs for our consumers and end users. As part of it we are migrating end to end data delivery stack to azure cloud. Our team is looking experts who has experience in cloud migration of large datasets on warehouse and build complex data pipelines for data load and perform analytics. We are essentially looking for data engineers who has experience in working on azure eco system and hands on experience in design, develop and deployment of large set azure cloud solutions.

Primary Responsibilities:

  • As an engineer, design technical solutions and develop them using azure cloud native tools and software

  • Copying large data sets from on perm to azure Gen2 storage, develop data pipelines using Azure data factory

  • Load the data into MPP database systems like (Synapse Analytics or Snowflakes) and perform analytics using azure data bricks

  • Automating end to end data pipelines executions using scheduling tools like airflow or ADF native schedulers

  • Versioning of code on GitHub and implementing CI/CD using Jenkins pipeline and automated deployment of code

  • Developing complex SQL queries running on terabytes of data in warehouse and also optimizing the over run time, monitoring data shuffles in query, tuning performance of SQLs

  • Participating in ETL design discussion, understanding specification and requirements, feature and backlog grooming

  • Primarily focusing on building “everything as a code“ and develop generic and reusable components

  • Emphasis on effective logging and debugging, covering failure and exception scenarios

  • Exhibiting alternative design thinking and developing workaround solutions for blockers, edge case and corner scenarios in data migration

  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:

  • 4+ years of experience in data engineering and working on large data warehouse MPP database systems like Synapse analytics including design and development of ETL on distributed platforms

  • 4+ year of experience in constructing large and complex queries on terabytes of warehouse database system

  • 3+ years of experience in Azure cloud eco system particularly on Azure resources for data and ETL migration from on perm to cloud

  • 2+ years of experience in Azure database offering like SQL DB, Postgres DB, constructing data pipelines using Azure data factory, design and development of analytics using Azure data bricks

  • 1+ years of experience in scheduling tools on cloud either using Apache Airflow or logic apps or any native/third party scheduling tool on cloud

  • Hands-on experience working with azure run time executions like Azure containers, Azure function app and having real time experience in develop, build and deploy to Azure functions

  • Knowledge on multiple solution for integrating ADF data pipeline, SQL scripts, Data bricks job into Airflow DAG or Azure Logic app

  • Knowledge on data quality and data profiling management and supporting open source tools for the same

Preferred Qualifications:

  • Azure certifications Ex. AZ 200, AZ 201 or AZ - 400

  • 1+ years of experience working with cloud native monitoring and logging tool like Log analytics or experience in any third-party services

  • At least 1 programming language, having Python

  • Excellent documentation experience and skills

  • Knowledge on cloud shell either PowerShell or Bash

  • Knowledge on Apache Open source software stack for quick integration

  • Individual contributor in team and peer reviewing code commits

  • Excellent written and verbal communication skills for communicating with customers, technically and procedurally

  • Highly self-motivated with excellent interpersonal and collaborative skills

  • Solid analytical and solid problem-solving skills

  • Able to anticipate risks and obstacles and develop plans for mitigation

Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.(sm)

Job Keywords: Senior Data Engineer, Data Engineer, Data Engineering, Azure, Data Warehouse, MPP, Synapse Analytics, ETL, Azure Cloud, ETL, ETL Development, SQL DB, Postgres DB, Azure Data Factory, Azure Data Bricks, Scheduling Tools, Apache Airflow, Noida, UP, Uttar Pradesh, Hyderabad, TG, Telangana, Gurgaon, HR, Haryana

DirectEmployers