Shire Jobs

Mobile Shire Logo

Job Information

UKG (Ultimate Kronos Group) Sr Data Engineer in Noida, India

Job Summary: The Software Engineer - Data will be responsible for designing, developing, and maintaining robust data pipelines and architectures on the Google platform. The ideal candidate will have extensive experience with orchestration and data engineering tools as well as cloud data warehouses. They will be adept at working with various data sources, including files, FTP, and third-party APIs such as Salesforce and Qualtrics, as well as critical systems like ERPs. The role requires a deep understanding of medallion architecture and star schema data modeling to transform raw data into a cohesive EDW structure. The candidate should be comfortable working in an agile/scrum environment, utilizing standardized frameworks, Git repositories, and CI/CD processes for deployment.

Key Responsibilities:

• Software Development: Write clean, maintainable, and efficient code or various software applications and systems.  Design, develop, and maintain scalable data pipelines using modern cloud data applications such as Apache Airflow, DataFlow, Google Big Query, Azure Databricks, and Deltalake (Lakehouse). Ingest data from various sources, including files, FTP, third-party APIs (e.g., Salesforce, Qualtrics), and ERPs. Implement medallion architecture in datalakes (Azure Data Lake & Google Cloud Store) and star schema data modeling to transform and integrate data into a unified EDW structure utilizing advanced SQL and Python (PySpark).

• Collaboration: with cross-functional teams, business data analysts, product owners and business users to understand business requirements and translate them into technical solutions.

• Governance: Ensure data quality, integrity, and security across all data pipelines and storage solutions.

• Technical Leadership: Contribute to the design, development, and deployment of complex software applications and systems, ensuring they meet high standards of quality and performance.  

• Project Management: Manage execution and delivery of features and projects, negotiating project priorities and deadlines, ensuring successful and timely completion, with quality.  Participate in agile/scrum ceremonies, contributing to sprint planning, stand-ups, and retrospectives.  

• Architectural Design: Participate in design reviews with peers and stakeholders and in the architectural design of new features and systems, ensuring scalability, reliability, and maintainability.   

• Code Review: Diligent about reviewing code developed by other developers, provide feedback and maintain a high bar of technical excellence to ensure code is adhering to industry standard best practices like coding guidelines, elegant, efficient and maintainable code, with observability built from ground up, unit tests etc.   

• Testing: Build testable software, define tests, participate in the testing process, automate tests using, tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide.   

• Service Health and Quality: Maintain the health and quality of services and incidents, proactively identifying and resolving issues. Utilize service health indicators and telemetry for action providing recommendations to optimize performance. Conduct thorough root cause analysis and drive the implementation of measures to prevent future recurrences.  Maintain and support the data fabric / data platform that allows data analysts and scientists leverage for end-to-end data and information.  

• Dev Ops Model: Understanding of working in a DevOps Model.  Taking ownership from working with product management on requirements to design, develop, test, deploy and maintain the software in production.     

• Documentation: Properly document new features, enhancements or fixes to the product, contributing to training materials. 

• Develop data products, including web applications, API endpoints, and data visualizations, that deliver information and insights to business users.  

• Develop, maintain, and support the front-end Data Marketplace.

Qualifications:

• Bachelor’s degree in Computer Science, Information Technology, or a related field.

• 6+ years of experience in data engineering, with a focus on Azure and/or Google cloud data technologies.

• Proficiency in cloud data warehousing and orchestration tools such as Azure Data Factory, Apache Airflow, DataFlow, Google Big Query, Azure Databricks, and Deltalake (Lakehouse).

• Strong knowledge of data ingestion techniques from various sources, including files, FTP, and third-party APIs.

• Experience with medallion architecture and star schema data modeling.

• Proven ability to monitor, troubleshoot, and tune data pipelines for optimal performance.

• Advanced skills in SQL, Python (PySpark) and JavaScript.

• Familiarity with agile/scrum methodologies and standardized frameworks.

• Experience with Git repositories and CI/CD processes.

• Excellent problem-solving skills and attention to detail.

• Strong communication and collaboration skills.

• Self-starter with a proactive approach to learning and development.

Preferred Qualifications:

• Experience in Apache Spark frameworks

• Experience with some ML Engineering concepts and deploying and serving ML assets utilizing tools such as Vertex AI, Databricks ML (ML Flow), and Kubeflow

• Experience test automation frameworks and tools

It is the policy of Ultimate Software to promote and assure equal employment opportunity for all current and prospective Peeps without regard to race, color, religion, sex, age, disability, marital status, familial status, sexual orientation, pregnancy, genetic information, gender identity, gender expression, national origin, ancestry, citizenship status, veteran status, and any other legally protected status entitled to protection under federal, state, or local anti-discrimination laws. This policy governs all matters related to recruitment, advertising, and initial selection of employment. It shall also apply to all other aspects of employment, including, but not limited to, compensation, promotion, demotion, transfer, lay-offs, terminations, leave of absence, and training opportunities.

DirectEmployers