Job Information
Nielsen Senior Software Engineer - Bigdata (Java/Scala , Spark, Python, AWS ) in Gurgaon, India
At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future.
As a Senior Software Developer, you will be a contributor of a Scrum/DevOps team focusing on analyzing, developing, testing, and supporting highly complex application software build on Big Data
Your primary objective is to ensure project goals are achieved and are aligned with business objectives. You will also work closely with your Scrum team and program team to test, develop, refine and implement quality software in production via standard Agile methodologies.
Responsibilities
Stay informed about the latest technology and methodology by participating in industry forums, having an active peer network, and engaging actively with customers
Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through collaborative effort
Build scalable, reliable, cost-effective solutions for both the Cloud and on-premises with an emphasis on quality, best-practice coding standards, and cost-effectiveness
Build and test Cloud-based applications for new and existing backend systems to help facilitate development teams to migrate to the cloud.
Build platform reusable code and components that could be used by multiple project teams.
Understand the enterprise architecture within the context of existing platforms, services and strategic direction.
Implement end-to-end solutions with sound technical architecture, in Big Data analytics framework along with customized solutions that are scalable, with primary focus on performance, quality, maintainability, cost and testability.
Drive innovative solutions within the platform to establish common components, while allowing customization of solutions for different products.
Develop design specifications, continuous build and deployment strategy to drive Agile methodology
Setup and manage expectations with consultants engaged in the projects.
Provide cloud integration development support to various project teams.
Build rapid technical prototypes for early customer validation of new technologies
Collaborate effectively with Data Science to understand, translate, and integrate methodologies into engineering build pipelines
Collaborate with product owners to translate complex business requirements into technical solutions, providing leadership in the design and architecture processes.
Provide expert apprenticeship to project teams on technology strategy, cultivating advanced skill sets in application engineering and implementing modern software engineering practices
Mentor junior team members, providing guidance and support in their professional development
Key Skills
Domain Expertise
Bachelor’s degree in computer science, engineering plus 5-8 years of experience in information technology solutions development.
Must have strong cloud Implementation expertise in cloud architecture.
Must have strong analytical and technical skills in troubleshooting and problem resolution.
Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security.
3+ years of experience: big data using Apache Spark in developing distributed processing. applications; building applications with immutable infrastructure in the AWS Cloud with automation technologies like Terraform or Ansible or CloudFormation.
A quick learner, who can pick up new technologies, program languages and frameworks in a short span of time
Technical Skills
Experience in Service-oriented architecture, Spark Streaming, and Git.
Experience in software development using programming languages & tools/services: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive, Cloud (preferably AWS), Docker, RDBMS (such as Postgres and/or Oracle), Linux, Shell scripting, GitLab, Airflow.
Experience in big data processing tools/languages using Apache Spark Scala.
Experience with orchestration tools: Apache Airflow or similar tools.
Strong knowledge on Unix/Linux OS, commands, shell scripting, python, JSON, YAML.
Agile scrum experience in application development is required.
Strong knowledge in AWS S3, PostgreSQL or MySQL.
Strong knowledge in AWS Compute: EC2, EMR, AWS Lambda.
Strong knowledge in Gitlab /Bitbucket .
AWS Certification is a plus.
Mindset and attributes
Very strong verbal and written communication skills
Advanced analytical and technical skills in troubleshooting and problem resolution
Ability to coach, mentor and provide guidance to junior colleagues