Shire Jobs

Mobile Shire Logo

Job Information

Apex Systems, Inc GCP Data Engineer in Dearborn, Michigan

Full Job Description

Job#: 2038905

Job Description:

On-site Dearborn, MI 12 + Month Contract

*Skills: *

Required: * Technical Skills: * Proficiency in data pipeline tools and frameworks such as Apache Airflow, Cloud Composer, and Cloud Dataflow.

  • Strong knowledge of GCP services, including BigQuery, Cloud Storage, Cloud Run, and Cloud Functions.

  • Experience with SQL, Python, and other programming languages commonly used in data engineering.

  • Familiarity with data modeling, ETL processes, and data integration techniques.

  • Soft Skills: * Excellent problem-solving and analytical skills.

  • Strong communication and collaboration abilities.

  • Ability to work independently and as part of a team in a fast-paced, dynamic environment. 

     

*Education: *Required: Bachelors in Science Preferred: Masters in Science

 

*Description: *

Key Responsibilities: 1. Data Pipeline Development:

  • Design, build, and maintain scalable and robust data pipelines on GCP using tools such as Apache Airflow, Cloud Composer, and Cloud Dataflow.

  • Implement data integration solutions to ingest data from various sources, including cloud storage, and third-party APIs.

  1. Data Warehousing:
  • Develop and optimize data warehouse solutions using BigQuery and other GCP services.

  • Ensure data accuracy, consistency, and security within the data warehouse environment.

  • Monitor and troubleshoot data pipeline and warehouse issues to maintain system reliability. 3. Cloud Platform Expertise:

  • Utilize GCP services such as Cloud Storage, Cloud Run, and Cloud Functions to build scalable and cost-effective data solutions.

  • Implement best practices for cloud infrastructure management, including resource provisioning, monitoring, and cost optimization.

  1. Collaboration and Communication: * Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions.
  • Collaborate with cross-functional teams to design and implement data models, ETL processes, and reporting solutions.
  1. Automation and Optimization:
  • Develop automated workflows using Apache Airflow and Astronomer to streamline data processing and improve efficiency.

  • Continuously optimize data pipelines for performance, scalability, and cost-effectiveness.

  1. Documentation and Training: * Create and maintain comprehensive documentation for data pipelines, data models, and infrastructure components.
  • Provide training and support to team members and stakeholders on data engineering best practices and GCP services.

::: {cla=""} :::

DirectEmployers