Capgemini

Data Engineer (Databricks & Google Cloud Platform)

Capgemini Nashville Metropolitan Area

Save

Capgemini provided pay range

This range is provided by Capgemini. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range

$46,000.00/yr - $90,000.00/yr

Direct message the job poster from Capgemini

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world.

About the



  • Primary Skills: Databricks, Google Cloud Platform (GCP), Python, SQL
  • Location: Nashville, TN Onsite
  • Employment Type: Full-time
  • No sposhipship available for this role


Position Overview

We are seeking a highly skilled Data Engineer with strong expertise in Databricks and Google Cloud Platform (GCP) to design, develop, and optimize large-scale data pipelines and analytics solutions. The ideal candidate will have a proven background in modern data engineering practices, cloud-based architectures, and advanced data transformation frameworks.


Key Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines using Databricks (PySpark/Spark SQL).
  • Develop and optimize data workflows on Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, and Pub/Sub.
  • Implement data ingestion, transformation, and integration processes for structured and unstructured data.
  • Collaborate with cross-functional teams to support data modeling, analytics, and machine learning initiatives.
  • Ensure data quality, reliability, governance, and security standards are met.
  • Perform pipeline performance tuning and optimize cloud resources to reduce compute cost.
  • Automate workflow orchestration using tools such as Airflow, Databricks Jobs, or Cloud Composer.
  • Contribute to the design of data Lakehouse and data warehouse architectures.
  • Maintain technical documentation, best practices, and operational runbooks.


Required Qualifications

  • Bachelor’s degree in computer science, Engineering, or a related field.
  • 3+ years of experience in Data Engineering or related roles.
  • Strong hands-on experience with Databricks, including Spark, Delta Lake, and notebooks.
  • Proficiency in GCP data services such as BigQuery, Cloud Storage, Cloud Composer, Pub/Sub, Dataflow, etc.
  • Strong programming skills in Python and advanced proficiency in SQL.
  • Experience with CI/CD pipelines, Git-based workflows, and DevOps practices.
  • Knowledge of data modeling, data lake house principles, and distributed computing.


Preferred Qualifications

  • Certification in Databricks or Google Cloud Platform.
  • Experience with Terraform or Infrastructure-as-Code.
  • Exposure to real-time streaming technologies (Kafka, Pub/Sub).
  • Experience working in Agile environments.
  • Familiarity with machine learning pipelines (MLflow, GCP Vertex AI).


Soft Skills

  • Strong communication and stakeholder management skills.
  • Ability to work independently and in collaborative, cross-functional teams.
  • Strong analytical and problem-solving mindset.
  • Attention to detail and commitment to high-quality deliverables.



The base compensation range for this role in the posted location is:$46,000 to $9,000.

Capgemini provides compensation range information in accordance with applicable national, state, provincial, and local pay transparency laws. The base compensation range listed for this position reflects the minimum and maximum target compensation Capgemini, in good faith, believes it may pay for the role at the time of this posting. This range may be subject to change as permitted by law.

The actual compensation offered to any candidate may fall outside of the posted range and will be determined based on multiple factors legally permitted in the applicable jurisdiction.

These may include, but are not limited to: Geographic location, Education and qualifications, Certifications and licenses, Relevant experience and skills, Seniority and performance, Market and business consideration, Internal pay equity.

It is not typical for candidates to be hired at or near the top of the posted compensation range.

In addition to base salary, this role may be eligible for additional compensation such as variable incentives, bonuses, or commissions, depending on the position and applicable laws.

  • Seniority level

    Entry level
  • Employment type

    Full-time
  • Job function

    Consulting
  • Industries

    IT Services and IT Consulting and Business Consulting and Services

Referrals increase your chances of interviewing at Capgemini by 2x

See who you know

Get notified about new Data Engineer jobs in Nashville Metropolitan Area.

Sign in to create job alert

Similar jobs

People also viewed

Similar Searches

Explore collaborative articles

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

Explore More