We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Asset & Wealth Management-Dallas-Associate-Data Engineering

The Goldman Sachs Group
$1.8 trillion in assets
United States, Texas, Dallas
Jul 08, 2025

Goldman Sachs Asset & Wealth Management:

GSAM is the asset management arm of Goldman Sachs, which supervises more than $1.8 trillion in assets. Goldman Sachs Asset Management has been providing discretionary investment advisory services since 1988 and has investment professionals in all major financial centers around the world.

Our team of engineers in the Quant Data Engineering help support quantitative research and investment strategies through efficient data acquisition, transformation, and access. We build critical data pipelines to enable seamless ingestion, storage, and processing of vendor datasets of any size, format, or structure. QDE empowers researchers and portfolio managers by rapidly onboarding and productionizing datasets and providing distributed compute infrastructure (Spark/Dask) for large-scale data analysis .

We are seeking an associate Software Engineer for the Asset Management Division in our Quant Engineering team in Dallas.

Responsibilities and Qualifications



  • Bachelor's or Master's degree in Computer Science or related technical discipline.
  • Work in an Agile environment managing end-to-end systems development cycle from requirements analysis to coding, testing, UAT, implementation and maintenance
  • Work in a dynamic, fast-paced environment that provides exposure to big data platform and research
  • Able to contribute towards Production Support and Maintenance of data platform including incident management, troubleshooting ,alert monitoring and problem management



Skills and Experiences we are Looking For



  • 2-5 years' experience working as a data engineer with hands on experience building ETL pipelines.
  • Knowledge in software development, design and core programming concepts in at least one of these languages: Python, Scala, Java
  • Expertise in Big data technologies using Hadoop, Spark and distributed computing
  • In-depth knowledge of relational and columnar SQL databases with SQL, PLSQL querying skills
  • Strong programming experience in at least Pyspark or Scala
  • Knowledge of AWS, Snowflake cloud technology would be a plus
  • Experience in shell scripting in Unix or Linux environments.
  • Experience working in a git-based CI/CD SDLC environment.
  • Comfortable multi-tasking, open to learning new technologies and working as part of a global team.
  • Strong problem solving and analytical skills with excellent communication skills

Applied = 0

(web-8588dfb-dbztl)