Data Engineer (J00122495)

Website EquifaxCareers Equifax

Smarter insights for smarter decisions.

Job Summary 

Equifax is seeking a Data Engineer who can prepare data for analytical use by building data pipeline to gather data from multiple sources and systems, integrating, consolidating and cleansing data, and structuring

Who is Equifax? 

At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence.

We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best.

The Perks of being an Equifax Employee?

We offer excellent compensation packages with market competitive pay, comprehensive healthcare packages, 401k matching, schedule flexibility, work from home opportunities, paid time off, and organizational growth potential.
Grow at your own pace through online courses at Learning @ Equifax.

What you’ll Do

  • Create and maintain data pipelines using the Google Cloud Platform using the following tools: Google Dataflow, PubSub, BigQuery and Cloud Storage (or their equivalent in other platforms such as AWS, Azure or Hadoop).
  • Work with data scientists in building and optimizing our AI solutions for greater functionality in our data systems
  • Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficienciesIdentify, design and implement process improvements: optimize data delivery and automate manual processes
  • Maintain data integrity and regionalization by defining boundaries through multiple GCP zones

Qualifications:

  • Bachelor’s Degree in Computer Science, Statistics, Mathematics or another quantitative field
  • 2+ years Experience with REST APIs, programming language with Java or Python or Scala or Go
  • 2 + years of experience with relational SQL and NoSQL databases
  • 2+ years of experience with big data tools: Google Dataflow or Google DataPrep or Hadoop

Extra Points for any of the following 

  • Hands on experience in Cloud technologies and Google Data Cloud tools, BigTable and BigQuery
  • Strong analytical skills and attention to detail and accuracy
  • Work experience in regulatory and data compliant environments and Credit Industry Domain knowledge is preferred

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

If this sounds like somewhere you want to work, don’t delay, apply today – we’re looking for you!

Before applying for this position you need to submit your online resume. Click the button below to continue.

Become a Member Today

Be Part of the WIT Movement and join our community of technology leaders, professionals and students TODAY!
Membership with Women In Technology is FREE