Hotline: 678-408-1354

Strong programmer with Data engineering tools AWS, Redshift,Data Pipeline

Title: Strong programmer with Data engineering tools: AWS, Redshift, Python, Spark, ETL, Data Pipeline

Location: Bellevue, WA

Duration: Full-time

Looking for strong programmer/coder

  • We are not looking for consultant types and looking for people who have worked at consumer software technology companies, ideally in the e-commerce.
  • consumer tech startup/company is a huge plus

As a Software Data Engineer, you’ll be part of a team responsible for the integrity and accessibility of all of Postmates business-critical data. You’ll contribute to our data-pipelines, our analytics tools, and our data science and machine learning infrastructure, as well as help design and scale our architecture to meet future needs. You’ll work with teams across the organization, making sure that engineers have the tools to generate and store data and that business and data science consumers have the information they need at their fingertips.

We’re looking for engineers with a proven track record of shipping high-impact tools. We care much more that you understand how to build simple, clear, and reliable tools than you have experience with any given toolset or pattern. We love learning, and we expect that you will learn new things and teach us new things as we build out the Postmates data infrastructure.

RESPONSIBILITIES:

  • Design and build reliable, easy to use data pipelines and data systems
  • Roll out new tools and features on existing big data storage, processing, and machine learning systems
  • Triage, identify, and fix scaling challenges
  • Perform cost-benefit analyses of short-term needs vs long-term data scaling and company growth
  • Educate product managers, analysts, and other engineers about how best to use our systems to answer hard business questions and make better decisions using data

REQUIREMENTS:

  • You possess strong computer science fundamentals: data structures, algorithms, programming languages, distributed systems, and information retrieval.
  • You’ve built large-scale data pipeline and ETL tooling before, and have strong opinions about writing beautiful, maintainable, understandable code.
  • You’ve worked professionally with both streaming and batch data processing tools, and understand the tradeoffs.
  • You understand the challenges of working with schema-based and unstructured data, and enjoy the challenge of collecting data flexibly and accurately.
  • You have extensive experience with at least one RDBMS platform (Postgres, Transact-SQL, MySQL, etc.)
  • You are a strong communicator. Explaining complex technical concepts to product managers, support, and other engineers is no problem for you.
  • You love it when things work, you understand that things break, and when things do fail you dive in to understand the root causes of failure and fix whatever needs work.

BONUS POINTS:

  • A Masters degree (or higher) in a technical field (C.S., Math, Physics, Engineering…)
  • AWS development and operations experience (EMR, s3, data pipelines, etc.)
  • Experience with the Apache Ecosystem – Kafka, Spark, Storm, Zookeeper, Etc
  • Experience with Amazon Redshift data warehouse
  • A solid math and statistics background

Best Regards,

Anshul Mishra

Cogent Infotech

1035 Boyce Road, Suite 108, Pittsburgh PA 15241

Tel: 412-212-1146

Job Type: Full-time

Share this job

Contact Us

Eltas EnterPrises Inc.
3978 Windgrove Crossing
Suite 200A
Suwanee, Georgia
30024, USA
contact@eltasjobs.com