Hotline: 678-408-1354

Principal Data Engineer !! UT/Remote !! Fulltime

Job Description

To follow up with any questions, please contact Junaid at 510-574-9220 / junaid@triuneinfomatics.com

Position: Principal Data Engineer
Location: UT/Remote
Duration: Fulltime

We are looking for a Principal Data Engineer who will play a critical role in our data engineering strategy, contribute to and lead the development of our Enterprise Data and Analytics Platforms.
We are looking for a passionate professional who can blend the fast-changing technology landscape of Big Data & Advanced Analytics with the complex and high-impact space of e-commerce and direct sales.
Identify processes and tools that can benefit from an increased focus on automation to enable seamless development and self-service analytics workloads.
Obtain & maintain technical expertise of available data manipulation and preparation tools and programming languages, including Matillion, Python, Spark, Elastic Map Reduce, and so on.
Build, manage and optimize data pipelines through a variety of ELT/ETL tools, including custom infrastructure and 3rd-party tooling such as, but not limited to, AWS Platform
Collaborate with internal engineering teams and vendors to understand the business logic of systems to ensure veracity in datasets
Generate documentation on existing production data logic and its influencing business processes in order to reconcile knowledge gaps between the business, engineering, and data collection
Solid 8-10 years of experience in delivering data engineering solutions that include batch and streaming capabilities
Experience building, testing, automating and optimizing data pipelines
Experience using AWS Big data and analytics products
Strong understanding and prior use of SQL and be highly proficient in the workings of data technologies, for example, Hadoop, Hive, Spark, Kafka, low latency data stores, Airflow, among others.
Have a deep understanding of data testing techniques, and a proven record of driving sustainable change to the software development practice to improve quality and performance.
Proficiency with data querying languages (e.g. SQL), programming languages (e.g. Python, Java, etc.),
Expertise selecting context-appropriate data modeling techniques, including Kimball dimensional modeling, slowly changing dimensions, snowflake, and others.
Passion for software development and data and be highly skilled in performing data extraction, transformation and processing to optimize quantitative analyses on various business functions
Familiarity with Scrum, DevOps, and DataOps methodologies, and supporting tools such as JIRA
Experience with AWS technologies such as Redshift, RDS, S3, Glacier, EC2, Lambda, API Gateway, Elastic Map Reduce, Kinesis, and Glue.
Experience with managing AWS infrastructure as code, including the use of Cloud Formation, Git, and GitLab.
Strong presentation skills and the ability to communicate analytical and technical concepts with confidence and in an easy-to-understand fashion to technical and non-technical audiences

Nice to Have
General understanding of machine learning tools and frameworks, such as SageMaker, AWS Forecast, TensorFlow, PyTorch, etc.
Familiarity with statistical and mathematical programming languages (e.g. R, MATLAB, etc.)
Professional certifications, such as the AWS Certified Solutions Architect Associate or Certified Big Data Specialist

Email Me Jobs Like These
Share this job

Contact Us

Eltas EnterPrises Inc.
3978 Windgrove Crossing
Suite 200A
Suwanee, Georgia
30024, USA
contact@eltasjobs.com