Hotline: 678-408-1354

Hadoop Administrator ~

This position is responsible for the support, design, maintenance, configuration and monitoring of Costco’s new Enterprise Data Hub (EDH) environment that spans multiple technology layers (i.e. Hardware, OS, Database, Application , Security, etc.). This position requires a sharp, hard-working individual who is ready to pitch in and do whatever it takes for the team to successfully stand up, roll out and support our growing EDH environment. Along with the technical work, there will be a need to establish successful processes and documentation. This individual will participate in administration of the EDH environment, integration of external tools/software as well as all strategic and tactical planning of the EDH environment required to meet Costco-specific business needs. The individual filling this role will also be expected to deliver excellent customer service and to interact in a highly effective manner with other team members and management. Costco is looking for a creative individual who can effectively manage projects and multiple tasks to meet business objectives.

Tasks and responsibilities

  • Recommend security management best practices including the ongoing promotion of awareness on current threats, auditing of server logs and other security management processes, as well as following established security standards.
  • Designs and implements a toolset that simplifies provisioning and support of a large cluster environment.
  • Plans and executes major platform software and operating system upgrades and maintenance across physical and virtualized environments.
  • Works with project teams to integrate EDH access points.
  • Strong focus on designs, builds, deployments, security, system hardening and securing services.
  • Creates and maintains detailed up-to-date technical documentation.
  • Proactively manages EDH system resources to assure maximum system performance and appropriate additional capacity for peak periods and growth.
  • Reviews performance stats and query execution/explain plans, and recommends changes for tuning Hive/Impala queries.
  • Partners with Infrastructure to identify server hardware, software and configurations necessary for optimally running big data workloads (i.e. Spark, Hive, Impala, etc.)
  • Provides support to the user community using incident and problem management tools, email, and voicemail.
  • Periodic off-hours work required including weekends and holidays. Must be able to provide 24 by 7 on-call support as necessary.
  • Maintains current knowledge of industry trends and standards in a “Big Data” space/ecosystem.

Required skills and abilities

  • Minimum 4 years of experience supporting products running on AIX, Linux or other UNIX variants.
  • Minimum 2 years of shell scripting experience.
  • Minimum 5 years of experience supporting production JVMs.
  • Understanding of Hadoop and distributed computing
  • Understanding of Hadoop components (i.e. HDFS, YARN, Zookeeper, Sqoop, Hive, Impala, Hue, Sentry, Spark, Kafka, flume, etc.)
  • Understanding of message based architecture
  • Version control experience (Unix, git, git preferred)
  • Demonstrated ability in Systems Management and Administration (applying fixes, loading the OS, creating Gold images, resolving system issues, working with vendors to resolve issues, etc.)
  • Proficient with operating system utilities as well as server monitoring and diagnostic tools to troubleshoot server software and network issues.
  • Knowledge of Relational Databases (DB2, Oracle, SQL Server, DB2 for iSeries, MySQL, Postgres, MariaDB)
  • Deep knowledge of associated industry protocol standards such as: LDAP, DNS, TCP/IP, etc.
  • Understanding of Enterprise level services – Active Directory, PKI Infrastructure (Venafi).
  • Understanding of Kerberos, SAML, cross-realm authentication and Kerberized services.
  • Security experience including SSL certificates,TLS, hardening, PEN tests, etc.
  • Experience with VMWare (vSphere), virtualized environments and/or Microsoft Azure
  • Configuration Management experience (puppet preferred)
  • Experience in System/Database Backup and Recovery

Recommended skills, abilities, and certifications

  • Experience with Tivoli Storage Management and/or Commvault
  • B.S. degree in Computer Science or equivalent formal training and experience.

Apply: Use the link below to upload all required documents to

https://chm.tbe.taleo.net/chm02/ats/careers/requisition.jsp?org=COSTCO&cws=1&rid=1877

Applicants and employees for this position will not be sponsored for work authorization, including, but not limited to H1-B visas. If hired, you will be required to provide proof of authorization to work in the United States. Apart from any religious or disability considerations, open availability is needed to meet the needs of the business.

Share this job

Contact Us

Eltas EnterPrises Inc.
3978 Windgrove Crossing
Suite 200A
Suwanee, Georgia
30024, USA
contact@eltasjobs.com

Subscribe to our Newsletter