Innovate to solve the world’s most important challenges
Create robust and accessible data stores from a variety of sources. You will be responsible for acquisition of data, cleaning, transformation, and storage of data. You will be curious and enjoy deep diving into the raw data to help figure out the best way to manage and organize it. You will work closely with data scientists and business analysts to build world-class products. You will mix ETL, development and systems knowledge to achieve business goals.
Be a key player in the Honeywell Connected strategy by creating a data platform for applications and analytics.
Use state-of-the-art Cloud and Big Data technology and work with other experts across the company.
Develop in Python, Java, and Scala for Big Data and other technologies.
YOU MUST HAVE
- Bachelor degree in computer science, IT or engineering
- Positive attitude with the ability to get up to speed quickly with emerging technologies.
- Experience in migrating data from data sources (MS SQL, Oracle, MySQL) into Hadoop platform using Hadoop technology (Spark, Hive, NiFi, Pig, Sqoop)
- Strong Linux skills
- Experience with Java or other JVM-based language and software and deal with ambiguous requirements
- Experience with scalable open source data processing frameworks
- Interest in machine learning
- Ability to work in a fast-paced and sometimes ambiguous environment
Due to US export control laws, must be a US citizen, permanent resident or have protected status. Exempt How Honeywell is Connecting the World
- Continued Professional Development
- Some Travel Required
- Job ID: HRD26792
- Category: Engineering
- Location: 1944 E Sky Harbor Circle, Phoenix, AZ 85034 USA
Honeywell is an equal opportunity employer. Qualified applicants will be considered without regard to age, race, creed, color, national origin, ancestry, marital status, affectional or sexual orientation, gender identity or expression, disability, nationality, sex, or veteran status.