Landmark Health

(4) Hadoop Developers

Job Locations US-MD-Ellicott City | US-MD-Ellicott City | US-MD-Baltimore | US-CA-El Segundo | US-CA-Huntington Beach
Posted Date 5 days ago(7/29/2020 10:58 AM)
Category
Information Technology
Type
Regular Full-Time
Job ID
2020-4066

Overview

Landmark Health is hiring  (4) Hadoop Developers to join a growing organization to support our Data Integration Team in Huntington Beach. Our Hadoop Developer(s) will work side by side with our IT development team to assist in the design and development of our data feeds, data warehouses, data lakes and enterprise reporting. Our Hadoop developer(s) will have a strong working knowledge of ETL development, using SQL Server Integration Services (SSIS) and ability to liaison between our internal stakeholders, business partners and IT teams to determine requirements and apply business rules to data. Provide recommendations to data modeling, data ingestion of our health plans partners data and implement H-Scale tool built on top of Hadoop eco-system to integrate, scale and manage healthcare data. If you have a passion for Hadoop then we want to hear from you! 

 

Opportunity is office based in Huntington Beach - Corporate Headquarters, OR Ellicott CIty, MD or Los Angeles, CA.

 

Unfortunately, VISA SPONSORSHIP is NOT offered for these roles

Responsibilities

• Architect, design, develop, implement, configure and document current and future Hadoop infrastructure
• Perform software installation and configuration, database backup and recovery, storage management, performance tuning, database connectivity and security
• Implement and support various components of the Hadoop ecosystem
• Monitor Hadoop cluster connectivity and performance.
• HDFS support and maintenance.
• Setting up new Hadoop users.
• Responsible for the new and existing administration of Hadoop infrastructure.
• Support data modelling, design and implementation,
• Work closely with infrastructure, network, database, business intelligence and application teams to ensure business applications are readily available and performing within agreed upon service levels
• Develop tools for automated build test deployment and management of the platform
• Continuously improve integration and delivery systems

Qualifications

• Bachelor’s degree in Computer Science, Information Systems, Information Technology, Management Information Systems, or equivalent experience is preferred
• 3+ years designing, developing and implementing solutions in Hadoop environments
• 3+ years supporting technologies in the Hadoop ecosystem
• Expertise with HDFS, YARN, MapReduce, Hive, Pig, Spark
• Proficiency with Hadoop technologies such as Nifi, HDFS, HBase, Flume and Sqoop
• Ability to write reliable, manageable and highly efficient code using Hadoop tools
• Proficiency with SQL scripting
• Experience in backend programming, particularly Java, JavaScript, Python, R, Scala
• Working knowledge of Linux/Unix, distributed computing, networks and network security
• Strong problem solving and analytical skills
• Effective oral and written communication skills for both technical and non-technical audiences
• Excellent systems and data analysis skills
• Ability to prioritize and manage competing projects

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Need help finding the right job?

We can recommend jobs specifically for you! Click here to get started.