Keasis Inc: Hadoop Developer (12 - 18M Contract)
Developer
💰 Negotiable
📍 Charlotte, United States
Twine Jobs
Based in Manchester, United Kingdom
Last online 2 months ago
Developer is needed in Charlotte, United States.
This job has been crawled from the web.
Client: Keasis Inc
Location: Charlotte, NC
Contract: Contract
Required Skills
- 3-6years experience in Hadoop stack and storage technologies, HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
- Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
- Excellent analytical capabilities - Strong interest in algorithms
- Experienced in HBase, RDBMS, SQL, ETL and data analysis
- Experience in No SQL Technologies (ex., Cassandra/ MongoDB, etc )
- Experienced in scripting(Unix/Linux) and scheduling (Autosys)
- Experience with team delivery/release processes and cadence pertaining to code deployment and release
- Research oriented, motivated, pro-active, self-starter with strong technical, analytical and interpersonal skills.
- A team player with good verbal and written skills, capable of working with a team of Architects, Developers, Business/Data Analysts, QA and client stakeholders
- Versatile resource with balanced development skills and business acumen to operate at a fast and accurate speed
- Proficient understanding of distributed computing principles. Continuously evaluate new technologies, innovate and deliver solution for business critical applications
Desired Skills
- Object-oriented programming and design experience.
- Degree in Computer Science or equivalent.
- Experience with automated testing methodologies and frameworks, including JUnit, is a plus
- Python IDEs(Django, Flask), data wrangling and analytics in a python based environment
- Fundamentals of Python - Data Structures, Collections, Pandas for file and other type of data handling, visualizations etc.
- Visual Analytics Tools knowledge ( Tableau )
- Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
- Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
- Any Big Data certification(ex. Cloudera’s CCP, CCA) is a plus.
Salary: $50.00 - $65.00 per hour
Experience:
- Hadoop stack and storage technologies: 3 years (Required)
- Knowledge on Bigdata Enterprise architecture ex. Cloudera: 4 years (Required)
- HBase, RDBMS, SQL, ETL and data analysis: 3 years (Required)
- No SQL Technologies (ex., Cassandra/ MongoDB, etc ): 2 years (Required)
- Hadoop Developer: 6 years (Required)
- W2 position working: 1 year (Required)
Posted 2 years ago
No longer accepting applications
Get instant notifications for new Developer jobs. Enter your email:
How It Works
🔍Get quality leads
Review job leads for free, filter by local or global clients, and get real time notifications for new opportunities.
🎉Apply with ease
Pick the best leads, unlock contact details, and apply effortlessly with Twine's AI application tools.
📈Grow your career
Showcase your work, pitch to the best leads, land new clients and use Twine’s tools to find more opportunities.