Hadoop Developer

Location: San Francisco, CA
Date Posted: 07-02-2013

You should have a track record of expertise in design, implementation, and troubleshooting of software optimized for large scale environments. You thrive on learning and applying new technologies. You*re capable of managing your time well and working efficiently and independently. Excellent communication skills, both written and verbal, are required. 
 

  • Minimum 5 years professional experience.
  • 3 years solid experience building, configuring, and monitoring highly scalable distributed computing solutions.*
  • Expertise in developing high-quality, object-oriented code (Java preferred) deployed on Linux/Unix.
  • Well-versed in automated testing with experience using test frameworks like JUnit.
  • Strong system and application troubleshooting and performance tuning skills (Hardware, Linux, Networking, JVMs, etc.)
  • Bachelors degree in Computer Science or related field (Masters a plus).
  • Devoted to teamwork, collaboration, and knowledge sharing.

Qualifications

  • Experience with Hadoop, MapReduce, and related large-scale distributed systems technologies.
  • Experience with NoSQL storage technologies including: HBase, BigTable, Redis, MongoDB, Cassandra, etc.
  • Experience with Hibernate or other ORM technologies.
or
this job portal is powered by CATS