Unidev is looking for a Senior Hadoop Developer to develop, create, and modify general computer applications software or specialized utility programs. At Unidev, you will work with a team of talented individuals who share a strong passion for their work.
Job responsibilities and duties include:
- Implement data analytics processing algorithms on Big Data batch and stream processing frameworks (e.g. Hadoop MapReduce, Python, Spark, Scala, Kafka etc.).
- Perform data acquisition, preparation, and perform analysis leveraging a variety of data programming techniques in Spark using Scala.
- Work on complex issues where analysis of situations and data requires an in-depth evaluation of variable factors.
- Load data from different datasets and decide on which file format is efficient for a task.
- Source large volumes of data from diverse data platforms into Hadoop platform.
- Install, configure, and maintain enterprise Hadoop environment.
- Build distributed, reliable, and scalable data pipelines to ingest and process data in real-time.
- Fetch impression streams, transaction behaviors, clickstream data, and other unstructured data.
- Define Hadoop Job Flows and manage Hadoop jobs using Scheduler.
- Review and manage Hadoop log files.
- Design and implement column family schemas of Hive and HBase within HDFS.
- Assign schemas and create Hive tables with suitable formats and compression techniques.
- Develop efficient Pig and Hive scripts with joins on datasets using various techniques.
- Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics.
- Fine tune Hadoop applications for high performance and throughput.
- Troubleshoot and debug any Hadoop ecosystem run time issues.
- Develop and document technical design specifications.
- Design and develop data integration solutions (batch and real-time) to support enterprise data platforms including Hadoop, RDBMS, and NoSQL.
- Implement Spark Streaming architecture and integration with JMS queue with custom receivers.
- Develop and deploy API services in Java Spring.
- Create Hive and HBase data source connection to Spring.
- Implement multi-threading in Java/Scala.
- Lead technical meetings, as required, and convey ideas clearly and tailor communication based on selected audience (technical and non-technical).
- Mentor Big Data Developers on best practices and strategic development.
Qualified applicants must also have experience with the following:
- Hadoop/Big Data Ecosystem and Architecture
- Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, and MapReduce
- Programming experience in Java, Scala, Python, and Shell Scripting
- SQL and Data modeling
Master’s degree in Computer Science, Applied Computer Science, Engineering, or any related field of study, plus at least two (2) years of experience in the job offered or in any related position(s).
Unidev offers a full suite of benefits including:
- 100% premium-paid medical insurance for employee-only-coverage
- 100% premium-paid dental insurance for employee-only-coverage
- Optional vision insurance
- Optional life insurance
- Paid time off and paid holidays
- 401k plan
Unidev is a female owned and operated IT consulting firm that provides custom software development, mobile app development, and staff augmentation services for clients across the country. Operating under the values of humanity, integrity, perseverance, and joy, Unidev helps people and businesses achieve their full potential through intentional and intelligent teamwork
Unidev is an equal opportunity employer. Employment is contingent upon the successful completion of a background check.
To apply, submit your resume to Human Resources at firstname.lastname@example.org.