Hadoop Developer

Contract Full Time 3 months ago
Employment Information

Job Overview

We are looking for an experienced Hadoop Developer to join our team in Warsaw for a 3-year contract. In this role, you will design, implement, and optimize big data solutions using the Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop, Spark, and other big data technologies and will work closely with data engineers and architects to build scalable, efficient data pipelines and systems.

Key Responsibilities

  • Develop and maintain robust big data applications using Hadoop and related technologies.
  • Design and implement data pipelines and ETL processes using Hadoop, Spark, and Kafka.
  • Optimize performance of big data applications to ensure efficiency and scalability.
  • Work with cross-functional teams to ensure smooth integration of data systems.
  • Manage and process large datasets from diverse sources.
  • Troubleshoot and resolve issues with Hadoop applications and data systems.
  • Collaborate with data scientists, analysts, and engineers to ensure data is available and ready for analysis.
  • Stay updated with the latest trends and best practices in big data technologies.

Technical Skills Required

  • Hadoop Ecosystem: Expertise in Hadoop, HDFS, MapReduce, Hive, Pig, and YARN.
  • Data Processing: Experience with Spark for batch and real-time data processing.
  • ETL Frameworks: Familiarity with ETL tools and processes for data integration and transformation.
  • Programming Languages: Proficiency in Java, Scala, or Python for big data development.
  • SQL/NoSQL: Strong understanding of both relational (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., HBase, MongoDB).
  • Data Streaming: Knowledge of Kafka or similar stream processing tools for real-time data.
  • Cloud Platforms: Experience with cloud data services such as AWS, GCP, or Azure (a plus).
  • Version Control: Experience with Git for version control and collaborative development.
  • Automation Tools: Familiarity with tools like Apache NiFi, Airflow, or similar workflow automation platforms.

Qualifications

  • 4+ years of professional experience as a Hadoop Developer or in a similar role.
  • A degree in Computer Science, Software Engineering, or a related field.
  • Experience with the full lifecycle of big data solutions, from development to deployment.
  • Ability to work independently and within a collaborative team environment.
  • Strong problem-solving and debugging skills.
  • Excellent communication skills in English; knowledge of Polish is a plus.

What We Offer

  • Competitive Salary: €4,500 – €6,500 per month (based on experience).
  • Contract: 3-year contract with the possibility of extension.
  • Career Growth: Opportunities to work on large-scale, innovative data projects.
  • Work-Life Balance: Flexible working hours and the option for remote work.
  • Benefits: Health insurance, paid leave, and access to professional development resources.
  • Dynamic Team: Collaborate with experts in big data and cutting-edge technologies.
Tech Jobs - Discover Your Dream IT & Tech Career in Europe

Always Stay Ahead with
New Opportunities

Tech Jobs - Discover Your Dream IT & Tech Career in Europe