Job Application
Start Your Career Today
Please fill in your information and send it to the employer.
First Name *
Last Name *
Email *
Phone Number
Message
Resume Upload
Cover Letter Upload
Apply Now
Job Application
Start Your Career Today
Please fill in your information and send it to the employer.
First Name *
Last Name *
Email *
Go To Job Apply Page
Find a Job
Companies
Post a Job
Browse Companies
Hiring Solutions
Candidates
Register as Talent
Browse Top Talent
Become Verified Talent
Register
Sign In
Find a Job
Companies
Post a Job
Browse Companies
Hiring Solutions
Candidates
Register as Talent
Browse Top Talent
Become Verified Talent
EUR
USD
GBP
Sign In
Sign Up
©2024 Kaswans Technologies Ltd, United Kingdom All right reserved.
Big Data Engineer
Full Time
Remote
Hybrid
2 months ago
Apply Now
Employment Information
Industry
Data Science
Job Level
Entry Level
Open Positions
1
Salary
Attractive
Experience
Junior 1-3 Years
Job Type
Full Time , Remote , Hybrid
Location
Malmö, Sweden
Key Responsibilities:
Design, build, and maintain scalable and efficient big data pipelines for data ingestion, processing, and storage.
Develop and optimize data architecture to handle large volumes of structured and unstructured data.
Collaborate with data scientists, analysts, and engineers to ensure data quality and accessibility.
Implement and manage big data technologies such as Hadoop, Spark, Kafka, and others.
Automate data workflows and build reusable data processing frameworks.
Monitor and troubleshoot data pipelines and system performance to ensure reliability and scalability.
Ensure data security and compliance with relevant regulations (e.g., GDPR).
Collaborate with cross-functional teams to align data solutions with business needs and goals.
Required Qualifications:
Bachelor's degree in Computer Science, Data Engineering, or a related field.
Experience working with big data technologies (e.g., Hadoop, Spark, Kafka).
Proficiency in programming languages like Python, Java, or Scala.
Strong knowledge of SQL and NoSQL databases.
Experience with cloud platforms such as AWS, Azure, or Google Cloud.
Familiarity with data warehousing and ETL processes.
Knowledge of distributed systems and parallel computing.
Strong problem-solving skills and ability to work in a fast-paced environment.
Preferred Skills:
Experience with data lakes and data processing frameworks.
Knowledge of real-time data streaming technologies.
Familiarity with DevOps practices and containerization tools (e.g., Docker, Kubernetes).
Experience with data governance and data security best practices.
Excellent communication and teamwork skills.
Salary and Benefits:
Competitive salary
Health insurance and wellness programs.
Paid vacation and public holidays.
Opportunities for professional growth and career development.
Collaborative and innovative work environment.
Skills
Python
Docker
AWS
ETL Processes
Google Cloud
Azure
DevOps
Tags
Python
Docker
Kubernetes
SQL
Data Engineer
ETL processes
Apply Now
Share this
Always Stay Ahead with
New Opportunities
Subscribe
Your experience on this site will be improved by allowing cookies
Cookie Policy
Allow cookies