Warsaw, Poland
Posted 12 months ago

Job Title Big Data Engineer PySpark
Type: Permanent
Salary: 20,000-25,000PLN per month (Negotiable)
Hiring Location Warsaw, Poland
Location Status Currently work from home due to COVID
Must Have Skills: Big Data-PySpark, Hadoop, Unix, Shell Scripting
Databricks Certified Developer or Similar

Job Details:
Java PySpark technical lead with minimum 5-6 years of experience in PySpark
Requirement gathering and understanding, Analyze and convert functional requirements into concrete technical tasks.
PySpark, Hive, HDFS, Impala, HBase, Hadoop MapReduce, Linux shell scripting
Understanding of Cloud, Docker, Kubernetes will be plus
Agile/Scrum methodology experience is required, Experience in SCMs like GIT; and tools like JIRA
Experience in RDMS and No SQL databases
Strong experience in Data Structures, Algorithms etc
Well versed with SDLC life cycle having exposure to various Phases
Strong hands-on working experience in multiple Big data technologies
Good to have Datawarehouse exposure
Responsible for systems analysis – Design, Coding and Unit Testing

Full Time Duration: 3 Months

Job Features

Job CategoryBig Data Engineer PySpark

Apply Online

A valid phone number is required.
A valid email address is required.