Skip navigation EPAM

Big Data Developer (Remote, relocation to Poland) Ukraine or Remote

Big Data Developer (Remote, relocation to Poland) Description

Job #: 65203
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

DESCRIPTION


We are looking for aspiring Big Data Developers with at last 1 year of commercial experience to join our growing Data Practice and make our team even stronger. Projects and technologies we are working with are very different and cover all technologies which present currently on the market and represented by open source communities.

We are providing our service to Clients in different domains: Financial, Health Care, Insurance and many others, so you will have a chance to develop yourself in any direction you want.

You can join one of our offices, which are located in Warsaw, Krakow, Wroclaw and Gdansk.
#REF_R2R_PL

Responsibilities

  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, Time series data, SAP and a lot other based on various proprietary systems. You will need to research and implement data ingestion with help of Big Data technologies
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part business logic and implement it using any language which supported by base data platform

Requirements

  • Advanced knowledge one of language: Java, Scala, Python, C#
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow
  • Google Cloud Platform, Amazon Web Services, Cloudera (or any other) HADOOP distribution, Databricks

Nice to have

  • Spark Streaming
  • Kafka Streaming / Kafka Connect
  • Snowflake
  • ELK Stack
  • Docker / Kubernetes
  • Cassandra / MongoDB
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools

We offer

  • Competitive compensation depending on experience and skills
  • Individual career path
  • Unlimited access to LinkedIn learning solutions
  • Social package - medical insurance, sports
  • Compensation for sick lists and regular vacations
  • English classes with native speakers (certified English teachers)
  • Flexible work hours

HELLO! HOW CAN WE HELP YOU?


OUR OFFICES