Extensive experience in Python, Scala, or Java programming language Experience with Spark or PySpark Experience with CI/CD and relevant tools like Github Actions Experience in business process analysis and automation or similar field is considered as
2024-10-24 16:04:07SEB (Skandinaviska Enskilda Banken)Experience with analysis and creation of data pipelines, data architecture, ETL/ELT development and with processing structured and unstructured data, including post-go live activities. Ability to analyse data, identify issues (e.g. gaps, inconsistenc
2024-10-25 11:27:57EuroclearRequirements: At least 4-5 years of experience in Data Engineering Expertise in Azure cloud service platform Proven experience with large databases and data warehouses Expertise in Databricks: Delta Tables, DLT, Unity Catalog, performance tuning, etc
2024-10-25 16:50:37AvengaUl. centrum
2024-11-15 13:56:05Kevin Edward ConsultancyUl. Powstańców Śląskich 3
2024-11-15 13:56:04SoftKraftPolska, Praca zdalna
2024-11-15 12:55:19SII 4,6Your responsibilities, Develop and maintain data ingestion pipelines using Cassandra, Kafka, Spark, and Scala., Collaborate closely with cross-functional teams to ensure data consistency, accuracy, and high performance., Optimize data integration..
2024-11-15 12:40:41Ta oferta jest dla Ciebie, jeśli: posiadasz min. 5 lat doświadczenia komercyjnego zdobytego na podobnej pozycji, znasz Pythona, PySparka, Pandas oraz masz doświadczenie w pracy z notatnikami JupyterLab. biegle posługujesz się SQL oraz na co dzień wyk
2024-11-15 11:29:45ScaloTa oferta jest dla Ciebie, jeśli: posiadasz minimum 4 lata doświadczenia zawodowego w roli Data Engineera, biegle posługujesz się Pythonem, SQL, DBT oraz Airflow, masz doświadczenie w pracy z usługami AWS (np. EventBridge, Kinesis), znasz się na budo
2024-11-15 11:22:06ScaloExperience in data engineering; Experience working with Cloud Solutions (preferably AWS, also GCP or Azure); Experience with Cloud Data Platforms (e.g., Snowflake, Databricks); Proficiency with Infrastructure as Code (IaC) technologies like Terraform
2024-11-15 10:24:09ProvectusHybrid work (office in Warsaw)B2B contract
2024-11-14 17:59:02Technical skills: Expertise in SQL (writing and optimizing complex queries; preferably in T-SQL or Snowflake dialects) Writing clean and self-documenting code Practical knowledge of code version control (preferably Git) Experience with Azure DevOps (
2024-11-14 17:36:28emagine sp. z o.o 4,4Nasza firma działa w sektorze usług IT, specjalizując się w dostarczaniu zaawansowanych rozwiązań technologicznych. Cenimy innowacyjność, profesjonalizm oraz dążenie do doskonałości. Nasza
2024-11-14 16:30:00A solid technical background in data and analytics engineering. Proven experience in building and managing data pipelines and models using tools such as dbt, Fivetran, and Snowflake. A strong understanding of data governance, privacy, and compliance
2024-11-14 14:56:39Link GroupUl. Fabryczna 6
2024-11-14 14:56:03Holisticon ConnectYour responsibilities, Develop innovative solutions for complex data challenges, using creativity and technology., Design and build data systems that fulfill client requirements and optimize user experience., Collaborate across teams to integrate..
2024-11-14 13:40:42W Scalo zajmujemy się dostarczaniem projektów software"owych i wspieraniem naszych partnerów w rozwijaniu ich biznesu. Tworzymy oprogramowanie, które umożliwia ludziom dokonywanie zmian, działan
2024-11-14 12:36:14Scalo Sp. z o.o 5,0We are looking for a Data Engineer with strong ETL, Informatica, and Oracle expertise to join Sii and develop advanced data integration solutions. Your skills in Informatica will be crucial in building and optimizing robust ETL workflows that support
2024-11-14 11:50:03Sii PolskaTwój zakres obowiązków, dołączysz do nowopowstałego projektu realizowanego w obszarze bankowość/ finanse,, praca nad stworzeniem rozwiązania Data lake, w oparciu o wymagania jakościowe i zakresowe,, dbanie utrzymanie szczegółowej dokumentacji..
2024-11-14 10:40:42Ul. Aleje Jerozolimskie
2024-11-14 09:56:04ScaloUl. Powstańców Śląskich
2024-11-14 09:56:04ScaloBI/Data Engineer for Data Warehouse (Remote CZ/SK/PL) The main requirements are MS Fabric or Board. We are looking for candidates who are interested in working remotely, living in the Czech Republic, Slovakia, or Poland, and who are nationals or perm
2024-11-14 09:38:14Manpower GroupW Scalo zajmujemy się dostarczaniem projektów software"owych i wspieraniem naszych partnerów w rozwijaniu ich biznesu. Tworzymy oprogramowanie, które umożliwia ludziom dokonywanie zmian, działan
2024-11-14 09:36:22Scalo Sp. z o.o 5,0Our customer is seeking a Junior Data Engineer to join a large, expert team focused on data architecture, analytics, and AI solutions. This long-term opportunity is ideal for candidates looking to grow within the field of data engineering.The project
2024-11-14 07:01:28B2B.net S.ASenior Data EngineerYour responsibilities Working with Spark & Python to define and maintain data ingestions and transformation Building distributed and highly parallelized big data processing pipeline which process massive amount of data (both struc
2024-11-14 07:01:28B2B.net S.AUl. Fabryczna 6
2024-11-13 15:56:05Holisticon Insight 5,0Your responsibilities, Day to day ensure the customers’ models are running as intended and troubleshoot issues that arise., Support strategic projects, by working side by side with the data scientist from design to delivery processes around the..
2024-11-13 15:40:46Wymagania: min. 5 lat doświadczenia na stanowisku Data Engineer/Big Data Engineer/Cloud Data Engineer Doskonałe umiejętności SQL (i/albo NoSQL) Bardzo dobre umiejętności programowania w języku Python Doświadczenie z DataBricks/Snowflake Doś
2024-11-13 14:56:04Team Connect 5,0Ul. Remote
2024-11-13 12:56:06Team Connect 5,0Ul. Jana Nowaka-Jeziorańskiego 53A 03-982 Warszawa
2024-11-13 11:56:06CLOUDFIDE6-8 years of experience as Data Engineer GCP Data Engineer Certification (required) Hands-on experience with GCP services, Dataproc, dbt, and Terraform Familiarity with Azure is a plus but not mandatory Devire IT Outsourcing is a form of cooperation
2024-11-13 11:26:33DevireMust have: At least 4 years of hands-on experience with Spark and Python. Proficient in Spark SQL or Hive SQL with a minimum of 4 years of experience. 3+ years working with Hadoop/Hive ecosystem or other Big Data technologies. Background in designing
2024-11-13 11:24:38DevireUl. Prosta 67
2024-11-12 16:56:04Falck Digital Technology Poland 5,0Your responsibilities, Design and implement sophisticated data models and database systems, ensuring they are scalable, resilient, and optimized for business needs, Build and maintain the infrastructure for seamless extraction, transformation, and..
2024-11-12 15:40:49Basic Qualifications* 5+ years of relevant work experience with a Bachelor’s Degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD, OR 8+ years of relevant work expe
2024-11-12 14:54:12VISAQualifications Ø 7 or more years of relevant work experience with a Bachelor’s Degree or 6 or more relevant years of experience with an Advanced Degree (e.g. Masters, MBA, MD).Ø Minimum of 5+ years’ experience in building data engineering pipelines.Ø
2024-11-12 14:44:41VISAYour responsibilities, Design, build, and optimize scalable data pipelines with SQL, PySpark, Python., Use Apache Airflow, AWS Glue, Kafka, Redshift for data processing and orchestration., Manage AWS infrastructure (Lambda, S3, CloudWatch,..
2024-11-12 14:40:45Twój zakres obowiązków, Projektowanie, budowa, testowanie i utrzymanie skalowalnych systemów zarządzania danymi,, Integracja nowych technologii zarządzania danymi i narzędzi inżynierii oprogramowania,, Tworzenie i wdrażanie metod zbierania i..
2024-11-12 14:40:45Twój zakres obowiązków, Aktywny udział we wszystkich fazach cyklu rozwoju oprogramowania,, Pisanie dobrze zaprojektowanego, testowalnego i wydajnego kodu,, Praca z dużymi zbiorami danych (kilka petabajtów), Tworzenie dokumentacji technicznej,,..
2024-11-12 14:40:45Must Have: Design and Develop Data Pipelines: Create efficient and scalable data pipelines using GCP services such as Dataflow (Apache Beam), Dataproc (Apache Spark), and Pub/Sub Data Storage Solutions: Implement and manage data storage solutions usi
2024-10-25 15:26:53Experis PolskaRequirements: Hard Skills Azure Databricks (Unity Catalog, workflows, Python, Spark/PySpark) Azure Data Factory MSSQL ADLS blob storage: delta tables, parquets DevOps - pipelines, CI/CD SQL – advanced Ready to analyse tasks Soft skills Open minded En
2024-10-25 14:39:24Experis Polska