Requirements: At least 4-5 years of experience in Data Engineering Expertise in Azure cloud service platform Proven experience with large databases and data warehouses Expertise in Databricks: Delta Tables, DLT, Unity Catalog, performance tuning, etc
2024-11-05 19:40:02Experis PolskaThe ideal candidate: Has hands-on experience with SQL. Is familiar with Azure Is proficient in Python, with extensive experience in PySpark. Has solid experience in data modeling and implementing Data Layers for solutions. Has experience with DataBri
2024-11-05 14:50:00Falck Digital Technology Poland Sp. z o.oDeep Familiarity with dbt: Must have hands-on experience with Snowflake and dbt (data build tool) Mastery of SQL and Database Management: Strong command of SQL and production databases Proven BI and Analytics Expertise: Skilled in producing dashboard
2024-11-15 21:46:29CouponFollow, a System1 CompanyYour responsibilities, Designing and building data pipelines, Collaborating with other team members, Implementing new features with PySpark, Conducting tests in Apache Airflow, Deploying in Kubernetes, Optimizing data, Working with the latest..
2024-12-03 13:40:44Hays IT Contracting to wsp ółpraca oparta na zasadach B2B. Nasza firma dopasowuje specjalist ów IT do najciekawszych projekt ów technologicznych na rynku. Dołącz do grona 500 zadowolonych Kontraktor ów i pracuj dla Klient ów Hays!Obecnie dla naszego
2024-12-03 11:53:57Ul. Fabryczna 6
2024-12-02 14:56:04Holisticon Insight 5,0Ta oferta jest dla Ciebie, jeśli: masz minimum 4 lat doświadczenia w roli Data Engineering, posiadasz doświadczenie w pracy z platformą Azure (Azure Data Factory, Azure SQL Database, Azure Key Vault, Azure OpenAI Service), pracujesz z Databricks: Del
2024-12-02 14:44:54ScaloCo najmniej 4-letnie doświadczenie w pracy na stanowisku SQL Developer / Data Engineer Znajomość Python, SQL w tym T-SQL, pisanie złożonych procedur składowanych, optymalizacja wydajności, Umiejętność tworzenia procesów ETL (SSIS, ADF), Doświadczenie
2024-12-02 10:46:14XTBTwój zakres obowiązków, Tworzenie pipelinów danych, w tym: projektowanie, tworzenie i utrzymywanie solidnych, skalowalnych i wydajnych pipelinów ETL/ELT w celu obsługi różnych źródeł danych i przetwarzania danych na dużą skalę, Modelowanie danych:..
2024-11-28 15:40:36Requirements:- Deep knowledge of data engineering principles and working experience in developing advanced analytics data pipelines- Knowledge of SQL and excellent coding skills in (Python / Scala)- Experience working with big data technologies (Hado
2024-11-28 14:00:27Avenga 4,3Your responsibilities, Working together with Platform Engineers to assess and choose the most suitable technologies and tools for the project, Development and committing of new functionalities and open-source tools, Executing intricate data intake..
2024-11-28 13:40:37Ul. remote
2024-11-26 13:56:03TransferGoW Scalo zajmujemy się dostarczaniem projektów software"owych i wspieraniem naszych partnerów w rozwijaniu ich biznesu. Tworzymy oprogramowanie, które umożliwia ludziom dokonywanie zmian, działan
2024-11-20 18:36:18Scalo Sp. z o.o 5,0Ul. Stanisława Żółkiewskiego 17b
2024-11-20 17:56:02MediusYour responsibilities, Develop new data pipelines and maintain our data ecosystem, with a strong focus on fault-tolerant data ingestion, storage, and lifecycle management, as well as the computation of metrics, reports, and derived insights; ,..
2024-11-20 16:40:29Ul. Powstańców Śląskich 9
2024-11-20 15:56:05ScaloYour responsibilities, Design and develop new techniques and data pipelines to enable various insights for internal and external clients within market research, Improve our infrastructure required for optimal extraction, transformation and loading..
2024-11-20 14:40:41Must-Have: 5+ years of experience in data engineering focused on data integration, ETL, and data warehousing, Proficiency in SQL, Python, and Spark, Experience with data platforms like Databricks and Snowflake, Strong data modeling skills, Knowledge
2024-11-20 10:17:10ROCKWOOL Global Business ServiceUl. Tadeusza Czackiego 15/17
2024-11-20 09:56:04Billennium 4,3Your responsibilities, Design and implement data pipelines using DBT for transformation and modeling;, Manage and optimize data warehouse solutions on Snowflake;, Develop and maintain ETL processes using Fivetran for data ingestion;, Utilize..
2024-11-20 09:40:39Possess a Bachelor’s degree in Computer Science or Software Engineering, or demonstrate substantial experience as a seasoned app developer. Demonstrate a minimum of 3 years of proficiency in Python, SQL, data systems development life cycle. Experienc
2024-11-20 09:34:38ProxetUl. Tadeusza Czackiego 15/17
2024-11-19 15:56:04SunscrapersAt least 5 years of professional experience as a data engineer, Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar, Excellent command in spoken and written English, at least C1, Strong professional experience w
2024-11-19 14:17:47SunscrapersSalary brackets: 160 - 200 PLN net / hour (B2B)About role We're excited to share that we're seeking a Data Engineer to join our team! This role plays a vital part in our company, and we're looking for candidates with exceptional skills and expertise
2024-11-19 11:50:03Proficiency in a programming language like Python / Scala or Java Knowledge of Lakehouse platforms - Databricks Experience working with messaging systems - Kafka Familiarity with Version Control Systems, particularly GIT Experience as a programmer an
2024-11-19 10:29:56GetInData | Part of Xebia 5,0Senior Data Engineer (GCP)(She/He/They)41_R00169044Obowiązki Collaborate on solutions using Google Cloud Platform (GCP) services, integrating with on-premise and cloud solutions.Deploy Data & AI solutions on GCP, leveraging artificial intelligence an
2024-11-19 07:01:19Accenture Sp. z o. oDescription of knowledge and experience: 4+ years of experience as Data Engineer. Experience in Spark and Python Strong understanding Spark SQL or Hive SQL. Experience with Hadoop/Hive ecosystem or other Big Data technologies. Previous experience in
2024-11-17 17:02:23emagine sp. z o.o 3,9Deep Familiarity with dbt: Must have hands-on experience with Snowflake and dbt (data build tool) Mastery of SQL and Database Management: Strong command of SQL and production databases Proven BI and Analytics Expertise: Skilled in producing dashboard
2024-11-15 21:46:29CouponFollow, a System1 CompanyTa oferta jest dla Ciebie, jeśli: posiadasz min. 5 lat doświadczenia komercyjnego zdobytego na podobnej pozycji, znasz Pythona, PySparka, Pandas oraz masz doświadczenie w pracy z notatnikami JupyterLab. biegle posługujesz się SQL oraz na co dzień wyk
2024-11-15 11:29:45ScaloExperience in data engineering; Experience working with Cloud Solutions (preferably AWS, also GCP or Azure); Experience with Cloud Data Platforms (e.g., Snowflake, Databricks); Proficiency with Infrastructure as Code (IaC) technologies like Terraform
2024-11-15 10:24:09ProvectusTwój zakres obowiązków, dołączysz do nowopowstałego projektu realizowanego w obszarze bankowość/ finanse,, praca nad stworzeniem rozwiązania Data lake, w oparciu o wymagania jakościowe i zakresowe,, dbanie utrzymanie szczegółowej dokumentacji..
2024-11-14 10:40:42Ul. Aleje Jerozolimskie
2024-11-14 09:56:04ScaloW Scalo zajmujemy się dostarczaniem projektów software"owych i wspieraniem naszych partnerów w rozwijaniu ich biznesu. Tworzymy oprogramowanie, które umożliwia ludziom dokonywanie zmian, działan
2024-11-14 09:36:22Scalo Sp. z o.o 5,0Senior Data EngineerYour responsibilities Working with Spark & Python to define and maintain data ingestions and transformation Building distributed and highly parallelized big data processing pipeline which process massive amount of data (both struc
2024-11-14 07:01:28B2B.net S.AUl. Fabryczna 6
2024-11-13 15:56:05Holisticon Insight 5,0Ul. Prosta 67
2024-11-12 16:56:04Falck Digital Technology Poland 5,0Your responsibilities, Design, build, and optimize scalable data pipelines with SQL, PySpark, Python., Use Apache Airflow, AWS Glue, Kafka, Redshift for data processing and orchestration., Manage AWS infrastructure (Lambda, S3, CloudWatch,..
2024-11-12 14:40:45Minimum 5 years in data engineering with demonstrated expertise in designing, building, and optimizing data workflows. Core Languages: Proficient in SQL, PySpark, and Python. Frameworks and Tools: Skilled in using Apache Airflow, AWS Glue, Kafka, Red
2024-11-12 10:48:10DevireUl. remote
2024-11-11 13:56:05TransferGoExperience in data engineering; Experience working with Cloud Solutions (preferably AWS, also GCP or Azure); Experience with Cloud Data Platforms (e.g., Snowflake, Databricks); Proficiency with Infrastructure as Code (IaC) technologies like Terraform
2024-11-15 10:24:09ProvectusMinimum 5 years in data engineering with demonstrated expertise in designing, building, and optimizing data workflows. Core Languages: Proficient in SQL, PySpark, and Python. Frameworks and Tools: Skilled in using Apache Airflow, AWS Glue, Kafka, Red
2024-11-12 10:48:10Devire